A proposed class action lawsuit (Javid v. MAC Cosmetics, Inc., Case No. 2025-CH-08774) was filed in Cook County, Illinois, accusing MAC Cosmetics of violating the Illinois Biometric Information Privacy Act (BIPA) by using its popular virtual try-on tools both in-store and online to scan and collect consumers’ facial geometry data without obtaining the required written, informed consent.
How MAC’s AR Try-On Tech Works—and What the Lawsuit Claims
The complaint details that MAC Cosmetics offers a virtual try-on tool to let shoppers preview how lipsticks, foundations, and other products would look using a real-time live video scan or a photo uploaded either at physical locations, such as the Schaumberg, Illinois store, or online via customers’ device cameras or photos.
The technology captures precise facial geometry to overlay makeup onto a mirror-like replica of the customer’s face, but does not display consent forms or provide information about data storage, use, or retention as mandated by BIPA.
Plaintiff Fiza Javid alleges she experienced this at a suburban Chicago store and again via MAC’s website, never being notified or presented with written consent, in violation of Illinois law. The suit argues that neither MAC’s digital platforms nor its retail associates disclosed that biometric information would be recorded, how it would be used, or how long it would be stored.
The Legal Standard: What Does Illinois BIPA Require?
Enacted in 2008, the Illinois Biometric Information Privacy Act is the strictest biometric data law in the U.S.—requiring companies to obtain explicit, written consent before collecting or possessing any individual’s biometric data. Businesses must also publish written policies outlining the duration of data storage and the method of data destruction, and clearly disclose the purpose for collecting the data.
Failure to comply can result in statutory damages of $1,000 per negligent violation or $5,000 per intentional or reckless violation—plus attorneys’ fees. Notably, BIPA allows private citizens to directly sue, which has sparked a major increase in consumer privacy litigation against both global tech giants and retail brands.
Industry Impact: The New Frontline of Biometric Lawsuits
MAC’s legal troubles highlight a larger industry trend: beauty, fashion, and retail brands are facing mounting lawsuits over AR and biometric tech. Prior BIPA lawsuits led to major settlements, including millions from Meta (Facebook) in 2020, and dozens more against both tech and non-tech companies in Illinois alone. As practicalecommerce.com and Law360 note, other companies, such as Walmart, have also been named in class actions for allegedly using facial recognition in stores without proper consent or transparency.
The suit seeks to represent all Illinois residents whose biometric data was allegedly collected through the virtual try-on feature without proper notice or consent, with both in-store and online classes named. The case follows a spring 2025 class action against Living Proof Inc. in Illinois over similar allegations involving hair product virtual try-ons, signaling that all retail categories using AR try-on technology are now on notice.
What’s Next? Brand Reactions and Consumer Rights
As of this writing, MAC Cosmetics and parent Estée Lauder Companies have not released a public statement about the allegations. Legal experts expect a surge in privacy litigation as biometric and AR shopping experiences become more widespread; brands must adopt privacy by design, with explicit, easy-to-understand consent flows and public retention policies, to avoid running afoul of strict state laws.
Affected Illinois consumers do not generally need to take action at the outset of such lawsuits, but should save evidence of any AR try-on experiences or relevant communications. If the class is certified or a settlement announced, those individuals would be notified and could be eligible for financial relief.
This lawsuit shows how vital robust privacy compliance is for every retailer rolling out virtual try-on and advanced digital experiences in the U.S. As BIPA litigation continues to snowball, leadership in transparency, consent, and privacy practices will be critical for maintaining trust and mitigating liability in the era of AI-driven retail.
