FTC chair says promises on data security need to be kept.
A letter from the Federal Trade Commission has expressed concern about the security of the data held by 23andMe.
In the letter, FTC chair Andrew Ferguson contacted the Department of Justice bankruptcy regulators warning that any purchaser of 23andMe must adhere to the company’s current privacy policies for protecting consumers’ genetic and other data, reports The Record.
User Privacy and Choice
The Company has promised, and continues to promise, that user privacy and choice are at the forefront of its business model,” Ferguson wrote. “23andMe tells its users that ‘privacy comes first’ and that ‘since day one, we’ve committed ourselves to protecting your privacy’.
“Further, the Company commits to its users that they are in control of their data, and that users can decide how their information is used and for what purposes—including honoring the right of users to delete their personal information at any time.”
The news follows claims that 23andMe is set to enter bankruptcy in the U.S., and concerns have been raised by regulators and politicians about the security of the biometric data held by the company.
Personal Information and Biological Samples
Ferguson went on to say that the FTC believes that, consistent with Section 363(b)(1) of the Bankruptcy Code, promises of data security to consumers must be kept. “This means that any bankruptcy-related sale or transfer involving 23andMe users’ personal information and biological samples will be subject to the representations the Company has made to users about both privacy and data security, and which users relied upon in providing their sensitive data to the Company,” he said.
He also said that any purchaser should expressly agree to be bound by and adhere to the terms of 23andMe’s privacy policies and applicable law, including as to any changes it subsequently makes to those policies.
Regulatory Options
An advisory by Punter Southall Law said this is not the first time that there has been a data auction in circumstances like this: in 2000 online toy retailer Toysmart had a plan to sell its customer data to help pay off its debts, and the FTC intervened to block the sale.
“The main legislation protecting an individual’s personal data is GDPR which applies across the EU,” the advisory claimed, while under the EU AI Act, most biometric identification or biometric categorisation AI systems are categorised as either high risk or unacceptable risk.
“The outright prohibition on unacceptable risk AI systems started applying from 2nd February 2025.”
Written by
Dan Raywood is a B2B journalist with 25 years of experience, including covering cybersecurity for the past 17 years. He has extensively covered topics from Advanced Persistent Threats and nation-state hackers to major data breaches and regulatory changes.
He has spoken at events including 44CON, Infosecurity Europe, RANT Forum, BSides Scotland, Steelcon and the National Cyber Security Show, and served as editor of SC Media UK, Infosecurity Magazine and IT Security Guru. He was also an analyst with 451 Research and a product marketing lead at Tenable.