Just be sure to change you DNA if you were impacted by the hack.
"taking action" is personal: just don't give personal information, like your DNA (!) to anyone.
You can't control what your relatives do, unfortunately.
I'm aware HN has a dim view of the GDPR, but I previously worked in compliance and it was a sea change in how big corporations and organizations viewed data collection.
User PII and especially sensitive data suddenly was viewed as "toxic" and that having it around was something that could only bring them hassle.
California's data privacy acts are similar (but much more narrowly focused).
Also, I always like to sum up what the intent of these acts typically are and what compliance means:
- Tell people what data you're going to collect and, what you do with it, who you share it with
- Keep their data reasonably secure
- Delete it if they ask
The United States is unlikely to have a national privacy law in the foreseeable future due to the extensive lobbying by companies that depend on violating the privacy of its citizens. For the same reasons we are unlikely to have true Net Neutrality, there is too much money opposed to it.
Wasn't it time to take action after the Equifax leak, or the facebook-cambridge leak? Yahoo? Marriott International? Yahoo again somehow?
Nothing will change.
Its time for people to stop expecting things from their corpo-overlords or the governments they've purchased.
No one will take action.
Outside Europe privacy isn’t a priority for politicians.
Why doesn't this breach constitute some sort of HIPAA violation? I.e. exposure of personally identifiable information
If a cousin, aunt, or uncle used this service, does that automatically mean my DNA is visible/accessible?
Good luck out-lobbying Google and Meta for privacy
using unique passwords would have prevented this from happening on the user side
but I agree with sensitive data 2fa should be mandatory
Thank you for calling this what it is - a hack - despite 23andme's strenuous efforts to paint this as the fault of millions of users (see https://blog.23andme.com/articles/addressing-data-security-c...) rather than owning the vast technical or management failure that allowed this to continue undetected for months.
Without the risk of a giant fine or, say, jail time, many tech giants can and do get away with managing their data security badly.
That's right. It's happened before, and will continue to happen as long as there are no consequences.
Note that 23andMe is not the first online genealogy service to get hacked:
- In 2017, MyHeritage had 92 million accounts hacked https://www.hackread.com/dna-testing-website-myheritage-hack....
- In 2020, MyHeritage users were targeted in a separate phishing scheme. https://blog.myheritage.com/2020/07/security-alert-malicious...
- GEDmatch admitted “all user permissions were reset” in a 2020 attack. https://www.buzzfeednews.com/article/peteraldhous/hackers-ge...
- Ancestry and Ancestry affiliated companies have had multiple security breaches over the past 10 years (https://www.hackread.com/software-firm-leaks-ancestry-com-us...)
- Ancestry has also destroyed people's archives when it decided it was no longer profitable or important enough to keep them. https://slate.com/technology/2015/04/myfamily-shuttered-ance...
- Last year, FamilySearch belatedly admitted a breach had exposed “users’ full names, genders, email addresses, birth dates, mailing addresses, phone numbers.” https://grahamcluley.com/seven-months-after-it-found-out-fam...
These are incidents that have been made public as required by law. There are surely thousands of other smaller incidents that are not reported, as well as major breaches that the companies themselves don’t even know about yet. And it will continue for years to come until lawsuits or brutal regulations with teeth are enacted.
For the short term if you don't want your data leaked, don't give it away to others. For the long term support research into Homomorphic_encryption.
I get the need to tie in to a recent big news story for exposure reasons, but I think it would be good to be more explicit about the different problems.
We have businesses that are explicitly built on violating privacy.
We have businesses provide services that require them to collect some private info. I’d put 23andme in this bucket.
We have businesses that have lax security, and actually get their systems broken into.
We have businesses that have fine security, but don’t force users to have good, unique passwords and 2FA. 23andme is in this bucket, right?
The first, we should be happy to run them out of business, like we should actively write laws that try to destroy them.
The third, we should fine them to the point where skimping on security is never a rational decision (and if that runs companies out of business, fine).
The second seems not too bad, every medical-field-related service is going to have some private info necessarily (for example), as long as they don’t exploit it that seems fine.
The fourth seems not so bad, there are all sorts of services that are not so important. I don’t have 2FA on, like, random forums and video games, who cares?
Combining two and four is pretty bad though.