Services like 23andMe can be fun but do they put our privacy at risk?
Everybody wants to know where they’re from, who they’re related to, and what they’re at risk of. From questions of paternity to screening for diseases, genetic testing has been a rigorous and serious endeavour. With big businesses like 23andMe, finding out about your DNA has never been so accessible. We see it in online advertisements, coupons, and testing kits as family Christmas gifts. Genetic tests are the new horoscopes.
But such knowledge comes at a cost. In recent times, the safety of your genetic data has proven to be a cause for concern.
23andMe has sold over 10m DNA testing kits. And when asked on purchase, over 80% of its consumers agreed to their data being used by the company and scientists for research, to investigate the causes of diseases and subsequent treatments. However, this year 23andMe sold to a different customer: the rights of a new drug to Spanish pharmaceutical company Almirall. This drug was developed with their customers’ data; a point of pride for 23andMe. Speaking to Bloomberg News, their vice of business development Emily Brabant Conley remarked “we’ve now gone from database to discovery to developing a drug.” When sending off their swabs to 23andMe, customers are signing over more than they realise.
23andMe’s terms of service are carefully worded and constructed for flexibility. They state that genetic data that you hand over to them (“Genetic and/or Self-Reported Information”) will not be disclosed at an individual-level to third parties without explicit consent, unless required by law. This does not disallow them from selling data in bulk. Furthermore, by agreeing to these terms of service you relieve ownership of your genetic material, and thus receive no financial gain from research or products developed by them or their collaborators (“Waiver of Property Rights”). You have sold your DNA off at a loss, and are powerless over its usage. Speaking with the BBC, Tim Caulfield, research director at the health institute at the University of Alberta, expressed concern at the lack of awareness consumers have when they provide their information: “people need to look carefully at privacy statements because often these firms are partnering with the pharmaceutical industry and people should be aware that is happening”.
Such examples are easily found. In 2016, 23andMe sold access to this data to more than 13 drug firms, with Genentech reportedly paying $10m (£8.3m) to look at the genes of those suffering from Parkinson’s disease. In 2018, almost 5 million customers of 23andMe found that their genetic data and related health information was sold to the major drug company GlaxoSmithKline, in a $300m deal granting them exclusive access for four years. This was under their research programme “23andMe Therapeutics”, whose collaborators range from non-profits such as The Michael J. Fox Foundation, academic institutions like the University of Chicago, and for-profit pharmaceuticals such as Pfizer and Biogen.
If you’re feeling altruistic and aren’t concerned about how your data is used, this may be to your detriment. In 2019, 92m accounts on the DNA testing service MyHeritage were hacked and placed on a private server. Luckily for their customers, they only breached encrypted emails and passwords and not genetic data (or credit card information). But as the value of data increases, the drive to get it only will too, and not just by hackers.
Researchers want it, the police want it, and one day employers may too. Insurance companies can use it to calculate how much you may cost them and deny you cover. As noted in their terms of service, not telling your insurance company of a health condition discovered through a 23andMe test is an act of fraud. Privacy becomes a criminal act.
As best summarised by Natalie Ram, a professor of law at the University of Baltimore specialising in bioethics issues, “if there is data that exists, there is a way for it to be exploited.” And unfortunately the UK does not appear to have specialised laws to protect from this. Soon UK police may desire to replicate their American counterparts.
In April 2018, US police caught the serial rapist and murderer known as the Golden State Killer by uploading suspected belonging DNA to GEDMatch, a free database to help find relatives who have also submitted DNA. For the results, police created a family tree containing around 1,000 people’s details, leading to the third and fourth cousins of Joseph James DeAngelo, who was then arrested and charged. The problem? Not a single individual whose data was used consented to it being used in a murder investigation, and relatives became suspects. In 2019, GEDmatch attempted to restrict police access by creating an opt-in for law enforcement usage of genetic information, but a state judge overturned this that same year. This sets a concerning precedent for consumer privacy, and those companies who have resisted cooperation with police investigations, such as 23andMe and Ancestry.com. As the example shows, even if you haven’t uploaded DNA but a relative has, then you are traceable. So if you’ve committed a crime in the US, you better hope that your fourth cousin Phil hasn’t taken to genealogy.
Even the most law-abiding citizen should be concerned. Is sacrificing your privacy a worthy exchange for knowing how Nordic your distant uncle is? And the safety concerns on sites like 23andMe only grow ever higher. What if the company changes hands? What if they go out of business? Who would buy this information at a debtor’s auction? You can close your account and delete your test results, but 23andMe can keep your DNA sample for up to 10 years and Helix “may store your DNA indefinitely”. A puppy isn’t just for Christmas and neither is the family’s DNA test.