Hannah Pickard takes a look at the risks that come with putting mental health in the hands of technology.
Approximately one in four people will experience a mental health condition each year in the UK, and as many as one in six of us have common mental health complaints every week. Apple is looking to combat this prevalence of mental ill-health by developing methods and detectors, such as monitoring facial expressions and sleep patterns, in their iPhones to recognise and diagnose mental health conditions. Should we be concerned about these “detectors”, and do they pose risks to privacy and ethics?
Currently mental health conditions can only be diagnosed by a health professional, usually a psychiatrist with years of medical training. Health professionals are able to consider the various factors of a person’s life circumstances when making a diagnosis, something which Apple would not be able to do with their proposed techniques. The role that Apple wants to take in mental health diagnosis is troubling. Suggesting that mental health conditions can be diagnosed with an iPhone both belittles the work of health professionals and is likely to be inaccurate compared to a formal medical evaluation, which could lead to misdiagnosis.
"Suggesting that mental health conditions can be diagnosed with an iPhone both belittles the work of health professionals and is likely to be inaccurate compared to a formal medical evaluation."
Even if iPhones don’t end up diagnosing users but instead make a suggestion of a mental disorder, there are still issues. Individuals may have a false reliance on their phones and believe that if their phone isn’t indicating that they are potentially suffering with a mental health condition, that means they don’t have one. This could lead to underdiagnosis, the same thing that Apple apparently wants to attempt to eradicate.
Additionally, if a user is notified that they are likely to have depression, for example, they may become very paranoid about how their phone acquired this information. If a person already has a diagnosis of a mental health condition that presents with paranoia and delusions, these symptoms could become elevated. Or users may try to adapt their behaviour to receive a specific diagnosis or avoid one. If these features are used on future iPhones then perhaps they should have an opt-in basis, as individuals may not want their mental health to be monitored for them by technology.
"If these features are used on future iPhones then perhaps they should have an opt-in basis..."
A concern for many people will be privacy. How will Apple use the data they acquire from these detectors? Apple has stated that this information will not be put on their servers, but how can individuals know this for sure? Data leaks do happen, or Apple could sell this data to a range of different companies, who can then target adverts at individuals exhibiting certain behaviours consistent with mental health issues. Many people use shopping as a way to cope with feelings of loneliness and sadness. Diagnostic technology made by a business like Apple who turn over huge profit, could prey on this through advertising.
Apple’s main concern with this new technology is early intervention, but surely there are more safe and secure methods which can achieve this? Half of mental health conditions start by the age of 14, so it would make sense for schools to play a crucial part in early intervention. Indeed, the Early Intervention Trust has stated that school-based interventions are successful because schools can reach all children, no matter the pupils’ backgrounds, and can remove potential barriers, such as parents acting as gatekeepers to mental health support.
Apple is still in the early stages of research for their iPhone mental health detectors, so it’s not something to worry too much about right now. However, it will be interesting nonetheless to see the development of technology and its growing involvement in all aspects of our lives in the coming years.
No related posts found!