Help Before We Know We Need It?

28th May 2018

As new reports are emerging that at least four former Royal Marines are suspected to have committed suicide in the past four weeks. Family men. It’s becoming ever more apparent that more needs to be done to support the people who are struggling to survive the changes of an ever-changing world.

This latest spate of deaths came as it also emerged that a further three former soldiers are believed to have taken their lives since the beginning of April.

We all know people who suffer, struggling to seek out the help they desperately need. And we also know there are more just like them who suffer in silence. Mental health is a killer. It doesn’t matter who you are, how tough you are or how many medals you have on your chest. Everyone is vulnerable and not everyone will win the battle.

“There is a big problem with these often very strong characters. They look invincible, but they are carrying a lot of baggage and are often too proud to get help.”

Other figures are emerging from the Ministry of Defence that reveals more than 5,000 Armed Services members were diagnosed with a mental health problem last year.

So what can we do about it?

Compared to other areas of healthcare, mental health is far more challenging to diagnose and treat. For example, only a third of people suffering from depression benefit from the first antidepressant they are prescribed, and it generally takes several months to determine whether an antidepressant is proving useful or not. The challenges in mental health care are further complicated by our limited understanding of the human brain and the fact that complications can often be the result of a combination of interdependent social, economic, neurological and genetic factors – the interplay between which is yet to be fully understood.

Talking therapies, like CBT, which are based on the theory that thoughts, feelings, what we do and how our body feels are all connected, can be more accessible to measure in a lot of cases, because feedback is more immediate, even if the benefits take longer to form. While these talk-driven approaches also yield varying results, they at least rely upon human, rather than biological data, and so offer some exciting opportunities in the space of Ai and predictive analytics.

These technology-driven approaches in psychiatry will always be contentious, and data-driven care will no doubt polarise opinion, but the opportunities are apparent. Especially in a world where some groups of society spend more time ‘talking’ to their phone, than their offline friends.

When people feel worried or distressed, they often fall into patterns of thinking and responding which can worsen how they think. The idea of a predictive model to help spot and change those problematic thinking styles or behavioural patterns is fast becoming a reality. It could help us move from a model of ‘diagnose and treat’ to one of ‘predict and prevent’. Triaging, and speeding up treatment for the most at risk.

Psychiatrists are smart, well trained and do their best to stay up to date with the latest research. But they can’t possibly memorise all the knowledge needed for every single situation, and they probably still don’t have enough information at their fingertips all of the time. Even if they did have access to the massive amounts of data necessary to compare treatment outcomes, they would still need time and expertise to analyse the information and make a recommendation for each patient.

This is where predictive analytics could step in if it’s approached ethically. Imagine for a moment, a world, where the data those men above collected on their phones, was processed, understood, translated, and interpreted in a more human-focused way. A predictive model could help to parse the meaning in their data into visual and actionable outputs for them. Like a coach, only smarter.

The opportunity to reveal surprising associations in that data, that the human brain would never suspect is a tantalising prospect… but we must approach it thoughtfully, not technically.

Data acquisition

The first big hurdle would inevitably come in the acquisition of the data. When it comes to creating an excellent predictive model, the first stage is to gather all of the inputs or data sources. There’s no question that we now have access to a plethora of data, but in many cases, people have been collecting data that only they care about, and it might not be valuable, insightful or actionable. So getting the right data is more important than merely getting data at all.

The risk of not having the right data, or there being bias in the data we use, is a real threat. Training datasets for predictive models in psychiatry must be built from diverse population samples. Algorithms made using narrow datasets regarding age or ethnicity will have little predictive power for the broader population.

As the field of predictive psychiatry evolves, we will need to learn from mistakes made in other industries such as banking and gaming and ensure the evidence base is built upon diverse foundations.

Ethics and Privacy

Predictive psychiatry is founded on the concept that we can derive insight based on data collected from many individuals, which we then push into the cloud to cross reference, cross fertilise, and learn from. The management and sharing of this data raise some very complex issues around privacy and security. These problems are far from unique to this field; they are sharply felt across the all healthcare systems—both private and public organisations. However, within the area of mental health, the ethical concerns are arguably even more sensitive, given that some of the most vulnerable members of society will be directly affected. There are ways to mitigate these problems, such as carefully designed consent agreements which outline how the data will be used and shared, as well as greater transparency around how the data is stored, used and analysed. But because of recent data sharing scandals (thanks Facebook!), it’s crucial to help people make more informed decisions about services powered by their data.

While this technology has considerable potential, expensive trials are unlikely to be a silver-bullet for diagnosing and treating mental health problems. For a start, the ethics of running a large scale trial could easily prove prohibitive when we already see an acute squeeze on budgets for mental health services. So it might fall on industry to lead the charge. Highly technical experimental approaches are time-consuming and only constitute a single observation, therefore are less reliable than multiple measurements. So if the potential of predictive psychiatry is to be realised, it would be necessary to find better ways of trialling it. Methods capable of guiding diagnosis and treatment, but which are easier and cheaper to collect data from, than those obtained by academic trial alone.

Because the answer might be found in the data captured by the smartphones we use every single day, on mass, it could give us a large sample of the general population.

Data preparation

Before digging into all this data, it’s crucial first to step back and figure out what problem you are trying to solve with the model.

It seems entirely plausible, using large audience datasets and machine learning, to replicate the results of traditional test scores (including scales used to measure depression and anxiety) by drawing on information about how someone uses their phone. Our phones continuously and passively monitor patterns of behaviour, things such as the timing of swipes, key presses, and spacebar taps. Using this information would be gold-dust, and if we could tap that data into better things, it would be a massive opportunity for predictive psychiatry, and humanity as a whole. But the reality is that its typical for large datasets like this to be incomplete and contain human error because of data entry. So data preparation is crucial to the future of predictive psychiatry. Any data should be adequately cleaned up to normalise common mistakes captured during the data acquisition phase. Only then, will a predictive model be able to answer the specific questions and drive the actions we need? Some common ways to prepare your data include enrichment (bringing in external signals to complement current records), spam analysis and content normalisation .

Modelling

Once we understand the machine-learning problem we’re trying to solve, the next stage of building a robust model is to employ data science methodologies like classification or regression. Classification (also known as probability estimation) is used to predict which of a small set of classes an individual belongs to. For instance, we might ask “Among patients of this community, who is most likely to respond to a certain set of questions and nudges?” There then would be two classes: “Will Not Respond” or “Will Respond.”

Here we see the actual value in prediction. Not just giving an individual a different form of clinical care, but in helping the carers themselves make better-informed decisions on how scarce resources could be applied, or whether an issue is looming that could be dealt with before a crisis point is hit.

On the other hand, regression (or success estimation) is used to predict the value of some variable for each person. Looking at historical data, you might produce a model that estimates a particular variable specific to each person, such as “How often will this person use this service?”

Both of these techniques and many others can deliver model outputs that drive powerful Ai and predictive analytics use cases for service usage.

As already mentioned, the most valuable use case for this kind of technology would be to drive efficiency. With the right patient intelligence, teams can optimise treatments for the best effectiveness. And since predictive analytics outputs deliver immediate feedback on the quality of something, they can quickly calculate key performance metrics in real-time rather than waiting for real-world outcomes to play out—which as we know, can yield tragic results, especially in the space of mental health.

Banks have been employing this kind of prediction for decades using something called a Stochastic Model. The model is used to project and predict different outcomes and variables. As a form of financial modelling, its purpose is to estimate how probable consequences are within a forecast and predict conditions for different situations. The Monte Carlo simulation is one example of a stochastic model; when used for portfolio evaluation, various simulations of how a portfolio may perform are developed based on probability distributions of individual stock returns. Now imagine applying the same methodology (and even technology) to a group of patients. Predicting which patients may need a higher degree of personal attention, vs those who may be better treated using more remote, digital techniques. Of course, it’s never about replacing a human; it’s always about making the right channel choices, based on the data that’s available.

Accurate predictions also add value when it comes to quantifying key performance indicators like the accuracy of recommendations, average engagement quality, chat-to-efficacy ratio, etc. By using these KPIs to look past traditional medical metrics and identify top performing nudges and content, clinicians could gain more profound insights into which dialogues attract the highest velocity outcomes, drive more extensive reach, and more precision care.

The Future?

This kind of precision, data-driven psychiatry has been discussed for years. It’s potential to transform the way mental health is not only treated but diagnosed is seen as the most significant opportunity of a generation. Using data, and predictive modelling, a highly individualised approach to care is a real possibility.

Using chat-based, or passive data collection to provide an objective measure of someone’s mental health is a huge deal. A measurement which could also be monitored against how wellbeing fluctuates over time.

Understanding that data collection needs a strict informed consent model is critical, but the potential to save lives is enormous – Given the rising rates of people affected, both the individual, their families, their work colleagues, and friends, it would be criminal for Us not to be doing more. So we are.