Last Updated on March 26, 2018
A recent study from University College London suggests that machine learning could revolutionize the way that clinical trials are conducted and unleash a wave of innovation in the pharmaceutical industry.
The researchers chose to test their hypothesis by studying existing large-scale data sets related to stroke victims. The researchers posited that the complexity of the human brain and its interaction with various drug formulations is too complex for existing statistical models to understand.
By implementing machine learning, the researchers hoped to demonstrate that existing treatments for stroke are more effective than many suspect but obscured due to limitations in the way that clinical trials are currently conducted.
After compiling that data, researchers implemented a sophisticated form of machine learning that can account for thousands of different variables in the brain. This contrasts with past clinical trials that focused on only three or four basic variables.
After careful analysis, the researchers concluded that stroke treatments that had previously been dismissed as ineffective actually had significant therapeutic benefit. Due to the limitations of statistical modeling, however, those benefits were overlooked or ignored.
The results of this research are exciting but not especially surprising. The volume and complexity of data have always been major hurdles when conducting clinical trials.
Reaching accurate and actionable conclusions requires that the largest amount of data available be analyzed with the greatest degree of precision possible.
Achieving that on even a small scale requires a huge input of staff, time, and energy while delivering uncertain or inconclusive results.
The researchers at University College London are quick to point out that the value of machine learning does not lie in its ability to automate routine processes.
Rather, its value comes from conducting highly-complex process in perfectly formalized ways. As a result, clinical trials can combine diagnostic precision with statistical scale.
This is a specific example of a technological trend that promises to revolutionize healthcare broadly.
Machine learning that can filter through massive sets of data to extract meaningful insights has the potential to improve everything from clinical trials to patient care to hospital administration.
Healthcare data analytics will transform digitized medical information from an insurmountable obstacle into an undeniable asset.
The stroke researchers created their own algorithm, but it’s not necessary for healthcare organizations to have massive technical capabilities in order to benefit from machine learning.
This technology has matured significantly in recent years, and the improvements relate to its accessibility as much as its intelligence.
Organizations as diverse as university researchers, major hospital networks, insurance providers, and pharmaceutical companies are all investigating machine learning applications and searching for new approaches to data.
Machine learning may not be standard in the industry now, but in five to 10 years it will be ubiquitous, and soon after that it will be mission critical.
Organizations eager to implement machine learning must start by considering their organization’s strengths and weaknesses in terms of data management. Knowing what kind of data is most valuable and what sorts of issues inhibit access to data makes it possible to identify and apply machine learning solutions with deep impact.
Organizations must also identify solutions that make it possible to explore data in-depth without creating major technical burdens in the process.
The promise and potential of machine learning are exciting, but the solution cannot be more complicated than the problem. Organizations that make user-friendly features and functions a priority get more from their solutions in less time.
Once those solutions are common in clinical trials specifically and healthcare generally, expect things to improve drastically for patients and providers alike.