Open access peer-reviewed chapter

The Machine-Human Collaboration in Healthcare Innovation

Written By

Neta Kela-Madar and Itai Kela

Submitted: 23 July 2018 Reviewed: 31 July 2019 Published: 08 October 2019

DOI: 10.5772/intechopen.88951

Chapter metrics overview

1,069 Chapter Downloads

View Full Metrics

Abstract

The biopharma industry is in crisis, demonstrated by unsustainable research and development (R&D) costs. In parallel, the healthcare system suffers from skyrocketing costs, driven by the prevalence of chronic diseases and increased life expectancy. Innovative technologies have the potential to alleviate challenges both in the biopharma R&D model and in healthcare. This chapter considers how Big Data analysis based on artificial intelligence and machine learning offer opportunities to drive greater efficiency across the entire R&D value chain, enhance the quality of assets produced, and improve the time and cost to bring products to market. We also consider the unique challenges that arise with the integration of these fields into healthcare and medicine, specifically, the initially high costs when new medical and healthcare technologies are brought to the marketplace; widening socioeconomic health inequalities due to high marketplace costs; and unique methodological challenges presented by cross industry innovation, research, development, and implementation.

Keywords

  • artificial intelligence
  • healthcare
  • biopharma industry
  • personalized medicine
  • big data
  • digital transformation
  • machine learning
  • R&D
  • innovation

1. Introduction

The biopharma industry is facing significant challenges reflected by unsustainable research and development (R&D) costs. This challenge is seen in several ways. First, aggressive pricing pressure has led to an increase in the cost needed to bring products to market—from $1.188 billion in 2010 to a record level of $2.168 billion in 2018. A second major reason is the threat of patent expirations on numerous blockbuster drugs. As a result, biopharma companies experienced record low R&D returns in 2018—10.1% in 2010 to 1.9% in 2018, the lowest levels the industry has seen in 9 years [1].

In parallel to biopharma challenges, the healthcare system is having a crisis due to the prevalence of chronic diseases and increased life expectancy, the main causes for skyrocketing healthcare costs (in US, the health share of GDP is 18% and expected to reach 19.6% by 2014) [2]. Today, 50% of the entire US population is considered chronic patients, which accounts for 85% of the overall cost of healthcare [3]. Fortunately, the majority of chronic diseases can be prevented or delayed until significantly later stages in life due to successful medical interventions.

Today, the healthcare industry is seeing an integration of novel genetic and digital technologies that help identify and cope with the complexity of chronic diseases and their often “silent” transition from healthy status to an active disease with a late onset of symptoms. The challenge is to move medical interventions upstream to the pre-disease state, during which symptoms are cheaper and easier to treat. Significant change must be made to the current pharma R&D model, if productivity and profitability are ever to be restored and maximized. The view today is that a complete digital transformation is what is needed to achieve these goals and deliver the next generation of scientific breakthroughs.

Big Data analysis based on artificial intelligence (AI) and machine learning (ML) offer opportunities to address some of these challenges expected to drive greater efficiency across the entire R&D value chain, and eventually improve the quality of the assets produced, as well as the time and cost it takes to bring them to the market. The change is already beginning to take place. In fact, most of the big pharma companies (such as Novartis, Roche, Pfizer, Merck, AstraZeneca, GlaxoSmithKline, Sanofi, Abbvie, Bristol-Myers Squibb, Johnson & Johnson, etc.) are already on the road to taking advantage of AI innovation (healthcareweekly.com, online) as it becomes a driving force in the innovation of medicine and healthcare.

AI is a growing industry of personalized health technology, or personalized medicine, which will have tremendous effect on healthcare management. The key idea behind the technological personalization of medicine and healthcare is to capture, analyze, and utilize individual patient characteristics, such as biomarkers, then to base medical decisions on these individual characteristics rather than on population averages. Another direct application is the technological development of assisted devices that augment traditional medical practice and healthcare, such as the broad use of robotics as well as patient-worn devices (“wearables”) that optimize care. This chapter reviews leading publications in these areas and outlines major advantages and also methodological and clinical weak points that need to be addressed in order for personalized medicine to realize its potential.

Advertisement

2. Toward personalized medicine age

Advances in technology are shifting the practice of medicine from anecdotal to data-driven. Due to this shift, improvement in screening, prediction, diagnosis, and the treatment of disease has increased the quality of medical care worldwide and cost effectively ([4]: p. 139). Personalized medicine is generally recognized as promising and advantageous in several important ways. It can improve the efficacy of medication as treatments become better matched to patients; when patients are better matched to treatments, ineffective treatments and their accompanying harmful side effects are avoided; healthcare costs are driven down as a result of better use of therapies; diseases are detected sooner or even anticipated so care is shifted from detection to prevention, thereby avoiding late-care, less effective, and more costly treatment; disease management is more effective through wearable patient technology; and clinical trials can be more accurate as patient selection becomes more precise ([5]: pp. 1-2).

Despite these apparent advantages, the technological personalization of medicine brings numerous challenges that must be addressed in order to harness its full potential. When healthcare and medical technologies first enter the marketplace, for example, they are often initially more expensive, as the companies that develop these products need to recoup high expenses from R&D. As a result, personalized health technologies are utilized first by the more affluent, driving an even larger wedge between affluent populations and marginalized ones. This serves to broaden the already wide socioeconomic gap in health inequalities in the short term ([5]: p. 2).

Solutions must be found to provide for diverse socioeconomic patient access to personalized medicine, so that its benefits reach all populations. This is especially important as marginalized and disadvantaged populations are precisely the ones least likely to access and utilize these products, but typically the very populations that would disproportionally benefit from them. Early disease diagnosis and management provided by advances in personalized medicine are especially needed in these populations and innovating for these populations is crucial in order for personalized health technology to reach its public health potential. For this, creative, strategic health initiatives must be developed that aim to lower costs while expanding access ([6]: pp. 2–4).

Advertisement

3. Challenges of human-machine innovation

The challenge in personalized medicine is methodological and inherent in cross-industry innovation itself—the ways in which different technologies are utilized for healthcare and medicine. While machine learning techniques can process complex and large data and provide accurate predictions based on this analysis, they are unable to provide a deeper understanding of phenomena ([6]: p. 5). In this way, Data Science and AI do not replace classical research. As a result, there remains a gap between the potential of personalized medicine and its realized application borne out as solutions that impact clinical practice.

One foreseeable way to bridge this gap is to push for a better coordinated interdisciplinary effort. Scientists, physicians, patients and their advocates, regulatory agencies, and health insurance providers need to create a healthcare system that can learn and adapt as it develops ([6]: p. 12). In short, technology is not meant to replace physicians. Rather, the idea is to provide physicians with a tool that supports their decisions based on the accurate processing, understanding, and analysis of large amounts of already available biomedical data ([6]: p. 13).

Another way to understand this difficulty is that personalized medicine is “underpinned” by convergent, cross-industry innovation. This naturally results in complexity, and uncertainty in terms of organization ([7]: p. 44). The question becomes how best to innovate given this challenge of cross-industry integration.

The two dominant forms of organizational learning aim for simplification and specialization. This is especially so in the context of uncertainty and complex integration issues that arise from innovation in an emerging cross-industry ecosystem. However, new research suggests a need to face this complexity via an adaption of a multitude of approaches, recognizing that uncertainty and risk are part and parcel of the very nature of innovation.

In this context, the management of risk might best be replaced with addressing uncertainty, understanding that in an emerging ecosystem of convergent innovation, comprehensive understanding is lacking. Approaches that embrace complexity rather than just managing it might prove more effective, specifically by adopting numerous measures to address the divergent factors in cross-industry innovation ([7]: pp. 51–52).

Advertisement

4. AI and digital healthcare case study: AI for cardiac patients

As a case in point, consider the impact of AI in cardiology and cardiac imaging. Machine learning and the “deep” neural networks used for this purpose hold great promise when applied to medical imaging. Improving the identification accuracy in patients at risk for cardiovascular events is critical, as well as patients who are not at risk but suffer from misdiagnosis and are given unnecessary and sometimes harmful treatments with negative side effects. The importance of improving the accuracy in detection and diagnosis is thus monumental given that cardiovascular disease is leading cause of death worldwide ([4]: p. 139).

The use of AI in cardiology has increased dramatically in the past 5 years. Machine learning algorithms now outperform many traditional algorithms, including the established risk prediction algorithm used by the American College of Cardiology (ACC)/American Heart Association (AHA), performing with a 3.6% predictive accuracy improvement over the ACC/AHA algorithm ([4]: p. 139).

Still major challenges lie ahead. Before AI can reliably be utilized by any field of medicine let alone realize its potential for cardiac patients, the neural networks necessary for its application require constant and extremely time-consuming expansion and revision. Key difficulties are (i) the extremely large amount of training data required by neural networks; (ii) the need to annotate (label) any dataset used for the training of a neural network; (iii) creating an understating of what computers learn given that the patterns and knowledge gained by a network are contained in the weights of the nodes of the network; and (iv) the risk of “overfitting” the training data when designing and training a neural network.

In other words, better efficiency of machine learning, together with improved accuracy with less training and data necessary, are all needed in order to approximate the efficiency of human learning and bring its relevance to a clinical setting.

Given these challenges, AI in cardiac CT angiography has made tremendous gains in the past 10 years, and over the next 10 years, the expectation is that we will see more AI software development and use in cardiac imaging than in the past 50 years ([4]: p. 139).

Advertisement

5. Decision-making tools for practitioners

A recent study revealed that one in every 71 cases from 6000 tissue samples of cancer patients across the US was misdiagnosed and up to one in five were misclassified. This same study reviewed 25 years of US malpractice claims and concluded that diagnostic errors were the cause of the most severe patient harm. According to the National Academies’ Institute of Medicine, 10% of patient deaths and as much as 17% of hospital complications are a result of diagnostic errors ([8]: p. 1).

What is more, it was not primarily the physicians who were the cause of most diagnostic errors. Instead, the study found, the fault lies primarily in substandard collaboration and synthesis of information in the healthcare system as well as communication gaps, and that the healthcare system as a whole failed to effectively support the diagnostic process ([8]: p. 1).

Now let us consider the application of AI to address the need for collaboration and integration in healthcare to improve the diagnostic process. Optum, a leading company providing these solutions for the healthcare industry, developed a program called Care Coordination Platform. It processes vast amounts of data and provides a comprehensive overview of every patient’s full medical history, allowing healthcare providers an immediate, complete picture of each patient. The platform suggests the most appropriate and cost-effective treatment options; identifies high-risk patients before symptoms occur; and has adaptive algorithms that incorporate clinical data, claims, and socioeconomic figures ([8]: p. 2).

A recent study examined the effects of clinical decision-support systems (CDSSs) on practitioner performance and patient outcomes. Clinical decision-support tools made available to practitioners and patients, such as computer-generated clinical knowledge and patient-related information, were studied. When such data are filtered and made available at appropriate times, it was shown to enhance patient care. CDSS can also send reminders, warnings, test results, check for drug interactions, dosage errors, contraindications, and list patients eligible for specific interventions such as immunizations and follow-ups [9].

The study found that CDSSs that require large amounts of data entry adversely affect physician satisfaction and use of the system. When large amounts of data required for the CDSS to be effective are incomplete, diagnoses will be less accurate, or it will take longer to complete the data, resulting in delays in the CDSS to accurately deliver advice. Anticoagulant-prescribing CDSSs are a case in point; the data required are more complex and have a higher patient variance.

CDSSs requiring limited number of patient data items for input were the most used and clinically successful. Examples include preventative care reminder systems for routine tasks such as blood pressure tests, pap smears, vaccinations, etc.

The study concluded that CDSSs become more effective as they become more specified and sensitive in their levels of advice but at the same time the manual input of data needs to be minimized, and the CDSS advice needs to be available in a timely manner to be of relevance for physician use.

Advertisement

6. Advanced genetic technologies

Personalized medicine is making an impact in advanced genetic technologies as well. Genome modulation (modifications), in particular, has an array of applications, from energy, food, and industrial to medical. Researchers are turning to genome modulation with the hope that it will provide the key to understanding and answering some of life’s most difficult and challenging questions.

Genome modulation applied medically has been known as gene therapy, but with new technologies, has evolved into the science of gene editing. At the forefront of this technology is what is now known as Clustered Regularly Interspaced Short Palindromic Repeats, or CRISPR. Experts claim CRISPR has brought with it new streams of business based on its cutting-edge technologies [10].

One recent example of CRISPR technology application is the correction of blood clotting problems in newborn and adult mice, with marked success. The aim is to cure the majority of patients with hemophilia B with CRISPR-based gene targeting [10].

Growing interest in CRISPR technology is speeding its transition to research, clinical trials, and applications in humans, and it was recently tested on a human being for the first time. In China, a patient diagnosed with terminal lung cancer was treated with CRISPR gene editing therapy as part of a clinical trial. Meanwhile, clinical trials in the United States using CRISPR technology are underway.

CRISPR technology has opened channels for business, but there are still many daunting challenges before its application can be realized clinically. One such hurdle, if not the most significant one, is regulation. Personalized medicine, including gene editing technologies, is involved in a regulatory business, involving peer-reviewed, published papers and clinical trials. Even in cooperation with the FDA, for example, it could still take a new technology 20 years to be approved.

Advertisement

7. Robotics and advanced medical devices

Another way in which personalized medicine is driven by advancements in technology is in healthcare robotics. The introduction of robotics in healthcare is driven by the desire to improve quality, safety, and control expenditure. Surgical robots, service robots, companion robots, cognitive therapy robots, robotic limbs and exoskeletons, humanoids, and rehabilitation robots are just a few applied areas already making use of this technology.

Despite clear advantages and a promising, growing future of robotics in healthcare and in medical devices, there is a need for a robotics strategy that addresses concerns and challenges. Patient and cultural perceptions, liability rules, and ethical debates present challenges to the integration and development of robotics in healthcare.

A recent study suggested that a deliberative approach is needed to find a balance between developing overarching rules in this industry and allowing innovation to flourish, and that robots and robotic devices should be viewed as “augmenting human capabilities and empowering professionals in their role” so that patients would have a more positive perception of robotics in their healthcare settings [11].

Another recent study suggests that robotics lags behind its healthcare potential primarily because the industry has yet to live up to a primary principle of Cybernetics. According to this theory, robots and robotic devices should have a high level of adaptation and reaction to environments, resulting in complete interaction between humans and robots [12]. In this study, robotics-assisted surgery, rehabilitation, prosthetics, and companion systems were analyzed.

In all areas, the study concluded, for one, that the real potential of robotics in these fields requires a much greater degree of customization. Customization is defined as the robotic technology’s adaptation to clinicians and patients, and the authors argue that existing robotic systems are limited in their ability for customization, which greatly limits its practical use in healthcare. The idea is that technology should adapt to users, rather than forcing users to adapt to technology.

Despite implicit or even explicit claims of the superiority of robotic systems for healthcare, when compared to more traditional methods, the clear advantage of these systems is currently unproven and highly dependent on the skills of the users. Therefore, the success of such technologies is still heavily dependent on adequate training and experience.

Advertisement

8. 3D printing drugs

As our last example in this chapter of the impact of technology in medicine, consider the “3D” printing (3DP) of oral drugs. While it may sound novel and revolutionary, drug manufacturing using 3DP technology is actually a combination of well-established technologies first developed to meet the needs of engineering prototypes [13], namely building objects by creating sequentially added layers.

There are a few driving forces behind the 3DP of oral drugs: personalization, on-demand capability, and the ability to manufacture drugs in new, decentralized locations. A recent study suggests that the key to the success of utilizing 3DP technology for healthcare and medicine is to maximize patient benefits while providing production efficiency, and that 3DP has a proven track record. As such, they argue, its future is clearly viable in three fields, namely preclinical, within a pharmaceutics framework; innovative drug delivery concepts; and decentralizing the drug manufacturing process [13].

Although this technology is currently niche and not an alternative to mainstream mass production processes, there is a clear place for 3DP in healthcare and medicine and its role will be more clearly defined in the future by incorporating considerations such as ideal population product profile, drug formulation, and engineering, as well as the management of regulatory and supply chain factors [13].

Advertisement

9. How to engage the patient to the human-machine innovation?

Due to the technological advances described above along with the growing need for smarter, preventive, more accurate, and effective medicine, the healthcare industry is advancing into the digital age—the digital health revolution. The digital health revolution is made possible by advances in medical information technologies—information storage, data analysis, mobile, sensors, and genetic information. All this will enable the capture and analysis of vast amounts of information about patients, populations, environments, and the lifestyle in which they live, and thus adapt personalized treatment accordingly.

The technological advances facilitating personalized medicine enable the capture of major challenges of the health system and chronic diseases [3]. The reason that chronic diseases are the major financial burden on the healthcare system is because most chronic disorders develop outside healthcare settings, and patients with these conditions require continuous interventions to make behavioral and lifestyles changes needed to effectively manage the disease.

The challenge with chronic diseases is the transition from health to disease with late-onset symptoms that can be irreversible. Coincidentally, the majority of chronic diseases can be prevented or delayed in life through interventions as described above, which results in an extended health span (the duration of individual life spent in a state of wellness, free of disease). Current chronic disease management is characterized by fragmented interventions and communication and recommendations from specialists, becoming constitutive only following the onset of disease symptoms. At the stage where an individual is free of symptoms, preventive activities management is done mostly by individuals themselves.

Due to the growing evidence that links patients’ activation, defined as the patients’ willingness and ability to take independent actions to manage their health and care, to their health and cost outcomes, methods and tools need to be developed to increase patient activation and engagement to accelerate the needed behavior change.

Encompassing both the design thinking approach and behavioral economics can motivate people to change their current behavioral health-related habits to improve their health. This underscores the need to devise a personalized, preventive medical infrastructure with recommendations and motivation mechanisms taken from behavioral economics.

Behavioral economics aims at realizing the human irrational decision process underpinning suboptimal outcomes, which in our context translates to unhealthy behavior patterns. In recent years, government agencies around the world have been employing behavioral economics models and methods as complementing means to standard public-policy tools that are implemented by decision-makers. These measures, based on the “Nudge” theory [14], are used for preventing policy-implementation failures and positively impacting motivation and decision-making by individuals and groups. Thus far, this theory has inspired a variety of applications in areas such as education, health, safety and environment. Extensive applied research, performed in the UK by the Department for Environment, Food & Rural Affairs [15], has outlined nine principles influencing human behavior, based on research in social psychology and behavioral economics.

Integrating elements from persuasive technologies for supporting extrinsic motivation factors stemming from communication and social aspects, such as incentives and norms, will have a great impact on the implantation and engagement of the patient. These technologies provide effective means for supporting the operationalization of “Nudge” theory, for example by producing email messages for raising awareness regarding fulfillment of required assignments, delivering informative messages related to the performance of these assignments, and promoting a climate that reflects social norms within online social networks. Studies have shown that nudging could also incorporate various approaches that focus on changing physical or social environments to increase the likelihood of certain behaviors. This could include the provision of social norm feedback, which will increase the likelihood of healthy behaviors, altering the defaults surrounding how food and drinks are served, or even changing the layout of buildings to encourage physical activity ([16]: p. 263). Nudging focuses on a set of simple and low-cost remedies that may not require any legislation and can be used to solve most of the problems emanating from human contact. On the other hand, nudging could also enhance behaviors that may worsen the health of individuals ([16]: p. 264). For instance, food products may be labeled as healthy, hence causing consumers to ignore the energy content, which may lead to excessive consumption of such products.

The value of technologies that increase patient activation and engagement is paramount due to the increasing incidence of chronic diseases. Therefore, developing “patient-centered” technologies will increase adoption and diffusion of these technologies.

References

  1. 1. Embracing the future of work to unlock R&D productivity. Deloitte. 2018. Available from: https://www2.deloitte.com/uk/en/pages/life-sciences-and-healthcare/articles/measuring-return-from-pharmaceutical-innovation.html
  2. 2. Sagner M, McNeil A, Puska P, et al. The P4 health spectrum—A predictive, preventive, personalized and participatory continuum for promoting healthspan. Progress in Cardiovascular Diseases. 2016;59(5):506-521
  3. 3. Kvedar JC. Digital medicine’s march on chronic disease. Nature Biotechnology. 2016;34:239-246
  4. 4. Dilsizian ME, Siegel EL. Machine meets biology: A primer on artificial intelligence in cardiology and cardiac imaging. Current Cardiology Reports. 2018;20:139
  5. 5. Frölich H et al. From Hype to Reality: Data Science Enabling Personalized Medicine. 2018. DOI: 10.1186/s12916-018-1122-7 [Online]
  6. 6. Allen LN, Christie GP. The emergence of personalized health technology. Journal of Medical Internet Research. 2016;18(5):e99
  7. 7. Phillips MA, Harrington TS, Srai JS. Convergent innovation in emerging healthcare technology ecosystems: Addressing complexity and integration. Technology and Innovation Management Review. 2017;7(9):44
  8. 8. Ainslie J. An Epidemic of Misdiagnosis: Using AI to Solve a Quiet Crisis in Healthcare. USA: Aibusiness, Texas; 2018. p. 1
  9. 9. Jaspers MWM, Smeulers M, Vermeulen H, Peute LW. Effects of clinical decision-support systems on practitioner performance and patient outcomes: A synthesis of high-quality systematic review findings. Journal of the American Medical Informatics Association. 2011;18:327-334
  10. 10. Stangelli J. CRISPR Tech Heralds Hype, Hope, and Hurdles for Gene-Based Therapeutics. 2016. Available from: http://www.bio-itworld.com/2016/12/12/crispr-tech-heralds-hype-hope-and-hurdlesfor-gene-based-therapeutics-or-how-to-cure-cancer-and-aids-with-this-one-weirdtrick.aspx [Accessed: Dec. 20, 2016, in Bio-ITWorld: Online]
  11. 11. Cresswell K, Cunningham-Burley S, Sheikh A. Health care robotics: qualitative exploration of key challenges and future directions. Journal of Medical Internet Research. 2018;20(7):e10410. DOI: 10.2196/10410
  12. 12. Andrade AO, Pereira AA, Walter S, Almeida R, Loureiro R, Compagna D, et al. Bridging the gap between robotic technology and health care. Biomedical Signal Processing and Control. 2014;10:65-78
  13. 13. Hsiao WK, Lorber B, Reitsamer H, Khinast J. 3D printing of oral drugs: a new reality or hype? Expert Opinion On Drug Delivery. 2018;15(1):1-4. DOI: 10.1080/17425247.2017.1371698
  14. 14. Thaler RH, Sunstein CB. Nudge. New Haven, CT: Yale University Press; 2008
  15. 15. Department for Environment, Food, and Rural Affairs. A Framework for Proenvironmental Behaviours. 2008. Available from: www.defra.gov.uk
  16. 16. Roland M, Kelly PM, Suhrcke M. Judging nudging: Can nudging improve population health? BMJ. 2011;342:263-265

Written By

Neta Kela-Madar and Itai Kela

Submitted: 23 July 2018 Reviewed: 31 July 2019 Published: 08 October 2019