The Covid-19 pandemic has laid bare the deficiencies of health systems around the world, exacerbating systemic issues, and underlining the need to address the growing demands of an ageing population. Systems are struggling from long-term underinvestment and piecemeal funding that has been designed to manage rather than solve fundamental issues, an approach geared towards treating disease rather than optimising health – and a failure to fully embrace technology.
Offering hope and real solutions to these problems, technology should be the centrepiece of a new health-care model. Harnessing both the power of data and advances in biotech can create a more predictive, preventative, personalised and participatory model, with the power to radically improve outcomes, dramatically reduce costs and bring about broader economic benefits. It is no longer sufficient to simply manage rising demands; we must ask how we reduce that demand too.
The technologies of today and tomorrow will be critical in revolutionising health alongside wider reforms. They will include:
Systems of records that can bring together comprehensive personalised health data in a secure way to reveal unprecedented insights about our own health and that of the population as a whole.
Systems of engagement that reimagine how we interact with health-care systems to make them more efficient and accessible.
Systems of intelligence that can utilise our individual health data to inform personalised treatments.
Systems of collaboration that can drive clinical research to develop novel treatments and help protect us from global health threats.
To realise this future, governments and their leaders should:
Invest over the long term in transformative strategies with data and technology at their heart. This means considering the economic case for investing in health over a multi-decade horizon while committing to and incentivising significant, long-term capital investment.
Focus on the foundations of a platform that will support a tech-enabled future of health. This means delivering foundational physical and digital infrastructure as well as digital skills on which technology can be built and exploited for the future of health delivery.
Prioritise the patient population and their access and engagement with digital tools. This means addressing the trust deficit by empowering patients through new levers of accountability, enabled by data and technology, as well as providing them with the technology that gives them greater autonomy over their own health.
By focusing on these three areas, governments around the world can radically improve the health of their populations, address long-standing health inequalities, reduce fiscal burdens and unlock the enormous economic potential of a healthier population.
From decoding the genome to unravelling how proteins fold, humans have made unparalleled progress in our ability to understand and influence biology since the turn of the 21st century. Simultaneously, advances in cloud computing, software development and artificial intelligence (AI) have given us unmatched powers to assemble and analyse data at a scale and level of sophistication previously unimaginable.
While these breakthroughs present a rare opportunity to transform health, the case for doing so is now one of necessity for many countries around the world, their public-health systems under unprecedented pressure as they try to recover from a pandemic that is not yet over. In the UK for instance, a record seven million people are waiting for elective care and emergency care is in crisis. But the pandemic is not solely to blame. In reality, Covid-19 has simply exacerbated the systemic challenges inherent in untenable 20th-century health-care models.
Taking a longer-term view, the figures are even more concerning. By 2024, total health expenditure in the UK, for example, will account for 44 per cent of day-to-day public-service spending. Household spending on private health care has almost doubled over the past decade and is fast approaching rates seen in the United States. Meanwhile, health outcomes have stalled or declined. Life expectancy has fallen for the first time in decades after plateauing over the past 10 years and health inequalities have widened. A similar picture can be seen in other developed countries. The United States spends at least twice as much on health care per capita as most other high-income countries but still has the equivalent or shorter life expectancy, with significant disparities by geography and race.
The gap between life expectancy and healthy-life expectancy is also growing in many countries as the population ages and susceptibility to chronic diseases – such as cancer, diabetes, cardiovascular diseases and respiratory illnesses – increases. In advanced economies, declining birth rates combined with ageing populations will continue to push health-care costs higher against an eroding tax base. For developing nations, the need for improved health outcomes is even more acute. Life expectancy in low- income countries lags behind high-income countries by more than 20 years, according to the World Bank, with states of health often poorer.
We must start to prioritise long-term strategies that can radically change our approach to health and reduce the demand on services tomorrow. Rather than rely on reactive strategies, we must exploit the technologies at our disposal and invest in those of tomorrow to shift health care towards maintenance and optimisation of health. Achieving this paradigm shift will require decisive action, strategic investment and public buy-in, but the rewards for countries will be radical improvements to health outcomes, a reduction in long-term health-care costs and significant, broader economic benefits.
What Architecture Will Be Needed?
To design architecture for the future of health care, we need new operating models and principles. This means integrating systems to build population-level insights, but also giving more autonomy to the individual. It will require greater deployment of technologies, such as wearables and robotics, but also the realisation of digital services and biomedical data infrastructure to stimulate drug discovery, enable proactive populational health management and reduce the costs of treatment.
The democratisation of health is one of the world’s most pressing challenges. While Covid-19 may have accelerated action – including the use of data and technologies – it has also reinforced the huge inadequacies of systems and highlighted the need for much faster reform. Building a new health platform for the 21st century will require a new approach at individual, domestic and international levels. This will involve reordering health around a core set of principles that enable individuals to own their health data, and governments to access both identifiable and de-identified data. Those principles are:
To be predictive, so that information from genomics, wearables, electronic health records (EHRs) and social determinants can provide early warnings about the risk, onset and progression of disease.
To be preventative, so that data can be used to take informed actions whether through therapeutics and treatment, lifestyle changes or populational health interventions.
To be personalised, so that treatment is based on individual profiles and health baselines rather than overgeneralised guidance.
To be participatory, so that individuals have more meaningful relationship with their health-care providers as well as with their own health and data.
While getting to this point requires system-wide reform, it is important that technology is not seen as a substitute but as a complement to the clinical care that doctors and nurses provide. Technology should act as an enabling function that frees up time for professionals so they can focus on the patient. In essence, it should augment our ability to provide deep and quality care for all.
This new model, which would see much greater symbiosis between humans and technology, involves thinking about data as a digital specimen, with systems of record and engagement collecting and structuring data before artificial intelligence and systems of intelligence provide the insights needed to transform health for people. And to work beyond the individual borders of countries, we need systems of collaboration to harness our collective will to solve global health issues. Getting all the aspects right is critical because a tsunami of data, left unstructured, would do little to move the dial – and could even be harmful – if they are not curated and held securely. We need better solutions to generate data but storing, sharing and making sense of them will also be paramount.
Systems of Record
The starting points for any such system are the individual and the building of a digitised, personal health record. It should connect to and include data from EHRs and, over time, also integrate personal data from genomics, multi-omics, wearable devices and social determinants of health information. Building digital-health profiles for citizens will not only offer unparalleled insights into our individual health to enable personalised treatments, they will also provide population-wide learnings, which can be used to inform research and public-health policy, enabling a more efficient health system. Each person should have ownership and autonomy over their own personal health record (and those of their dependants), with the ability to share details and to consent to allow research access – as they choose.
Digital-health ID and digital-health twins: A digital-health ID – a unique identifier that provides access to digital-health data and can be stored on a smartphone or similar device – is central to this concept. This is an area in which Estonia leads the way through its e-Health policy. Stored electronically, health records are presented in a standard format via a patient portal. Doctors can access the records from any clinic, emergency responders can access a patient’s ID code to read critical information, such as blood type and allergies, while prescriptions are issued electronically. For security, blockchain is used, with the system creating an indelible record each time anyone accesses it and penalties are issued for doing so without consent. Building on this, each person should be able to construct their own digital-health twin – a digital version of themselves linked to their ID that can be used to better visualise and model their health as technology develops.The primary cost saving would result from system integration and efficiencies, but such infrastructure has to be seen as a critical enabler of health systems too. The seamless integration of physical and digital services must be underpinned by the right infrastructure. EHRs and interoperable data entry are critical to providing the best patient care, reducing burdens on physicians and enabling more sustainable health systems. A 2020 analysis by the Office of the National Coordinator for Health Information Technology (ONC) in the United States estimated the annual economic benefits of better electronic data sharing to be between $1 billion to $5 billion a year, through reductions in duplicate testing as well as avoiding emergency-room visits, hospitalisations, readmissions and adverse drug events.
Genomics: The genetic material we inherit can significantly increase or reduce our predisposition to disease. The ability to read our genomic profiles is critical to understanding our individual disease-risk profile and can inform medical decisions to predict, prevent or treat certain conditions, including whether novel gene therapies could be appropriate. At a population level, aggregated profiles provide invaluable datasets that are enabling us to gain a more detailed understanding of the causes of disease, particularly those that are rare or complex. Next-generation sequencing technologies have reduced the cost of whole-genome sequencing to less than £1,000 per genome (100,000 times cheaper than 20 years ago) while the recent emergence of third-generation sequencing, using nanopore technology, could reduce this further. Moreover, nanopore technology has also reduced the speed at which a whole genome can be sequenced, down to a matter of hours, with the option of mobile-platform-enabling sequencing outside of traditional lab settings, such as at the bedside, providing the opportunity to integrate it into routine clinical care. The UK is leading the way in this field and has committed to sequencing the genomes of one million volunteers through Genomics England and UK Biobank, with the intention of increasing this to 5 million genomes by 2023. As the costs continue to fall, however, it will become increasingly desirable to make individual genomic profiling a routine part of health care in the future.
The cost of whole-genome sequencing has reduced significantly over the past 20 years
Source: Our World in Data. Moore's Law refers to the cost and efficiency savings achieved by technological advances over time
Multi-omics: Building on genomics to measure a more comprehensive set of biomolecules (including levels of mRNA, proteins and epigenetic markers) in tissues, multi-omics provides a more detailed view of the molecular composition of our cells and therefore the causes of health and disease. As with genomics, this has enormous diagnostic and therapeutic potential for individuals, but with much higher-resolution data that can provide more valuable medical insights. Beyond individuals, aggregated multi-omics data will produce powerful population datasets that will provide unprecedented insights into the causes and mechanisms of disease. In the United States, The Cancer Genome Atlas (TCGA) dataset contains over 2.5 petabytes of genomic, epigenomic, transcriptomic and proteomic data for 33 tumour types, and this programme is already leading to significant improvements in cancer classification, diagnosis and treatment. In addition, many of the identified molecular subtypes of cancer can now be targeted by available drugs, with others being revealed as potential targets for drug development. Personal, longitudinal, multi-omic profiling is the ultimate goal and has already been achieved for a handful of individuals, but this technology is still in its infancy. Therefore, governments must look to foster its development to drive down costs and enable its adoption at scale, as has been the case with genomics.
Wearables: Around 40 per cent of UK consumers and 58 per cent in the United States have access to a smartwatch or a fitness tracker. Wearables, which are being adopted as quickly as smartphones and could be as ubiquitous in 20 years’ time, present extraordinary opportunities. These devices can provide accurate, real-time monitoring of biological measures such as body temperature, heart rate and blood oxygen, helping to build long-term datasets that provide an understanding of a patient’s personal baselines. Wearables have recently been used to detect the onset of certain illnesses, including Covid-19, through deviations in an individual’s personal baselines, while the UK’s National Health Service (NHS) is starting to use smartwatches for remote monitoring purposes, such as the recent rollout of devices to patients with Parkinson’s disease.The ability of wearables to provide lifestyle feedback and to serve as early-warning systems will be essential to preventative and personalised medicine. As their sophistication grows, the cost savings from improved lifestyles and earlier diagnoses could be profound. With simple measures accessible today, wearables are likely to be able to detect a range of metabolic, cardiovascular, pulmonary and psychological diseases in the near future and, in time, will be able to monitor an array of metrics to provide more granular real-time readouts of an individual’s health status. Given how profoundly they could benefit individual health and ease the burden on health-care systems, governments should support wider adoption of these technologies, both through public health-care systems and more generalised approaches, such as salary-sacrifice schemes. In the UK, a HeadUp Systems pilot promotes the monitoring of step count and diet, providing incentives such as gym passes for activity, while Singapore’s Health Promotion Board has begun using wearables to promote healthier lifestyles.
Wearables will become an essential component of preventative and personalised medicine
Source: PLOS Medicine
Social determinants of health (SDOH) data: Social and environmental conditions can have an enormous impacts on individuals, their health and risk of disease. A recent study estimated that medical care could be limited to only 20 per cent of attributable health outcomes in a population, with SDOHs making up the remainder (excluding genetic factors). It follows then that systematised collection of SDOH data can provide important context on patients by helping clinicians make social-service referrals when appropriate, allowing researchers to assess the relationships between SDOH and disease, and supporting public-health practitioners to design populational health-management models that are effective, precise and financially viable.
EHR systems: EHRs are electronic databases that enable the storage of medical records, at patient and population levels, in a digital format. They allow a patient’s existing and historical information – including basic physiological statistics, medication and immunisation records, medical scans, test results and personal data – to be shared and accessed across different health-care settings to help inform medical decisions. When considering implementation of EHR systems, technical, operational and privacy-related barriers need to be resolved to ensure a good level of interoperability so that patient data is accessible between different health-care providers. Recent advances in cloud computing and federated data platforms offer potential solutions to interoperability issues while promising long-term efficiencies both from an operational perspective and the improved individual health outcomes resulting from better decision-making.Interconnected EHRs can also provide populational datasets that, in combination with those from biobanks and SDOH data, can be used to inform medical research and public-health policy. Advancements in AI will enable rapid and advanced analysis of such datasets to offer deep insights into the genetic and environmental factors that contribute to health and disease.
Systems of Engagement
The way we engage with public-health systems has been evolving over the past 20 years as internet and smartphone usage has grown exponentially, with the pandemic more recently accelerating this pace of change. We have become more comfortable interacting with systems outside traditional settings, including telemedicine and home-testing kits. Technology offers the opportunity to move closer to a more decentralised, efficient and accessible model of health care that provides more autonomy to the individual.
Moving forwards, we should take particular advantage of the behavioural changes resulting from the pandemic:
Telemedicine: The delivery of health services via digital platforms can make contact between patients and providers quicker and easier. During the pandemic, the use of telemedicine has increased dramatically in many countries. In the UK for example, primary-care appointments delivered over the phone or online increased from 15 per cent in December 2019 to more than 90 per cent by May 2020, according to NHS Digital. There has also been an enormous surge in uptake of online services among patients. More than 28 million people have downloaded the NHS App, enabling them to access advice, book appointments and order prescriptions more efficiently. However, its use is still limited – for example, only one in 1,000 GP appointments in the UK are booked through the app – and its functionality could be greatly extended whether that is in notifying patients of clinical trials they are eligible for or linking data derived from wearable devices. Similarly, the US government updated its regulation in response to Covid-19 to enable providers to use telemedicine so that patients would not have to travel for certain types of appointment. Unsurprisingly, the use of telemedicine and investments in digital health rose greatly during the pandemic in the United States. This means there is now a golden opportunity to build on the success of such digital portals, positioning them as a much broader doorway for interaction with health services. Driving up the user base and usage rates (particularly among older demographics) as well as improving the functionality of such systems will not only bring operational efficiencies, but also represent a decisive step towards comprehensively digitising health records.
Patient-centred communications-system records: Health care is increasingly delivered by teams across many settings, yet those involved in a patient’s care struggle to communicate with one another and indeed the patient. For instance, health-care professionals in the UK typically spend two hours on administration (referrals, coordinating care and chasing results) – in other words, trying to communicate with one another – for every hour they spend with patients. This is in large part a result of the antiquated communications systems still used in health care, for example post and fax. Even email struggles to solve the problem because professionals cannot gain a comprehensive view of the whole care pathway or who has been involved in it.A patient-centred communications system would allow health-care professionals to simply look up their patient’s details, view the care pathway and instantly communicate with anyone else involved. This would reduce time spent on administration. Users of Accurx, a UK start-up that offers such a system, have reported savings of 2.3 hours per week on communicating with patients and that’s before taking into account the productivity benefits gained from the time saved communicating with other professionals. Such an approach could have significant wider system benefits too, whether through the reduction of referral times that are still dependent on letters or the avoidance of unnecessary accident-and-emergency visits because the appropriate care provider has been accessed instantly and successfully. Integrated care cannot be achieved without integrated communications between those involved in a patient’s care. A patient-centred communications system would solve this problem and provide the platform for new technologies to be seamlessly integrated into care pathways.
Remote monitoring and virtual wards: With hospitals facing significant capacity issues, technology holds the power to reduce the burden on care teams by enabling the management of patients through virtual wards, increasing accessibility and improving the ability to triage. This enhances continuity of care, allowing clinicians to prioritise their patients based on early identification of deterioration in individuals while reducing unnecessary complications, accident-and-emergency visits and outpatient appointments. As an example, the NHS has recently provided oximeters to Covid patients so they can monitor their blood-oxygen levels from home, reducing burdens on both patients and health-care providers. Such technology has significant applications for older patients in care settings although it could also enable at-home care for illnesses ranging from musculoskeletal to cardiovascular diseases, cancer to diabetes and mental-health conditions. The use of remote technology has also increased dramatically during the pandemic, with 24 remote projects seeking to cover more than 300,000 patients already underway in the UK this year. Estimates in the United States suggest uptake of such technology could reduce hospitalisations by as much as 40 per cent for heart disease while another study forecasts cost savings of up to $6 billion a year in the country.
Decentralised diagnostics: Relocating diagnostics from traditional health-care settings to the home or virtual settings can lead to faster and cheaper results, and enable earlier detection of disease. Today, genomics and AI have an important role to play in creating new prediction-and-prevention models that mean health services can be there for people even before they are born, with data contributing to personalised care and more intelligent public-health models, and knowledge starting to empower individuals to co-create their own care. However, to achieve this genuine and disruptive step change, significant investment is required to support greater use of self-administered diagnostics as well as advanced technologies for point-of-care testing, digital pathology and radiography. Funding alone is not sufficient. The huge investment made in diagnostics to enable mass testing for Covid-19 is providing a unique opportunity to think beyond the crisis and ensure a lasting legacy that transforms how care is provided. In particular, the investment by government and industry in physical infrastructure (labs, manufacturing, distribution channels, digital devices and data) as well as the associated behaviour changes (home testing, awareness of individual and environmental risk factors, and use of digital enablers) paves the way for preventative public-health interventions beyond Covid-19.
Systems of Intelligence
While new models of health care enable us to harvest vast amounts of data, it is only by analysing and utilising them that we can realise fundamental changes. Advances in technology and biology are presenting profound opportunities to design new drugs, tailor cutting-edge personalised treatments and operate a more efficient health-care system. But many of these technologies are still immature and costly so health-care systems must foster their development to drive down costs.
AI: Presenting new virtual and physical opportunities in health, AI has the potential to change the future of diagnosis, drug discovery, epidemic management and even hospitals, including their administration and staffing. In the promising field of drug development and design, AI is increasingly being used to create new drugs, identify where existing treatments may be used for different diseases and for the design of novel proteins with therapeutic purposes. For instance, recent advancements have enabled us to rapidly determine the three-dimensional structure of proteins so we can better understand their function and how to influence them. When used to analyse the massive datasets held by EHRs and those that could be created through the fields of genomics and multi-omics, AI represents a powerful and cheap strategy to uncover deep insights into therapeutics that could dramatically change health care and health outcomes. While drug development is still expensive today, costing between $1 billion and $5 billion for each new treatment and around ten years to bring to market, AI can dramatically improve the discovery phase of the process, as covered above. Another area for consideration is the clinical-trials process that follows discovery, which is also still time-consuming and logistically complex. For this application, AI can help to improve patient identification and recruitment – a major hurdle to the success of clinical trials – as well as ongoing monitoring and dropout rates. It also brings more sophisticated analysis to post-market real-world data. Furthermore, some AI methods can contribute to in silico trials (i.e., those that rely on virtual populations) designed to test the safety and efficacy of new drugs by helping researchers identify subpopulation variations and discover drivers of drug response.Meanwhile, AI applications in diagnostics have been scaling up with some of the biggest breakthroughs in the branch of radiology because image-recognition abilities really come into play here. For example, when Google’s DeepMind and London’s Moorfields Eye Hospital recently partnered to apply AI analysis to optical coherence tomography (OCT) scans, they were able to identify more than 50 eye diseases, including glaucoma, diabetic retinopathy and age-related macular degeneration. The AI proved correct in identifying diseases 94.5 per cent of the time – on a par with world-leading ophthalmologists. DeepMind has also used AI to analyse EHRs to detect acute kidney injury in patients up to 48 hours earlier than it is currently predicted, avoiding up to 30 per cent of cases through the early intervention of a doctor. Meanwhile, Kheiron Medical Technologies has partnered with the NHS to use AI to analyse mammograms and improve breast-cancer screenings, reducing both false positives and false negatives, while releasing capacity among doctors.When it comes to operations, a 2018 report by the UK’s Lord Darzi estimated that AI, combined with robotics, could save the NHS up to £12.5 billion a year by fully automating repetitive and administrative tasks, such as medical notes, appointments and prescriptions. In total, robots and AI systems could take on 31 per cent of GP workload, 23 per cent for hospital doctors and 29 per cent for nurses, according to the report. AI can also be used to triage patients more effectively, inform smart-energy solutions and simulate surgical procedures. Employed more and more by large corporations to inform strategic workforce planning, recruitment and training, AI is also being deployed for advanced analytics on large datasets to forecast demand and future skills requirements. This application could be particularly relevant to those health services currently facing labour shortages and a lack of strategic workforce plans. Finally, AI is quickly finding a role in population-level health activities, including the surveillance and prediction of disease dynamics, and the targeting of interventions and risk management, with practitioners benefitting from the analysis of health data in real time.
Robotics: As they become ever more sophisticated, robotics offer a range of opportunities in precision surgery, rehabilitation, monitoring, care assistance and social interactions, for example by helping to combat loneliness in isolated patients. When deployed, these technologies will release capacity, deliver efficiencies and improve health care. Approximately one million robot-assisted surgeries happened around the world last year. While the capital costs for such robotic-assisted platforms remain high and savings are currently derived from better outcomes as a result of shorter surgeries and reduced post-surgical care, the price of robotics will come down to offer increasingly better value.
CRISPR technology and mRNA: The discovery and development of CRISPR technology has enabled the editing of DNA in extraordinarily precise ways. Coupled with the emergence of messenger RNA (mRNA) therapies during the pandemic, we now have access to a sophisticated set of tools that can manipulate cells to restore or change their function both within and outside the body in a controlled way. These breakthroughs open up unparalleled therapeutic possibilities. CRISPR gene-editing technology has already been used to treat some forms of blindness in patients while novel mRNA therapies have been effective in lab models at reprogramming immune cells to specifically target and repair damage caused by heart disease. Meanwhile, mRNA technology is producing novel vaccines (as seen globally during Covid-19) that offer the promise of new and more effective immunisation against a host of diseases. However, many of these therapies are prohibitively expensive with high upfront costs, so it is essential that public health-care systems develop novel procurement and commercial models that balance the need to incentivise innovation in these budding technologies while providing value for money to the taxpayer. Industry must also play its part by reasonably pricing and committing to innovative or risk-sharing commercial partnerships.
3D printing: The precision, reliability and material range of 3D printing has grown significantly in recent years. 3D printers already create custom-made prosthetic body parts, using materials that are lighter, stronger and cheaper than traditional methods and are typically much quicker to produce. Medical professionals and major pharmaceutical companies, such as DePuy Synthes, are using 3D printers and information from scan data to produce precise, patient-specific models of bones, organs and tissues to improve diagnosis, plan operations and even practice surgeries, which can help to avoid complications and reduce errors. This technology can also create bespoke surgical tools for operations. Moreover, as stem-cell culture and tissue-engineering techniques become more advanced, 3D printers are being employed to “bioprint” organic structures for tissue and organ regeneration. 3D printing will play an integral role in personalised medicine, which is why its widespread adoption, regulation (a key challenge) and distribution must be a prime focus for health-care systems around the world.
Smart hospitals: Uniting many of the innovations above, smart hospitals connect clinical technologies in an integrated environment to provide better health care and patient experiences, as well as improved operational efficiency and workflow when compared to traditional hospitals. Many of the top-ranked hospitals in the world are considered “smart” and will play a critical role in the future of health-service delivery. The United States and China lead the world in smart-hospital adoption.
Systems of Collaboration
The pandemic has underlined our susceptibility to global health challenges that require coordinated efforts across national boundaries to overcome. By pooling resources and risk on a regional and global level, we can more rapidly bring novel therapies to market, with data and technology key to these efforts.
Pathogen-surveillance systems: During Covid-19, systems and platforms such as the Global Pathogen Analysis Service and GISAID, which help to identify and monitor the emergence of novel pathogens and infectious diseases, have played a significant role in establishing risks to global health and informing public-health policy across borders to limit their impact. The advent of next-generation sequencing technologies and cloud-computing services has further facilitated the development of such systems, which must now be expanded to ensure global coverage, hand in hand with the implementation of necessary governance structures, as the Global Health Security Consortium has advised. Given the country’s contribution to genomic surveillance, the UK government should proactively seek to take a leading role in establishing global pathogen-surveillance systems as well as vaccine-manufacturing capacity to respond to emerging threats.
Health passports: The Covid-19 pandemic regenerated interest in digital-health passports (i.e., digital credentials with vaccination and testing-status information) as a way to safely restore economic activities such as travel, hospitality, tourism and sports. However, having multiple bodies seeking to create their own versions of digital passports risks creating disparate systems that lack universal applicability. This underscores the need for agreed-upon standards and interoperability guidelines between issuers and verifiers of digital-health passports to ensure international compatibility.
Research consortia: Collaboration between scientists and research institutions can leverage shared-data platforms for the benefit of clinical trials, health-care services and digitised biobanks, creating datasets of superior breadth and representation. Scientific collaborations are particularly relevant for research on rare diseases, the very nature of which means there are limited clinical data and samples available for study. Rapid data-sharing and knowledge dissemination leads to improvements and efficiencies in rare-disease diagnosis, treatment and clinical-trials recruitment and management. One of the main challenges for such collaborations, however, is the establishing of sufficient regulatory alignment and standardisation in data collection, storage and sharing. While such agreements must adequately ensure security and privacy safeguards on the one hand, they must allow for high-quality pooled datasets, with practicable data-cleaning and wrangling requirements, on the other.
Clinical research: Clinical trials are essential to health innovation but the processes involved must be improved to ensure that emerging treatments are timely, effective, cost effective and equitable. Unfortunately, only around 10 per cent of drug candidates succeed. Data from unsuccessful trials should be shared openly so that lessons can be learned from these outcomes, costly duplication avoided and biomarkers identified across patient subgroups, which may indicate opportunities for future drug development. Clinical trials must also make better use of data to overcome logistical challenges, including patient enrolment, as well as be made more diverse to identify efficacious treatments among underrepresented patient groups. This will require both a strategic commitment from pharmaceutical companies as well as deep engagement with underrepresented communities to gain their trust.
Biobanks: As another essential piece of infrastructure for driving biomedical research and personalised, precision medicine, biobanks are repositories and databases that house biological and clinical samples, and their associated data. UK Biobank, one of the world’s most advanced facilities, holds biological samples alongside relevant patient information for more than half a million UK residents. Operating at scale, biobanks offer an extensive range of samples that can be used for advanced biomedical research in several fields, including genomics, multi-omics and precision-medicine studies. Statistically powerful datasets can then be produced to help better understand the molecular basis of diseases and, across different populations, the different genetic and environmental risk factors associated with them.
The global network of major biobanks
Technology is not a silver bullet for the problems being experienced by many public-health services. While governments are beginning to take steps towards services underpinned by technologies, it is only by developing and adopting a comprehensive health-tech stack that they will be able to deliver exponential benefits for patients, practitioners and the population at large. Equally critical is building the underlying digital infrastructure, addressing workforce skills gaps, and driving digital literacy and engagement among the patient population.
Below, we set out three areas of focus for governments around the world, which they should consider adopting to deliver a more effective and equitable health-care system – one that will improve health outcomes, reduce costs and bring wider economic benefits.
Invest in long-term transformative strategies with data and technology at their heart. These strategies must sequence and prioritise the interventions that offer the best value for money, delivering them on greatly accelerated timescales. To be comprehensive, they must go beyond the boundaries of any health department’s remit by considering how social policies and industrial strategies can be aligned to facilitate improved populational health while meeting future health-services requirements through the development, scaling and adoption of skills and technologies.National strategies must also extend beyond national borders. The pandemic has demonstrated that a nation’s health can be greatly affected by threats from further afield. Individual countries must proactively contribute to wider regional and global strategies through the sharing of data, infrastructure and institutions. Governments also need to plan for the health services and technologies needed to meet the demands of their country’s shifting demographics, such as the growing over-65s populations in the developed world.While this will require significant upfront and continued investment, this will reap rewards in time to make health care more affordable and sustainable in the medium- to long-term, and to bring substantial economic benefits from improved populational health. Too often, health spending has suffered from short-termism, with governments seeking to manage rather than solve issues and failing to enable transformational change at scale and pace. Invariably, this is false economy and represents worse value for money to the taxpayer over the long term. Governments must begin to think about their population’s national health as an asset, considering the economic case for deep investment over a multi-decade horizon, just as they might for major infrastructure or defence projects. As well as public investment, governments must look to use tax reliefs and demand-side levers to incentivise private enterprise to innovate and develop the technologies that can bring new, affordable treatments and operational efficiencies to clinical settings sooner.
Focus on the foundations of a platform that will support a tech-enabled future of health. Achieving digital maturity is a prerequisite to a health system built upon data and analytics. While many countries have begun the process of digitalisation, few have yet to deliver fully digitised health-care systems even decades on. The costs of both hardware and software have reduced significantly since the turn of the century while the evidence of the operational efficiencies and enabling functions produced by digitalisation has only become stronger. Similarly, significant investments in network infrastructure – often poor in existing health-care settings – must also be prioritised to realise the benefits of digitalisation and cloud technologies.Issues concerning interoperability of digital systems also need to be tackled. It is essential that universal EHR systems are prioritised, but they must also be unified or made interoperable not just with one another, but with external systems – such as those for clinical-trials management – to maximise the benefits of digitalisation. Technological hurdles have been the major barrier to this until recently, but recent advances in hyperscale cloud-computing services and federated data platforms offer new solutions to these decades-old issues.Improving connectivity across health services in this way will not only improve decision-making and patient care, it will also help to modernise performance management and accountability at a more granular level. This will in turn enable greater local autonomy and innovation to drive a more effective and efficient system. Beyond digital infrastructure, governments must also focus on human capital, which is equally foundational to digital health. The skills requirements of modern health-care professionals have changed dramatically and continue to rapidly evolve. Proficiently identifying current and future skills gaps, and developing strategic workforce plans to address these gaps, will be critical to the delivery of 21st-century health-care systems.
Prioritise the patient population. The ability of patients to access and engage with digital tools is essential. Governments should look to drive digital access and literacy in their patient populations, particularly in the over-65s who, as a demographic, are typically the least digitally literate but the most reliant on health services. Investments in broadband and mobile-data coverage will be essential for communicating with rural and hard-to-reach populations.Issues concerning interoperability of digital systems also need to be tackled. It is essential that universal EHR systems are prioritised, but they must also be unified or made interoperable not just with one another, but with external systems – such as those for clinical-trials management – to maximise the benefits of digitalisation. Technological hurdles have been the major barrier to this until recently, but recent advances in hyperscale cloud-computing services and federated data platforms offer new solutions to these decades-old issues.Improving connectivity across health services in this way will not only improve decision-making and patient care, it will also help to modernise performance management and accountability at a more granular level. This will in turn enable greater local autonomy and innovation to drive a more effective and efficient system. Beyond digital infrastructure, governments must also focus on human capital, which is equally foundational to digital health. The skills requirements of modern health-care professionals have changed dramatically and continue to rapidly evolve. Proficiently identifying current and future skills gaps, and developing strategic workforce plans to address these gaps, will be critical to the delivery of 21st-century health-care systems.
We stand at a unique juncture in history. The potential for technology to transform public health is arguably greater than ever. However, we remain a long way from delivering on this promise despite the ever-increasing need for change. The 20th-century model of health care, focused on managing illness rather than optimising health, has become outdated, inequitable and unsustainable. Its limitations will only become more apparent as populations continue to age in many countries. The public-health, fiscal and economic impacts of this demographic shift are already evident. To respond effectively, governments must seize unparalleled advances in data, technology and biotech to revolutionise how health care is delivered, making it more predictive, more preventative, more personalised and more participatory. To be effective however, this also requires a change in culture. The role of governments moving forwards should not only be to invest in physical infrastructure and software, but also to create a fertile ground for technologies to flourish in health care and other public services. A future in which we maximise the opportunities of technology is not inevitable, but it will be greatly accelerated with the right principles, policy choices, levers and, above all, political will in place.
Lead Image: Getty Images