Saturday, December 30, 2017

Annual Reflections at the End of 2017

As longtime readers of this blog know, I always end each year with an annual reflection on the year past. I did this in the first year of the blog of 2009, and have done it every year since.

The life of this blog has seen a remarkable transformation of the biomedical and health informatics field, especially for those of us who had been working in it for a long time. In my case, I entered the field in 1987 when I started my NLM postdoctoral fellowship at Brigham & Women’s Hospital in Boston, MA. After spending three years in Boston, I arrived at Oregon Health & Science University (OHSU) as a newly minted Assistant Professor. I have climbed the ranks to full Professor and am the inaugural (and still only ever) Chair of the School of Medicine’s Department of Medical Informatics & Clinical Epidemiology.

During my career, I have witnessed a great deal of other change and growth in information technology. I witnessed the birth of the World Wide Web in the early 1990s (with skepticism it could really work since the bandwidth of the Internet was so slow back then). I was doing information retrieval (search) before the emergence of Google (why didn’t I come up with the idea of ranking Web page output by number of links to each page?). And I watched the rest of healthcare, especially the policy folks, “discover” the potential benefits of the electronic health record (EHR).

It could be argued that EHRs were not quite ready for prime time when new President Barack Obama unveiled the Health Information Technology for Clinical & Economic Health (HITECH) Act, with the American Recovery & Reinvestment Act (ARRA). HITECH can certainly be criticized in hindsight that the “meaningful use” program had too much emphasis on process measures and not enough on information exchange or standards and interoperability. But, as those of us glass-half-full types would note, we do have a wired healthcare system now, and the next challenge is to meet the needs of patients and their providers.

I always remember students who asked, in the early days of HITECH, whether there would be jobs once we were “done” implementing. Of course, not only is implementation of large and complex software systems never truly “done,” there is so much more to do to obtain value.

As for me personally, I still remain gratified by my career choice at the intersection of medicine and computers. My interactions with my colleagues and my students, helping and mentoring them in different ways, gives me that nice human touch that I abandoned through making the decision in 2001 to stop seeing patients.

My department at OHSU continues to thrive under my leadership and, more importantly, the dedication of faculty, staff, and students. Our research programs are still being impactful and well-funded, and the enrollment in our various educational programs remains strong.

My family also adds a critical dimension to my life, with the academic and career successes of my wife and two daughters as gratifying as my own. I did suffer a couple unfortunate losses this year, with the passing of both my mother and father. Fortunately both lived long relatively healthy lives, although my mother’s last years were compromised by dementia. I do miss them both, and am sad that they will not see the rest of my family and I going forward in life.

And of course this blog is doing well. Last year I touted reaching 400,000 page views. This past month I barreled through the half-million page views milestone, and was able to make the 500,000th view myself, as seen in the picture below.


There are still challenges ahead, both for myself and the field. But this year, and likely next year, I receive comfort not only from family, friends, and colleagues, but also the satisfaction of my work.

Thursday, December 21, 2017

Apple Watch 2, A Year On

About a year ago, I described my early experience with the Apple Watch 2. I noted that based on my priorities for a digital watch, the Apple Watch 2 had excellent hardware but some limitations with its software. A year later, the software has improved, but is still not as good as I might like.

Everyone has different needs for devices such as a digital watch, and mine mostly revolve around running. Other functions, such as telling time, viewing local weather, and accessing text messages, are secondary. My main needs for running center around access to the data. I need data from my runs to live in the cloud, and not be stuck on my phone. I want to be able to access the details of my runs from any device, and share them with friends who do not need to be logged on to the app or its Web site to view them. I also want to be able to run without having to take my phone with me, even though I sometimes do, especially when I am traveling.

I have gone through a number of running apps on my Apple Watch. The requirement to be able to run with the watch and without the iPhone made the initial choice very limited. I started with Apple’s Activity app that comes with the watch. While the app has a nice interface and can be used without being tethered to the iPhone, its inability to export data beyond the iPhone makes it a non-starter. When I upgraded my iPhone shortly after obtaining my Apple Watch 2 last year, and promptly lost all of my runs that had been stored on my old iPhone, I had no way to get them back.

Within the first few months of the Apple Watch 2 release, some of the other running app vendors released standalone versions of their apps. One of the first was RunGo. It was a decent app, although one has to explicitly save runs to the cloud as “My Routes,” as it is not done automatically. Nonetheless, RunGo has served me well in places such as Singapore, Bangkok, Honolulu, Siesta Key FL, Chicago, Philadelphia, and here in Portland (including my annual birthday 10-mile run).

More recently, I have settled on Strava, a long-time running and cycling app that by default stores exercise sessions in the cloud. The Strava Apple Watch app is not perfect. I wish the watch app displayed the cumulative distance run in hundredths of miles (instead of tenths) and the cumulative time and distance in larger size on the watch than the pace. I do like its auto-stop abilities for when I get stuck at stop lights, although get a little bit annoyed when it stops temporarily when I pull up the watch to view my distance and time. Strava too has served me well in a number of places, including Washington DC, New York City, and Abu Dhabi. All in all, I will stick with Strava for now.

Some may wonder whether I have considered upgrading to the Apple Watch 3, whose main feature is including a cellular chip so it can be accessed without being tethered to the iPhone. Given my running needs, it may seem ironic that I do not see a need to get the new watch. This is because with the exception of being out on my runs, I am just about always carrying my phone, so see no need to have the watch stand alone at other times I am not exercising.

Monday, November 20, 2017

From Predictive to Prescriptive Analytics: Response to NLM RFI

The National Library of Medicine (NLM) recently posted a Request for Information (RFI) asking for comment on promising directions and opportunities for next-generation data science challenges in health and biomedicine. This blog posting lists the questions posed and my responses to them. A main focus of my input centers on the need for transition from predictive to prescriptive analytics, i.e., going beyond being able to predict with data and moving toward applying it to improve patient diagnoses and outcomes.

1. Promising directions for new data science research in the context of health and biomedicine.  Input might address such topics as Data Driven Discovery and Data Driven Health Improvement.

The scientific literature is increasingly filled with papers describing novel and exciting applications of data science, such as improving clinical diagnosis and determining safer and more efficient healthcare. But there is more to impactful data science than the data and tools. We need studies that demonstrate real impact in improve patient and system outcomes. We need to assess the impact of efforts improving data standards and data quality.

One way to look at this is to consider the growing area of data analytics, which may be thought of as applied data science. Data analytics classifies three levels of analytics [1]:
  • Descriptive - describing what the data say about what has happened
  • Predictive - using the data to predict what might happen going forward
  • Prescriptive - deciding on actions based on the data to improve outcomes
Of course, there is science behind each level. We are seeing a steady stream of scientific papers on the application of predictive analytics. One of the earliest foci was the use of clinical data to predict hospital readmission, especially as a result of the Centers for Medical and Medicaid Services (CMS) penalizing US hospitals for excessive rates of readmission. This has led to dozens of papers being published over the last decade assessing various models and approaches for prediction of hospital readmission, e.g., [2,3]. Another focus that has recently attracted attention has been the use of deep learning for medical diagnoses through processing of radiology [4,5], pathology [6], and photographic images [7]. Even patient monitoring and health behaviors have shown the potential benefit for improvement via Big Data [8]. Likewise, as the database for precision medicine emerges, we will understand increasingly data-driven ways to treat different diseases, sometimes by therapies we never hypothesized for a given condition [9].

These predictive analytics applications are important, but equally important is research into how they will be best applied. Attention to hospital readmissions has somewhat lowered its rate, but the problem is far from solved. We not only need to predict who these patients will be, but device programs that will enable action on that data.

Likewise, as we learn to improve diagnosis and treatment of disease through predictive analytics, we will need to determine ways to make actions on those predictions possible, both for clinical researchers who discover new possible diagnostic tests and treatments for disease as well as clinicians who apply the new complex information in patient care. This will require both clinical decision support from machines and new organizational structures to conduct research and apply its results optimally in clinical care.

As such, a new thread of research in prescriptive analytics, i.e., applying the outcomes of data science research, is critical for realizing the value of biomedical science. The NLM should be at the forefront of thought leadership and funding of that research. Such research can build on its unique strong portfolio of existing research in biomedical informatics (which some of us consider data science to be a part of).

2. Promising directions for new initiatives relating to open science and research reproducibility. Input might address such topics as Advanced Data Management and Intelligent and Learning Systems for Health.

Open science and reproducibility of research are critical for the transition of data science from predictive to prescriptive analytics. Since the value of data science comes from large understanding of populations of patients, it is only fair to all who contribute their data to benefit from research using it. Therefore, we must devise methods to allowing appropriate access to that data while still protecting the privacy of individuals who have contributed their data. We also need to devise approaches to give appropriate scientific credit to those who collect the data, and a short time-limited window for them to achieve the first publication of results from it.

Open science should not, however, just be thought of as open data. The models and algorithms that process such data are also increasingly complex. We need more research into understanding how such systems work, how different methods compare with each other, and where biases and other problems may be introduced. As such, the algorithms used must be open so they can be understood and improved.

3. Promising directions for workforce development and new partnerships. Input might address such topics as Workforce Development and Diversity and New Stakeholder Partnerships.

New directions in data science must take into account the human workforce needed to lead discovery as well as apply it to achieve value. The best known data analytics workforce analyses from McKinsey [10] and IDC [11] are a few years old now, but both make a consistent point that we not only need a focused cadre of quantitative experts, but also 5-10 fold more professionals who can contribute to the design of analyses and apply their results in ways that improve patient and system outcomes. In other words, we need individuals who not only know the optimal methods for predictive uses, but also domain experts and applications specialists who can collaborate with the quantitative experts to achieve the best outcomes of data science.

In conclusion, there are many opportunities to put data science and data analytics to work for advancing health and healthcare. This work must not only build on past work done in biomedical informatics and other disciplines but also look to the future to best apply prediction in ways that improves maintanence of health and treatment of disease.

References

1. Davenport, TH (2015). Big Data at Work: Dispelling the Myths, Uncovering the Opportunities. Cambridge, MA, Harvard Business Review.
2. Amarasingham, R, Moore, BJ, et al. (2010). An automated model to identify heart failure patients at risk for 30-day readmission or death using electronic medical record data. Medical Care. 48: 981-988.
3. Futomaa, J, Morris, J, et al. (2015). A comparison of models for predicting early hospital readmissions. Journal of Biomedical Informatics. 56: 229-238.
4. Oakden-Rayner, L, Carneiro, G, et al. (2017). Precision radiology: predicting longevity using feature engineering and deep learning methods in a radiomics framework. Scientific Reports. 7: 1648. https://www.nature.com/articles/s41598-017-01931-w.
5. Rajpurkar, P, Irvin, J, et al. (2017). CheXNet: radiologist-level pneumonia detection on chest x-rays with deep learning. https://arxiv.org/abs/1711.05225.
6. Liu, Y, Gadepalli, K, et al. (2017). Detecting cancer metastases on gigapixel pathology images. https://arxiv.org/abs/1703.02442.
7. Esteva, A, Kuprel, B, et al. (2017). Dermatologist-level classification of skin cancer with deep neural networks. Nature. 542: 115-118.
8. Price, ND, Magis, AT, et al. (2017). A wellness study of 108 individuals using personal, dense, dynamic data clouds. Nature Biotechnology. 35: 747-756.
9. Collins, FS and Varmus, H (2015). A new initiative on precision medicine. New England Journal of Medicine. 372: 793-795.
10. Manyika, J, Chui, M, et al. (2011). Big data: The next frontier for innovation, competition, and productivity, McKinsey Global Institute. http://www.mckinsey.com/insights/business_technology/big_data_the_next_frontier_for_innovation.
11. Anonymous (2014). IDC Reveals Worldwide Big Data and Analytics Predictions for 2015. Framingham, MA, International Data Corporation. http://bit.ly/IDCBigDataFutureScape2015.

Wednesday, November 8, 2017

End of an Era For Academic Informatics: Demise of the Home-Grown EHR

Pick your cliche to describe a major event this past week: Another domino is falling. The dawn of a new era. The news that Vanderbilt University Medical Center, home to one of the most esteemed academic informatics programs in the country, is replacing its collection of home-grown and commercial electronic health record (EHR) systems with Epic shows that the era of the home-grown academic EHR is coming to a close.

Whatever cliche we wish to use, the change is real for academic informatics. One by one, many of the major academic informatics programs have sunset their home-grown EHRs in favor of commercial systems, including Partners Healthcare, Mayo Clinic, Intermountain Healthcare, and the Veteran’s Administration.

The enterprise EHR has become too complex for a single academic program to maintain. Academic informatics programs are great at fostering innovation in areas such as clinical decision support and re-use of clinical data. But they are less adept at managing the more mundane yet increasingly complex operations of hospitals and healthcare systems, such as the transmission of orders from the hospital ward to departmental (e.g., radiology or pathology) systems, the delivery of results back to clinicians, and the generation of bills for services. When compliance and security issues are added on top, it becomes untenable for academic programs to maintain.

Some in academic informatics lament this closing of an era. But ever the glass-half-full optimist, I do not necessarily view it as a bad thing. Now that EHR systems are mission-critical to healthcare delivery organizations and must be integrated with their myriad of other information systems, it is probably inappropriate for academic groups to develop and maintain them.

Fortunately, there are emerging tools for innovation on top of the mundane “plumbing” of the EHR. Probably the leading candidate to serve as such a platform is SMART on FHIR. A growing number of academic programs are using SMART on FHIR to innovate on top of commercial EHRs. Granted, some of the commercial EHR systems (e.g., Epic) currently support the Fast Health Interoperability Resources (FHIR) standard incompletely, but we can remember another cliche, which is the famous Wayne Gretzky quote of skating not to where the puck is, but where it will be going. As SMART on FHIR matures, I can envision it as a great platform for apps that read and write data from the EHR.

In some ways I liken the situation to the relationship between computer operating systems and academic computer science departments. Very few academic computer scientists do research on operating systems these days. Most academic computer scientists, just like the rest of us, use Windows, MacOS, Linux, iOS, and/or Android. Today’s modern operation systems are complex and require large companies to maintain. Most academic computer science research now occurs on top of those operating systems. There, academics can carry out their innovation knowing that the operating systems (to the best of their capabilities) can manage the data in files, connect to networks, and keep information secure.

This new environment should lead to new types of innovations in informatics, which take place on top of commercial EHRs, which may now be better viewed as the “operating system” that provides the foundational functionality upon which academic informatics innovators can build. This could be a boon to places like my institution, which never even had a home-grown EHR. We are certainly pursuing SMART on FHIR development with rigor going forward.

Friday, November 3, 2017

Why Pursue a Career in Biomedical and Health Informatics?

There are an ever-growing number of career opportunities for those who enjoy working with data, information, and knowledge to improve the health of individuals and the population in the field of biomedical and health informatics. This field develops solutions to improve the health of individuals, the delivery of healthcare, and advancing of research in health-related areas. Jobs in informatics are highly diverse, running the spectrum of the highly technical to those that are very interpersonal. All are driven, however, by the goal of using data, information, and knowledge to improve all aspects of human health [1, 2].

Within biomedical and health informatics are a myriad of sub-disciplines, all of which apply the same fundamental science and methods but are focused on particular (and increasingly overlapping) subject domains. Informatics can be viewed as proceeding along a continuum from the cellular level (bioinformatics) to the person (medical or clinical informatics) to the population (public health informatics). Within clinical informatics may be a focus on specific healthcare disciplines, such as nursing (nursing informatics), pharmacy (pharmacy informatics), and radiology (radiology informatics) as well as on consumers and patients (consumer health informatics). There are also disciplines in informatics that apply across the cell-person-population spectrum:
  • Imaging informatics – informatics with a focus on the storage, retrieval, and processing of images
  • Research informatics – the use of informatics to facilitate biomedical and health research, including a focus on clinical and translational research that aims to accelerate research findings into healthcare practice
Another emerging new discipline that has substantial overlap with informatics is data science (or data analytics in its more applied form). The growth in use of electronic health records, gene sequencing, and new modalities of imaging, combined with advances in machine learning, natural language understanding, and other areas of artificial intelligence provide a wealth of data and tools for use to improve health. But informatics is not just about processing the data; the range of activity includes insuring the usability of systems for entering and working with high-quality data to applying the results of data analysis to improve the health of individuals and the population as well as the safety and quality of healthcare delivery.

The variety of jobs in biomedical and health informatics means that there is a diversity in the education of those holding the jobs. Informatics has a body of knowledge and a way of thinking that advance the field. It is also an interdisciplinary field, existing at the interface of a number of other disciplines. For this reason, education has historically been at the graduate level, where individuals combine their initial education in one of the core disciplines (e.g., health or life sciences, computing or information sciences, etc.) with others as well as the core of informatics. An example of such a program is ours at Oregon Health & Science University (OHSU).

A variety of data show that professionals from this discipline are in high demand. Job sites such as Monster.com show a wide variety of well-paying jobs. A previous analysis of online job postings found 226,356 positions advertised [3]. More recently, a survey of healthcare IT leaders shows continued demand for professionals in this area [4]. For physicians working in the field, there is now a new medical subspecialty [5]. The nursing profession has had a specialization in nursing informatics for over a decade, and we are likely to see more certifications, for example the American Medical Informatics Association (AMIA) developing an Advanced Health Informatics Certification that will apply to all informatics professionals, not just those who are physicians and nurses.

Does one need to be a clinician to be trained and effective in a job in clinical informatics? Must one know computer programming to work in any area of informatics? The answers are no and no. Informatics is a very heterogeneous field, and there are opportunities for individuals from all types of backgrounds. One thing that is clear, however, is that the type of informatics job you assume will be somewhat dependent on your background. Those with healthcare backgrounds, particularly medicine or nursing, are likely to draw on that expertise for their informatics work in roles such as a Chief Medical or Nursing Informatics Officer. Those with other backgrounds still have plenty of opportunities in the field, with a wide variety of jobs and careers that are available.

Informatics is a career for the 21st century. There are a wide variety of jobs for people with diverse backgrounds, interests, and talents, all of whom can serve the health of society through effective use of information and associated technologies.

References

1. Hersh, W (2009). A stimulus to define informatics and health information technology. BMC Medical Informatics & Decision Making. 9: 24. http://www.biomedcentral.com/1472-6947/9/24/.
2. Hersh, W and Ehrenfeld, J (2017). Clinical Informatics. In Health Systems Science. S. Skochelak, R. Hawkins, L. Lawson et al. New York, NY, Elsevier: 105-116.
3. Schwartz, A, Magoulas, R, et al. (2013). Tracking labor demand with online job postings: the case of health IT workers and the HITECH Act. Industrial Relations: A Journal of Economy and Society. 52: 941–968.
4. Anonymous (2017). 2017 HIMSS Leadership and Workforce Survey. Chicago, IL, Healthcare Information Management Systems Society. http://www.himss.org/library/2017-himss-leadership-and-workforce-survey.
5. Detmer, DE and Shortliffe, EH (2014). Clinical informatics: prospects for a new medical subspecialty. Journal of the American Medical Association. 311: 2067-2068.

Wednesday, November 1, 2017

From Vendor-Centric to Patient-Centric Data Stores

There is growing consensus that patients should be owners and stewards of their personal health and healthcare data. They should also have the right to control access to chosen healthcare professionals, institutions, and researchers. Current information systems in the healthcare system do not facilitate this point of view, as data is for the most part stored in the siloed systems of the places where patients obtain care.

If we accept the view that patients own their data and can control access to it, how do we facilitate the transition from provider-centric to patient-centric data storage? Such an ecosystem will require new models for data storage and its access. Existing business models for clinical systems will need to adapt to this new approach, although new business opportunities will emerge for companies and others that can succeed in this new environment.

My own view is that every patient should have a cloud-based data store to which they (or a designated surrogate for minors or those unable to give consent for access) allow access to designated healthcare providers or others. A new business model will emerge for companies that facilitate connection of authorized systems to a patient’s data. Even existing electronic health record (EHR) vendors could participate, especially as many of them are building large data centers and cloud-based solutions (although will require changes in their current business models away from their keeping the data in their silos).

The market for this approach will necessarily have some regulation, most likely from the government. Those participating will need to adhere to a common set of standards. Systems will also need to maintain the integrity of data deposited by clinicians. Patients should be allowed to annotate data, and even challenge it, but not modify it (unless the clinician amends it).

This has implications for EHR systems of the future. The current large monolithic systems will need to give way to those that access data in a standardized way. The new EHR “system” may not look much different from current systems (although hopefully will), but instead of accessing data from within its own stores, it will instead pull and push back data from the patient’s designated store.

A recent Perspective in JAMA lays out three necessary components for this vision to succeed [1]. The first is standard data elements. Among the approaches likely to achieve this are initiative such as SMART on FHIR [2] and the Clinical Information Modeling Initiative (CIMI) [3, 4].

The JAMA piece posits a second required component, a standard data receipt for each clinical encounter, with push of the encounter into the patient’s data store. Methods such as blockchain may facilitate the integrity needed to maintain the sanctity of the clinician's input.

Finally, the third is a contract (I may have preferred calling it a compact) that sets the rules for access and control for such a system.

One question the JAMA piece did not was address was, who pays? This is never an easy question in healthcare, since patients do not pay directly for many things. Instead, their insurance pays. So a conversation will be necessary to determine how such a system is financed.

As with many aspects of informatics, the technology to implement all of this currently exists, and the real challenges are how to create the market and the regulations for this major transition in how patient data is stored and accessed. As with all developments in informatics, there will be “unintended consequences” along the way that will need thoughtful discussion among all stakeholders in this endeavor.

References

1. Mikk, KA, Sleeper, HA, et al. (2017). The pathway to patient data ownership and better health. Journal of the American Medical Association. 318: 1433-1434.
2. Mandel, JC, Kreda, DA, et al. (2016). SMART on FHIR: a standards-based, interoperable apps platform for electronic health records. Journal of the American Medical Informatics Association. 23: 899-908.
3. Oniki, TA, Zhuo, N, et al. (2016). Clinical element models in the SHARPn consortium. Journal of the American Medical Informatics Association. 23: 248-256.
4. Moreno-Conde, A, Moner, D, et al. (2015). Clinical information modeling processes for semantic interoperability of electronic health records: systematic review and inductive analysis. Journal of the American Medical Informatics Association. 22: 925-934.

Thursday, October 19, 2017

Markets Are Great When They Work, But They Don't in Most Aspects of Medicine

One of most maddening parts of the healthcare debate to me concerns the role of markets and the mythos that if we somehow let the free market work, a new era of low costs and high-quality care would usher upon us. I have written before that while I believe free markets are the best approach for optimizing the quantity and quality of most consumer goods, healthcare is inherently different. For the most part, we do not seek healthcare as a market good, but rather something we must use when we are sick to make us better (or to prevent us from getting sick). When we are acutely ill, there is very little choice we can make, and even when we are not sick, there are limits in information that prevent us from making the “best” purchasing decision (where best may not only refer to money, but also perceived quality and other aspects of care we value).

To say that healthcare will improve if we let markets operate is naive at best. While some healthcare organizations might perform better due to attention to cost and efficiency, at the end of the day, healthcare is not something we want to leave to pure market principles. But ironically, despite the lack of operating as a free market, healthcare is profitable for many. I provided a number of examples in that posting a few years ago, and now some new information has come to the fore.

One is an interesting new book by Elisabeth Rosenthal, Editor-in-Chief of Kaiser Health News and a physician and former correspondent for The New York Times [1]. Dr. Rosenthal's book explores how all of the major players of healthcare - insurance companies, hospitals, physicians, pharmaceutical companies, medical device manufacturers, and even codes and researchers operate under a set of “rules” of a highly dysfunctional market. These rules (with copious examples to back them up in the book and more elsewhere [2]), to quote, are:
  1. More treatment is always better. Default to the most expensive option.
  2. A lifetime of treatment is preferable to a cure.
  3. Amenities and marketing matter more than good care.
  4. As technologies age, prices can rise rather than fall.
  5. There is no free choice. Patients are stuck. And they’re stuck buying American.
  6. More competitors vying for business doesn’t mean better prices; it can drive prices up, not down.
  7. Economies of scale don’t translate to lower prices. With their market power, big providers can simply demand more.
  8. There is no such thing as a fixed price for a procedure or test. And the uninsured pay the highest prices of all.
  9. There are no standards for billing.
  10. Prices will rise to whatever the market will bear. The mother of all rules!
The latter rule drives home the point of this posting. Even though the book is written from a somewhat liberal political bent, a political conservative could also find cause with the book in its demonstration how the market is distorted by special interests that corrupt government attempts to regulate the market.

More specific aspects of market dysfunction are provided by two recent papers, both authored by OHSU faculty. The first paper by Prasad and Mailankody calls into questions the oft-stated high costs of drug development, which are used to justify the ever-increasing prices charged [3]. Some have been highly critical of their methodology [4] while others have noted that the costs are highly variable but still do not bear any connection to the prices charged [5]. There is no question that drug development is still expensive, and a pharmaceutical company may have many misses in between hits. But we need to be reasonable about using the cost of developing drugs to justify prices, especially in monopolistic or other situations where market-style choices are not available.

Another paper looks at repository corticotropin (rACTH) injection [6]. Although there is no evidence that this treatment is more effective for any indication than much cheaper synthetic corticosteroid drugs, its use has grown substantially, due both to intensive marketing efforts as well as conflicts of interest among those who use it most frequently. It is also one of a growing number of drugs whose price has risen substantially, long after its development.

Other countries besides the US struggle with how to price drugs and other aspects of healthcare. The methods they employ, from negotiating on a national level to saying no to drugs that do not pass muster in cost-benefit analyses, are probably the only realistic solution when markets do not work and when government control of them gets subverted by special interests.

References
1. Rosenthal, E (2017). How Healthcare Became Big Business and How You Can Take It Back. New York, NY, Penguin Press.
2. Rosenthal, E (2017). How Economic Incentives have Created our Dysfunctional US Medical Market. Medium. https://medium.com/@RosenthalHealth/how-economic-incentives-have-created-our-dysfunctional-us-medical-market-b681c51d6436.
3. Prasad, V and Mailankody, S (2017). Research and development spending to bring a single cancer drug to market and revenues after approval. JAMA Internal Medicine. Epub ahead of print.
4. Herper, M (2017). The Cost Of Developing Drugs Is Insane. That Paper That Says Otherwise Is Insanely Bad. Forbes, October 16, 2017. http://www.forbes.com/sites/matthewherper/2017/10/16/the-cost-of-developing-drugs-is-insane-a-paper-that-argued-otherwise-was-insanely-bad/.
5. Love, J (2017). Perspectives on Cancer Drug Development Costs in JAMA. Bill of Health. http://blogs.harvard.edu/billofhealth/2017/09/13/perspectives-on-cancer-drug-development-costs-in-jama/.
6. Hartung, DM, Johnston, K, et al. (2017). Trends and characteristics of us medicare spending on repository corticotropin. JAMA Internal Medicine. Epub ahead of print.

Tuesday, October 17, 2017

The Still-Incomplete Answering of Questions About Physician Time With Computers

Another couple of studies have been published documenting the amount of time physicians spend with computers in primary care [1] and ophthalmology [2] clinics. Clearly these and other recent studies [3,4] show that physicians spend too much time with the electronic health record (EHR), especially when phrases like “pajama time” enter into the vernacular to refer to documentation that must take place after work at home because it could not be completed during the day.

But one aspect of these studies that has always concerned me is that there is no measure of what is the appropriate amount of time for physicians to spend not in the presence of the patient. This includes tasks like reviewing data that will help inform making current decisions as well as entering data that other team members caring for the patient will use to inform their decision-making. While some dispute the value of our current approaches to measurement of quality of care delivered [5], I believe that most physicians accept there should be some measure of accountability for their decisions, especially given the high cost of care. This means that some time and effort must be devoted by physicians to measuring and improving the quality of care that they deliver.

The newest time-motion study from primary care once again reiterates the large amount of time that the EHR consumes of the physician day [1]. In this study, that time was found to be 5.9 hours of an 11.4-hour workday and 1.4 hours after hours. But if we look at the tasks on which this time was spent (Table 3 of the paper), we cannot deny that just about all of them are important to overall patient care, even if too much time is spent on them. Do we not want physicians to have some time for reviewing results, following up with patients, looking at their larger practice, etc.?

I have noted in the past that physicians have always spent a good deal of time not in the presence of patients. I have cited studies of this that even pre-date the computer era, but someone recently pointed me to an even older study from 1973 [6]. In this study of physicians in a general medicine clinic, 103 physicians were found to spend 37.8% of their time charting, 5.3% consulting, 1.7% in other activities, and the remaining 55.2% of time with the patient. So even in the 1970s, ambulatory physicians spent only slightly more than half of their time in the presence of patients. As one who started his medical training in that era, I can certainly remember time spent trying to decipher unreadable hand-writing as well as trying to track down paper charts and other missing information. I also remember caring for patients with no information except for what the patient could recollect.

Clearly we have a great deal of work to do to make our current EHRs better, especially in streamlining both data entry and retrieval. We also need to be careful not to equate measures like clicks and screens with performance, as a study from our institution found that those who efficiently navigated the most information in the record achieved the best results in a simulation task [7]. What we really need is studies that measure time taken for information-related activities in physician practice and determine which are most important to optimal patient care. Further research must also be done to optimize usability and workflow, including determining when other members of the team can contribute to overall efficiency of the care process.

References

1. Arndt, BG, Beasley, JW, et al. (2017). Tethered to the EHR: primary care physician workload assessment using ehr event log data and time-motion observations. Annals of Family Medicine. 15: 419-426.
2. Read-Brown, S, Hribar, MR, et al. (2017). Time requirements for electronic health record use in an academic ophthalmology center. JAMA Ophthalmology. Epub ahead of print.
3. Sinsky, C, Colligan, L, et al. (2016). Allocation of physician time in ambulatory practice: a time and motion study in 4 specialties. Annals of Internal Medicine. 165: 753-760.
4. Tai-Seale, M, Olson, CW, et al. (2017). Electronic health record logs indicate that physicians split time evenly between seeing patients and desktop medicine. Health Affairs. 36: 655-662.
5. Marcotte, BJ, Fildes, AG, et al. (2017). U.S. Health Care Reform Can’t Wait for Quality Measures to Be Perfect. Harvard Business Review, October 4, 2017. https://hbr.org/2017/10/u-s-health-care-reform-cant-wait-for-quality-measures-to-be-perfect.
6. Mamlin, JJ and Baker, DH (1973). Combined time-motion and work sampling study in a general medicine clinic. Medical Care. 11: 449-456.
7. March, CA, Steiger, D, et al. (2013). Use of simulation to assess electronic health record safety in the intensive care unit: a pilot study. BMJ Open. 3: e002549. http://bmjopen.bmj.com/content/3/4/e002549.long.

Tuesday, October 10, 2017

The Resurgence and Limitations of Artificial Intelligence in Medicine

I came of age in the biomedical informatics world in the late 1980s, which was near the end of the first era of artificial intelligence (AI). A good deal of work in what we called medical informatics at that time focused on developing “expert systems” that would aim to mimic, and perhaps someday replace, the cognition of physicians and others in healthcare.

But it was not to be, as excessive hype, stoked with misguided fears about losing out to Japan, led to the dreaded “AI winter.” Fortunately I had chosen to pursue research in information retrieval (search), which of course blossomed in the 1990s with the advent of the World Wide Web. The “decision support” aspect of AI did not go away, but rather was replaced with focused decision support that aimed to augment the cognition of physicians and not replace it.

In recent years, it seemed that the term AI had almost disappeared from the vernacular. My only use of it came in my teaching, where I consider it essential to learning to understand the history of the informatics field.

But now the term is seeing a resurgence in use [1]. Furthermore, modern AI systems take different approaches. Rather than trying to represent the world and create algorithms that operate on those representations, AI has reemerged due to the convergence of large amounts of real-world data, increases in storage and computational capabilities of hardware, and new computation methods, especially in machine learning.

This has given rise to a new generation of applications that again try to outperform human experts in medical diagnosis and treatment recommendations. Most of these successful applications employ machine learning, sometimes so-called “deep learning,” and include:
  • Diagnosing skin lesions – keratinocyte carcinomas vs. benign seborrheic keratoses and malignant melanomas vs. benign nevi [2]
  • Classifying metastatic breast cancer on pathology slide images [3]
  • Predicting longevity from CT imaging [4]
  • Predicting cardiovascular risk factors from retinal fundus photographs [5]
  • Detecting arrhythmias comparable to cardiologists [6]
Unfortunately, the hype is building back too, perhaps exemplified by the IBM Watson system [7]. I recently came across an interesting article by MIT Emeritus Professor Rodney Brooks that put a nice perspective on it and stimulated some of my own thinking [8].

From my perspective, the most interesting part of Brook's piece concerns “performance vs. competence.” He warns that we must not confuse performance on a single task, such as making the diagnosis from an image, with the larger task of competence, such as being a physician. As he states, “People hear that some robot or some AI system has performed some task. They then generalize from that performance to a competence that a person performing the same task could be expected to have. And they apply that generalization to the robot or AI system.”

I have no doubt that algorithmic accomplishments in the above medical examples will be used by physicians in the future, just as they now uses automated interpretation of EKGs and other tests that comers, in part, from earlier AI work. But I have a hard time believing that the practice of medicine will evolve to patients submitting pictures or blood samples to computers to obtain an automated diagnosis and treatment plan. It will be a long time before computers can replace the larger perspective that an experienced physician brings to a patient’s condition, to say nothing of the emotional and other support that goes along with the context of the diagnosis and its treatment. Indeed, the doctors of Star Trek are augmented by automated tools but in the end, still compassionate individuals who diagnose and treat patients.

Somewhat tongue in cheek, I won’t say that machines replacing physicians is impossible, since there is a quote in a different part of the article, attributed to Arthur C. Clarke, aimed at people like myself: “When a distinguished but elderly scientist states that something is possible, he is almost certainly right. When he states that something is impossible, he is very probably wrong.” As someone who does not consider himself quite yet to be elderly, but is has worked in the field for several decades, I want be careful to not say that something is “impossible.”

But on the other hand, while I am certain that we will see growing numbers of tools to improve the practice of medicine based on machine learning and other analysis of data, it is very difficult for me to see no continued role for the empathetic physician who puts the findings in context and supports in other ways the patient whose diagnosis and treatment are augmented by AI.

References

1. Stockert, J (2017). Artificial intelligence is coming to medicine — don’t be afraid. STAT, August 18, 2017. https://www.statnews.com/2017/08/18/artificial-intelligence-medicine/.
2. Esteva, A, Kuprel, B, et al. (2017). Dermatologist-level classification of skin cancer with deep neural networks. Nature. 542: 115-118.
3. Liu, Y, Gadepalli, K, et al. (2017). Detecting cancer metastases on gigapixel pathology images. arXiv.org: arXiv:1703.02442. https://arxiv.org/abs/1703.02442.
4. Oakden-Rayner, L, Carneiro, G, et al. (2017). Precision radiology: predicting longevity using feature engineering and deep learning methods in a radiomics framework. Scientific Reports. 7: 1648. https://www.nature.com/articles/s41598-017-01931-w.
5. Poplin, R, Varadarajan, AV, et al. (2017). Predicting Cardiovascular Risk Factors from Retinal Fundus Photographs using Deep Learning, Arxiv.org. https://arxiv.org/abs/1708.09843.
6. Rajpurkar, P, Hannun, AY, et al. (2017). Cardiologist-Level Arrhythmia Detection with Convolutional Neural Networks, Arxiv.org. https://arxiv.org/abs/1707.01836.
7. Ross, C and Swetlit, I (2017). IBM pitched its Watson supercomputer as a revolution in cancer care. It’s nowhere close. STAT, September 5, 2017. https://www.statnews.com/2017/09/05/watson-ibm-cancer/.
8. Brooks, R (2017). The Seven Deadly Sins of AI Predictions. MIT Technology Review, October 6, 2017. https://www.technologyreview.com/s/609048/the-seven-deadly-sins-of-ai-predictions/.

Friday, October 6, 2017

HITECH Retrospective: Glass Half-Full or Half-Empty?

Last month, the New England Journal of Medicine published a pair of Perspective pieces about the Health Information Technology for Clinical and Economic Health (HITECH) Act (both available open access). The first was written by the current and three former Directors of the Office of the National Coordinator for Health IT (ONC) [1]. The second was written by two other national thought leaders who also have a wealth of implementation experience [2]. Both papers discuss the accomplishments and challenges, with the Directors’ piece more positive (glass half-full) than the outside thought leaders (glass half-empty).

In the first piece, Washington et al. pointed to the accomplishments of the HITECH era, where we have finally seen digitization of the healthcare industry, one of the last major industries to do so. The funding and other support provided by the HITECH Act have led to near-universal adoption of electronic health records (EHRs) in hospitals and substantial uptake in physician offices. They also point to a substantial body of evidence that supports the functionality required under the “meaningful use” program.

These authors also note the shortcomings of this rapid adoption, when not only the people but also healthcare organizations and even EHR systems were not ready for rapid uptake. They acknowledge that many healthcare providers are frustrated by poor usability and lack of actionable information, which they attribute in part to proprietary standards and information blocking. They advocate moving forward with a push for interoperability, secure and seamless flow to data, engagement of patients, and development of a learning health system.

Halamka and Tripathi, on the other hand, take a somewhat more negative view. While acknowledging the gains in adoption that have occurred under HITECH, they note (my emphasis), “We lost the hearts and minds of clinicians. We overwhelmed them with confusing layers of regulations. We tried to drive cultural change with legislation. We expected interoperability without first building the enabling tools. In a sense, we gave clinicians suboptimal cars, didn’t build roads, and then blamed them for not driving.” They note that the process measures of achieving meaningful use have become an end in themselves, without looking at the larger picture of how to improve quality, safety, and cost of healthcare. They do point a path forward, calling for streamlining of requirements to insure interoperability and a focused set of appropriate quality measures, with EHR certification centered on this as well. They also encourage more market-driven solutions, with government regulation focused on providing incentives and standards for desired outcomes.

Taking more of a glass half-full point of view, I wrote in this blog several months ago that EHR adoption has “failed to translate” the benefits that have been borne out in practical research studies. I noted the success of some institutions, mostly integrated delivery systems, in successfully adoption EHRs, and also persistence in healthcare of the problems that motivate them, such as suboptimal quality and safety of care while costs continue to rise.

A few other recent pieces have painted a path forward. The trade journal Medical Economics interviewed several physician informatics experts to collate their thoughts on what features a highly useful EHR might have, especially in contrast to systems that a majority of physicians complain about today [3]. The set of features does not represent much more than we expect of all of our computer applications these days, but whose availability in EHRs continues to be elusive:
  • Make systems work together – achieve interoperability of data across systems
  • Make it easier and more intuitive – make systems easier to understand and use; reduce cognitive load
  • Add better analytics – add more capability to use data to coordinate and improve care
  • Support high-tech care delivery – be able to engage patients in through video and asynchronous communication
  • Make EHRs smarter – systems anticipate user actions and provide reversible shortcuts
  • Become a virtual assistant – assist the clinician with all aspects of managing the delivery of care
A couple other recent Perspective pieces in the New England Journal of Medicine provide some additional solutions. Two well-known informatics thought leaders from Boston Children’s Hospital lay out the case for an application programming interface (API) approach to the EHR based on standards and interoperability [4]. Although this piece has a different focus than the previous one, there is no question that the data normalization from FHIR Resources, the flexible interfaces that can be developed using SMART, and the ease of developing it all via SMART on FHIR could make those goals achievable.

In the second other piece, a well-known leader in primary care medicine calls for delivering us from the current EHR purgatory [5]. His primary solutions focus on reforming the healthcare payment system, moving toward payment for outcomes and not volume, i.e., value-based care.

I agree with just about all that these authors have to say. While the meaningful use program required some benchmarks to insure the HITECH incentive money was appropriately spent, we are probably beyond the need to continue requiring large numbers of process measures. We need to focus on standards and interoperability that will open the door to doing more with the EHR than just documenting care, such as predictive analytics and research. Continuing to reform our payment system is a must, not only for better EHR usage but also to control cost and improve health of the population.

There is also an important role for clinical informatics professionals and leaders, who must lead the way in righting the problems of the EHR and other information systems in healthcare. I have periodically reached back to a quote of my own after the unveiling of the HITECH Act: “This is a defining moment for the informatics field. Never before has such money and attention been lavished on it. HITECH provides a clear challenge for the field to 'get it right.' It will be interesting to look back on this time in the years ahead and see what worked and did not work. Whatever does happen, it is clear that informatics lives in a HITECH world now.” Informatics does live in this world now, and we must lead the way, not letting perfect get in the way of good, but making EHRs most useful for patients, clinicians, and all other participants in the healthcare system.

References

1. Washington, V, DeSalvo, K, et al. (2017). The HITECH era and the path forward. New England Journal of Medicine. 377: 904-906.
2. Halamka, JD and Tripathi, M (2017). The HITECH Era in Retrospect. New England Journal of Medicine. 377: 907-909.
3. Pratt, MK (2017). Physicians dream up a better EHR. Medical Economics, May 22, 2017. http://medicaleconomics.modernmedicine.com/medical-economics/news/physicians-dream-better-ehr.
4. Mandl, KD and Kohane, IS (2017). A 21st-century health IT system — creating a real-world information economy. New England Journal of Medicine. 376: 1905-1907.
5. Goroll, AH (2017). Emerging from EHR purgatory — moving from process to outcomes. New England Journal of Medicine. 376: 2004-2006.

Sunday, July 9, 2017

Kudos for the Informatics Professor, Winter/Spring 2017

As always, I have had the ongoing opportunity to publish, speak, and otherwise disseminate information about the informatics in the new year since my last “kudos” posting last fall.

One accolade I received was election as an inaugural member of the International Academy of Health Sciences Informatics (IAHSI). Informatics leaders from around the world voted to establish the initial membership of 121 leaders from around the world. I was delighted to be among the inaugural group who will be inducted during the 16th World Congress on Medical and Health Informatics (Medinfo 2017) in Hangzhou, China in August, 2017.

I am also pleased to report on a major accomplishment of the Oregon Health & Science University (OHSU) Biomedical Informatics Graduate Program, of which I am Director, received notice of renewal of its NIH National Library of Medicine (NLM) Training Grant in Biomedical Informatics & Data Science. The grant will provide $3.8 million to fund PhD and postdoc students in the program over the next five years.

During this time I also had the opportunity to publish a chapter in an important new book published by the American Medical Association, which I have already written about (Hersh W, Ehrenfeld J, Clinical Informatics, in Skochelak SE and Hawkins RE (eds.), Health Systems Science, 2017, 105-116).

I also gave a number of talks during this time, including one at the Data Day Health event in Austin, TX on January 15, 2017. The title of my talk was, Big Data Is Not Enough: People and Systems Are Needed to Benefit Health and Biomedicine.

I gave another talk at an interesting conference devoted to the challenges of the electronic health record. The conference, The Patient, the Practitioner, and the Computer, took place in Providence, RI on March 17-19, 2017. The title of my talk was, Talk, Failure to Translate: Why Have Evidence-Based EHR Interventions Not Generalized? This talk laid the groundwork for my subsequent blog posting published in this blog as well as The Health Care Blog.

Finally, I also had the opportunity to lead a couple of webinars. One was for the H3ABioNet Seminars series of the Pan African Bioinformatics Network for H3Africa, which took place on April 18, 2017 and covered the same topic as the Data Day Health talk described above.

The other Webinar, Implementing Clinical Informatics in the MD Curriculum and Beyond, was delivered to the Association of Faculties of Medicine of Canada on June 13, 2017.

Monday, July 3, 2017

Eligibility for the Clinical Informatics Subspecialty: 2017 Update

Some of the most highly viewed posts in this blog have been those on eligibility for the clinical informatics subspecialty for physicians, the first in January, 2013 and updates in June, 2014 and March, 2016. A noteworthy event occurred last November when the "grandfathering" period was extended to 2022.

One of the reasons for these posts has been to use them as a starting point for replying to those who email or otherwise contact me with questions about their own eligibility. After all these years, I still get such emails and inquiries. While the advice in the previous posts is largely still correct, there have been a number of small changes, most notably the extension of the grandfathering period. There are still (only) two boards that qualify physicians for the exam, the American Board of Preventive Medicine (ABPM) and the American Board of Pathology (ABP). ABP handles qualifications for those with Pathology as a primary specialty and ABPM handles those from all other primary specialties. (Kudos to ABPM for finally updating and improving their Web site!)

The official eligibility statement for the subspecialty is essentially unchanged from the beginning of the grandfathering period and is documented on the ABPM and ABP Web sites. As clinical informatics has been designated a subspecialty of all medical specialties, this means that physicians must be board-certified in one of the 23 primary specialties (such as Internal Medicine, Family Medicine, Surgery, Radiology, etc.). Those who have let their primary board specialty lapse or who never had one are not eligible to become board-certified in the subspecialty. They must also have an active and unrestricted medical license in one US state.

For the first ten years of the subspecialty (through 2022), the "practice pathway" or completing a "non-traditional fellowship" (i.e. one not accredited by the Accreditation Council for Graduate Medical Education, or ACGME) will allow physicians to "grandfather" the training requirements, i.e., take the exam without completing a formal fellowship accredited by the ACGME. The practice pathway requires that a physician have "practiced" clinical informatics for a minimum of 25% time for three of the last five years. Time spent in formal informatics education is credited at one-half of practice, meaning that a recent master's degree or other educational program should be sufficient to achieve board eligibility. The non-traditional fellowship allows board eligibility by completing a non-ACGME accredited informatics fellowship, such as one sponsored by the National Library of MedicineVeteran's Administration, or others. The ABPM Web site implies, but does not explicitly state, that a master's degree program will qualify one via this pathway as well. A number of physicians have achieved board eligibility (and subsequent certification) by completing the Master of Biomedical Informatics program we offer at Oregon Health & Science University (OHSU).

As always, I must provide the disclaimer that ABPM and ABP are the ultimate arbiters of eligibility, and anyone who has questions should contact ABPM or ABP. I only interpret their rules.

One bit of advice I always give to any physician who meets the practice pathway qualifications (or can do so by 2022) is to sit for the exam before the end of grandfathering period. After that time, the only way to become certified in the subspecialty will be to complete a two-year, on-site, ACGME-accredited fellowship. While we were excited to be the third program nationally to launch a fellowship at OHSU, it will be a challenge for those who are mid-career, with jobs, family, and/or geographical roots, to up and move to become board-certified.

Starting in 2023, however, the only pathway to board eligibility will be via an ACGME-accredited fellowship. There are now nearly 30 such fellowships. But starting in 2023, board certification for physicians not able to pursue fellowships will become much more difficult. There are many categories of individuals for whom getting certified in the subspecialty after the grandfathering period will be a challenge:
  • Those who are mid-career - I have written in the past that the age range of OHSU online informatics students, including physicians, is spread almost evenly across all ages up to 65. Many physicians transition into informatics during the course of their careers, and not necessarily at the start.
  • Those pursuing research training in informatics, such as an NLM fellowship or, in the case of some of our current students, in an MD/PhD program (and will not finish their residency until after the grandfathering period ends) - Why must these individuals also need to pursue an ACGME-accredited clinical fellowship to be eligible for the board exam given such comparable levels of informatics training, even if it will be somewhat less clinical?
  • Those who already have had long medical training experiences, such as subspecialists with six or more years of training - Would such individuals want to do two additional years of informatics when, as I recently pointed out, it might be an ideal experience for them to overlay informatics and their subspecialty training?
There will be one other option for physicians who are not eligible for board exam, which will be the Advanced Health Informatics Certification (AHIC) being developed by AMIA. This certification, according to current plans, will be available to all practitioners of informatics who have master's degrees in both a health profession and informatics, or a PhD in only informatics. This will also provide a pathway for physicians who are not eligible for the board certification pathway. I am looking forward to AMIA releasing its detailed plans for this certification, not only for these physicians but also other practitioners of informatics.

As I have also stated before, I also hold out hope for the ideal situation for physician-informaticians, which in my opinion will be our own specialty or some other certification process. The work of informatics carried out by physicians is unique and not really dependent on their initial clinical specialty (or lack of one at all). I still believe that robust training is required to be an informatician; I just don't believe it needs to be a two-year, in-residence experience. An online master's degree or something equivalent, with a good deal of experiential learning in real-world settings, should be an option. The lack of these sorts of options will keep many talented physicians from joining the field. Such training would also be consistent with the 21st century knowledge workforce that will involve many career transitions over one's working lifetime.

Friday, May 19, 2017

Failure to Translate: Why Have Evidence-Based EHR Interventions Not Generalized?

The adoption of electronic health records (EHRs) has increased substantially in hospitals and clinician offices in large part due to the “meaningful use” program of the Health Information Technology for Clinical and Economic Health (HITECH) Act. The motivation for increasing EHR use in the HITECH Act was supported by evidence-based interventions for known significant problems in healthcare. In spite of widespread adoption, EHRs have become a significant burden to physicians in terms of time and dissatisfaction with practice. This raises a question as to why EHR interventions have been difficult to generalize across the health care system, despite evidence that they contribute to addressing major challenges in health care.

EHR interventions address known problems in health care of patient safety, quality of care, cost, and accessibility of information. These problems were identified a decade or two ago but still persist. Patient safety problems due to medical errors were brought to light with the publication of the Institute of Medicine report, To Err is Human [1], with recent analyses indicating medical errors are still a problem and may be underestimated [2]. Deficiencies in the quality of medical care delivered was identified almost a decade and a half ago [3] and continues to be a problem [4]. The excess cost of care in the US has been a persistent challenge [5] and continues to the present [6]. A final problem motivating the use of EHRs has been access to patient information that is known to exist but is inaccessible [7], with access now stymied by “information blocking” [8].

These problems motivated initial research on the value of EHRs. One early study found that display of charges during order entry resulted in a 12.7% decrease in total charges and 0.9 days shorter length of stay [9]. Another study found that computerized provider order entry (CPOE) led to nonintercepted serious medication errors decreasing by 55%, from 10.7 events per 1000 patient-days to 4.86 events, with preventable ADEs reduced by 17% [10]. Additional studies of CPOE showed a reduction in redundant laboratory tests [11] and improved prescribing behavior of equally efficacious but less costly medications [12]. Another study found that CPOE increased the use of important “corollary orders” by 25% [13]. Additional studies followed from many institutions that were collated in systematic reviews and built the evidence-based case for EHRs [14-17]. There were some caveats about the evidence base, such as publication bias [18] and the benefits mostly emanating from “health IT leader” institutions that made investments both in EHRs and the personnel and leadership to use them successfully.

Despite the robust evidence base, why have the benefits of EHR adoption failed to generalize now that we have widespread adoption? There are several reasons, some of which emanate from well-intentioned circumvention of the EHR for other purposes. For example, both institutions and payers (including the US government) view the EHR as a tool and modify prioritization of functions for cost reduction. There is also a desire to use the EHR to collect data for quality measurement - which should be done - but not in ways that add substantial burden to the clinician. Additionally, there are the meaningful use regulations, which were implemented to insure that the substantive government investment in EHRs led to their use in clinically important ways but are now criticized as being a distraction for clinicians and vendors.

There are also some less nobly intentioned reasons why the value of EHRs has not generalized. One is “volume-based billing,” or the connection of billing to the volume of documentation, which leads to pernicious documentation practices [19]. Another is financial motivation for revenues of EHR vendors, who may be selling systems that are burdensome to use or not ready for widespread adoption. Much of the early evidence for the benefits of EHRs came from “home grown” systems, most of which have been replaced by commercial EHRs. These commercial EHRs do more than just provide clinical functionality; they redesign the delivery of care, sometimes beneficial but other times not. It thus can take a large expenditure on an EHR infrastructure before any marginal benefit from a particular clinical benefit can be achieved, even if the rationale for that function is evidence-based.

Nonetheless, a number of “health IT leader” institutions have sustained successful EHR use and quality of care, such as Kaiser-Permanente [20], Geisinger [21], and the Veteran’s Health Administration [22]. These institutions are not only integrated delivery systems but also have substantial expertise in clinical informatics. These qualities enable them to prioritize use of IT in the context of patients and practitioners as well as incorporate known best practices from clinical informatics focused on standards, interoperability, usability, workflow, and user engagement.

How, then, do we move forward? We can start by building on the technology foundation, albeit imperfect, that has come about from the HITECH Act. We must focus on translation, aiming to understand how to diversely implement functionality that is highly supported by the evidence while carrying out further research in areas where the evidence is less clear. As with any clinical intervention, we must pay attention to both beneficial and adverse effects, learning from the growing body of knowledge on safe use of EHRs [23]. We must also train and deploy clinician informatics leaders who provide expertise at the intersection of health care and IT [24].

Finally, we also reflect on the perspective of the larger value of IT in health care settings. Approaches to cost containment, quality measurement, and billing via documentation must be reformulated to leverage the EHR and reduce burden on clinicians. We should focus on issues such as practice and IT system redesign, best practices for the patient-practitioner-computer triad, and practitioner well-being [25]. We must build on value from other uses of EHRs and IT, including patient engagement and support for clinical research. Leadership for these changes must come from leading health care systems, professional associations, academia, and government.

References

1. Kohn LT, Corrigan JM, and Donaldson MS, eds. To Err Is Human: Building a Safer Health System. 2000, National Academies Press: Washington, DC.
2. Classen DC, Resar R, Griffin F, Federico F, Frankel T, Kimmel N, et al., 'Global trigger tool' shows that adverse events in hospitals may be ten times greater than previously measured. Health Aff, 2011. 30: 4581-4589.
3. McGlynn EA, Asch SM, Adams J, Keesey J, Hicks J, DeCristofaro A, et al., The quality of health care delivered to adults in the United States. N Engl J Med, 2003. 348: 2635-2645.
4. Levine DM, Linder JA, and Landon BE, The quality of outpatient care delivered to adults in the United States, 2002 to 2013. JAMA Intern Med, 2016. 176: 1778-1790.
5. Anderson GF, Frogner BK, Johns RA, and Reinhardt UE, Health care spending and use of information technology in OECD countries. Health Aff, 2006. 25: 819-831.
6. Squires D and Anderson C, U.S. Health Care from a Global Perspective: Spending, Use of Services, Prices, and Health in 13 Countries. 2015, The Commonwealth Fund: New York, NY, http://www.commonwealthfund.org/publications/issue-briefs/2015/oct/us-health-care-from-a-global-perspective.
7. Smith PC, Araya-Guerra R, Bublitz C, Parnes B, Dickinson LM, VanVorst R, et al., Missing clinical information during primary care visits. JAMA, 2005. 293: 565-571.
8. Adler-Milstein J and Pfeifer E, Information blocking: is it occurring and what policy strategies can address it? Milbank Q, 2017. 95: 117-135.
9. Tierney WM, Miller ME, Overhage JM, and McDonald CJ, Physician inpatient order writing on microcomputer workstations: effects on resource utilization. JAMA, 1993. 269: 379-383.
10. Bates DW, Leape LL, Cullen DJ, Laird N, Petersen LA, Teich JM, et al., Effect of computerized physician order entry and a team intervention on prevention of serious medication errors. JAMA, 1998. 280: 1311-1316.
11. Bates DW, Kuperman GJ, Rittenberg E, Teich JM, Fiskio J, Ma'luf N, et al., A randomized trial of a computer-based intervention to reduce utilization of redundant laboratory tests. Am J Med, 1999. 106: 144-150.
12. Teich JM, Merchia PR, Schmiz JL, Kuperman GJ, Spurr CD, and Bates DW, Effects of computerized physician order entry on prescribing practices. Arch Int Med, 2000. 160: 2741-2747.
13. Overhage JM, Tierney WM, Zhou XH, and McDonald CJ, A randomized trial of "corollary orders" to prevent errors of omission. JAMA, 1997. 4: 364-375.
14. Chaudhry B, Wang J, Wu S, Maglione M, Mojica W, Roth E, et al., Systematic review: impact of health information technology on quality, efficiency, and costs of medical care. Ann Intern Med, 2006. 144: 742-752.
15. Goldzweig CL, Towfigh A, Maglione M, and Shekelle PG, Costs and benefits of health information technology: new trends from the literature. Health Aff, 2009. 28: w282-w293.
16. Buntin MB, Burke MF, Hoaglin MC, and Blumenthal D, The benefits of health information technology: a review of the recent literature shows predominantly positive results. Health Aff, 2011. 30: 464-471.
17. Jones EB and Furukawa MF, Adoption and use of electronic health records among federally qualified health centers grew substantially during 2010-12. Health Aff, 2014. 33: 1254-1261.
18. Vawdrey DK and Hripcsak G, Publication bias in clinical trials of electronic health records. J Biomed Inform, 2013. 46: 139-141.
19. Kuhn T, Basch P, Barr M, and Yackel T, Clinical documentation in the 21st century: executive summary of a policy position paper from the American College of Physicians. Ann Intern Med, 2015. 162: 301-303.
20. Liang LL, Connected for Health - Using Electronic Health Records to Transform Care Delivery. 2010, San Francisco, CA: Jossey-Bass.
21. Maeng DD, Davis DE, Tomcavage J, Graf TR, and Procopio KM, Improving patient experience by transforming primary care: evidence from Geisinger's patient-centered medical homes. Pop Health Manag, 2013. 16: 157-163.
22. Longman P, Best Care Anywhere: Why VA Health Care is Better Than Yours. 2007, Sausalito, CA: Polipoint Press.
23. Sittig DF, Ash JS, and Singh H, The SAFER guides: empowering organizations to improve the safety and effectiveness of electronic health records. Am J Manag Care, 2014. 20: 418-423.
24. Detmer DE and Shortliffe EH, Clinical informatics: prospects for a new medical subspecialty. JAMA, 2014. 311: 2067-2068.
25. Adler-Milstein J, Embi PJ, Middleton B, Sarkar IN, and Smith J, Crossing the health IT chasm: considerations and policy recommendations to overcome current challenges and enable value-based care. J Am Med Inform Assoc, 2017: Epub ahead of print.

Monday, May 1, 2017

Navigating OHSU Informatics Education Programs and Content

Not infrequently, I receive emails asking about or even expressing confusion about the various informatics educational programs and products of Oregon Health & Science University (OHSU). With a couple of grant-funded curriculum development projects about to end, this is probably a good time for a posting here to help sort things out. Before I do that, I must give a plug to US News & World Report, which recently plugged informatics as a graduate health degree that expanded both knowledge and career opportunities.

OHSU has a number of educational programs in biomedical informatics. The core of all these programs is the Biomedical Informatics Graduate Program, which provides masters and PhD degrees in two tracks, health and clinical informatics (HCI) and bioinformatics and computational biomedicine (BCB). The HCI track also offers a Graduate Certificate that is a subset of the masters program, and these two programs available in a distance learning format.

OHSU also offers two fellowship programs. One is a long-standing research-oriented fellowship program for PhD and postdoctoral students funded by the National Library of Medicine. The postdoctoral option also includes a masters degree. More recently, a clinically oriented fellowship for physicians has been launched. This fellowship is accredited by the Accreditation Council for Graduate Medical Education (ACGME) and allows sitting for the clinical informatics subspecialty board exam. The clinical informatics fellowship also provides the Graduate Certificate with an option to pursue the masters degree.

OHSU also was the original participant in the AMIA 10x10 (“ten by ten”) program. The OHSU 10x10 course is a repackaging of the introductory course from the HCI track of the graduate program, and those completing the OHSU 10x10 course can take the optional final exam to receive academic credit from OHSU.

The OHSU biomedical informatics program has also participated in the development of a number of public repositories of educational materials that have been funded by US federal grants. OHSU was funded in the original and subsequent update of the Office of the National Coordinator for Health IT (ONC) curriculum. Development of the original curriculum was stopped when funding ended in 2013, with the archive freely available on the American Medical Informatics Association (AMIA) Web site. The update has been expanded to 24 components, each of which is about a college course in size. It has been under development since 2015 and will be made publicly available on the ONC Web site (HealthIT.gov) this summer.

All of the grantees of the ONC update project have also been required to offer short-term training to 1000 incumbent healthcare professionals. The OHSU offering has focused on healthcare data analytics, and has also provided continuing medical education (CME) for all physicians and Maintenance of Certification (MOC)-II credit for physicians certified in the clinical informatics subspecialty. The free courses offered as part of the ONC grant will be wrapping up at the end of May. We will likely start offering the course again in the future for a fee.

OHSU has also been funded to develop open educational resources (OERs) and data skills courses funded by two grants under the Big Data to Knowledge (BD2K) initiative of the National Institutes of Health (NIH). About 20 modules have been developed for various topics in biomedical science. The materials from this project are currently housed on a Web site that will transition to a permanent archive on GitHub when funding ends for the project later this year.

The long-term maintenance of repository materials is uncertain at this time. We are hopeful that resources to keep them up to date will be found, and OHSU will certainly continue to use them in its own educational programs.

Sunday, April 16, 2017

Participating in the March for Science

I plan to participate in the March for Science in Portland on April 22nd. I did not come to this decision lightly. This was different, for example, from my decision to participate in the Women’s March in January. That was a very easy decision to make based on my political views.

But science is more than just politics to me. It is of course my livelihood, as I am a faculty at Oregon Health & Science University (OHSU) and one whose work is supported by public funds. Science is also, however, the dispassionate pursuit - to the best of human ability - for discerning truth. As such, I do not want to see science subverted for political or other aims. I also want to be careful that others do not subvert the message of the march itself, which in my view is to inform people about the value of public support and taxpayer funding of science.

I am pleased that many organizations have reached similar decisions. The American Association for the Advancement of Science (AAAS) has endorsed the march, and I agree with their statement that the march is a "nonpartisan set of activities that aim to promote science education and the use of scientific evidence to inform policy." I am pleased that OHSU supports the march as well.

In the end, my concerns about the real threats to science outweighed my worries about science being subverted by politics. I consider the threats to science by the current political leadership of the US to be significant. I do not consider science to be a partisan subject. I cannot look at climate change, gun violence, or immunization-preventable diseases and state that research about them is driven by an ideological or political agenda. Yes, science is always full of disagreement and is never truly “settled,” but there are bounds of truth and there is always a need to probe even what we believe to be true further. One of the beauties of science and its dispassionate search for answers is that it is self-correcting. So when science gets something wrong, there is a good likelihood that it will be corrected by further research.

Of course, I also recognize that the public purse to fund science, or anything else that the government funds, is not unlimited. That is why we have a political process to debate and enact appropriate amounts of taxation and public spending. It is also important to remember that basic research funded by the government is not research that would be funded by private industry. In fact, industry well knows it benefits from the basic research that enables companies to develop profitable products.

I also believe that the scientific enterprise in the US is a very efficient and effective allocation of tax dollars. The National Institutes of Health (NIH) budget of about $30 billion is about 1% of overall federal spending. Federal research grants not only fund scientific research but also education and training for the next generations of scientists, clinicians, and others. NIH funding is mostly awarded through highly competitive funding opportunities that often times only have success rates of 10-15%. Despite what detractors says, the life of writing grant proposals is not a cushy one.

Even beyond the research itself, the money spent on scientific research brings money back to communities. When a faculty like myself is awarded a grant, that money not only advances research and education, but it creates jobs for people in the local community. In turn, all of those who are funded by the grant turn around and spend money in grocery stores, restaurants, and other local businesses. And of course the reality is that if my institution and state and local governments were not funded by this money, it would end up in other states. OHSU commissioned a study about five years ago showing a multiplier effect in terms of money spent on the institution having an impact back on the local economy.

Although this march is about science generally, I hope that some other points specific to biomedical research come across. As pointed out by OHSU leadership, an abrupt cut in funding, such as the 18% cut to NIH proposed by the Trump Administration, will have outsized impact due to the fact that most NIH grants are multi-year awards. This means that only a portion of a given year’s funding goes to new projects. An abrupt cut will mean that for the first year of the big cuts, very few new grants would be able to be awarded. Given how competitive the environment for funding already is, we stand to lose the momentum of both established and emerging scientists.

I also hope that another point that comes out is the misguided plan to in essence eliminate the Agency for Healthcare Research & Quality (AHRQ). While it will ostensibly be folded into NIH (AHRQ currently exists within the Department of Health and Human Services but outside NIH), the claim that its research is duplicated by other NIH entities is simply not true. AHRQ performs critical and novel research in under-researched areas of health and healthcare, such as patient safety, healthcare quality, and evidence-based medicine. As with other basic research, industry may benefit and even develop products from this research, the basic research is too far removed from their product cycle for them to want to fund it. As this is not the first time that efforts have been made to de-fund AHRQ, I have written about the value of AHRQ before.

I look forward to participating in the March for Science and advocating for the benefits of scientific research and its funding by the federal government. I hope the outpouring of support will education the downside to neglecting basic scientific research and the importance of training new scientists.

Sunday, February 19, 2017

Big Change, Little Change: OHSU Biomedical Informatics Graduate Program Renames Tracks

The Oregon Health & Science University (OHSU) Biomedical Informatics Graduate Program is renaming the two tracks of its program. While the changes to the names of the tracks are small, they reflect the big changes in the field and evolving content of the curriculum.

Since 2006, the program has had two “tracks,” which have been called Clinical Informatics (CI) and Bioinformatics & Computational Biology (BCB). These two pathways through the program have been called “tracks” because they represent two different foci within the larger field of biomedical informatics, which is the discipline that acquires, organizes, and uses data, information, and knowledge to advance health-related sciences. Historically, the differences between the tracks represented their informatics focus, in particular people, populations, and healthcare (clinical informatics) vs. cellular and molecular biology, genomics, and imaging (bioinformatics).

In recent years, however, these distinctions have blurred as “omics” science has worked its way into clinical medicine. At the same time, health, healthcare, and public health have become much more data-driven, due in no small part to the large-scale adoption of electronic health records. As such, the two tracks have begun to represent different but still distinct foci, mostly in their depth of quantitative methods (deep vs. applied) but also in coverage of other topics (e.g., system implementation, especially in complex health environments; usability; and clinical data quality and standards).

The program believes that both tracks possess a set of common competencies at a high level that reflect the essential knowledge and skills of individuals who work in biomedical informatics. The curriculum organizes these competencies into “domains,” which are groups of required and elective courses that comprise the core curriculum of each track. To reflect the evolution of the program, the program has renamed the BCB track to Bioinformatics and Computational Biomedicine (still abbreviated BCB) and the CI track to Health and Clinical Informatics (now to be abbreviated HCI). The table lists below lists the common competencies and the names of the domains for each track. Each of the domains contains required courses, individual competency courses (where students are required to select a certain number of courses from a larger list, which used to be called “k of n” courses), and elective courses.


The program will continue the overall structure of the curriculum with the “knowledge base” that represents the core curriculum of the master’s degree and the base curriculum for advanced study in the PhD program. A thesis or capstone is added to the knowledge base to qualify for the MS or MBI (latter in the HCI Track only) degrees, respectively. Additional courses are required for the PhD, ultimately culminating in a dissertation.

The materials and Web site for the program will be updated quickly to reflect the new names. The program will also be evolving course content as well as introducing new courses to reflect the foci of the new tracks. The program still fundamentally aims to train future researchers and leaders in the field of biomedical informatics.