For several years, I have advocated that the city of Portland and state of Oregon have the necessary ingredients to develop an industry cluster in health and biomedical information technology (IT). I expounded this vision in an Op-Ed piece in the Oregonian in 2008 and in the Silicon Forest blog in 2009.
Some recent happenings in the area make this vision more compelling. First comes news that is not directly related to biomedical informatics but is relevant to the currently beleaguered Oregon economy. This is the plan announced by Intel, one of our major local high-tech employers, to invest several billion dollars in renovating existing production centers and building a new research and development center. This is good news for the local economy due to the promise of high-skill, high-paying jobs. This is synergistic with other local development efforts, including those led by the Portland Development Commission to advance software as one of the four areas it identifies as a key cluster for economic development.
There are also specific instances of highly visible health IT companies, such as Kyrptiq, which just received some investment from the large national e-prescribing company, Surescripts. Oregon also has the cache of a strong open-source software community and the surrounding business activity to make it sustainable, not only generally but also specifically in health and healthcare. A local company that exemplifies this approach is the Collaborative Software Initiative, with its focus on public health.
I have always argued that Oregon is a potential hub for health and biomedical IT because of the confluence of strong industry, innovation in the healthcare delivery sector (Oregon is one of those "high quality, low cost" states), and the presence of a world-class academic program in biomedical and health informatics. I believe that these attributes can combine synergistically to foster economic development, improve the quality of people's health, and provide leadership and innovation in health and biomedical IT.
One encouraging recent happening is the publication of a draft report for comments calling for Portland State University (PSU) and Oregon Health & Science University (OHSU) to expand their collaboration by developing a formal strategic alliance. The report explicitly calls out the potential for developing joint programs in biomedical and health informatics.
There are other cities and regions that aspire to leadership in this area. The city of Atlanta recently published a gloss on its being an "epicenter" of health IT. The larger healthcare entrepreneurship scene in Nashville also includes a component of health IT. I hope the leaders in Portland and Oregon will share this vision.
Wednesday, October 27, 2010
Thursday, October 14, 2010
The Informatics Outlook for Physicians
In looking over the topics I have addressed in the year and a half of this blog, I have tried to convey biomedical and health informatics as a broad field with many roles and opportunities for people from a wide variety of professional backgrounds. One group that I have not addressed explicitly is physicians.
I admittedly have a kinship for physicians in the informatics field. After all, I am a physician by training, and even though I know longer actively care for patients, my training and early career experience provide a perspective that informs my understanding of the role of physicians in informatics.
Overall, the opportunities for physicians in informatics are substantial and growing. While many early informatics roles for physicians focused on research and development, the real growth opportunities are now for those seeking to be informatics practitioners. These practitioners play a variety of roles not only in planning and implementing systems, but deriving value from the information within them.
An ever increasing number of healthcare organizations have recognized the importance of physician informatics leadership, manifested most frequently in positions that go by the name of Chief Medical Information Officer or Chief Medical Informatics Officer (both of which conveniently are represented by the acronym CMIO). While the CMIO position is probably now the most visible physician role in informatics, it is hardly the only one. Physicians also play other roles in healthcare organizations as well as other entities, such as vendor, consulting, government, and research organizations.
The OHSU biomedical informatics graduate program has always had a strong representation from physicians, who comprise about 50% of our enrollment. They are therefore not the only demographic of student in the program, as we also have students from other healthcare professionals (e.g., nurses, pharmacists, lab/radiology technicians, health information managers) as well as from outside the health professions (e.g., information technology, computer science, and even further afield from law, biology, business, and others). Furthermore, the professional diversity of our program has always, in my mind, been one of the program's assets, even though trying to teach informatics simultaneously to a physician, nurse, computer scientist, and businessperson can be a challenge!
But it nonetheless has been gratifying to see many physicians go on to assume roles and leadership in the field. The diverse roles that physicians who enter the field take exemplifies the expansion of these roles.
There are a number of issues ahead for physicians contemplating careers in informatics to ponder. One concerns training. How much should they seek? Should they get it at all? If they do, in what kinds of programs should they train? I honestly cannot give an unequivocal answer. There are many physicians who move into informatics roles without any formal training. However, I do believe over time that formal training will be a requisite for informatics jobs. If nothing else, one's competitors for those jobs will have such training.
As for how much training, that is also an uncertainty. There is a growing recognized knowledge base for the informatics field. There is also recognition of an increasing number of best practices. Physician-informaticians might not need to understand all the technical details of the systems with which they work, but they must have the big picture both of the technology and how it fits into their environment.
Another issue on the horizon for physicians is certification, in particular the proposed clinical informatics subspecialty. This subspecialty will be available to physicians in many, perhaps all, specialties (e.g., internal medicine, pediatrics, family medicine, surgery, etc.). There are still many unknowns about this process, such as how will other informatics experience and training besides formal on-site fellowship training be viewed and how physicians without board certification might be able to take part. Nonetheless, certification is important in healthcare professions, and certification in informatics will lead to more professional recognition of the field.
I believe it is safe to conclude here are tremendous opportunities for physicians to be innovators and leaders in the proper and most effective use of information technology IT) not only in healthcare, but also personal health, public health, and research.
Despite the uncertainty about some of the details, the outlook for physicians in informatics is bright, even after the initial wave of EHR adoption is complete (as addressed in a previous blog entry). The need for expertise in health IT implementation will only increase, especially as we see more coordination and quality measurement of care delivery.
I admittedly have a kinship for physicians in the informatics field. After all, I am a physician by training, and even though I know longer actively care for patients, my training and early career experience provide a perspective that informs my understanding of the role of physicians in informatics.
Overall, the opportunities for physicians in informatics are substantial and growing. While many early informatics roles for physicians focused on research and development, the real growth opportunities are now for those seeking to be informatics practitioners. These practitioners play a variety of roles not only in planning and implementing systems, but deriving value from the information within them.
An ever increasing number of healthcare organizations have recognized the importance of physician informatics leadership, manifested most frequently in positions that go by the name of Chief Medical Information Officer or Chief Medical Informatics Officer (both of which conveniently are represented by the acronym CMIO). While the CMIO position is probably now the most visible physician role in informatics, it is hardly the only one. Physicians also play other roles in healthcare organizations as well as other entities, such as vendor, consulting, government, and research organizations.
The OHSU biomedical informatics graduate program has always had a strong representation from physicians, who comprise about 50% of our enrollment. They are therefore not the only demographic of student in the program, as we also have students from other healthcare professionals (e.g., nurses, pharmacists, lab/radiology technicians, health information managers) as well as from outside the health professions (e.g., information technology, computer science, and even further afield from law, biology, business, and others). Furthermore, the professional diversity of our program has always, in my mind, been one of the program's assets, even though trying to teach informatics simultaneously to a physician, nurse, computer scientist, and businessperson can be a challenge!
But it nonetheless has been gratifying to see many physicians go on to assume roles and leadership in the field. The diverse roles that physicians who enter the field take exemplifies the expansion of these roles.
There are a number of issues ahead for physicians contemplating careers in informatics to ponder. One concerns training. How much should they seek? Should they get it at all? If they do, in what kinds of programs should they train? I honestly cannot give an unequivocal answer. There are many physicians who move into informatics roles without any formal training. However, I do believe over time that formal training will be a requisite for informatics jobs. If nothing else, one's competitors for those jobs will have such training.
As for how much training, that is also an uncertainty. There is a growing recognized knowledge base for the informatics field. There is also recognition of an increasing number of best practices. Physician-informaticians might not need to understand all the technical details of the systems with which they work, but they must have the big picture both of the technology and how it fits into their environment.
Another issue on the horizon for physicians is certification, in particular the proposed clinical informatics subspecialty. This subspecialty will be available to physicians in many, perhaps all, specialties (e.g., internal medicine, pediatrics, family medicine, surgery, etc.). There are still many unknowns about this process, such as how will other informatics experience and training besides formal on-site fellowship training be viewed and how physicians without board certification might be able to take part. Nonetheless, certification is important in healthcare professions, and certification in informatics will lead to more professional recognition of the field.
I believe it is safe to conclude here are tremendous opportunities for physicians to be innovators and leaders in the proper and most effective use of information technology IT) not only in healthcare, but also personal health, public health, and research.
Despite the uncertainty about some of the details, the outlook for physicians in informatics is bright, even after the initial wave of EHR adoption is complete (as addressed in a previous blog entry). The need for expertise in health IT implementation will only increase, especially as we see more coordination and quality measurement of care delivery.
Labels:
biomedical informatics,
certification,
physicians
Tuesday, October 12, 2010
A Teachable Moment About Healthcare Reform and Markets
This is not a political blog, and I prefer to keep the postings focused on biomedical and health informatics. However, as an educator, I sometimes feel compelled to do some educating to remind people in a debate where there are some substantial misperceptions. I believe this to be in the case in the healthcare reform debate. While I do recognize there are different positions on the solutions to the problem, I disagree that there is not a factual basis upon which to base the discussion. One of my favorite quotes in life comes from the late Sen. Daniel Patrick Moynihan, who stated, "Everyone is entitled to his own opinion, but not his own facts."
I certainly have my opinions on how healthcare should be reformed in the United States, but I will have those debates elsewhere. I do, however, believe we need to get the facts straight. To that end, I had an Op-Ed column published in this week's Oregonian newspaper. The text is reproduced below. I do not have much optimism that this piece can alter the substance or tone of what passes for our political debate, but I do hope that it might cause some to think.
_____________________________________
Last week's article in The Oregonian about health care costs varying widely by hospital was hardly surprising to anyone familiar with the health care "marketplace." The problem with health care is that it doesn't obey the principles of markets, and the problem is unlikely to be fixed by letting the market work.
Before we even think about pricing of health care items, we must first remember that Americans don't commonly purchase health care. Rather, we purchase health insurance. Only the very rich, and certainly not the upper middle class or anyone less wealthy, could afford to buy health care on a per-item basis. Instead, we buy insurance, which means that the ability to afford substantial medical expenses will be possible if and when we need it. Naturally, we hope we remain healthy and don't need it.
Because we buy insurance and not care, we need to think about health care purchasing in terms of setting reasonable prices that large-volume insurers, including federal and state governments, can negotiate. Those opposed to health care reform proclaim that people should not be forced to buy insurance "they don't need." But buying something we hope we'll not need is the whole idea behind insurance. If we all pay something for insurance, then we spread the risk for those with truly high expenses. If we let people wait until they get sick to buy insurance, we defeat the purpose of insurance. That's one of the reasons why it's essential that young, healthy people be required to purchase health insurance.
We also need to think differently about "rationing" of health care, giving up the notion that it should never occur. As most free-market economists will tell you, rationing is a good thing. Rationing is the means by which free markets work, determining, for example, whether we can afford a particular house, or car or computer. So the issue is not whether to ration health care, but rather how will we go about doing it, either through the purely free market or by some mechanism that attempts to maximize the allocation of health care resources to achieve the greatest common good. Saying that all health care decisions must be made between a patient and his or her physician is not an answer, since such a system is not economically sustainable and provides no mechanism to achieve any kind of rationing, even rationing by purely market mechanisms.
There are other aspects of the health care marketplace that we must remember when thinking about the price of care. When it comes to acute illness, few people are in the position to comparison shop on price, quality or anything else. If you suffer serious trauma or an acute life-threatening event, such as a heart attack, you generally go to the nearest hospital. Even if your illness is less acute and you can take time to make a decision, our health care system doesn't have the ability to provide information that would enable you to make truly informed decisions about quality or cost. Reputation and anecdotes about hospitals and clinicians are only that, and do not provide details on quality and skill. Furthermore, few patients are willing to go against the advice of their physician when recommendations for tests and/or treatments are given. There is truly little to keep people from spending money that the health care system wants them to spend.
And the system wants people to spend in a big way.
One of the most notorious examples of that is pharmaceutical companies. While these companies have created truly life-saving products over the years, they're also effective at creating medical conditions or advocating for prescribing of their products that people don't necessarily need. Even physicians sometimes have incentives to advocate for tests and treatments that patients truly don't need. There are too many entrenched self-interests in the health care system, which sometimes even piggyback on to "reform" efforts.
Some advocate for putting more financial burden on consumers through higher deductibles and co-pays, thus leading them to consider the cost of their care. While I'm not opposed to making consumers more cognizant of the cost of their care, the problem with this approach is that while individual people may have leverage with a physician practice, they have little if any leverage with hospitals or pharmaceutical companies. Another problem with increasing out-of-pocket health care costs is that consumers might be inclined to forgo screening or other preventive care that could reduce costs in the long run -- for example a colonoscopy that detects colon cancer at a very early stage when it's cheaper to treat.
Once we abandon the notion that markets will cure runaway health care costs, we can then work our way toward a meaningful conversation about costs and the role of government, insurers and others. It's unfortunate that this discussion has become so political and ideological, if not emotional, preventing us from having rational dialogue about the role of various participants in the system, including first and foremost the patient.
I certainly have my opinions on how healthcare should be reformed in the United States, but I will have those debates elsewhere. I do, however, believe we need to get the facts straight. To that end, I had an Op-Ed column published in this week's Oregonian newspaper. The text is reproduced below. I do not have much optimism that this piece can alter the substance or tone of what passes for our political debate, but I do hope that it might cause some to think.
_____________________________________
Last week's article in The Oregonian about health care costs varying widely by hospital was hardly surprising to anyone familiar with the health care "marketplace." The problem with health care is that it doesn't obey the principles of markets, and the problem is unlikely to be fixed by letting the market work.
Before we even think about pricing of health care items, we must first remember that Americans don't commonly purchase health care. Rather, we purchase health insurance. Only the very rich, and certainly not the upper middle class or anyone less wealthy, could afford to buy health care on a per-item basis. Instead, we buy insurance, which means that the ability to afford substantial medical expenses will be possible if and when we need it. Naturally, we hope we remain healthy and don't need it.
Because we buy insurance and not care, we need to think about health care purchasing in terms of setting reasonable prices that large-volume insurers, including federal and state governments, can negotiate. Those opposed to health care reform proclaim that people should not be forced to buy insurance "they don't need." But buying something we hope we'll not need is the whole idea behind insurance. If we all pay something for insurance, then we spread the risk for those with truly high expenses. If we let people wait until they get sick to buy insurance, we defeat the purpose of insurance. That's one of the reasons why it's essential that young, healthy people be required to purchase health insurance.
We also need to think differently about "rationing" of health care, giving up the notion that it should never occur. As most free-market economists will tell you, rationing is a good thing. Rationing is the means by which free markets work, determining, for example, whether we can afford a particular house, or car or computer. So the issue is not whether to ration health care, but rather how will we go about doing it, either through the purely free market or by some mechanism that attempts to maximize the allocation of health care resources to achieve the greatest common good. Saying that all health care decisions must be made between a patient and his or her physician is not an answer, since such a system is not economically sustainable and provides no mechanism to achieve any kind of rationing, even rationing by purely market mechanisms.
There are other aspects of the health care marketplace that we must remember when thinking about the price of care. When it comes to acute illness, few people are in the position to comparison shop on price, quality or anything else. If you suffer serious trauma or an acute life-threatening event, such as a heart attack, you generally go to the nearest hospital. Even if your illness is less acute and you can take time to make a decision, our health care system doesn't have the ability to provide information that would enable you to make truly informed decisions about quality or cost. Reputation and anecdotes about hospitals and clinicians are only that, and do not provide details on quality and skill. Furthermore, few patients are willing to go against the advice of their physician when recommendations for tests and/or treatments are given. There is truly little to keep people from spending money that the health care system wants them to spend.
And the system wants people to spend in a big way.
One of the most notorious examples of that is pharmaceutical companies. While these companies have created truly life-saving products over the years, they're also effective at creating medical conditions or advocating for prescribing of their products that people don't necessarily need. Even physicians sometimes have incentives to advocate for tests and treatments that patients truly don't need. There are too many entrenched self-interests in the health care system, which sometimes even piggyback on to "reform" efforts.
Some advocate for putting more financial burden on consumers through higher deductibles and co-pays, thus leading them to consider the cost of their care. While I'm not opposed to making consumers more cognizant of the cost of their care, the problem with this approach is that while individual people may have leverage with a physician practice, they have little if any leverage with hospitals or pharmaceutical companies. Another problem with increasing out-of-pocket health care costs is that consumers might be inclined to forgo screening or other preventive care that could reduce costs in the long run -- for example a colonoscopy that detects colon cancer at a very early stage when it's cheaper to treat.
Once we abandon the notion that markets will cure runaway health care costs, we can then work our way toward a meaningful conversation about costs and the role of government, insurers and others. It's unfortunate that this discussion has become so political and ideological, if not emotional, preventing us from having rational dialogue about the role of various participants in the system, including first and foremost the patient.
Thursday, October 7, 2010
New Data Reiterates Coming Need for Health IT Workforce
A recent survey from the College of Healthcare Information Management Executives (CHIME) provides additional data on the growing need for a skilled health IT workforce, with a particular need in "clinical software implementation and support staff." The survey was administered to healthcare CIOs in September, 2010 and had 182 respondents, representing 13% of CHIME's membership. A summary and full report of the survey are available.
The respondents came from a variety of hospital types and sizes, from large academic centers to small community hospitals. Interesting enough, the biggest needs, more than 20% staff shortages, were found at the big and small institution ends of the spectrum.
The highest proportion of open positions, as noted above, were in clinical software implementation and support staff, with 71% of CIOs reporting openings. The types of positions open included project managers, analysts, application coordinators, report writers, trainers, informatics staff, and technical staff. I really consider all of these positions to be in the realm of "informatics," or at least are positions for which informatics training would prepare one well.
One disappointing finding of the survey was half of the respondents reporting that they did not foresee additional spending on bolstering IT staff. On the other hand, most of the those organizations will be seeking their "meaningful use" incentive dollars, so hopefully their leadership can be convinced to invest in staff.
Healthcare organizations are not the only ones with needs and who are hiring. The EHR vendor Meditech reported that it will be hiring over 800 people in a new facility in Massachusetts. (Note to Oregon economic development leaders: There is opportunity for job creation in this field!)
These data are very consistent with a survey reported by HIMSS last spring, which had a total of 149 respondents. The survey found that 86% of organizations planned to hire additional IT staff in 2010. The areas respondents would most likely hire included implementation support specialists (55%), implementation managers (51%), and technical support (48%). The highest ranked area that organizations felt they lacked qualified candidates was clinical informatics (30%), followed by implementation expert (26%) and software maintenance expert (17%).
Finally, the consulting firm CSC, whose web site of reports on various aspects of meaningful use is one of my favorites, has produced a report on HIT workforce shortages. They summarize the research (including my study that used HIMSS Analytics data), describe the ONC workforce development programs, and discuss the implications to healthcare organizations. The latter include competition for qualified staff, inexperience among those newly trained, leading to lack of those with enough experience to assume leadership roles, attrition, and competition from other HIT tasks, such as ICD-10 implementation, HIPAA issues, and insurance exchanges.
The report's recommendations for overcoming these challenges is "expand, retain, and exploit," i.e., training and developing from within as well as exploring alternatives from outside the organization.
Of course, my advice is to hang tight, and hopefully the graduates from the newly funded ONC workforce development programs, including ours at OHSU, will start to fill the need soon.
The respondents came from a variety of hospital types and sizes, from large academic centers to small community hospitals. Interesting enough, the biggest needs, more than 20% staff shortages, were found at the big and small institution ends of the spectrum.
The highest proportion of open positions, as noted above, were in clinical software implementation and support staff, with 71% of CIOs reporting openings. The types of positions open included project managers, analysts, application coordinators, report writers, trainers, informatics staff, and technical staff. I really consider all of these positions to be in the realm of "informatics," or at least are positions for which informatics training would prepare one well.
One disappointing finding of the survey was half of the respondents reporting that they did not foresee additional spending on bolstering IT staff. On the other hand, most of the those organizations will be seeking their "meaningful use" incentive dollars, so hopefully their leadership can be convinced to invest in staff.
Healthcare organizations are not the only ones with needs and who are hiring. The EHR vendor Meditech reported that it will be hiring over 800 people in a new facility in Massachusetts. (Note to Oregon economic development leaders: There is opportunity for job creation in this field!)
These data are very consistent with a survey reported by HIMSS last spring, which had a total of 149 respondents. The survey found that 86% of organizations planned to hire additional IT staff in 2010. The areas respondents would most likely hire included implementation support specialists (55%), implementation managers (51%), and technical support (48%). The highest ranked area that organizations felt they lacked qualified candidates was clinical informatics (30%), followed by implementation expert (26%) and software maintenance expert (17%).
Finally, the consulting firm CSC, whose web site of reports on various aspects of meaningful use is one of my favorites, has produced a report on HIT workforce shortages. They summarize the research (including my study that used HIMSS Analytics data), describe the ONC workforce development programs, and discuss the implications to healthcare organizations. The latter include competition for qualified staff, inexperience among those newly trained, leading to lack of those with enough experience to assume leadership roles, attrition, and competition from other HIT tasks, such as ICD-10 implementation, HIPAA issues, and insurance exchanges.
The report's recommendations for overcoming these challenges is "expand, retain, and exploit," i.e., training and developing from within as well as exploring alternatives from outside the organization.
Of course, my advice is to hang tight, and hopefully the graduates from the newly funded ONC workforce development programs, including ours at OHSU, will start to fill the need soon.
Subscribe to:
Posts (Atom)