One question that arises is, who will provide all this education and training? A number of people have advocated that it be carried out by community colleges. A recent article in Healthcare IT News interviewed two people, a health insurance company executive and a president of a community college association, who advocated for community colleges to play that role.
In a rebuttal commentary, however, I replied that I was not so sure. There is no doubt that plenty of jobs in health IT will be for those educated in community colleges, such as the "informatics technicians" noted in a recent CNN posting about "emerging jobs poised for growth." But this is in distinction to the emerging clinical informatics role, which requires a combination of understanding the clinical environment and its workflows, ability to use advanced information analysis (more so than IT or computer science skills), and a myriad of business and soft skills. As the director of an informatics graduate program, I acknowledge my bias, but I advocated in my commentary that these programs, slightly re-orienting and focusing their curricula, may be better suited for training up this workforce. Since the proposed training must necessarily be short-term, I noted in my commentary that we are re-configuring our Graduate Certificate program into a 6-month program when pursued as a full-time student.
One line of evidence supporting my view comes from the Health IT Compensation Survey (Vendome, 2009). This year's survey features a wealth of data that goes way beyond compensation, and provides an interesting synopsis of the job functions and educational backgrounds of a wide variety of people who work in the industry. They segment those they survey into job setting (i.e., hospital, company, etc.), and across every segment, they subdivide people into leadership, clinical, and non-clinical positions.
Those in hospitals make up the largest segment in the survey, so I will focus on them. Among the leaders, 18% have doctoral or professional degrees, 48% have master's degrees, and all but 4% of the rest have bachelor's degrees. They subdivide the clinical and non-clinical professionals into "high authority" and "low authority." The breakdown of degrees within these groups is:
- Clinical/High Authority: 34% have doctoral or professional degrees, 29% have master's degrees, and 30% have bachelor's degrees
- Clinical/Low Authority: 20% have doctoral or professional degrees, 31% have master's degrees, and 35% have bachelor's degrees
- Non-Clinical/High Authority: 1% have doctoral or professional degrees, 36% have master's degrees, and 38% have bachelor's degrees
- Non-Clinical/Low Authority: 1% have doctoral or professional degrees, 24% have master's degrees, and 51% have bachelor's degrees
I do realize that community colleges play a strong role in rapidly adapting to skills needs in communities, and that many of their students are those who have bachelor's or even graduate degrees and return to attain new skills. And there is no question that some of the jobs in health IT will require the kinds of skills that community colleges already teach, such as those in pure IT. I acknowledge that the person hired to harden a server to prevent its security from being compromised probably does not need courses in change management. But many others who work in health IT do!
The reality is that few community colleges have expertise on their faculty in clinical informatics, which is not the mere addition of computer science, health information management, and health care courses as many seem to think. Informatics is what arises at the unique intersection of those areas, and the expertise for teaching it currently resides mostly in graduate-level informatics programs.