The Truven Health Blog

The latest healthcare topics from a trusted, proven, and unbiased source.


Using Data to Improve Healthcare

By Truven Staff
Mike Taylor imageAs other have pointed out repeatedly, our healthcare system is badly broken. In fact, we don’t have a healthcare system in this country – it’s a series of independent businesses, often competing with each other in the goal of making more profit. The three constituencies in the healthcare business are the customers (patients), the providers (doctors and hospitals), and the payers (health plans, employers and the government). These three groups all have perfectly misaligned incentives. Patients want care at minimal cost, providers make more money by providing more care (whether it is needed or not), and payers want to minimize payments. The payment mechanism drives more care at higher cost, and the result is the U.S. pays 18% of its GDP for healthcare – more than twice as much as any other country on the planet.

How does smarter use of data help this picture? In my opinion, more intelligent use of data is an important part of the answer. Data is a powerful tool to help physicians make better decisions. In the hospital setting, physicians should have access to ALL of a patient’s medical record, not just information gathered during a single hospital stay. In most Emergency Departments, doctors often don’t have unfettered access to outpatient medical records that may provide important clues to making correct diagnoses. Tests are needlessly repeated, incorrect medications are given and diagnostic errors are made all too often.  Electronic medical records (EMRs) should be helping this problem, but unfortunately most EMRs are simply digitized versions of the old paper record. We need EMRs to be longitudinal electronic health records, aggregating all of a person’s health information into a single record to be used by all providers of care. A unified health record then needs analytic tools to be able to use the comprehensive record to improve care, provide guidelines for evidence-based medical care, prevent incorrect medication use, stop dosing errors, and have prompts in the analytic tool to stop repeat tests and x-rays- in sum, improve the care.

A unified, single, health record for a patient would be a great tool to help improve care, but in the U.S., we have more fundamental problems than a lack of accessible data. In today’s residency training programs, physicians should be taught how to use the data and EMRs to make better decisions. An evaluation of a patient should always start with the physician sitting with the patient, taking a probing history by knowing what questions to ask, and how to elicit symptoms. This information is supplemented by knowing how to properly examine a patient and understand how to put all the information together to formulate a diagnosis. We cannot rely on an EMR or CT scans to do this job – it must start with a thorough history and a proper physical. One of the most impactful lessons I was taught in residency was that if I finished taking a patient’s medical history and yet still didn’t have a series of probable diagnoses to consider, I needed to take more history. Unfortunately, in today’s hospitals, finding a diagnosis is all too often done by ordering more testing, and in a fee-for-service payment environment, more testing means more revenue. More procedures mean more revenue. Hospitals and physicians should be paid for providing a higher level of quality, not by volume. 

I am a strong advocate of using medical data and providing better analytic tools to help physicians and patients, but tools are just tools. Physicians and other caregivers need these tools to improve care, but providers of care also need to listen to patients, think critically in making diagnostic assessments, care passionately about improving care, and use sound judgment at all times. They cannot be effective in a fee-for-service world. Providers do need to improve the care they provide, but the U.S. needs a sound healthcare strategy to solve our issues. Technology is part of that solution.

Michael L. Taylor, MD, FACP
Chief Medical Officer

Using Big Data in the Best Interest of the Patient

By Truven Staff
Kathleen Foley imageThe recent USA Today article, ‘’ highlighted many of the ways in which ‘big data’ are being used to improve healthcare in the United States. The linkage of data across hospitals, insurance claims, electronic medical record systems, and genomics databases are helping to identify more efficient treatments and high-cost patients, and determine best practices for treating patients with particular conditions.

Despite these benefits and many others, the creation of ‘big data’ assets is fraught with difficulties that may be limiting the true potential of existing data. In addition to privacy concerns and constraints which limit what types of data can be linked and by whom, there are issues around ownership and access to big data. Who should pay for the creation of these large data assets, and once created, who should have access? The answers are not straightforward and require the development of trust and a shared vision across many stakeholders.

Truven Health is actively involved in the development of data infrastructures to both create big data and facilitate analyses while guiding appropriate interpretation. One of the first areas of focus is the creation of cancer data assets. To facilitate research that will truly answer important questions for patients, providers, and payers, we are exploring all avenues for linking various data from claims data to EMRs to cancer registries. Only by combining data sources can we finally begin to address questions that will get the right treatment to the right patient at the right time. It isn’t just about generating big data, it’s also about knowing how to use it to generate knowledge that is a game changer.

Kathleen Foley
Senior Director, Strategic Consulting (Life Sciences)

Shifting the Data-Sharing Mindset for the At-Risk Healthcare Environment

By Truven Staff
Larry Yuhasz imageThe shift from fee-for-service (FFS) to at-risk reimbursement also represents a shift from siloed data sources and facilities to physician-driven networks that need to be connected with patient-centric decision support and work flow applications. In the FFS world, health systems and health plans often thought of patient data as a strategic asset to help recruit and retain both physicians and patients to their networks and services. Yet, in the at-risk world, the more data available about the patient, the better the ability to manage risk and coordinate care. The change in mindset from controlling patient data to allowing it to flow freely (albeit, securely) across networks is a radical transformation in the U.S. healthcare system. The three key barriers to overcome are business model, proprietary data formats and governance.

At the business model level, as health systems, providers, and payers form new at-risk arrangements, they need to under gird those arrangements with the relevant flow of administrative and clinical data to manage performance and risk. Health plans that have been reticent to share claims data need to shift gears both culturally and operationally to help the new provider-driven networks understand costs.

In terms of format, the Federal government has been trying to stimulate interoperability standards through the ARRA HITECH roll out, but many vendors (particularly EMR vendors) have been fighting back to defend their proprietary data formats. As the volume of at-risk contracts grows, new at-risk entities will not be able to function without some form of interoperable gateway to share and receive patient data. So that means that EMR customers will be the ones to request and implement interoperable gateways from their vendors. And if they continue to resist, there is a new generation of interoperability platform vendors that will fill the need.

Finally, from a legal perspective, no one entity actually “owns” the patient data. The patient owns his/her own data. Yet this creates a quagmire of governance models that is bogged down in consent management policy and privacy mechanisms. If patients do not allow an at-risk network to see their data, the network cannot optimize performance. So we may see a new market dynamic whereby network participation requires upfront patient consent to data sharing (an opt-out consent model).

Larry Yuhasz
Director for Strategy and Business Development

Population Health Analytics: The Devil Is Truly In the Details

By Truven Staff
Grant Hoffman image“Population Health” is an oft-discussed topic, but the definition is variable depending on the vantage point of the presenter. Likewise, “Population Health Analytics” attempts to measure and improve an array of risk-bearing, clinically-integrated activities, ranging from aggregate risk analysis to predictive interventions at the point of the care.

Regardless of your particular turf, some common challenges lurk behind the application of analytics to these business challenges. The roadblocks stem from the fundamental fact that the data sources on which you depend for decision-making were not captured with cross-encounter analytics in mind. Source IT systems such as EMRs, billing systems, and electronic prescribing solutions were constructed to accomplish transactional goals for siloed provider organizations, not to support improved outcomes and cost control across the patient care continuum.

 We’ve identified three areas of focus to help you avoid pitfalls: 
  • Anticipate information-sharing challenges: Technical integration of data isn’t the hard part. The tough stuff is setting the trust conditions for authentic multi-stakeholder data sharing and governance.
  • Navigate the context of data creation: Operational processes obscure analytic classification of data, terminology standards are variable, and information arrives at different periodicities. Amidst this noise, reliable prediction, reporting, and alerting all require an “analytically-aware” implementation of data streams and measures.
  • Start with analytics you can take action on: Massive projects get everyone excited, but a moon launch isn’t necessarily your first step. Work backwards from where you have operational capacity to make improvements (basic quality measures across the continuum of care? risk and disease prevalence? alerting and interventions?) and focus your attention on a set of trusted measures to get you there.
Watch our four-part video series on population health analytics.

Grant Hoffman
VP, Clinical Integration

EMRs Should Re-Engineer Medical Data Collection

By Truven Staff
Mike Taylor imageElectronic medical records . To fulfill the promise, EMRs need to lead the way in re-engineering the way medical information is collected, processed and utilized. This promise will not be fulfilled, however, if the EMR simply converts the paper medical record into an electronic record, using the same formats. EMRs should be able to solve the challenges of workflow automation and allow for a more mobile platform in which to collect data, but if an RN is now entering blood pressure into the EMR, much as was done in a paper record, is there any efficiency advantage for that RN? I would argue there is not.

Medical journals are now commenting about physician progress notes in the EMRs; many notes are simply copying all prior physician notes and pasting back into the record with a new date, making the notes redundant and meaningless. This is an example of trying to use the paper medical record format in an EMR environment.

EMRs need to allow for automated data entry from digitized sources, but the data need to be converted into medical information with decision support, gaps in care prompts, and other innovations to improve individual patient care. But even that is not enough. EMRs need to allow physicians, nurses, and other health care professionals to manage entire populations, not just the patient sitting in front of them at the time. This is the true promise of a fully integrated EMR. 

Dr. Michael Taylor
Chief Medical Officer


купити книгу