The Truven Health Blog

The latest healthcare topics from a trusted, proven, and unbiased source.


Medicaid Program Integrity: Fighting Fraud in a Managed Care Environment

Monday, July 14, 2014
David Nelson imageA recently published study by the Government Accounting Office identified a need for states to ramp up their efforts to assure Medicaid program integrity under managed care. Although a majority of Medicaid beneficiaries are now enrolled with managed care organizations (MCOs), and payments for those plans are growing at a faster rate than fee-for-service (FFS) expenditures, some states are just now beginning to shift their program integrity focus from FFS to managed care. 

Traditionally, Medicaid has fought FFS fraud, waste, abuse, and overpayment by applying edits and algorithms to claims in prepayment, and using data mining, investigation, and recovery modeling and analytics in post payment. More recently, Medicaid has stepped up fraud-prevention efforts by expanding the use of prepay predictive analytics and implementing provider credentialing and stringent ongoing provider surveillance, as required under the Affordable Care Act (ACA). 

Best-practice Medicaid agencies have increased their managed care program integrity efforts through more comprehensive oversight of their contracted MCOs. They are collecting and validating encounter data, which allows them to perform advanced analytics to find fraud, waste, and abuse, and they are performing checks to ensure proper Medicaid administration. These agencies examine the full continuum of managed care fraud and abuse vulnerabilities:
  • Traditional FFS issues, such as over-utilization and billing for unnecessary or unused services
  • FFS/Managed Care crossover issues, including double billing and payment for ineligible recipients, such as prisoners and those with certain medical conditions or who are enrolled in certain waiver programs
  • Managed care operational issues, such as inaccurate encounter claims, under-utilization, and cherry-picking patients
  • Managed care financial auditing to ensure that MCOs accurately account for and categorize costs incurred and capitation rates are premised upon correct information
Medicaid agencies need to be diligent stewards of their managed care contracts. While managed care adds new complexities and challenges for monitoring program integrity, the rapid growth in managed care enrollment adds to the urgency of putting in place effective oversight mechanisms. 

Critical Success Factors
As we look across best-practice Medicaid agencies, several critical success factors have been shown to produce significant results for the integrity of the program under managed care. Some of these critical success factors are:
  • Encounter data accuracy and completeness
  • Contract provisions and rules to support managed care payment integrity
  • Capitation payment review
  • Data analytics examining MCO services and comparing MCO utilization to FFS
  • Inter-MCO comparisons and analytics
  • Managed care organization auditing (both financial and operational)
By incorporating such success factors, Medicaid agencies can avoid common fraud, waste, and abuse pitfalls under managed care and improve the integrity of the program.

Truven Health Analytics™ has been helping managed care organizations in all of these dimensions for several years. Our experts have advised 20 states over the past 15 years about managed care encounter data strategy, and our program integrity experts have been delivering recoveries to Medicaid agencies for three decades. In fact, IDC MarketScape recently named us an industry leader in fraud, waste, and abuse solutions.*

For more information, please contact me at [email protected].

David Nelson
Vice President, Market Planning & Strategy

Getting to Enterprise Analytics in the Government Healthcare Sector Begins With a Modern, Connected Data Warehouse

Monday, June 30, 2014
Rick Williams imageEnterprise analytics is a hot buzz phrase these days. What used to be an analyst-only topic has moved to the executive level. And it’s no secret that the idea of analyzing disparate data from across an organization is becoming increasingly important in all of healthcare today – perhaps even more so in the government sector.

Policymakers are talking about it, elected officials want it, and taxpayers expect that it’s already happening.

Meanwhile, state agencies, such as Medicaid and Departments of Health and Human Services (HHS), are facing an urgent need to curtail rising costs, boost efficiencies, report accurate information, and improve quality of care.

To achieve that, they need to see not only the big-picture of program data, but also to understand the intricacies of population health and even coordinate patient-level care across agencies. And thanks to Affordable Care Act-driven concepts, like ACOs and risk-based contracts, it’s all at a tipping point.

The key lies in an interoperable data hub – a modern, connected warehouse that  facilitates the flow of data and reporting, automates workflows, and helps staff be more efficient while providing the right decision-making knowledge to the right stakeholders. 

Of course, as this type of warehouse is developed, particular attention must be paid to data integrity – because without that, enterprise analytics are meaningless.

The development should be guided by an iron-clad master data management process, ensuring that all data values being collected and connected speak the same language. This results in a data warehouse that truly becomes a single source of truth across departments and agencies.

At Truven Health, we see the warehouse development process unfolding with these steps:
  • Identify stakeholders and “champions”
  • Assemble strong executive leadership
  • Create a shared vision of the modern data warehouse
  • Formalize the governance structure
  • Establish a clear decision-making process
  • Evaluate the governance system and adapt as necessary
  • Maintain transparent communications throughout development
  • Identify an enterprise reference model as part of the information architecture
After the enterprise warehouse is developed, we can then apply the all-important, advanced metrics and modeling. Just a few of the typical analytics and applications we recommend include:
  • Calculations for episode grouping
  • Hierarchical Condition Categories (HCC) score calculations
  • Risk stratifications
  • A measures engine
  • Practice-to-cohort comparisons
  • Disease registries
Ultimately, the end result will be an ultra-connected depth and breadth of useful data that can be streamlined and analyzed at all levels, from a policy analyst to a caseworker on the front lines.

Rick Williams
VP Data Warehouse

Managing Medicaid Managed Care and Encounter Data

Monday, May 19, 2014
David Nelson imageMedicaid agencies have increasingly turned to managed care organizations (MCOs) to deal with the tremendous increase in enrollment driven by the Affordable Care Act (ACA). The Centers for Medicare and Medicaid Services (CMS) released an Encounter Data Toolkit in November of 2013 to assist states with the operational task of managing the data streams from their MCO contractors. 

While most states are collecting encounter data, many face challenges in assessing the quality of data, and some still lack the confidence in their data to use it for rate setting, quality improvement, or public reporting. Over the past 15 years, Truven Health has helped nearly 20 states with their managed care programs and encounter data quality and completeness. We have assisted agencies with encounter data and managed care at all points of the encounter data process, including plan selection and evaluation, data collection, edit revisions, data quality improvement, and using data for plan management.

Most states choose to collect and process managed care data using their Medicaid management information systems (MMIS), for reasons that include the following:
  • The state can leverage the electronic data collection and translation processes already used for fee-for-service (FFS) claims.
  • The MMIS transaction system allows the state to process managed care data on a record-by-record basis, performing such tasks as editing and shadow pricing using procedures/protocols that are familiar because they are also used for FFS data.  
  • All data are maintained in the same system of record. The managed care data are housed with the FFS service data, which allows the Medicaid agency to incorporate all of the data, as needed and appropriate, in federal and state reports.
However, processing managed care data through the MMIS can also have drawbacks. Other states have experienced such issues as:
  • Delays in implementing new processes for managed care data because of the competing demands from FFS claims processing and associated system change orders.
  • Over-rejection of managed care encounters when edits designed for FFS claims processing are inappropriately applied to managed care records, which have already been adjudicated by the health plan.
  • Delays in the ongoing processing of managed care encounter data because persistent data quality issues cause repeated edit failures. This problem can be exacerbated if processes for resubmitting rejected records aren’t well designed and/or well understood and followed by the plans.
  • Inaccurate use or interpretation of managed care data in reporting and analysis because the nuances of encounter data are not accounted for in standard reports or communicated to users performing ad hoc analysis.
To avoid the above problems, states can either make appropriate adjustments to their MMIS systems and processes to fully accommodate encounter data, or consider other system options. States that are planning to re-procure their MMIS systems in the near future have the additional consideration of how much to invest in the existing MMIS system. This is particularly true for states that are moving to statewide, capitated managed care.

Some states have recently asked Truven Health about collecting encounter data directly from their managed care organizations. States could use their data warehouse decision support system (DW/DSS) to collect and process encounter data as either an interim approach or as a longer term process independent of the MMIS. Factors in support of loading the data directly into the DW/DSS include:
  • The DW/DSS is designed to incorporate managed care data – the data model and analytic reporting applications already anticipate the inclusion of managed care data. The DW/DSS provides a single, integrated repository for FFS and managed care data, capable of supporting transformed Medicaid statistical information systems (T-MSIS) and other federal reporting, as well as state-specific reporting needs.
  • By outsourcing this specialized function to a vendor like Truven Health that is highly experienced with encounter data, a state might help speed the availability of the quality data needed for performance monitoring, rate-setting, and public accountability. 
  • Our experience with the validation of managed care data will also help speed improvements in data integrity and increase credibility of the information.
Specifically, Truven Health’s managed care encounter data services, using the DW/DSS would include:
  • Receiving, processing, and translating managed care encounter data
  • Editing encounter data and providing feedback reports to managed care plans for resubmission
  • Storing encounter data and making it accessible for analysis alone or with FFS data
  • Incorporating encounter data into select federal reports
  • Validating and improving encounter data accuracy and completeness
  • An annual in-depth study of the quality of encounter data and development of a Data Quality Improvement Plan with each managed care organization
As Medicaid agencies turn to MCOs to deal with the tremendous increase in enrollment driven by the ACA, they have a partner in their DW/DSS contractors to implement the best practices outlined in the Encounter Data Toolkit. For more information you can contact me at [email protected].

David Nelson
Vice President, Market Planning & Strategy

Organizational Culture Eats Processes Everyday for Breakfast

Tuesday, May 6, 2014
Al Cordoba imageTechnology adoption is one big challenge. You may have experienced situations where executives think that “if we buy the technology, they will use it.” That executive misconception has created the “shelf-ware” issue where expensive software remains unused or marginally used for years because a comprehensive adoption plan was never considered.

When we speak about technology adoption, we’re essentially talking about process change. New technologies bring about change in current processes. We know there is a natural resistance to change. We could ask ourselves questions, such as:
  • How do we deal with this resistance?
  • Have we developed a positive strategy for coping with the process change?
Not everybody has the same attitude regarding a particular change, and some people dislike change because they perceive the change as a loss of control. They look at change as “what have I lost,” and they may like the old process better. Perhaps the old process gave them certain functionality and a comfort level that they are missing now. Can we help them assess what they have left and move into a new better attitude such as “what can I do now?”

In some situations, lack of time gets in the way of process change and neither the implementation team nor the user team has time to adopt the new process. We refer to that situation as “paint the train as we go.” The user team responds with “we don’t have time for this,” and the implementation team is too busy fixing technology defects to address customer needs properly. In this situation, can we add additional resources to cope with this lack of time? When the implementation has technology defects, new users may misunderstand the situation and conclude: “this process does not make sense.” Could we fix the existing technical problems, and then check if the users are simply trying to use their old ways to work in a new system? Is it the new system designed to work differently, but nobody has bothered to tell them how? Education goes a long way to diminish these issues.

Process change happens amidst two cultures: the implementing team culture and the customer culture. And what is organizational culture? According to Wikipedia, organizational culture includes organizational values, visions, norms, working language, systems, symbols, beliefs, and habits. I would suggest that organizational culture is built into the processes or lack of processes to conduct a particular task.
  • Have you focused on the interwoven processes used for analytics?
  • Do you have a process to onboard a new SAS user or a new SAS administrator?
  • Can you excel at supporting all the mundane, but concrete processes associated with a sophisticated analytic technology like SAS?
  • Can we improve those SAS processes, making the new system easy to use and capable of answering critical questions as well as facilitating the discovery of new questions?
  • What is the cultural reaction to new processes in both the implementation team as well as the customer team?
Culture can also get in the way of process performance by attacking the change agents. A process that functions poorly may be a result of misunderstandings. The documentation language may have conveyed a wrong meaning, but it could also be the result of a culture such as: “Not invented here.”  “We just don’t do that.” “That’s not desirable.”  “That’s wrong.” Since culture eats processes everyday for breakfast, leaders should understand the culture needed to make an effective process change. Creating a list of the processes required and discussing it with the involved culture or subcultures will help address adoption issues effectively. 

Effective process change requires a three prong strategy: process review, technical review and user support. It needs thoughtful intervention to avoid creating user mistrust. Consider a technology change that has failed. What happened? How did people resist the process change? What did you do to help the change? Which critical process was most affected? Where did the technical failures happen? Did the lack of knowledge of the new technology severely handicap the user to transition to a new process? Was there an effective training plan to deliver critical knowledge?

In summary, I would suggest that change leaders should:

  1. Identify and describe the old process and compare with the new process that will help users perform efficiently their analytical SAS tasks to the point of perfection
  2. Include all cultures and subcultures in a process redesign
  3. Create a comprehensive knowledge transfer plan.
This focus should help create an effective adoption plan for predictive analytics.

Read more about the Truven Health SAS Center of Excellence.

Al Cordoba, M.S.
Director, SAS Center of Excellence

Using Algorithms and Predictive Models to Find Abuse and Fraud

Monday, April 14, 2014
David Nelson imageA critical success factor in any program integrity effort is applying the appropriate algorithms and predictive models in pre-payment and post-payment claims analysis environments. Truven Health Analytics has experience developing and cataloging hundreds of algorithms which have been used (and are currently used) in various state agency, federal agency, health plan and employer operations to detect abusive and fraudulent claims schemes. We have also seen predictive model intelligence growing in the marketplace, and we are helping payers improve their predictive models so that they more effectively fight fraud and identify high risk claims before the claims are paid. While these sophisticated approaches are implemented to find what we didn’t see before, we also see our clients achieving results every year with some of the tried and true detection algorithms. Each year our expert panel – a team that works with payers across the healthcare spectrum every day – selects a set of key algorithms. We just presented a webinar on the Key Algorithms for 2014, and the presentation included:

  •  A new approach to the overuse of modifiers. We focused on modifiers 22, 24, 57, 76, and 77.
  • The device malfunction algorithm which identifies claims where the reason for treatment or services rendered is due to a malfunctioning implanted device
  • Extended DME rental use
  • Over utilization of diabetic supplies
  • Critical care on date of discharge
  • Advanced life support (ALS) transportation without an inpatient stay
  • Hospital acquired conditions
  • Over utilization of lumbar MRIs
  • Lumbar MRI, post lumbar MRI, or CT
Some of these algorithms represent new schemes we are seeing, and some represent schemes that continue to produce analytic results that PI units and Special Investigation Units (SIUs) can take action on and make recoveries. Our team has produced the Key Algorithms list annually since 2003 to support the healthcare payer community that is dedicated to improving integrity and eliminating fraud, waste, and abuse in healthcare. If you would like more information on algorithms and predictive models, feel free to reach me at [email protected].

David Nelson
Vice President, Market Planning & Strategy