The Data Scientist - the WPC Healthcare Blog

Data Science Helps to Identify the Right “Tools” to Address Back Pain

Facebooktwittergoogle_plusredditpinterestlinkedinmail

Published by: Ray Guzman, CEO, WPC Healthcare

Machine Learning to Assist Back Surgery Decision-making

Whether you’re a long-haul trucker or a weekend tennis jock, there’s a good chance you’ve experienced back pain. Maybe your pain was so severe it led you to an orthopedic surgeon’s office. Watch out!

As a recent study details, surgery is what surgeons do—as in the old adage, if you’re a hammer, everything looks like a nail. And for the many patients who have had spinal fusion or other surgeries to alleviate back pain, surgery may not have improved outcomes. Moreover, those who chose alternative therapies for pain (such as yoga, swimming, and physical therapy) may have achieved better results.

While most of us can agree that medical diagnosis is as much art as science, the steady uptick in back surgery over the last 20 or so years clearly raises some questions.

For the suffering patient and the sometimes-baffled physician, it can be difficult to determine when surgical intervention will truly make a difference. Current diagnostic methods include the use of assessment tools and some level of automation that, while efficient and somewhat effective, allow for errors and force providers to review more causes than necessary.

Computer-aided diagnostics incorporating data science and machine learning can help. Though not intended to replace a physician’s diagnostic skills, adding data science to the diagnostic process offers more precise and accurate measurements faster and with fewer false-positives and false-negatives.

Data Science Challenges in Medicine

In the data science process, developing algorithms is best when the size of datasets is fairly large. In medicine, HIPAA and other patient privacy protections make it difficult to gather and combine datasets. Despite these challenges, the Data Science Team at WPC Healthcare decided to take a crack at creating a spinal algorithm to help diagnose pathologies of the spine by leveraging a unique data strategy.

The result—a medical decision-support framework consisting of three subsystems: feature engineering, feature selection, and model selection. The framework generates a model for classifying new observations. The data from that model are refined through an automated feature selection process, thus yielding enhanced prediction accuracy.

Leveraging this framework, the predictive accuracy increased significantly to almost 100 percent in terms of identifying the correct underlying pathology. Physicians using this framework can have greater confidence in the diagnoses provided to their patients and thus can make recommendations for the appropriate intervention.

Data scientists are often humbled by biology. Physicians are just as humbled by how effective data science can be in medical decision support by offering them the experience of hundreds of their peers nationally and globally. By taking a multidisciplinary approach to pathologies of the spine, WPC data scientists may have given providers a better set of tools to identify the nails that truly need a hammer, and those that just might need something less invasive.

To read more about this model, check out this journal article: WPC Spinal Algorithm.

What’s Your Data Strategy?

Facebooktwittergoogle_plusredditpinterestlinkedinmail

Published by: Ray Guzman, CEO, WPC Healthcare

December 15, 2016, NASHVILLE, TENNESSEEWhen my family and I moved into our current home, we got a beautiful advertisement from a local hospital system, which also included a survey. If you filled it out, they’d give you a Home Depot gift card. I filled out the survey, which asked me how I liked to be contacted. I selected email and text. Not long after, I got an automated phone call and no further follow-up. While they were spot on in identifying us as a customer, their engagement strategy didn’t match up. This may have been unique to my survey, but I suspect a more persistent problem lurks: a data infrastructure that’s not performing as well as it should.

Whether it’s a clinical report from an EHR, a new way of measuring bad debt, or survey data, healthcare organizations are generating mountains of data. But efforts to glean appropriate insights to provide better patient outcomes aren’t yielding the needed results. With the final CMS rule implementing the Medicare Access and CHIP Reauthorization Act, the shift to value-based reimbursement has been greatly accelerated. In the march from volume to value, organizations that don’t take the time to transform data into a truly valuable asset face significant challenges.

Based on our experience at WPC, the road to truly optimize an organization’s data infrastructure has five core milestones:

1. Build a solid data foundation.

You have to walk before you run. The opportunity to monetize the value
of big data actually starts well before algorithms and statistical models are applied. Similar to building a house, you must first construct a solid foundation to ensure the longevity and value of your investment. Hospitals that demonstrate the highest level of downstream analytical maturity focus their attention on data capture, data quality, data integration, use of external data resources and data governance.

2. Value data as an asset.

Data may be an intangible asset, but forward-thinking organizations monitor, manage and protect data with the same level of diligence that is applied to traditional brick-and-mortar holdings.

In an analysis from Ocean Tomo, a leading consultant in the area of intellectual capital, the proportion of intangible assets has grown from about 17 percent to 81 percent in recent years, mainly due to information playing an increasingly important role in data-driven organizations1.

3. Establish a data governance policy.

Data brings risk – data breaches and ransomware attacks – and opportunity, as data assets become a competitive advantage. An enterprise-wide data governance policy balances the risk/reward ratio, and should answer these questions:

  • Who are the custodians of the data?
  • Who is accountable for the various aspects of the data, including its accuracy, accessibility, consistency and completeness?
  • How is the data stored, archived, backed up and protected from mishaps, theft or attack?
  • What audit procedures are in place to ensure ongoing compliance with government regulations?

4. Get started now.

Savvy healthcare leaders are not waiting for a universal data governance mandate. Instead, they invest early to deliver accurate and reliable data. This strategy allows for targeted analytical solutions that can be put into practice by clinicians and operating staff immediately. In doing so, efficient care organizations are able to move quickly and tackle the most pressing use cases for applied analytics.

In addition, healthcare leaders aren’t waiting for the changes that a new presidential administration will bring. While the only certainty seems to be uncertainty when looking ahead, the pressure to keep costs down and quality up aren’t going away. Paying for value requires payers and providers to use, share, and analyze data. Hospitals and other health organizations can’t afford to take a wait and see approach to get their data infrastructure in shape.

5. Leverage the ROI in organized data and analytics as a competitive differentiator.

Assuming all of the above milestones have been achieved, provider organizations are positioned to receive a compelling return on investment by putting that data to use. Why? Because the MACRA-driven push to value-based care requires providers to meaningfully measure and report on it. Data is baked into the healthcare cake. We can’t take it out now.

To create a basis for which predictive analytics can be applied in the future, it’s imperative that providers create a logical and efficient infrastructure for ongoing collection.

I never heard anything more from the hospital with the fancy magazine and not so fancy survey. But I sure hope they’ve given their data strategy serious thought – and just as important – serious action because they missed an opportunity. They didn’t execute end-to-end on their strategy, the result was an investment that didn’t have an opportunity or yield a full return.

1http://www.cmswire.com/cms/information-management/quantifying-the-value-of-your-data-026674.php

Sepsis: Every Second Matters

Facebooktwittergoogle_plusredditpinterestlinkedinmail

Published by: Ray Guzman, CEO, WPC Healthcare

By now, you’ve seen our continued drive to help hospitals and healthcare organizations tackle their biggest challenge, diagnosis, and treatment of sepsis. When we started this effort, we were consistently amazed by the numbers. Even more surprising? In the last 42 years, there has not been a single technology or clinical advancement beyond the identification protocols that remain the “gold standard” for sepsis identification. We’re still deploying the screening processes. What’s missing? Those processes don’t take into account the precious value of time in the sepsis treatment equation.

With sepsis, time is what matters. Every hour delayed means an increased mortality of 7.6 percent. A thirty percent miss rate requiring additional testing and clinical identification means hours and lives lost. Couple the human element with the cost and expenses of readmissions, and it translates to $3.1 billion annually. Add the lack of communication between acute and post-acute providers, and you’ve got our current reality, an epidemic. Sepsis.

With protocols that haven’t changed in 42 years, how do we begin to save precious time, and most importantly, lives? By implementing a sepsis strategy that leverages big data and machine learning. The ability to identify patients most likely to acquire sepsis at the point of admission provides care teams with insights leading to protocol-based treatment, earlier.

Today, clinical indicators, beginning with a rise in blood pressure, occur on average 15 hours into the 36-hour decline from infection to death. EHR-based alert systems require additional validation and communication breakdowns between hospital staff and physicians all plagues the process.

With WPC’s unique sepsis solution, every healthcare organization now has the ability incorporate data science, giving precious minutes back to the clinical staff to save lives and reduce current and FUTURE expenses associated with readmissions and ongoing treatment.

That all sounds great on paper, but what does implementing a sepsis solution mean to an already taxed and stressed healthcare system?

It means collaboration. A meaningful partnership between clinicians and technology. It means hope. The hope and promise of saving more lives. The clock is ticking.

Five Reasons to Leverage Data Science in the Fight against Sepsis

Facebooktwittergoogle_plusredditpinterestlinkedinmail

Published by: Ray Guzman, CEO, WPC Healthcare

It’s time to change the rules of the game.

Remember when Lucy, in A Charlie Brown Christmas, gave brother Linus five good reasons why he should go through with his role in the Christmas play by folding her five fingers into a fist? For patients battling sepsis—and the physicians racing against time to cure them—the infection packs a far more devastating punch.

With no real change in the protocol for managing sepsis in the last 40 years, it’s time hospitals pick up new tools to combat the infection. We’ve come up with five good reasons why using a data science approach to combating sepsis can deliver the knockout punch needed to save lives.

How Data Science Works to Combat Sepsis

Before we go into how data science (another way to explain is called machine learning) relates to sepsis, we should discuss how it works. Rather than leveraging another piece of software or hardware to integrate into existing technology, this type of technology-based strategy relies on super computers that have the ability to learn. Instead of relying on software, the power is in the algorithms used to train the computer, which is focused exclusively on the problem of sepsis in which the technology grows and improves over time when it’s fed with data from a variety of sources.

1. There really isn’t a human expert on the subject of sepsis—academic or otherwise.
Many good and talented people are working very hard to understand sepsis, yet it’s still elusive as to what triggers the body to develop this runaway response to infection. Untreated sepsis progresses so fast, clinicians struggle to get ahead of the curve, and often a diagnosis is confirmed very late, too late.

2. The best clinicians cannot explain their expertise.
Suppose you have two patients. They came in at the same time and were treated for the same illness with the same treatment protocol. One develops sepsis and the other doesn’t. Over time, many clinicians develop a “sixth sense” about who will be diagnosed. But it’s hard to quantify, much less instruct, on the basis of a “sixth sense.”

3. Sepsis definitions, protocols, and staff change over time.
Solid protocols are in place in many hospitals, but between nursing staff changes and the communication breakdown among hospital staff and physicians, additional time is added to a process in which every hour counts. With sepsis, the time between identification and treatment has major consequences in terms of lifelong chronic health conditions and, at worst, mortality.

4. There is no such thing as an average sepsis patient.
Sepsis patients are like snowflakes—no two are alike. That’s why it’s hard to get ahead of the sepsis curve. Sepsis can be masked by more than 84 diagnostic codes from bacterial meningitis to bronchitis to a simple UTI. Clinical indicators, beginning with a rise in blood pressure, occur on average 15 hours into the 36-hour decline from infection to death. But EHR-based alert systems generally require additional validation, resulting in even more time before the initiation of an aggressive sepsis treatment protocol.

5. Much of the low-hanging fruit of process improvement for sepsis has already been put into play.
The clinical indicators and protocols have been fine-tuned to the point that maximum benefit has been achieved. TREWScore, MEWS, and routine screening identify approximately 70 percent of patients who fall in the area under the curve (AUC) but does not take into account the precious value of time in the sepsis treatment equation. Every hour of delay in the application of effective antibiotics increases mortality by 7.6%. The combination of a those patients missed by traditional screening coupled with the 24 to 26 hours wasted for clinical identification leaves a trail of lives lost and expenses that have risen precipitously. With more than 258,000 sepsis deaths annually in the US at a staggering cost of more than $32 billion per year, surely we can do better.

To put this in perspective, consider the fact that what is happening in America is tantamount to two jumbo jet crashes with no survivors—every single day. If that were happening, would the airline industry be looking for new approaches?

Taking the best of what data science can offer to develop a truly new approach for combating sepsis is the right thing to do. For providers who are committed to reducing in-hospital and post discharge sepsis mortality without an additional cost burden, it’s time to look at new solutions to a vexing and all too devastating problem.

 

Three Approaches to Reducing the Ravages of Sepsis

Facebooktwittergoogle_plusredditpinterestlinkedinmail

Published by: Dr. Todd Gary, PhD

Mohammad Ali recently succumbed to sepsis. Ali was not alone in fighting this deadly condition. More than 1.6 million cases of sepsis in the US each year result in more than 258,000 deaths—an average of one person dying of sepsis every two minutes.

The financial cost of treating sepsis is staggering—at more than $23 billion annually—making it the single most expensive condition treated in US hospitals. Sepsis accounts for 40% of all ICU costs, and the average cost for treatment of ICU patients with sepsis is six times greater than that for nonsepsis ICU patients.

According to the Mayo Clinic website on sepsis, “Many doctors view sepsis as a three-stage syndrome, starting with sepsis and progressing through severe sepsis to septic shock. The goal is to treat sepsis during its early stage, before it becomes more dangerous.”

Sepsis is the body’s runaway immune response to life-threating infection. The triggers of sepsis are not well understood, and the early clinical signs are often seen as a normal immune response, which makes sepsis hard to diagnose. Untreated sepsis progresses so rapidly that the diagnosis is often confirmed only after death. As sepsis progresses, blood supply to vital organs is restricted or shut down, resulting in severe sepsis, and if this condition persists, then the patient enters septic shock—a stage of sepsis for which the mortality rate can be as high as 70%.

Time is a critical factor affecting survivorship with sepsis. Infections that trigger sepsis and ultimately septic shock can be caused by bacteria, viruses, fungi, or parasites. A key step in early treatment or prevention of sepsis is fast, accurate identification of the pathogen and administration of effective treatment to slow or reverse the spread of the infection and the body’s septic response. Without timely intervention and treatment within the first few hours, the patient survival falls dramatically to zero within 36 hours.

There are three main approaches to lower the number of people experiencing the ravages of sepsis:

  • Early Identification. This includes using a data science approach to identify patients susceptible to sepsis, closer monitoring of these patients, a more accurate means of diagnosing sepsis when other conditions are present, and awareness and prevalence of pathogens.
  • Rapid Treatment. This includes having hospital staff on heightened alert for signs of sepsis, ready to quickly administer appropriate treatment targeted to the infection and providing vital fluids to the patient.
  • Prevention. This includes lowering 30-day readmission, which can be as high as 60%, and increasing the use of effective and consistent hygiene and the use of vaccinations

The ultimate goal of these approaches is to reduce suffering, save lives and lower costs. As the interim director of the Data Science Institute at MTSU and visiting scholar at WPC, I am proud to work with WPC Healthcare on data science approaches designed to reduce the devastation of sepsis through early detection.

Sources:

 

Is Your Data Strategy Future-Proof?

Facebooktwittergoogle_plusredditpinterestlinkedinmail

Published by: Pat McGrath, Chief Technical Officer at WPC Healthcare, Inc.

The fundamental principles of science still apply, even in the world of Big Data.

I think of my 7th-grade lab teacher sometimes when I work with a client on a data science related project. When introducing the class to using metric scales, she emphasized the importance of calibrating the equipment first because, “Garbage in, garbage out.” Meaning, you can’t expect to deliver accurate results if you don’t get the foundational aspects right up front.

This same philosophy can also be applied to some of the most talked about, mulled over and trending topics in the world of healthcare – data science and predictive analytics. But, what’s forgotten in the conversation is the emphasis on first getting a solid data foundation in place. Some mistakenly think that data is data, and it’s all equally valuable, accurate and usable. Not so. The distinction is referred to as data literacy – understanding the origin and meaning of the data being considered to enact change.

Organizations that want to succeed deliver the best health outcomes and control costs need to focus on getting a solid data foundation in place. Hospitals that demonstrate the highest level of downstream analytical maturity focus attention on the following upstream data-related activities:

  • Data capture
  • Data quality
  • Data integration
  • Data trustworthiness
  • Data governance

Strategies for Optimizing Your Data Infrastructure

For organizations who are committed to developing a holistic data governance methodology, there are several core strategies that can help your organization evolve.

  • Value data as an asset!
  • At the heart of the analytical hospital is a commitment to treating data as a valuable asset that must be managed over time and constantly improved. As Pete Aiken said, “data is not technology”.

  • Don’t wait.
    Savvy healthcare leaders are not waiting for a universal data governance protocol. Instead, they invest early to deliver accurate and reliable, actionable data. This strategy allows for targeted analytical solutions that can be put into practice by clinicians and operating staff immediately.
  • A profound ROI in organized data and analytics.
    Provider organizations who invest time and resources into their data and analytics capabilities report a compelling return on investment. Related to that, senior leaders and governing boards are now including how their organizations capture, manage, store and utilize their data assets in the list of more traditional competitive differentiators such as price, product features and supply chain.
  • Your data foundation influences the value of your analytics.
    While success in the world of analytics is driven by a variety of elements, high performing organizations prioritize data management and governance rather than trying to solve for all data issues.
  • Prioritize a governance policy.
    Astute analytical leaders invest in their data assets based on a list of prioritized analytics use cases. That is a critical step towards becoming an analytical competitor.
  • Opportunities for Vast Improvements
    Data science and analytics can and should drive the business and clinical decisions of hospitals in order to make significant improvements in health outcomes.

I advise clients that while it may be initially intriguing to just focus on the ultimate goal (predictive analytics, machine learning, etc.) we bypass the infrastructure and data governance work which is so crucial to long-term success.

 

Deploying Data Science to Proactively Address Zika Virus—Lessons Learned From Chicago

Facebooktwittergoogle_plusredditpinterestlinkedinmail

Abstract: Though public health officials are asserting cautious optimism that Zika virus will have minimal impact in North America, mosquito-borne viruses must be taken seriously. Communities throughout the US are confronting dengue fever, chikungunya and more commonly, West Nile virus with an array of tools to manage the public health crisis. Data analytics gives affected communities the information they need to not only manage outbreaks, but predict where the next one is likely to occur, giving them valuable time to minimize the impact an outbreak is likely to have.

The World Health organization recently declared a public health emergency in connection with the Zika virus outbreak. No locally transmitted Zika virus cases have been reported in the continental United States, though a recently diagnosed Virginia college student joins around 30 or so cases of those who contracted the virus while traveling in Central or South America. It’s unlikely to have a significant impact in North America, but Zika is one of many mosquito-borne viruses that should be taken seriously, no matter where you live. Those on the front lines of containing these viral outbreaks would do well to harness the power of data science as a powerful new tool to map where future outbreaks are likely to occur and thus minimize the devastating impact of a mosquito-borne virus.

What We Learned From Predicting West Nile Virus
Recently I addressed a global health conference in Colombo, Sri Lanka on how data science, incorporating predictive analytics, was made available to the city of Chicago to predict where West Nile virus was likely to occur.

City public health systems have established comprehensive surveillance and control programs for West Nile since roughly 2004, but very few have developed a process to predict where the virus might occur. From the time it was first reported in the U.S. in 1999 through 2012, treating the West Nile virus cost $778 million in expenses and lost productivity.

Treatment is usually indiscriminate – and costly – insecticide spraying. Like chemotherapy, which kills non-cancer cells alongside cancerous one, the spraying helps kill mosquitoes but may not be the best thing for the environment, the municipality or public health in general.

To provide a more strategic, targeted approach, I worked with the city of Chicago to predict West Nile virus outbreaks. We collected a massive amount of weather and demographic data, as well as mapping previous testing and spraying locations. The result was an algorithm to predict when and where different species of mosquitoes will test positive for West Nile virus. City health officials could then use the algorithm to direct Insecticide spraying and other measures, such as eliminating pools of standing water, only to the affected areas. The result: a better, faster and cheaper way to contain an outbreak—proactively.

Using Data to Combat Zika
For communities gearing up to combat Zika virus, the cost will be high, not only in terms of “vector control,” or mosquito management, but also in terms of the genetic defects linked to the disease. It’s time to get smarter about using available resources to fight Zika outbreaks.

First, it’s important to understand the unique behaviors of Zika-transmitting mosquitoes. They take shelter during the heat of the day to feed. As a result, they live closer to people than the West Nile virus-carrying mosquito. As a people, we are collecting more data than we have before at any other point in time, shouldn’t we make use of that data we have sitting in repositories. Using Data Science to help find the signal in the noise should prove useful for all of us. In the case of West Nile, our algorithm was able to predict outbreaks with 85 percent accuracy, or 8.5 cases out of 10.

The tragic consequences of Zika virus currently unfolding in South America demand new approaches to combating the outbreak. Until antibodies can be developed into a vaccine to protect affected populations, using data science to stem current outbreaks is an essential tool for the public health tool kit.

Sources:

 

In the Brave New World of Data-Science-Driven Healthcare, Will There Be Room for Physicians?

Facebooktwittergoogle_plusredditpinterestlinkedinmail

Imagine a computing platform that can read a healthcare organization’s email, Word documents, PDFs, EHR, and text files at marvelous rates of speed, all in an effort to combine its own unique learning into a knowledgebase without any help from you, the healthcare professional. When you are ready, you can ask questions of it to help develop a working theory on facts, associations, or entities extracted from your enterprise data.

This is interesting on many levels, but what could such power mean for healthcare providers in the brave new world of data science?

Don’t panic—at least not yet!

The approach that many organizations are taking in the race to Big Data and beyond will certainly impact the level of panic. In my efforts to answer healthcare’s most challenging questions, I use machine-learning algorithms to help healthcare providers and health plans across the United States generate better outcomes clinically, financially, and operationally. However, there are ethical issues to consider as we leverage machines within healthcare. In my judgment, creating pure, undirected “artificial intelligence” is not as desirable as creating “beneficial intelligence” designed to support the work of healthcare professionals.

Recently, Hospital Corporation of America (HCA) made a strategic investment in Digital Reasoning to formalize their efforts to improve patient outcomes and reduce cost of care. In broad terms, Digital Reasoning has a corporate mission of using cognitive computing to create a better world. What is meant by “cognitive computing” and how will it make the world “better”?

Generally speaking, cognitive computing involves self-learning systems that use data mining, pattern recognition, and natural language processing to mimic the working of the human brain. The goal of cognitive computing is to create automated IT systems that are capable of solving problems without requiring the need for human assistance.

It is the last part of that statement that may have executives, clinicians, and coders anxious—and for good reason. After all, no one likes the idea of being replaced by a machine. Many healthcare professionals are certainly right to be guarded when vendors claim that data has great potential to create a more personal, automated, value-focused, and productive healthcare system and potentially reduce head count.

HCA has amassed a tremendous amount of patient data since its founding in 1968, so it’s easy to see why they are interested in exploring how machine-based intellectual capabilities could impact their business.