by Matt Ferrari
Chief Technology Officer
In this CTO Talk podcast (aired on HealthcareNOW Radio June 25 – July 23) my guest was Adam Klass, from VigiLanz Corp. Adam’s story is an interesting journey from a college intern fascinated by statistics, to a successful CTO who built a company with his father in an effort to find meaning in the data hospitals and clinicians collect. Founded in 2001, the company has come a long way, and continues to get noticed for innovative solutions - from using health data and rules engines to machine learning. Gartner named them to their Cool Healthcare Providers list in 2015.
What Adam is doing is important. There’s been so much emphasis on getting patient history and data into the electronic health record (EHR) but until we actually get it back out and use it, we just have technology, not value-based care, and certainly not innovation in healthcare.
Interoperability and value-based care is something so many providers I meet are struggling with. The investments in EHR/EMR technologies have been enormous. In fact, Healthcare Informatics cites this problem when writing about Black Book’s survey “State of the Global EHR Industry 2018” projecting the global healthcare spending on EMR/EHR technology to reach more than $30 billion U.S. by 2020.
As we build momentum toward value-based care, interoperability moves from a nice-to-have to a must-have status. Health IT Analytics cites the recent Healthcare Financial Management Association (HFMA) survey (sponsored by Humana) that found “more than 70 percent of healthcare financial executives say that data interoperability must improve within the next three years to ensure the success of value-based care.”
VigiLanz converts data in the EHR/ EMR into actionable intelligence in their platform, making interoperability infinitely easier. Instead of using flat files ala HL7 and waiting for hospitals to find the time and money to build out the interface, they use a web API to pull the info they need from the EHR/EMRs, handling the connectivity and data structure normalization. This can save a couple hundred hours of IT time, and you can do the math on the cost involved with that. It’s also important because for us to really find insights and make meaning of data we need a big N, which means gathering and normalizing data across disparate silos and often from within differing EHR technologies. With his work, Adam is able to amass big data sets quickly, regardless of the EHR/EMR, from which he is running rules engines, machine learning and predictive analytics to glean insights and support better clinical decision making including advances in preventative care at hospitals across the country.
Give the conversation a listen here on demand:
I found this conversation especially interesting because it can be tough to get people to embrace change, like the move to value-based care and a willingness to take disparate or fragmented health data and find a way to share it across systems. But when you can show them better outcomes, streamlined workflows, decreased maintenance costs and minimized risk all leading to improved patient outcomes, folks get on board pretty quickly.
If you missed any of my previous CTO Talks, you can stream them on demand here on my SoundCloud playlist.