About Adam Curran

Adam Curran is a Product Marketing Manager at SRS. He oversees marketing intelligence to support the development of strategic marketing plans. Prior to joining the organization, he was a key member of a pharmaceutical software company’s Clinical Development Business Unit, specializing in the clinical data management elements of the drug development lifecycle. He was also the editor for their microsite’s blog. Adam has also held roles at the UK’s National Energy Foundation and Skills Funding Agency.

The Importance of Flexible Technology in High-Performance Practices

flexible-tech-blogAn article posted recently to LinkedIn—about the jobs most and least likely to fall victim to robot replacements—started me thinking about the place of technology in healthcare. One takeaway from the article is that automation is best deployed for tasks that are manually or cognitively repetitive, freeing humans to specialize in tasks that are non-repetitive and non-predictable, ones the writer describes as requiring “human intuition, reasoning, empathy and emotion.”[1]

That was exactly the promise of electronic health record (EHR) technology—routine bureaucratic tasks would be automated, freeing doctors and staff to do what they do best: treat patients. Yet in a recent study published in the Annals of Internal Medicine, ambulatory physicians spent an average of a full hour at the computer for every hour they spent face to face with patients.[2] Imagine automating a factory and discovering that workers now worked twice as long, or produced half as much, because of the time required by the new technology that was supposed to reduce their workload.

Paradoxically, with recent advances in technology, it is now more possible than ever for EHRs to fulfill their original promise—and more; the problem is that most of the EHRs being offered to medical practices are simply the wrong technology. In an attempt to meet standardized government regulations, vendors have created standardized EHRs—gigantic, one-size-fits-all behemoths that attempt to meet the needs of all physicians, but end up missing the mark with nearly everyone. Particularly when it comes to specialists. KLAS’ Ambulatory Specialty 2016—One Size Does Not Fit All—Performance Report found that although traditional EHR vendors try to cover all specialties, fields like ophthalmology, orthopedics, and dermatology still lack the functionality required.[3]

This is why one size definitely does not fit all. The right EHR solution for a hospital or general practitioner, seeing a limited number of patents with a wide variety of conditions, will look quite different from the EHR for specialists who see a high volume of patents with similar complaints. And of course, different specialties won’t want exactly the same EHR, either, making flexibility—rather than universal applicability—a major prerequisite.

No wonder that 86% of specialists, according to Black Book Market Research, agree that the single biggest trend in technology replacements these days is the move to specialty-driven EHRs because of the workflow and productivity complications that accompany conventional, template-driven EHRs.[4]

Unfortunately, the problems with inflexible, template-driven EHRs don’t end with the lack of specialty-specific solutions. A secondary, but still significant, concern is the inability of many EHRs to be tailored to the need of individual physicians within the practice. One doctor may prefer taking notes, another inputs her own data, while a third dictates; one may be comfortable communicating through a patent portal, another prefers the phone. True flexibility means that no provider has to change the way that he or she has been practicing medicine simply to satisfy the demands of a generic template.

It also means that, when it comes to increasingly crucial matter of data collection, the decision about how data should be collected—what should be collected electronically and which should remain manual—is left up to the individual practice. In the next blog, I will look at what is called “role-based data entry,” and how this can increase productivity and cut costs.


 

[1] https://www.linkedin.com/pulse/5-jobs-robots-take-first-shelly-palmer

[2] http://annals.org/article.aspx?articleid=2546704

[3] Ambulatory Specialty 2016—One Size Does Not Fit All—Performance Report. KLAS. April 2016.

[4] https://blackbookmarketresearch.newswire.com/news/specialty-driven-ehrs-make-a-comeback-reveals-2016-black-book-11534546

The Right Tools for Relevant Results

surgical-tools-315pxThere is discussion in the industry about the effectiveness of healthcare information technology (HCIT) solutions. And so there should be; although we have seen improvements in HCIT solutions, a significant number of physicians are not happy with their current systems. Perhaps it is because some vendors feel that they know what’s better for their practice, and build the system around their vision at the expense of how the doctor likes to do things. Or maybe it’s because vendors sell practices solutions that aren’t specialized to their requirements—leading to complexity, fatigue and frustration. In either case, doctors are forced to use tools that are inappropriate to their needs and slow them down.

It’s not rocket science: doctors want tools that help them do their job effectively. Like the stethoscope—it’s one of the oldest medical tools still in use today, but it continues to perform an essential task, even in an era of high tech, and there is nothing complicated about it. Although it was originally invented to spare a young physician the embarrassment of putting his ear directly up against the chest of a young woman, it turned out to have enormous diagnostic value. Because of that, the stethoscope quickly caught on with other doctors.

Another good example is molecular breast imaging (MBI). Mammography was a good way to detect breast cancer, but MBI turns out to be three times more effective at finding tumors in dense breast tissue. MBI is simply a tool that has produced better results.

What about laser surgery? Developed at first for eye and skin surgery, it has expanded its range to include different medical and cosmetic procedures, from cosmetic dermatology to the removal of precancerous lesions. Laser surgery allows doctors to perform certain specific surgeries more safely and accurately—again, a new tool that provides better results.

When it comes to HCIT solutions, however, the reception has been decidedly less enthusiastic. Maybe that’s because, in contrast to the examples above, it hasn’t been clear what the purpose of HCIT solutions actually were. To help doctors collect data on patients, or to help administrators collect data on doctors? To make practices more efficient, or to simplify the government’s monitoring of public health? Without a clear task to perform, it’s not surprising that HCIT solutions have produced mixed results. It’s hard to assess the value of a tool when you aren’t sure what it is supposed to do.

It turns out that, like the stethoscope, electronic health record solutions were a tool designed for extra-diagnostic reasons, and then later repurposed. However unlike the stethoscope, the adoption of EHRs has been driven not by doctors who found them helpful, but by hospitals, insurance plans, and government agencies who sought to control skyrocketing costs and standardize healthcare. This disparity has been an underlying cause for ineffective workflows within the systems. And even when EHRs were designed with physicians in mind, they were designed for primary care physicians, leaving the specialist community underserved.

What is clear is that, when an HCIT solution is designed with the primary purpose of helping doctors, the industry does see value in them. According to the latest Black Book survey of specialty-driven EHRs, 80% of practices with specialty-distinctive EHRs affirm their confidence in their systems. The same survey reported that satisfaction among users who had switched to specialty-driven EHRs has shot up to 80%. And finally, 86% of specialists agreed that the biggest trend in technology replacements is specialty-driven EHRs due to specialist workflow and productivity complications.

The statistics show what we already knew; doctors want the technology and tools that give them relevant results. Like earlier great medical inventions, HCIT can play a vital role too. One positive development is that EHRs, like the lasers used in surgeries, have evolved to serve a variety of specific purposes. Just as there isn’t a single type of laser that is used by both ophthalmologists and dermatologists, EHRs are increasingly specialty specific.

This means that specialists are no longer forced to use systems designed for primary care physicians that collect every piece of data that every type of doctor might possibly need. That sort of all-inclusive data collection doesn’t lead to better results; if anything, too much data causes unnecessary clutter, making analysis more difficult. What is crucial is having more RELEVANT data. Specialists need EHRs that collect the data that is relevant to them, and only the data that is relevant to them. They need an HCIT solution that is driven by their specialty, that respects their workflow, and that has the flexibility to handle their practice’s unique requirements.

To find out more about developments in HCIT solutions that are improving patient care, check out our latest whitepaper, “Healthcare: How Moving from Paperless to Frictionless is Improving Patient Care”.

Free-Flow Workflow: How Did This Help with Data Collection?

data-flow“Being flooded with information doesn’t mean we have the right information or that we’re in touch with the right people” – Bill Gates

We are able to collect a wealth of information today, thanks to technological improvements over the last couple of years. For a long time, specialists struggled to get the most out of earlier EHR solutions due to the limited data available. This was not so much the fault of EHR vendors but rather of the inherent limitations of the technology at the time. Additionally, the first “templated” EHR systems were specifically designed for primary care and family practice doctors. These systems were not suitable to meet specialists’ different data needs and handle a much higher volume. I did a post recently on the evolution of data capture (read it here).

When it comes to submitting meaningful use data to CMS, however, with all this data available, identifying and collecting it generally takes a long time. There are studies that show an increase in the number of physicians who spend more than one day a week on paperwork, and that indicate many physicians still feel that EHRs do not save time. Although this technology is allowing practices to comply with meaningful use requirements, the cost seems to be too high.

What are we seeing here? Physicians are spending more time capturing data due to regulations, and this is taking up the time available to see patients. How did we get to a point where the physician is spending more time staring at the screen than looking at the patient? I’m not a doctor, but I can imagine that they went into the profession to actually help people as much as they can, so more face-to-face time with the patient is the end goal here.

What is the solution to handling this volume of data? Certainly not reducing the amount of data—it would be hard and time-consuming to distinguish which data to get rid of. The solution must focus on making it quicker to handle this data. This is where free-flow workflow comes into play. Rather than having to go through the laborious process of submitting the data to each application, it essentially reduced the repetitive steps involved, thereby streamlining the submission of data.

This big time saver helps to alleviate the pain, but there are still limitations. Fortunately, we are now at a point where we can get a workflow that isn’t just free-flow, but also adaptive. To find out more about this development and other future trends, you can read our white paper.

Hackathon 2.0: Bringing the Best Out of Participating Clients and Employees!

hack2-srs-logoWe have had a lot of fun here at SRS over the last couple of weeks; don’t worry, we have still been working hard! To clarify, we have been focused on our second annual Hackathon, a collaborative forum designed to innovate meaningful HCIT solutions for specialists.

We brought together our enthusiastic employees throughout the organization as well as select clients to come up with ideas for new and useful innovations. We didn’t simply see this as a side-project; our staff was fully committed to this project, and was working around the clock over the last couple of weeks bringing these great ideas to fruition.

This year’s theme was “Problem Solved”. Cross-functional teams were created and tasked to come up with breakthrough solutions to problems that would affect the patient and/or clinical experience.

Teams were also asked to think from the point of view of a new start-up healthcare IT company and encourage to invent a solution that really responded to a need in the market today from a fresh perspective.

Each team presented their solution’s business case, along with a prototype, video, and supportive marketing campaigns. Judges selected winners, and SRS will be funding development of the innovations that they believe will have the biggest impact on providing better healthcare through technology.

Several of the ideas selected will be showcased in the Innovation Expo at SRS’ annual User Summit. Clients can see future innovations in action and add their feedback at the event. Last year’s expo was one of the highlights of the conference.

We are always looking to hear great ideas, and get very excited during the Hackathon period which allows us to bring together our creative staff and client partners. That is the thing about great ideas; you just never know where the next one will come from! This is the way to come up with solutions that are truly user-centric in design.

Click here to learn more about how we do things.

Patient-centric Data Capture—Where Is It?

hc-prof-blog-image-v2We all know how increasingly important the patient experience is becoming in clinical trials and healthcare. With more emphasis being placed on quality care and patients’ active participation in their own treatment, it follows that this will have an effect on what solutions and services are required to satisfy consumers in this market. Consumers nowadays have a flood of information available at their fingertips—an amount unimaginable even just 15 years ago. And while the ability to look up symptoms online in the middle of the night has undoubtedly increased the number of hypochondriacs, it has also led to a higher number of truly educated patients, and an accompanying need for specialists to respect and involve them in the diagnosis and treatment process.

But what does it mean to be patient-centric? Our good friend Wikipedia defines it as “support[ing] active involvement of patients and their families in the design of new care models and in decision-making about individual options for treatment.” Not much help, is it really?

The Institute of Medicine defines it as “providing care that is respectful of and responsive to individual patient preferences, needs, and values, and ensuring that patient values guide all clinical decisions.” The difference in definitions seems to come down to how involved the patient gets in their healthcare. The first definition suggests that the specialist is at the center of decision making, but supports the patient involvement as well. The latter, at least in my opinion, implies that the specialist actively collaborates with the patient by empowering them with the necessary data to make their own treatment decisions.

By either definition, however, data capture is currently falling short of what it takes to be truly patient-centric, despite how far it has come over the last decade. Electronic Health Record (EHR) solutions have been widely adopted in a variety of healthcare specializations, and although the way they collect data can create friction and inefficiencies with specialists’ workflow, they still provide enormous benefits. They streamline access for the specialists to vast quantities of patient data more quickly than traditional paper-based systems, and they eliminate need for patients to fill out the same forms again and again at each specialist’s office.

With the power of technology growing at an exponential rate, new technology solutions are coming out every day, but the challenge is to figure out how to use these technologies to address the real problems that medical practices are facing. In other words, to provide the right technology solution, one that really works for practices. At the moment, more often than not, EHR software interferes with and takes time away from the doctor-patient interaction. However, by giving specialists data-capture tools that allow them to focus on their traditional role of caregivers and that reduce the time and energy that is diverted away from patients, everyone benefits: specialists win, and therefore so do their patients.

There are already good vendors out there who are designing solutions with specialists’ requirements in mind, and some of these certainly help to give specialists more time with patients. However, to achieve a truly patient-centric solution, data capture will need to both predict and adapt to the data being fed into it in real-time. This would give specialists relevant, up-to-date information right at their fingertips, which they could use both to inform their own decision-making process and to educate the patient on their particular condition. The result would be a collaborative, evidence-based plan of care that—because the patient had participated in creating it—would lead to an increased patient commitment to the plan and a better outcome overall.

That’s what providing a truly patient-centric solution looks like.

To find out more about the evolution of data capture and what to expect in the future, you can read our recent white paper on this topic.

How the Evolution Started in Data-Capture Technology

EvolutionDo you remember the days when cell phones were brand new? I am not referring to the Nokia 3310 (back when all we needed was a single game, Snake – simpler times . . .). I am talking about when they were first launched and introduced. Those were the days when cell phones were only purchased by business people and you could only make calls near a transmitter tower (oh how mobile!). They used to come with big cases, but these were not for the phone itself; their real purpose was to hold the phone’s huge battery! Despite that, the purpose of original cell phones was clear—to make phone calls on the move. Well, so long as you were going past at least one transmitter tower on the way . . .

Fast-forward to today—the cell phone we once knew has completely changed, and along with it, we see a transformation in how people see and use their phones. What used to be their original purpose (making phone calls) has now been virtually replaced by activities such as Internet browsing, checking social networks, shopping, listening to music, and playing games (you can still download Snake, but it’s no longer pre-installed!).

It would probably be more fitting to call them powerful mini-computers; the average smartphone today is millions of times more powerful than all of NASA’s combined computing power in 1969. Smartphones today are even powerful enough to run old Windows operating systems such as Windows 95. Good to know for all those old-operating-system enthusiasts who want a bit of nostalgia on the go.

The evolution of cell phones eventually led to a revolution in the market. The pace at which technology was developing eventually led to the creation of the first iPhone—the rest is history!

So how does the evolution and revolution in cell phones relate to data-capture technology? Just as the first cell phones had only one purpose—talking—data capture nowadays means simply sharing or collecting information. While 1990s-era electronic data capture focused almost exclusively on big data associated with clinical trials such as EDC and electronic patient reported outcomes (ePRO), it was eventually adapted for private medical practice. Over the years, the opportunities afforded by electronic data capture have grown, partly because of healthcare costs.

However, although these first digital data-capture systems offered some relief to physicians and other users, they were still time-consuming and cumbersome, creating more productivity issues than they solved. What was meant to save time actually had the opposite effect; while the new systems were being introduced, they actually resulted in physicians seeing fewer patients.

Back then, these solutions were designed for primary-care physicians. Specialists, who needed to maintain smaller sets of data, found that these first digital systems did not take their specific needs into account. What specialists required was a solution that would allow them to see many patients without sacrificing data quality and regulatory compliance. Fortunately, there were a few vendors who had the insight to rise up to the challenge and help to solve these specialty-specific problems.

To find out more about the evolution of data capture and how EHR solutions are becoming revolutionary—like smartphones—read our recent whitepaper on this topic.

Top 5 Observations at HIMSS16

HIMSS16

With a conference that draws over 50,000 attendees, 1300+ vendors, 300 educational sessions, and interesting keynote speakers, there is always plenty of food for thought. So much so that it can take a while to really assimilate all the information and process it into key observations.

Our team has just returned from the show, so I just wanted to quickly share our top 5 observations at HIMSS16:

  1. Value-based payments: There was much discussion on the shift to value-based payment. The MACRA/MIPS regulations are expected in the spring, which could mean as early as March or as late as June, with the Final Regulations mandated to be published by November 1. While the goal of MIPS is to simplify life for providers (by rolling up all the various current programs into one streamlined program), it’s a good bet that things will get more complex before they get easier. All of this begs the question: How will physicians be ready to comply beginning on January 1, 2017?
  1. Interoperability: No surprise that everyone was talking about this! This was reinforced when big-name healthcare technology providers promised to use standardized APIs to make access to patient information easier. Interestingly enough, this also ties in with the HHS wanting to expand its oversight of electronic health record vendors. The proposal they released on March 1 would allow the agency to review how certified health IT products interact with other products, with the aim to prevent data blocking, and to review certified HIT vendors if required (and even to take away their certification if necessary!) The comment period for the ONC rule ends on May 2.
  1. Population Health: This is increasingly becoming one of the top buzzwords at this show. More and more people are talking about it, but there does not seem to be a clear definition about what value this brings. After discussions with different attendees and vendors, it was clear how unclear it was: everyone was providing different answers. The term population health is much more widely used than it was back in 2003 when it was defined by Greg Soddart and David Kindig as “the health outcomes of a group of individuals, including the distribution of such outcomes within the group.” The management element is using the aggregation of patient data to devise actions that improve both clinical and financial outcomes. But what data should be used, especially when it comes to specialty practices? Clearly this is something that needs to be defined to ensure we are getting real value from these solutions.
  1. HHS and CMS: There was an interesting session with Karen DeSalvo (National Coordinator for HIT, Asst. Secretary HHS) and Andy Slavitt (Acting Administrator, CMS) where the barriers to data sharing was discussed, and 3 commitments were announced:
    1. Consumers will be able to easily and securely access their electronic health information and send it wherever and to whomever they want.
    2. Providers will share information for patient care with other providers and will refrain from information blocking.
    3. The government will implement national interoperability standards, policies, and practices and will adopt best practices related to privacy and security.

This further reinforces the 2nd observation in this post about HHS wanting to expand its oversight of electronic health record vendors. This session also brought up an interesting point about data blocking; DeSalvo pointed out that a year ago there were a “host of organizations who denied that blocking even was happening,” and now these same groups are “willing to publicly say that they want to engage in something now they’ve acknowledged info blocking can exist.” Hopefully, these same groups will follow with their pledges. As Slavitt advised, “I strongly encourage you to recognize those that don’t [live up to their pledges]” (FierceHealthIT).

  1. EHR collides with NFL: Denver Broncos quarterback Peyton Manning, the reigning Super Bowl Champ, gave a speech at the show thanking the health IT community. For a man who has gone through 3 potentially career-breaking, neck surgeries, I think it is fair to say he can “fully appreciate the value of information systems to keep hospitals functioning.” A physician joined Manning on stage, discussing the NFL’s EHR system and their portals, allowing players access to their medical details. Manning put it like this: “Football is a game. Revolutionizing healthcare is a mighty endeavor.” He also mentioned that leaders in any field need to evolve to match circumstances (HealthcareIT News).

Of course, HIMMS is a huge show where other topics were discussed too, such as patient engagement and RCM. The points mentioned above were only our key takeaways from it. We want to understand the latest regulations and trends, and how these will impact healthcare specialists. What were your key takeaways?