Blogs Home
Monday, October 1, 2018 2:00 PM

Clarity on Artificial Intelligence and Patient Privacy

Written by Mark H. Johnson, MHA, RN-BC, CPHIMS, FHIMSS, Vice President, Sales and Marketing - iatricSystems

Patient-Privacy-Blog-HeaderImage-artificial-intelligence-and-patient-privacy-2018-09-1According to Frost & Sullivan, Artificial Intelligence systems are projected to be a $6 billion dollar industry by 20211. In fact, if you Google “artificial intelligence” and “patient privacy” you’ll get at least 35,000 results. There’s been a lot of hype in the media recently about artificial intelligence (AI) and whether or not it’s good or bad for patient privacy. No matter where you stand on the topic, there’s no doubt that AI is already helping privacy auditors save time. Read on to learn how…

When a privacy auditor sits down to review all of the users who recently accessed patients’ Protected Health Information (PHI) at your organization, it can be an overwhelming task, since millions of PHI views occur on a daily basis.

Now, imagine that privacy auditor reviewing a single worklist of truly suspicious activity, free of false positives. That saves time — your auditor doesn’t have to spend time reviewing long reports or wading through PHI audit logs. Better yet, imagine your privacy team receiving automatic alerts as soon as truly suspicious activity occurs. This is what Artificial Intelligence (AI) can help accomplish.

What is Artificial Intelligence?
Artificial Intelligence in healthcare refers to the use of algorithms to approximate human reasoning in the analysis of complex patient data. Simply put, AI is the ability for software to approximate conclusions without direct human input.

AI systems like Haystack can examine your patient privacy audit logs and millions of PHI views behind the scenes so your privacy and compliance auditors don’t have to. Haystack can then apply the same methodology your auditors would use themselves — plus other advanced methodology that they may not — to remove false positives and zero in on only the truly suspicious activity.

If you’ve done any reading on AI, you may have seen the phrase Machine Learning.

What is Machine Learning, and how does it differ from AI?
Machine Learning (ML) certainly falls under the broader umbrella of AI, but its definition is more specific. ML uses statistical techniques to give computers the ability to "learn" — or progressively improve performance on a specific task — without being explicitly programmed to do so. ML allows software to respond to situations that it has not encountered before, replacing manual processes that previously would have required lots of time-intensive human analysis.

Speaking of analysis, if you’ve read about AI and ML, you may have also seen the phrase Behavioral Analysis.

How does Behavioral Analysis fit into the equation?
AI can be used to track behavior — and in some cases even predict behavior. Let’s stick with tracking behavior for now. Each user in your system tends to follow a typical workflow pattern as they go about their day. Let’s say a nurse who works in the Emergency Department generated 645 user activity events during her shift, but other ED nurses who worked the same shift only generated 151 events on average that day. Is something amiss?

Perhaps. There are other factors that must be considered, but whether your team concludes that the nurse’s access was appropriate or inappropriate (perhaps inappropriate in this example), those types of Behavioral Analysis details (e.g., 151 average events vs. 645 events) can be used to help substantiate your team’s conclusion.

Saving staff time
Haystack combines Artificial Intelligence, Machine Learning, and Behavioral Analysis to help privacy auditors save time. Its AI analyzes millions of PHI views behind the scenes, picks out the suspicious activities, and removes false positives — so your auditors don’t have to. Its ML learns from past audits, adjusting algorithms automatically depending on whether incidents are deemed appropriate or inappropriate after investigation. And its Behavioral Analysis gives your staff details that can be used to help determine whether or not incidents are appropriate and substantiate their conclusions.

Contact us anytime to gain more clarity on Artificial Intelligence and Patient Privacy.

References:

  1. Frost & Sullivan, January, 2016, From $600 M to $6 Billion, Artificial Intelligence Systems Poised for Dramatic Market Expansion in Healthcare, https://ww2.frost.com/news/press-releases/600-m-6-billion-artificial-intelligence-systems-poised-dramatic-market-expansion-healthcare, Accessed September 18, 2018.