Security and Privacy Hurdles Plaguing AI-Driven Health Services

The following is a guest article by Dr. Sriram Rajagopalan, Enterprise Agile Evangelist at Inflectra

Today’s most significant risk regarding security and privacy issues in health services is consumers’ need for more awareness of personal health information.

What do I mean?

As technology grows, we have fallen into the practice of taking images of our private health data and have also stored them in smart devices using wallets. When our fingerprint is associated with the laptop or our insurance card is stored in the Apple wallet, how much do we know where they are stored and how they will be protected?

Technology is growing faster than we can catch up, and we are giving away our personal information too soon. With regulations still catching up on AI-powered solutions, it is too early to say how much personal data we have already given to firms to use or abuse freely.

Of course, with every new technology comes abuse of that technology. So, I recommend the steps below, urging all patients to practice extreme care. But first, let’s quickly differentiate personally identifiable information (PII) from personal health information (PHI) for the sake of clarity later on.

  • The PII includes the name, date of birth, contact information (such as the address, telephone, and email), financial information (bank information), and government identifier (social security, driver’s license #). 
  • The PHI includes the PII and other data like medical record indicators, health plan identifiers, certificate or license numbers, vehicle identification and license plate, and biometric identifiers (such as the face, fingerprints, voice, and eye retina).

With that, here are my tips for patients:

  1. Think twice before inputting PII or PHI into unknown mobile apps without knowing the practices they follow.
  2. Check for release notes on the details (“bugs and fixes” or “new features and enhancements” are not acceptable release notes) before updating the apps you already have.
  3. Do not enable automatic updates and develop the practice of reading about what these updates involve. 
  4. Delete apps or accounts that you no longer use.
  5. Do not share information that will allow you to be tracked (where you are, what you do, etc.) 
  6. Beware of your social media practices.
  7. Do not share your passwords in browsers, sticky notes, text pads, etc. Use a dependable password manager and use a strong password scheme.
  8. Practice disaster recovery on yourself. How would your life be if you cannot access your phone? 
  9. Avoid the excessive use of search engines and use incognito mode.
  10. Encrypt data and backup your data frequently to avoid having your data lost.

However, that’s for patients. IT professionals in the healthcare industry must play their part. Healthcare systems in the clinical setting can immensely benefit from risk evaluation and asking the following questions: 

  • What if healthcare providers can’t access patient data when they are using electronic health records (EHR) systems to connect with patients? 
  • What if the computer on wheels (COW) that nurses roll over to the patient’s room to track vitals, administer the medicines, and record their observations can’t connect to the network? 
  • What if the public address systems used to seek emergency codes go down, and a physician or nurse is immediately required?

The Intersection of HIPPA and AI Solutions 

When it comes to healthcare, specifically the Healthcare Insurance Portability and Accountability Act, we need to consider two critical concepts before we look at AI solutions developed by a vendor in the HIPAA realm.

  1. First, we have three personas: a patient, a vendor (a 3rd party solution provider), and a healthcare professional (HCP) or organization (HCO). The patient is the consumer of the solution provided by the vendor. The vendor may become a business associate if contracted to develop a solution by the HCP or HCO. In such cases, the HCP/HCO becomes the covered entity. Without a contract, the vendor is not a business associate, and the HCP/HCO is not a covered entity. 
  2. Second, the vendor must deal with patient health information (PHI) data, such as blood test results, diagnostic images, patient-physician communication and appointments, etc.

The question of HIPAA applies to an AI solution when the following two conditions are met at a high level:

  1. A contract exists between the HCP/HCO and the vendor, making them a covered entity and business associate.
  2. The vendor is directly involved in creating or receiving PHI data from the consumer or the covered entity, maintaining the data in a system, or transmitting the data between the parties.

Put simply, the HIPAA regulations apply whenever:

  • AI solutions derive data from medical records, such as the parsing of lab reports, hospital records, or other clinical data sheets; or 
  • Direct or derived data is gathered and processed to provide specific healthcare-related decisions.

However, the HIPAA regulations may not apply if AI solutions use publicly available data or require people to enter their own data for independent record-keeping purposes.

Therefore, if you consider the use case of a consumer downloading a mobile app, inputting their blood A1C levels, and using the recommendations to monitor their food intake, then the vendor is not a business associate, and the doctor didn’t specifically ask the patient to execute this function. So, HIPAA may not apply.

On the other hand, if the doctor specifically asked for a CPAP instrument from a vendor that the doctor has contracted with to get the sleep data of the patient (frequently over the air) and communicate with the patient based on this data, then the vendor’s solution is subject to HIPAA.

Conclusion

In the rapidly evolving healthcare landscape, the integration of artificial intelligence (AI) has brought numerous security and privacy hurdles. While AI-powered health services promise to revolutionize patient care, diagnosis, treatment, and administrative efficiency, this new technology also presents serious security concerns that demand our immediate attention—both as patients and IT professionals in the healthcare industry.

About Sriram Rajagopalan, Ph.D.

Sriram Rajagopalan, Ph.D. is the Head of Training & Learning Services as well as  Enterprise Agile Evangelist at Inflectra. Sriram designs and orchestrates the training curriculum for Inflectra’s platform along with delivering business process consulting for strategic companies in multiple industries using Inflectra’s products. 

   

Categories