AI Glossary
HIPAA and AI means making sure any patient health information you feed into an AI tool stays private, secure, and fully under your control — just like it would with a paper file locked in a filing cabinet.
What it really means
HIPAA — the Health Insurance Portability and Accountability Act — sets the rules for how patient health information (called “protected health information” or PHI) can be stored, shared, and used. When you add AI into the mix, those rules don’t change. The AI tool becomes what’s called a “business associate” under HIPAA. That means you’re responsible for making sure the AI provider signs a contract promising to protect that data the same way you do.
I help healthcare-adjacent businesses — dental practices, therapy offices, medical billing services — understand that most off-the-shelf AI tools (think ChatGPT, Claude, or Google’s Gemini) are not designed for PHI out of the box. The free version of ChatGPT, for example, trains on whatever you type in. That’s a HIPAA violation waiting to happen if you’re pasting patient names, diagnoses, or insurance numbers.
The real meaning of HIPAA-compliant AI is this: the AI provider has signed a Business Associate Agreement (BAA) with you, the data stays encrypted at rest and in transit, and the tool doesn’t use your data to train its models. Period.
Where it shows up
You’ll run into HIPAA and AI in any business that touches patient data. I’ve worked with:
- A dental practice in Winter Park using AI to draft post-visit summaries for patients — they needed a BAA with the AI platform before they could send any patient names or treatment notes.
- A law firm in downtown Orlando that handles medical malpractice cases — they wanted AI to summarize medical records, but those records are PHI. Same rules apply.
- A medical billing company in Maitland using AI to flag coding errors in claims — that’s PHI flowing through the system, so the AI tool had to be HIPAA-compliant.
- A pool service in Clermont that does not deal with health data — they can use any AI tool freely. But if they ever started a side business doing health screenings? Different story.
The line is simple: if the data includes a patient’s name, address, birth date, Social Security number, medical record number, or health history, you’re in HIPAA territory.
Common SMB use cases
Small and mid-market healthcare businesses are using AI in practical ways that stay within HIPAA rules. Here’s what I see most often:
- Drafting patient communications: AI writes follow-up emails, appointment reminders, or discharge instructions — but only after the practice removes patient names and uses a HIPAA-compliant tool.
- Summarizing medical records: A doctor’s office uploads a patient’s chart into a HIPAA-compliant AI to get a one-paragraph summary of their history. The AI never stores the data.
- Transcribing clinical notes: AI listens to a doctor-patient conversation (with consent) and generates structured notes for the EHR — but the transcription service must have a BAA in place.
- Insurance claim review: A billing team uses AI to check for missing codes or potential denials before submitting claims. The AI tool processes PHI but doesn’t keep it.
None of these use cases require the AI to “think” about the patient — they just need to process text or speech securely.
Pitfalls (what gets oversold)
I’ve seen three common traps that trip up small businesses:
- “We’re just testing it with real data.” I’ve had a Winter Park dental office tell me they were “just experimenting” with ChatGPT by pasting a few patient notes. That’s a breach. Any use of PHI in a non-compliant tool is a violation, even if you delete it immediately.
- “The AI vendor says they’re HIPAA-compliant.” Some vendors claim compliance but don’t offer a BAA. Without that signed agreement, you’re on the hook. I always tell clients: “Show me the BAA, or we don’t connect it to patient data.”
- “We de-identified the data, so it’s fine.” De-identification is tricky. Removing a name isn’t enough if the AI can piece together the patient’s identity from other fields (age, zip code, rare diagnosis). Most small businesses don’t have the expertise to de-identify properly.
The oversell is that AI tools are “safe out of the box.” They’re not. You need the right contracts, the right settings, and the right training for your team.
Related terms
- Business Associate Agreement (BAA): The contract you sign with an AI vendor stating they’ll protect PHI. No BAA, no PHI.
- Protected Health Information (PHI): Any individually identifiable health data — names, dates, medical records, insurance IDs.
- De-identification: Removing or altering PHI so it can’t be traced back to a patient. Harder than it sounds.
- Data residency: Where the AI stores your data. HIPAA requires it stays in the U.S. or a jurisdiction with equivalent protections.
- Encryption at rest and in transit: Your data should be scrambled whether it’s sitting on a server or moving between systems. Non-negotiable for HIPAA compliance.
Want help with this in your business?
If you’re in Central Florida and wondering whether your AI tool is handling patient data safely, shoot me an email or use the contact form — happy to walk through it with you.