If you are considering AI tools for your dental practice — an AI receptionist, scribe, insurance coordinator, or patient outreach system — HIPAA compliance is not optional. Any tool that touches patient data must meet specific security requirements, and the responsibility falls on you as the practice owner to verify compliance before you deploy.
This guide covers what HIPAA requires when dental practices use AI, what to ask vendors, what a BAA is and why you need one, and the specific security controls that matter for AI tools handling patient calls, clinical notes, and scheduling data.
Why HIPAA Matters More With AI
Traditional dental software sits on a server in your office or in a known cloud environment. AI tools are different — they process patient conversations in real time, generate transcripts, access your PMS, and sometimes store recordings. The data flows are more complex, which means the compliance requirements are more specific.
When a patient calls your AI receptionist and says their name, date of birth, insurance information, and describes a toothache — that is protected health information (PHI). When an AI scribe listens to a provider-patient conversation and generates clinical notes — that is PHI. When an AI insurance coordinator pulls eligibility data — PHI again.
Every one of those interactions needs to be encrypted, access-controlled, logged, and covered by a Business Associate Agreement.
What HIPAA Actually Requires for AI Tools
1. Business Associate Agreement (BAA)
Any vendor that handles PHI on your behalf must sign a BAA with your practice. This is not optional and not negotiable. The BAA establishes:
What data the vendor can access
How they must protect it
What happens if there is a breach
Their obligations for data retention and deletion
If an AI vendor will not sign a BAA, do not use them. Period. It does not matter how good the product is — without a BAA, you are exposing your practice to regulatory risk.
2. Encryption in transit and at rest
Patient data must be encrypted when moving between systems (in transit) and when stored (at rest). This applies to:
Phone call audio and transcripts
Appointment data sent to/from your PMS
Clinical notes generated by AI scribes
Patient records accessed during calls
SMS and chat messages containing patient information
Ask vendors specifically: what encryption standard do you use? AES-256 at rest and TLS 1.2+ in transit is the current standard.
3. Access controls and audit logging
Who can see patient data? How is access tracked? HIPAA requires:
Role-based access — only authorized personnel see PHI
Unique user identification — every access is tied to a specific person
Audit logs — a record of who accessed what data and when
Automatic session timeouts
For AI tools, this means: can you see who reviewed call transcripts? Is there a log of which patient records the AI accessed? Can you restrict access to specific team members?
4. Data retention and deletion policies
How long does the AI vendor keep patient data? Can you request deletion? HIPAA does not specify exact retention periods, but you need to know:
How long call recordings are stored
How long transcripts and summaries are kept
Whether data is used to train AI models (it should not be, for dental)
What happens to data if you cancel the service
5. Breach notification procedures
If patient data is compromised, what is the vendor's response plan? HIPAA requires notification within 60 days. Your vendor should have:
A documented incident response plan
Clear notification timelines
Defined responsibilities (who notifies whom)
Regular security assessments to prevent breaches
How to Evaluate AI Vendors for HIPAA Compliance
When you are evaluating an AI receptionist, scribe, or any tool that touches patient data, ask these questions:
Will you sign a BAA? — Non-negotiable. If no, walk away.
Where is patient data stored? — US-based data centers? Which cloud provider?
What encryption do you use? — AES-256 at rest, TLS 1.2+ in transit.
Do you use patient data to train your AI models? — The answer should be no.
Who can access call recordings and transcripts? — Should be role-restricted with audit logs.
How long is data retained? — You should be able to configure retention periods.
Can data be deleted on request? — Yes, with documented proof of deletion.
When was your last security audit? — Should be annual at minimum.
What is your breach notification process? — Documented, with clear timelines.
Do you have SOC 2 certification? — Not required by HIPAA but demonstrates security maturity.
If a vendor gives vague answers to any of these — "we take security seriously" without specifics — that is a red flag. Compliant vendors have clear, documented answers because they have done the work.
HIPAA Considerations by AI Tool Type
AI Receptionist (phone calls and scheduling)
An AI receptionist like Ira handles patient phone calls, collects personal information, accesses your PMS for scheduling, and may send confirmation texts. HIPAA considerations:
Call recordings must be encrypted and access-controlled
Patient data pulled from PMS (name, DOB, appointment history) must be transmitted securely
SMS confirmations should contain minimal PHI (appointment time, not diagnosis details)
After-hours calls with emergency information need secure handling and escalation
AI Scribe (clinical documentation)
An AI scribe like Sia listens to provider-patient conversations and generates clinical notes. This is the most PHI-intensive AI use case:
Audio capture of clinical conversations contains detailed PHI
Generated notes include diagnoses, treatment plans, medications
Data must be encrypted end-to-end from microphone to PMS
Notes should be saved directly to the patient chart, not stored separately in an unsecured location
AI Insurance Coordinator
An AI insurance coordinator like Milo verifies patient eligibility and processes benefit information:
Insurance ID numbers and coverage details are PHI
Eligibility queries must go through secure, authenticated channels
Benefit breakdowns stored in the system need encryption
AI Retention Manager (patient outreach)
An AI retention tool like Novi contacts overdue patients via phone, SMS, and email:
Outreach messages should use minimal PHI — "It has been a while since your last visit" not "Your periodontal treatment plan is overdue"
Patient opt-out preferences must be respected
Contact lists derived from PMS data need the same access controls as the PMS itself
Common HIPAA Mistakes Dental Practices Make With AI
Using a tool without a signed BAA. This is the most common and most dangerous mistake. If the vendor experiences a breach and there is no BAA, your practice bears the full regulatory burden.
Assuming cloud = compliant. Not all cloud services are HIPAA compliant. AWS, Azure, and GCP can be — but the vendor must configure them correctly. "We use AWS" does not mean "we are HIPAA compliant."
Storing call recordings with no access controls. If anyone on your team can listen to any patient call with no audit trail, that is a violation. Access should be role-based and logged.
Sending detailed PHI via unencrypted text. Appointment confirmation texts are fine ("Your appointment is Tuesday at 2 PM"). Treatment details via SMS are not.
Not reviewing vendor compliance annually. HIPAA compliance is not a one-time checkbox. Review your vendor's security posture, BAA terms, and audit results annually.
What Compliant AI Looks Like in Practice
When a dental practice deploys HIPAA-compliant AI tools, the workflow looks like this:
Patient calls. AI answers and the conversation is encrypted end-to-end.
AI accesses the PMS through a secure, authenticated connection to check the schedule.
Appointment is booked. Confirmation text sent with minimal PHI.
Call summary stored with role-based access. Only authorized staff can review.
If AI scribe is used: clinical conversation captured, notes generated, saved directly to patient chart — encrypted throughout.
All access logged. Retention policies enforced. BAA in place.
The patient notices nothing different except faster service. Behind the scenes, every data touchpoint is secured.
Frequently Asked Questions
Is AI for dental practices HIPAA compliant?
It can be — if the vendor has proper security controls, signs a BAA, encrypts data, and provides access logging. Not all AI tools are compliant. You must verify each vendor individually.
What is a BAA and do I need one?
A Business Associate Agreement is a legal contract between your practice and any vendor handling PHI. Yes, you need one with every AI tool that touches patient data. No exceptions.
Can AI use patient data to train its models?
Reputable dental AI vendors do not use your patient data to train general models. Ask specifically and get it in writing — ideally in the BAA.
Are call recordings HIPAA compliant?
They can be — if encrypted, access-controlled, and covered by a BAA. The recording itself is PHI and must be treated as such.
What happens if there is a data breach?
Under HIPAA, affected patients must be notified within 60 days. Your BAA should specify the vendor's obligations for breach detection, notification, and remediation. Review this before you sign.
How do I verify a vendor's HIPAA compliance?
Ask for their BAA, encryption details, access control documentation, last security audit date, and breach notification procedures. SOC 2 certification is an additional indicator of security maturity but is not required by HIPAA.