Dental Technology

Every Word Your Patient Says Is Being Transcribed. Does Your Consent Form Know That?

Key Takeaways

  • Overjet Voice, launched January 2026, stores encrypted audio recordings for seven years — far beyond what most practices disclose in any consent document currently in use.
  • A November 2025 class action against Sharp HealthCare alleges ambient AI recording without proper consent across potentially 100,000+ patients, with damages of $5,000 per violation under California's wiretapping statute.
  • Twelve states require all-party consent for recording, meaning a HIPAA-compliant business associate agreement does not automatically satisfy state wiretapping law — two completely separate legal frameworks apply simultaneously.
  • Research published in a peer-reviewed journal found consent rates dropped from 81.6% to 55.3% when patients were told the full details of ambient AI: what the data is, where it goes, and who can access it.
  • Independent practices are creating consent protocols ad hoc while DSOs face the same legal exposure with marginally better legal support — neither group is adequately prepared for state attorney general scrutiny.

Ambient AI voice documentation has moved from pilot to production inside dental operatories at remarkable speed. Overjet Voice reached general availability on January 28, 2026. VideaHealth launched Voice Notes in October 2025. These systems listen to everything said in your operatory, convert it to a clinical note, and retain the audio. Overjet Voice stores encrypted recordings for seven years. Most dental practices haven't updated their HIPAA notice, their intake paperwork, or their verbal disclosure protocol to acknowledge any of this. That gap is not a theoretical compliance concern. It is, right now, the fact pattern in active litigation.

Ambient AI Isn't Just Taking Notes — It's Creating a Verbatim Legal Record of Everything Said in Your Operatory

The marketing language around ambient documentation tools emphasizes what they eliminate: after-hours charting, documentation burden, transcription lag. What it underemphasizes is what they create. Overjet Voice doesn't just generate a clinical note; it produces a time-stamped transcript and stores the underlying audio for seven years, which the company markets as "forensic-level protection against malpractice and board audits." That framing is accurate. It's also the thing your patients have no idea is happening.

These systems capture the full operatory conversation: the patient mentioning anxiety medication, the offhand comment about a domestic situation, the question about a procedure cost that reveals something about their financial stress. The clinical note that surfaces in the chart represents a fraction of what was recorded. The full audio record — every pause, every aside, every word — sits in a vendor's cloud infrastructure, potentially for the better part of a decade.

VideaHealth's Voice Notes uses a click-to-activate model that gives the provider control over when the system is listening. That's a meaningful design distinction. But even click-to-activate systems generate audio records whose post-transcription retention timeline is not spelled out in most press materials or product documentation. What happens to the audio after the note is approved? The answer varies by vendor and is almost never addressed in the intake forms patients sign.

HIPAA and State Wiretapping Laws Don't Agree on What 'Consent' Means for Continuous Ambient Recording

The most dangerous assumption a dental practice owner can make is that signing a business associate agreement with a HIPAA-compliant vendor resolves their legal exposure. It resolves one layer of it. State wiretapping law is an entirely separate framework, and it doesn't care about your BAA.

Fifteen states, including California, Florida, Illinois, Pennsylvania, and Washington, require all-party consent before recording a private conversation. California Penal Code § 632 defines the operatory as exactly the kind of environment where a patient has a "reasonable expectation" that no one is listening in — meaning ambient AI recording without explicit all-party consent is a potential violation of the California Invasion of Privacy Act. Civil damages run to $5,000 per violation per recording. In a practice seeing 20 patients a day, the math becomes catastrophic quickly.

The Justia 50-state survey on recording consent confirms the legal patchwork: 37 states operate under one-party consent, while 12 require all-party consent, with several additional states occupying mixed or ambiguous territory. A DSO operating across California, Illinois, and Florida simultaneously faces three distinct legal regimes governing the same ambient AI product. HIPAA's framework was designed around the use and disclosure of protected health information in records, not around the act of continuously recording live conversations. The two frameworks co-exist without resolving each other.

What Your Intake Forms and Privacy Notices Actually Need to Say

The research on patient consent for ambient AI is direct and actionable. A peer-reviewed study published in JAMIA found that when patients received only basic information about ambient documentation technology, 81.6% consented. When they were told the full details — what AI features are in use, how long data is stored, which corporate entities can access it — consent rates dropped to 55.3%. That 26-point gap represents your legal and ethical disclosure obligation.

The same study found that over 96% of patients rated the following information as important: how their audio is used, where it is sent, and who can access it. These are not details buried in vendor trust center documentation. They are details your patients expect to receive before you start recording them. The study also found that patients hold physicians accountable for clinical errors linked to ambient documentation (64.1%) and hold vendors accountable for data breaches (76.7%) — a liability split that won't protect a practice when a state AG or plaintiff's attorney comes looking.

Minimal adequate disclosure for a two-party consent state should include: that ambient AI is in use, the name of the vendor, whether audio is retained after note generation and for how long, who has access to stored audio, whether it may be used to train AI models, and how patients can opt out without affecting their care. Most current dental intake forms say none of this. A HIPAA notice that mentions "electronic records systems" does not cover continuous ambient audio capture by a third-party AI vendor.

California's AB 3030, effective January 1, 2025, adds an additional layer: healthcare providers using generative AI in patient communications must include explicit disclaimers and instructions for reaching a human provider. Texas and Colorado have enacted their own disclosure requirements. The regulatory floor is rising, state by state, while most dental practices are operating on consent forms written before any of this technology existed.

The Data Retention Question Nobody Is Asking: Who Owns the Voice Data Once the Chart Note Is Generated?

Overjet Voice's architecture stores audio for seven years. That retention window serves a legitimate purpose — malpractice defense, board audits, dispute resolution. It also means that a vendor holds a seven-year archive of every recorded patient interaction in your practice, and the legal status of that archive under your existing BAA almost certainly hasn't been stress-tested.

The Sharp HealthCare class action, filed in November 2025 in San Diego and covering a potential class of 100,000+ patients, crystallizes exactly what inadequate data governance looks like in practice. The complaint alleges that Sharp's ambient AI vendor retained audio for approximately 30 days and could not immediately delete recordings upon patient request. It further alleges that patient charts contained language stating patients "were advised" and "consented" to recording when they had not. The laws cited include the California Invasion of Privacy Act at $5,000 per violation and the California Confidentiality of Medical Information Act. At 100,000 patients, potential statutory exposure is measured in hundreds of millions of dollars.

Your BAA should specify, in explicit terms: whether audio is retained after the clinical note is approved, the maximum retention period, whether audio can be used to train AI models (and whether patients must separately consent to that), and whether the vendor can delete specific recordings upon patient request and within what timeframe. If your current BAA with an ambient AI vendor doesn't address all four of those points, it needs to be renegotiated before the next state AG investigates a patient complaint.

How DSOs Are Handling Ambient AI Disclosure — and Why Independent Practices Are Making It Up as They Go

The federal court case involving Heartland Dental and RingCentral, dismissed without prejudice in January 2026, illustrates both the legal exposure and the procedural complexity DSOs face. The plaintiff alleged that AI transcription of patient phone calls occurred without explicit notification, potentially violating the Federal Wiretap Act. The court found the complaint procedurally deficient but allowed it to be refiled — meaning the substantive legal question remains open.

Large DSOs have legal departments that can draft disclosure language, update intake workflows across hundreds of locations, and monitor state-by-state regulatory changes. They're still getting this wrong, as the Heartland case shows. Independent practices, by contrast, are typically relying on the vendor's assurance that the product is "HIPAA compliant" and treating that as sufficient. It isn't. HIPAA compliance is a data security and use framework. It does not substitute for the specific disclosure obligations created by state wiretapping statutes, state AI transparency laws, or the evolving common-law standard for informed consent in ambient recording contexts.

Build the Consent Protocol Now, Before a State Attorney General Builds It for You

The practices that emerge from the ambient AI wave without legal exposure will be those that treated consent as a workflow design problem, not a checkbox. That means updating HIPAA notices to specifically identify ambient AI recording by vendor name. It means adding a dedicated section to patient intake that explains audio capture, retention duration, and opt-out rights. It means training front desk and clinical staff to obtain and document verbal confirmation before a session begins. And it means auditing your BAA for the four data retention and deletion provisions described above.

The American Bar Association's health law section has flagged ambient AI scribes as a priority compliance risk, specifically citing state recording law exposure, HIPAA business associate obligations, and AI transparency requirements as distinct compliance vectors requiring separate analysis. The practices waiting for federal guidance to consolidate these frameworks will be waiting past the point when their state's plaintiffs' bar has already found the case law.

Ambient AI is genuinely useful. It reduces documentation burden, improves note consistency, and lets clinicians stay present with patients instead of typing. None of that utility goes away when you build a real consent protocol around it. What goes away is the legal exposure — and right now, for most dental practices, that exposure is sitting unsigned on your front desk.

Frequently Asked Questions

Is a HIPAA business associate agreement with an ambient AI vendor sufficient to cover my legal exposure?

A BAA addresses HIPAA's data security and use requirements but does not satisfy state wiretapping laws, which operate as a completely separate legal framework. In California, Florida, Illinois, and eleven other states requiring all-party consent, recording a patient conversation without explicit disclosure can constitute a criminal violation regardless of BAA compliance. Your legal review needs to cover both frameworks independently.

What specific disclosures do patients need to receive before ambient AI recording begins?

Research published in peer-reviewed literature found that over 96% of patients rated knowing how their audio is used, where it is sent, and who can access it as important to their consent decision. Adequate disclosure should identify the vendor by name, state how long audio is retained after note generation, clarify whether audio may be used for AI model training, and explain the opt-out process. A general HIPAA notice referencing 'electronic systems' does not meet this standard.

How long do ambient AI vendors actually retain audio recordings?

Retention policies vary significantly by vendor and are not uniformly disclosed in marketing materials. [Overjet Voice](https://www.overjet.com/solutions/voice) stores encrypted audio for seven years. The [Sharp HealthCare lawsuit](https://www.fisherphillips.com/en/insights/insights/new-class-action-targets-healthcare-ai-recordings) alleged that a competing vendor retained audio for approximately 30 days and could not immediately delete it upon patient request — a fact that contributed to the CIPA and CMIA claims in that litigation. Practices should require explicit retention and deletion terms in their BAA.

Does California's AB 3030 apply to dental practices using ambient AI?

AB 3030, effective January 1, 2025, requires California health facilities, clinics, and physician offices to include disclaimers when using generative AI to generate patient communications pertaining to clinical information. [Morgan Lewis's analysis](https://www.morganlewis.com/pubs/2024/12/california-law-requiring-disclaimers-by-healthcare-providers-using-genai-will-affect-providers-and-genai-developers) confirms dental practices are covered entities under the statute. The law exempts AI-generated content that is reviewed and approved by a licensed provider before dissemination, but ambient-generated chart notes that go directly to a record system without provider review may not qualify for that exemption.

What happened in the Heartland Dental AI recording case?

A plaintiff sued Heartland Dental and RingCentral in federal court in Illinois, alleging that AI transcription of patient phone calls occurred without explicit notification in potential violation of the Federal Wiretap Act. The [U.S. District Court for the Northern District of Illinois dismissed the case without prejudice](https://m.dentalgoodnews.com/sys-nd/1921.html) on January 13, 2026, finding procedural deficiencies in the complaint — but allowed the plaintiff to refile with more specific allegations. The substantive legal question of whether undisclosed AI call transcription by a DSO violates federal wiretapping law remains unresolved.

More from Dental Technology

Ambient AI Is Saving DSO Dentists 45 Minutes a Day. Independent Practices Are Still Paying Someone to Type It.The ADA Just Killed the Annual Bitewing Habit. Here's What That Means for Your Insurance Codes, Liability Exposure, and Informed Consent Forms.The ADA Just Killed the Annual Bitewing Habit. Here's What That Means for Your Insurance Codes, Liability Exposure, and Informed Consent Forms.Overjet Voice Is Writing Your Clinical Notes Now. Dentists Aren't Ready for What That Means for Liability, Accuracy, or the Future of the Chart.
← Back to Blog