This issue reads best with full editorial layout — read on encryptedchart.com →
You don't think about HIPAA much. You just practice.
Then one morning you open your email and there's a letter from your billing service. They were hit with a ransomware attack six months ago. Someone on their team clicked a phishing email, the attackers got in, and they stayed in the network for half a year before locking everything down. The billing service has been writing letters to your patients all week. Your patients. And now those patients are starting to call you.
You haven't looked at your contract with that billing service in two years. You signed it during onboarding. You don't remember what it says about who pays for what when something like this happens. You don't remember if it says anything about how fast they're supposed to tell you. You don't remember if it says anything at all.
Last month, that exact scenario stopped being hypothetical for the medical practices that worked with a Pennsylvania company called Consociate Health.
Consociate handles back-office work for medical groups — health plan administration, analytics, the kind of paperwork no one wants to do in-house. In 2021, an employee got a phishing email. They clicked. From there, the attackers were inside Consociate's systems for six months before launching a ransomware attack that locked everything down. By that point they had also accessed servers holding 136,539 patient records.
The Office for Civil Rights at HHS just announced its settlement with Consociate. $225,000 paid to the federal government. Plus two years of regulatory probation, with quarterly reports back to OCR.
Same week, OCR settled three more ransomware cases — another $940,000 in fines, another 290,000 patients exposed. The shared theme across all of them: each company failed to do basic security work. And each of their clinical clients now has a federal investigation paper trail that names them too.
OCR's investigation found a single root cause: Consociate had no proof they had ever actually assessed their own security risks — a basic requirement for anyone handling patient information. Five years later, here's the bill. $225,000, two years of probation, public docket entry.
What's not in the press release — but is sitting silently underneath this whole enforcement wave — is that every clinical practice that had a contract with Consociate is now living with the consequences of a security failure they had nothing to do with.
Here's the gap most independent practices share with the medical groups that hired Consociate: the contract was signed once, filed in a drawer, and never read again.
You have contracts like this with your billing service. Your EHR. Your transcription company. Your AI scribe. Your appointment-reminder system. Your secure messaging app. Your cloud backup. Your IT person. Your patient portal. Your analytics dashboard.
Most of these were signed during onboarding. The terms were whatever the vendor put in front of you. Maybe you read them. Maybe you skimmed. You signed because you needed the service to start. Then the contract went into a folder — physical or digital — and you have not read it since.
When OCR investigates a breach that flowed through your practice, they will ask: when did you last review those contracts? When did you last verify that the companies you trusted with patient information were actually doing the security work they promised? When did you last update the contract to reflect new requirements? When did you last cancel a service whose security stopped holding up?
These are not paperwork questions. They are operational questions. And the answer most practices give — I haven't — is now sitting inside a public federal settlement as part of why a six-figure penalty was imposed.
The Consociate story is not exotic. A phishing email. Six months before anyone noticed. Ransomware. Records exposed. That can happen to any company you've hired. The contract you signed with them is the only thing standing between you and the consequences when it does.
The contract has a regulatory name: Business Associate Agreement, or BAA. Every company that touches patient information is required to sign one with you. Yours probably did at onboarding. To regulators, the BAA is the seam between you and the vendor — the document that says who is responsible for what when patient information moves between organizations.
In practice, the BAA is what your insurance carrier reads first when deciding whether to cover a breach. It's what OCR asks for first when they investigate. It's what your malpractice attorney needs first when patients start asking questions.
A BAA Review Checklist is not a re-execution document. It is an operational audit tool. You run it once a year against every BAA you have, and the questions force you to confront the gap between what your contracts say and what your vendors are actually doing.
Mechanically, here is what the discipline does.
It surfaces forgotten contracts. A first walk-through forces an inventory of every company on the hook with you. Most practices, doing this for the first time, discover at least one vendor they should have a BAA with and don't, and at least one BAA for a vendor they no longer use.
It surfaces stale BAAs by date. Some BAAs auto-renew, some don't, some were signed five years ago and never touched. The checklist's test is simple: if you can't remember when you signed it, ask the vendor for their current template. You don't read the regulations. The vendor's lawyers already did. Your job is to ask, read what they send, and sign.
It pressure-tests breach notification timelines. When a vendor has a breach, your BAA says how fast they have to tell you — and the language varies. Some say within 60 days, the regulatory ceiling. Some say without unreasonable delay. Some say promptly. When the breach actually happens, "promptly" is what the lawyers will fight about for six months. The checklist asks: do you actually know what your BAAs require?
It tells you who pays when things go wrong. A modern BAA covers three things in plain language. Who's on the hook if your vendor causes a breach — do they cover the legal bills, the patient notifications, the federal fine, or are you holding all of it? Can you ask to see proof of their security work — backups, staff training, system protections — or do you have to trust their word? What do they actually promise to do — encryption, multi-factor login, a regular training cadence — or is the contract vague? Older BAAs are usually vague. The checklist puts these answers on one page so you can see which vendors stand behind their service and which leave you exposed.
It generates a remediation list. The output is not a clean BAA. It is a list. BAAs to renegotiate. BAAs to terminate. Gaps to address. Vendors to vet harder. That list is your evidence — to OCR, to your malpractice carrier, to your own future self — that you actively manage the risk of working with these companies.
The Consociate story shows what happens when no one runs that audit on either side. The medical groups had BAAs with Consociate. Consociate had not done the security work. Both sides assumed the other was. Neither was.
Three steps this week, in order.
One. Pull every BAA you have into a single folder. Physical and digital. Email PDFs from your inbox. The vendors that come up: billing, EHR, transcription, AI scribe, secure messaging, cloud backup, IT person, telehealth platform, patient portal, analytics, scheduling, payment processing if it touches patient information. If you can't find a BAA for a company you should have one with, that's the gap to flag first.
Two. Sort them by what scares you most. Not by date signed. By the question: if this vendor had Consociate's exact breach this morning, what does my Tuesday morning look like? The vendor whose breach would hurt most goes at the top. That's the BAA you read first.
Three. Read that top BAA against three specific questions. What is the breach notification timeline, in days, in writing? What does the vendor actually promise to do — encryption, multi-factor login, monitoring, staff training cadence? Who pays for what when their negligence causes a breach? If any of those answers is "doesn't say" or "vague" or "I'm not sure," that BAA goes on your renegotiation list.
You don't have to do the full audit this week. You have to do one BAA. The first one. The one that scares you. From that, you'll know what the next twelve will require.
The medical groups that worked with Consociate are not in trouble because they didn't have BAAs. They're in trouble because they had BAAs they had stopped reading.
The Encrypted Chart Vault includes a BAA Review Checklist plus pre-drafted BAA language calibrated for independent practice — the questions to ask each vendor, the notification-timeline language to insist on, the indemnification clauses that actually transfer risk, the audit-rights paragraphs that mean something, and a tracking sheet for your annual review cycle. National edition is $299. NY edition is $349 and adds SHIELD Act and state-specific addenda.
Apply the framework above to your own contracts even if you don't use the Vault. The discipline matters more than the vendor.
| With security, |
| Brad |
| Brad Lieberman, JD (retired), MSN, PMHNP-BC |
| Founder, The Encrypted Chart |
| www.encryptedchart.com · Vault: store.encryptedchart.com/l/binder |
| [email protected] |
Footnotes
- HHS-OCR Resolution Agreement and Corrective Action Plan with Consociate, Inc. (April 2026): hhs.gov/hipaa/.../ra-cap-with-consociate-health
- HHS Office for Civil Rights Settles Four HIPAA Security Rule Ransomware Investigations (April 23, 2026): hhs.gov/press-room/ocr-settles-four-ransomware-investigations
- 45 C.F.R. § 164.308(a)(1)(ii)(A) — Security Management Process: Risk Analysis requirement.
