Building a healthcare app? HIPAA compliance isn't optional — and it's not just a checkbox. Whether you're building a SaaS platform, a patient portal, or a mobile health app, here's what you actually need to do.
If you're building anything that touches patient data — a SaaS platform, a website with a patient portal, a mobile health app — you need to be HIPAA compliant. Not "we'll get to it later" compliant. Not "we checked a box" compliant. Actually, legally, no-one-goes-to-jail compliant.
HIPAA (the Health Insurance Portability and Accountability Act) isn't just about encryption. It's a framework that covers how you collect, store, transmit, and dispose of Protected Health Information (PHI). And the penalties for getting it wrong range from $100 per violation to $1.5 million per year — plus potential criminal charges.
So yeah, it's the kind of thing you want to get right before your CTO is explaining data storage practices to a federal auditor while sweating through a dress shirt.
We've helped multiple healthcare startups and clinics build HIPAA-compliant applications at Apptivity. Here's what we've learned about what actually matters versus what people think matters.
01What Counts as PHI?
Before you can protect PHI, you need to know what it is. PHI is any individually identifiable health information. That includes the obvious stuff — medical records, diagnoses, prescriptions — but also things developers often miss:
- Names combined with any health data
- Email addresses used for appointment reminders
- IP addresses logged alongside health queries
- Device IDs from mobile health apps
- Billing information tied to medical services
- Appointment dates and times
- Photos or images (X-rays, wound photos, etc.)
The key word is identifiable. A dataset of anonymized blood pressure readings isn't PHI. The same dataset with names attached is. If you're unsure whether your data qualifies, it probably does. Treat it as PHI until you've confirmed otherwise.
Think of it this way: if your database leak would make a patient's Thanksgiving dinner awkward, it's probably PHI.
02The Three Rules of HIPAA
HIPAA compliance is built on three pillars. You need all three — not just the one your engineering team is most comfortable with.
03The Privacy Rule: Who Can See What
The Privacy Rule governs who can access PHI and under what circumstances. For developers, this translates directly into your authorization system.
Minimum Necessary Standard — this is the one most apps get wrong. Every user should only see the minimum amount of PHI required to do their job. A billing clerk doesn't need to see clinical notes. A lab technician doesn't need to see insurance information. And your intern definitely doesn't need God-mode access to the production database, no matter how "responsible" they seem.
In practice, this means:
- Role-based access control (RBAC) isn't optional — it's required
- Every API endpoint that returns PHI should filter based on the requesting user's role
- Admin accounts shouldn't be a backdoor to all patient data
- Patient access — patients have the right to see their own records and request corrections
Don't build a system where everyone can see everything and plan to "lock it down later." That's like installing a front door after you've already been robbed. Build the permissions model first.
04The Security Rule: How You Protect ePHI
The Security Rule is where engineering teams spend most of their time. It covers the technical, physical, and administrative safeguards for electronic PHI (ePHI). This is the part where your developers will nod along in meetings and then quietly Google everything afterward.
Technical Safeguards
This is your bread and butter as a developer:
- Encryption in transit — TLS 1.2 or higher for all connections. No exceptions. No "but it's an internal service" excuses. Internal services gossip worse than a hospital break room.
- Encryption at rest — AES-256 for databases, file storage, and backups. Yes, backups too. Especially backups. Your backups are just unguarded copies of everything you're trying to protect — treat them accordingly.
- Access controls — unique user IDs, automatic logoff, emergency access procedures.
- Audit controls — log every access to PHI. Who accessed what, when, and why. Think of it as your system's diary, except this one might be subpoenaed.
- Integrity controls — mechanisms to ensure ePHI hasn't been altered or destroyed improperly.
Encryption Deep Dive
Encryption comes up so often in HIPAA conversations that it deserves its own section. The rule is simple: encrypt everything, everywhere.
HIPAA doesn't technically mandate encryption — it's listed as an "addressable" specification, meaning you can use an alternative if you document why it provides equivalent protection. But in practice, there is no reasonable alternative. The "alternative to encryption" is basically a strongly-worded letter promising to feel really bad if data leaks. Just encrypt it.
Administrative Safeguards
These aren't code — they're processes. But developers need to know about them because they affect how you build:
- Risk assessments — you need to perform regular security risk assessments. This means your architecture should be documented well enough that someone can audit it. Yes, this means you actually have to write documentation. We're sorry.
- Workforce training — everyone who touches PHI needs training. Your onboarding flow for new team members should include HIPAA training. A 15-minute video nobody watches doesn't count, even though we've all tried it.
- Contingency planning — what happens if your database goes down? What's your disaster recovery plan? These need to be documented and tested. "We'll figure it out" is not a disaster recovery plan — it's a disaster.
Physical Safeguards
If you're on the cloud (and you should be, for a healthcare app), most physical safeguards are handled by your cloud provider. But you still need:
- Workstation security — developers working with PHI need encrypted hard drives and screen locks. The coffee shop is not a HIPAA-compliant workspace, no matter how good the Wi-Fi is.
- Device policies — what happens when an employee's laptop is stolen? Can you remote-wipe it?
- Access to physical servers — if you have any on-prem equipment, who can physically access it?
05Making Your SaaS HIPAA Compliant
SaaS applications typically have the most complex compliance requirements because they store and process PHI continuously. Here's your checklist — print it out, tape it to your monitor, and don't remove it until every item has a checkmark:
Authentication and Authorization:
- Multi-factor authentication (MFA) for all users who access PHI
- Role-based access control with granular permissions
- Automatic session timeout (15 minutes is the common standard)
- Password policies that meet NIST guidelines (length over complexity — "correcthorsebatterystaple" beats "P@ssw0rd!" every time)
Data Layer:
- Database encryption at rest (AES-256)
- Field-level encryption for the most sensitive fields (SSNs, diagnoses)
- Encrypted backups stored in a separate region
- Data retention and disposal policies
API Security:
- HTTPS everywhere — no exceptions
- API authentication with short-lived tokens
- Rate limiting to prevent data harvesting
- Input validation on every endpoint that accepts PHI
Monitoring:
- Audit logging for every PHI access event
- Intrusion detection and alerting
- Regular vulnerability scanning
- Penetration testing at least annually
06Making Your Website HIPAA Compliant
A marketing website for a healthcare company doesn't necessarily need to be HIPAA compliant — unless it collects PHI. The moment you add a contact form that asks about symptoms, a patient portal, or an appointment booking system, HIPAA kicks in. Surprise! Your "simple marketing site" just became a compliance project.
Common pitfalls for healthcare websites:
- Contact forms — if patients describe symptoms or conditions in a contact form, that's PHI. Your form submission handler needs to encrypt that data and store it securely. That contact form you built in 20 minutes? It now needs the security posture of a bank vault.
- Analytics — standard Google Analytics is NOT HIPAA compliant. You need a HIPAA-compliant analytics solution or to strip all PHI from your tracking. Google knows enough about everyone already.
- Chat widgets — live chat where patients discuss health concerns? That's PHI flowing through a third-party service. Make sure that service has a BAA.
- Cookies — if cookies can be tied back to health-related browsing behavior, you have a problem. Your cookies might know more about your users' health than their primary care physician.
07Making Your Mobile App HIPAA Compliant
Mobile apps add a whole extra layer of complexity because you're now dealing with a device you don't control. It's like trying to secure a building when the tenants keep leaving the windows open and handing out copies of the key.
Mobile-specific requirements:
- Local data storage — any PHI cached on the device must be encrypted. Use the platform's secure storage (iOS Keychain, Android Keystore).
- Biometric authentication — fingerprint or face ID for app access when PHI is displayed.
- Remote wipe — if a device is lost or stolen, you need the ability to remotely clear app data.
- Screenshot prevention — consider disabling screenshots on screens that display PHI. Because nothing says "HIPAA violation" quite like a screenshot of lab results posted to a group chat.
- Push notifications — never include PHI in push notification content. "Your lab results are ready" is fine. "Your HIV test is negative" is a violation. And a very awkward notification to have pop up during a work presentation.
- Background behavior — clear sensitive screens when the app goes to the background.
08Business Associate Agreements (BAAs)
This is the part that catches most startups off guard. Every vendor, service, or tool that could potentially access PHI must sign a Business Associate Agreement with you. No BAA, no access to PHI — period.
Think of a BAA as a legal pinky promise between you and every service that might accidentally see patient data. Except instead of a pinky, it's a 40-page legal document. And instead of a promise, it's a binding contract with financial penalties.
Your BAA checklist:
- Cloud provider (AWS, GCP, Azure) — all offer BAAs
- Database hosting — most managed services offer BAAs on specific plans
- Email service — if sending emails containing PHI
- Authentication provider — they process user credentials tied to health data
- Payment processor — if billing is tied to medical services
- Monitoring and logging — if your logs contain PHI (and they probably do)
- Customer support tools — if support agents can see patient data
Services that often don't offer BAAs:
- Google Analytics (standard version)
- Most free-tier email marketing tools
- Standard Slack (your #patient-issues channel is living dangerously)
- Heroku (non-Shield plans)
- Most static hosting providers
If a vendor won't sign a BAA, you either can't use them for PHI-related workflows or you need to architect your system so PHI never touches their infrastructure. There is no third option. "But we really like their UI" is not a compliance strategy.
09The Breach Notification Rule
Despite your best efforts, breaches happen. The Breach Notification Rule requires you to:
- Notify affected individuals within 60 days of discovering the breach
- Notify HHS (Department of Health and Human Services) — within 60 days for breaches affecting 500+ people, annually for smaller breaches
- Notify media — if the breach affects 500+ people in a single state, you must notify prominent media outlets. Yes, you might have to call the local news and tell them you messed up. Fun.
- Document everything — the breach, your response, and the corrective actions taken
Build your incident response plan now. Not after a breach. Now. Writing your incident response plan during an actual incident is like writing a fire escape plan while the building is on fire. It should include:
- How you detect breaches (monitoring and alerting)
- Who gets notified internally (your incident response team)
- How you assess the scope and severity
- Template notifications for affected individuals
- Contact information for your HIPAA compliance officer
- Steps for preserving evidence
10Common Mistakes We See
After building multiple HIPAA-compliant applications, here are the mistakes we see most often. If you recognize yourself in this list, no judgment — just fix it before the auditors recognize you too.
- "We'll add compliance later" — Retrofitting HIPAA compliance is 5-10x more expensive than building it in from the start. Start with compliance on day one. "Later" in startup language usually means "after the lawsuit."
- Logging PHI in plaintext — Your application logs probably contain more PHI than you think. Audit your logging and either encrypt sensitive fields or exclude them. Somewhere, right now, a debug log is casually recording Social Security numbers in a file called
app.logon a server with password "admin123."
- Forgetting about backups — Your database is encrypted, great. Are your backups? Are your log archives? Is your staging environment using production data? (It shouldn't be. But it is, isn't it? We know.)
- No BAA with your hosting provider — "We use AWS" doesn't automatically make you compliant. You need to sign their BAA and configure their HIPAA-eligible services. Using AWS without a BAA for healthcare data is like buying a safe and leaving it open.
- Treating HIPAA as a one-time project — Compliance is ongoing. You need regular risk assessments, annual training, and continuous monitoring. HIPAA compliance isn't a destination, it's a lifestyle. An exhausting, never-ending lifestyle.
- Sharing PHI over Slack or email — Your team probably discusses patient issues in Slack. Unless you're on Slack Enterprise Grid with a BAA, that's a violation. Yes, even in the private channel. Especially in the private channel.
11The Bottom Line
HIPAA compliance is a lot of work. But it's not mysterious work. It's clear, well-documented requirements that translate directly into engineering decisions. Encrypt everything. Log everything. Control access to everything. Have a plan for when things go wrong. And for the love of all that is sacred, read the actual regulations before your lawyer has to read them to you in a deposition.
The biggest risk isn't that compliance is too hard — it's that teams delay it until they're too far into development to fix their architecture without a major rewrite.
If you're building a healthcare application and you're not sure where you stand on compliance, we offer HIPAA compliance audits at Apptivity. We'll review your architecture, identify gaps, and give you a concrete remediation plan. No scare tactics — just engineering advice from a team that's done this before. (The scare tactics in this blog post were free.)