Welcome to the Era of the AI-Native Practice
If you're a physician, pharmacist, lab operator, or clinical services provider who's been watching AI tools transform healthcare and wondering whether you could build something tailored to your own practice — you can. And the practitioners who've already started are producing real results.
I've spent more than a decade advising digital health companies through every stage of building, scaling, and navigating regulation. Telehealth startups, virtual care platforms, health tech vendors. In that time, the model was always the same: technology companies built the tools, healthcare providers bought them, and a wide technical capability gap kept the two sides apart.
AI is closing that gap. Practitioners who know their clinical workflows, their patient populations, and their operational pain points better than any outside product team can now build tools customized to exactly how they provide services. Not to replace the specialized vendor tools they already rely on — the ambient scribes documenting their patient encounters, the clinical decision support systems flagging drug interactions, the revenue cycle platforms processing their claims, the EHR systems anchoring their clinical workflows. Those tools still matter. They're woven into patient care and daily operations in ways that a homegrown build can't and maybe shouldn't try to replicate.
What's new is that practitioners are building alongside them. Custom patient communication tools that plug into their EHR. AI agents that handle intake and follow-up in ways their vendor platform never anticipated. Workflow automation tailored to the specific acuity levels, payor requirements, and patient demographics of their practice. Integrating vendor tools with custom builds to create something neither could produce alone.
I'm calling this the AI-native healthcare practice. Not a practice that adopted AI. A practice where the provider has stitched together the clinical tools they buy with tools they built themselves into a system hyper-customized to their patients and their vision. No two of these practices will look the same. That's the point.
And for digital health companies, I don't see this as a competitive threat. It's a market expansion. Every practitioner who starts building needs the clinical tools, the platforms, and the integrations that vendors provide. A physician who builds a custom follow-up agent is a better customer for your ambient scribe, not a replacement for it. The vendor ecosystem grows when the pool of builders grows.
Both of those developments are good. Both also come with legal and compliance exposure that most people aren't talking about yet.
The Clinician-Developers Are Already Here
A cardiologist placed 3rd out of 13,000 applicants at a major global AI hackathon by building a patient-facing care platform in 7 days between hospital shifts. An ER physician who describes himself as a "poorly-trained software developer" co-created one of the most widely used clinical decision tools in the world. A registered nurse taught himself to code and built a clinical AI platform that's now seriously venture-backed.
Some exceptional practitioners have always bridged the gap between clinical expertise and technical capability. What's different now is that you don't have to be exceptional at tech to build.
The AMA's 2026 survey confirms the shift to AI upskilling at scale: 81% of physicians now use AI in practice, up from 38% in 2023. That doubled in three years. To be clear, using AI and building with AI are different things; most of that 81% are using vendor-built ambient scribes and documentation tools. But the comfort level is there, the curiosity is there, and the on-ramp between using and building is shorter than it's ever been.
This Isn't Just Physicians
The physician vibe coders are leading the headlines today. But the AI-native practice model applies across the entire clinical services industry.
A pharmacy owner who builds a custom medication adherence system that texts patients based on their specific regimen, integrates with their dispensing platform, and flags interaction risks their EHR system misses.
A home health company that builds AI-powered care coordination tools connecting field nurses with supervising physicians in real time, customized to the acuity levels and documentation requirements of their specific payor contracts.
A lab operator who builds automated result interpretation workflows tailored to the ordering patterns of their referring physicians; integrated with their vendor LIS but customized in ways the vendor never anticipated or didn't build because the economics didn't make sense.
A skilled nursing facility that builds family communication tools, fall risk prediction models calibrated to their patient population, and staffing optimization agents that work alongside their on site team.
An outpatient clinic that builds a patient intake and follow-up system so tightly integrated with their clinical workflows that every patient interaction is personalized to the individual.
Each of these operators can now make the same "build, buddy, or buy" decision that venture-backed telehealth companies have been making for years. And each one still depends on the specialized clinical tools, EHR platforms, and vendor integrations already embedded in their operations. I predict they'll build on top of and alongside those systems, not instead of them.
The Part of This Conversation That's Missing
I've spent more than a decade watching telehealth companies learn the regulatory and legal lessons of building tech-enabled healthcare businesses. Many learned after launch, often expensively. The same walls they hit are waiting for every practitioner-builder who moves fast without building the legal infrastructure in parallel.
The compliance landscape for practitioner-built AI tools is broader than most people realize. It isn't just HIPAA and malpractice. Here's what I'm seeing practitioners miss:
Is Your Tool a Medical Device?
If your tool makes clinical recommendations without a physician reviewing each one, it may cross the line from workflow or clinical decision support into FDA-regulated medical device. The January 2026 CDS guidance updated the four-prong exemption test. A tool that surfaces patient data for physician review is likely exempt. A tool that autonomously triages, diagnoses, or recommends treatment without physician-in-the-loop may require FDA clearance. The distinction matters, and it's not always obvious from the builder's perspective.
Are You Exposing Patient Data to Public AI?
When you build with AI platforms, patient data flows through third-party servers for processing. Consumer-tier AI accounts — the free or personal plans most people start with — typically do not offer BAAs and may use your data for model training. If your patient data has already flowed through a consumer AI API, you may have a HIPAA violation that's already occurred. Enterprise or API tiers with BAAs and zero-data-retention policies exist, but practitioners need to be on them before patient data enters the pipeline.
Beyond the AI vendor, the coding environment itself matters. A recent study tested over 100 AI models on common coding tasks and found that in 45% of test cases, the models produced code with OWASP Top-10 security vulnerabilities. A practitioner who vibe-codes a patient-facing tool without a security audit is deploying software with known vulnerability patterns into a clinical environment.
Do Your Existing Vendor Contracts Allow What You're Building?
If you're building tools that integrate with your EHR, your practice management system, or any vendor platform, check the contract first. Many vendor agreements restrict how you can use APIs, what data you can extract, and whether you can build integrations that modify or extend the vendor's functionality. Building a custom workflow on top of a vendor platform may violate the terms you already agreed to. If you're negotiating custom API access, the contract should be clear on who owns the integration layer.
Who Owns What You Built?
The AI workflows, clinical protocols, custom agents, and prompt libraries you've built are valuable intellectual property. But ownership depends on who built it, what tools were used, what your employment or contractor agreements say, and what the AI vendor's terms of service provide. Some no-code platforms have ToS provisions that could affect your ownership of tools built on their platform.
If you're co-developing with a partner or contractor, document ownership clearly — the tool, the data, the improvements, the derivatives. If you're a physician in a group practice, check whether your employment agreement assigns practice-related IP to the group. The custom system you spent six months building could belong to a departing partner, a vendor, or a contractor depending on how your agreements are written.
If what you've built is a product rather than an internal workflow, consider whether it should be owned by a separate entity. Mixing practice assets and product IP in the same entity creates risk in both directions.
Patient Consent, Authorization, and Disclosure
Multiple states have enacted or are actively advancing laws requiring disclosure to patients when AI is used in clinical care; the landscape is expanding rapidly.If your tool involves patient outreach that could look like marketing, TCPA rules apply, and the FCC ruled in 2024 that AI-generated voice calls are regulated. The healthcare exemption under TCPA is narrow.
And, if your tool changes how you use or disclose protected health information, your Notice of Privacy Practices may need to be updated and redistributed. If the tool collects new categories of patient data or uses existing data in new ways, additional patient authorizations may be required.
Insurance Gaps
Malpractice insurers are actively updating policies to address AI. Some carriers are adding AI-specific exclusions; others require disclosure of AI tool usage. If you've built and deployed a clinical tool and your policy excludes AI-assisted decisions, you may be practicing without effective malpractice coverage for the tools you rely on most.
Beyond malpractice, practitioner-builders may need cyber liability insurance and technology errors-and-omissions coverage. The gap between these policy types is where most claims involving custom-built tools will land.
The HIPAA Security Risk Assessment You Haven't Updated
If you've deployed a new AI tool in your practice, your Security Risk Assessment needs to account for it. Data flows, access controls, encryption, audit logging, vendor relationships. The proposed HIPAA Security Rule update (expected mid-2026) eliminates the distinction between "addressable" and "required" safeguards. Everything becomes required. Mandatory encryption, multi-factor authentication, and AI-specific risk analysis requirements are coming.
Your audit trail matters too. If AI generates a clinical note or recommendation, you need to document what the AI recommended versus what the physician decided. HIPAA requires six years of audit log retention. Most vibe-coded tools don't generate this documentation by default.
Bias, Accessibility, and Nondiscrimination
The updated Section 1557 nondiscrimination rule (effective May 2025) explicitly covers AI and clinical decision support tools. If your tool produces different recommendations across patient demographics, you face civil rights liability with a private right of action. And if your tool is patient-facing, ADA and Section 504 require accessibility compliance — WCAG 2.1 AA standards, with enforcement deadlines arriving in 2026.
Business Continuity
If your practice depends on an AI tool and the vendor goes down, what happens? HIPAA requires contingency planning. 74% of healthcare cyber incidents are linked to third-party vendors. Building on a platform you don't control requires a plan for the day it's unavailable.
What Telehealth Companies Already Learned
Every one of these issues has been encountered, litigated, enforced, or expensively discovered by the telehealth companies that built tech-enabled healthcare businesses over the past decade. FDA classification questions. HIPAA violations from data flowing through unsecured channels. IP disputes between co-founders. Fee-splitting enforcement actions. Malpractice claims where the technology was a contributing factor.
The new wave of practitioner-builders has the advantage of learning from that decade of experience — if they're willing to build the legal and regulatory infrastructure in parallel with the technology.
This is exactly what we do. We've represented hundreds of digital health companies navigating these questions. The practitioner-builders emerging right now face the same legal landscape that telehealth startups faced five and ten years ago. We've already seen the movie. We know which scenes end badly.
The Bottom Line
The technical capability gap between healthcare practitioners and technology builders is closing. AI gave every practitioner with a laptop access to the same build-or-buy decision that venture-backed digital health companies have been making for a decade.
This is a market expansion. More builders means more demand for the specialized clinical tools, EHR platforms, and vendor integrations that practitioners build on top of. More sophisticated buyers means better products. More integrated practices means deeper vendor relationships. The digital health ecosystem gets bigger when practitioners start building, not smaller.
And the practitioners who act on this thoughtfully — combining vendor tools with custom-built systems, validating before deploying, and building the legal infrastructure in parallel — will design practices that no off-the-shelf tool could replicate.
The opportunity is massive, and so is the responsibility.
We built an 18-point legal compliance checklist for practitioner-builders, organized by phase: before you build, before you deploy, and ongoing operations. It covers FDA classification, data security, vendor contracts, IP ownership, patient consent, state AI disclosure laws, insurance, HIPAA risk assessment, audit trails, bias testing, accessibility, marketing compliance, and business continuity.
Email rebecca@elevarelaw.com to access the checklist.
Questions about your specific situation? Elevare Law helps digital health companies and (now more often!) healthcare providers build the regulatory and contractual infrastructure that makes tech-enabled healthcare work. The practitioner-builders emerging today face the same legal questions telehealth startups faced a decade ago. We've navigated them before - feel free to reach out.

