Medical Device AI Compliance in Texas: TRAIGA + SB 1188 Requirements
Medical devices with AI capabilities deployed in Texas face a dual compliance surface: the Responsible Artificial Intelligence Governance Act (TRAIGA) for prohibited practice screening and NIST alignment, plus SB 1188 for patient disclosure requirements. These are state-level obligations that layer on top of existing FDA clearance — an FDA 510(k) or De Novo approval does not satisfy Texas compliance.
This guide covers the compliance requirements for medical device manufacturers, hospital systems deploying clinical AI, and compliance officers managing the intersection of federal and state AI regulation.
Which Medical Devices Are In Scope?
Any medical device with AI or machine learning capabilities used in Texas patient care triggers both TRAIGA and SB 1188 obligations. This includes:
- Diagnostic imaging AI — radiology AI (Aidoc, Viz.ai, Zebra Medical), pathology AI, dermatology screening AI
- Clinical decision support — AI-powered clinical decision support systems (CDS) that recommend treatments, flag drug interactions, or prioritize patients
- Triage and risk scoring — sepsis prediction, deterioration alerts, emergency department triage AI
- Surgical AI — robotic-assisted surgery with AI guidance, pre-operative planning AI
- Remote monitoring — AI-powered wearables, continuous glucose monitors with predictive algorithms, cardiac monitoring with AI arrhythmia detection
- Clinical documentation — ambient AI scribes (Nuance DAX, Abridge), AI-powered coding and billing
- EHR-embedded AI — predictive models built into Epic, Cerner, and other EHR platforms
The scope is broader than many compliance teams realize. If a hospital uses Epic with Cosmos AI features enabled, those features are individually in-scope AI systems under TRAIGA.
TRAIGA Requirements for Medical Device AI
Medical device AI must be screened against TRAIGA's 7 prohibited practices, the same as any other AI system. The practices most relevant to medical devices:
Biometric Categorization (High Risk)
AI systems that infer race, religion, or other protected characteristics from biometric data are prohibited. Medical devices processing biometric data (imaging, vital signs, genetic information) need explicit documentation that the AI does not categorize patients by protected characteristics for any non-clinical purpose.
The gray zone: Many diagnostic AI models are trained on datasets that include demographic labels. If the model uses race or ethnicity as an input variable for clinical prediction (e.g., eGFR calculations), document the clinical justification and ensure the categorization serves a legitimate medical purpose — not a prohibited one.
Vulnerability Exploitation (Medium Risk)
Medical AI that targets patient vulnerabilities to influence decisions could trigger this prohibition. A patient recommendation engine that steers patients toward higher-revenue treatment options based on their insurance status or financial vulnerability would cross the line.
Subliminal Manipulation (Lower Risk)
Medical device AI is generally lower risk here, but patient-facing AI interfaces that influence health decisions through undisclosed algorithmic nudging (e.g., medication adherence apps with hidden behavioral manipulation) should be reviewed.
SB 1188 Patient Disclosure for Medical Devices
SB 1188 requires healthcare providers to disclose to patients when AI assists in their care. For medical devices, this means:
- Pre-service or at-time-of-service notice that an AI-powered device is being used in diagnosis, treatment planning, or clinical decision-making
- Conspicuous and clear disclosure — not buried in admission paperwork; specifically identifying the AI-assisted component
- Documentation — timestamp, the specific device/AI system referenced, patient acknowledgment method, associated encounter
- Dark-pattern-free design — disclosure cannot use pre-checked consent, misleading “accept all” language, or other manipulative UX patterns
The Per-Device Disclosure Problem
A single patient encounter may involve multiple AI-powered medical devices. A radiology appointment could use AI-assisted image acquisition, AI-powered anomaly detection, and AI-generated preliminary findings. Each is a separate AI system requiring separate disclosure.
Practical approaches:
- Departmental disclosure — “This department uses the following AI-assisted devices in your care: [list]” delivered at department check-in
- Encounter-specific disclosure — tied to specific procedures and orders, documented in the EHR
- Facility-wide baseline + procedure-specific supplements — a general facility notice that AI is used in clinical care, plus specific disclosure when high-impact AI devices are involved in a patient's treatment
FDA Clearance Does Not Equal Texas Compliance
This is the critical point most medical device manufacturers and hospital compliance teams miss. FDA 510(k) clearance, De Novo authorization, or PMA approval does not satisfy TRAIGA or SB 1188 requirements.
The FDA evaluates safety and efficacy. TRAIGA evaluates prohibited intent. SB 1188 evaluates patient transparency. These are separate regulatory surfaces:
- FDA: “Is this device safe and effective?”
- TRAIGA: “Was this AI deployed with prohibited intent?” + “Can you demonstrate NIST alignment?”
- SB 1188: “Was the patient told AI assisted in their care?”
An FDA-cleared diagnostic AI device that is deployed without prohibited practice screening, without NIST alignment documentation, and without patient disclosure is non-compliant in Texas — regardless of its federal regulatory status.
Compliance Responsibility: Manufacturer vs. Hospital
Under TRAIGA, the deployer bears primary compliance responsibility. In the medical device context:
- Hospitals and health systems are the deployers — they select, integrate, and use the device on patients
- Manufacturers build the device but are not the deployer unless they also operate a clinical practice
- However, manufacturers should support hospital compliance by providing: prohibited practice attestations, NIST alignment documentation, data flow descriptions, and bias/fairness testing results
Procurement is the leverage point. Hospitals should require TRAIGA compliance documentation from AI medical device vendors as part of the procurement process. This is where evidence bundles become valuable — standardized compliance packages that vendors can provide to every Texas hospital customer.
The NIST Safe Harbor for Medical AI
The NIST AI RMF affirmative defense is especially valuable for medical device AI. Clinical AI systems have inherent complexity — bias risks, accuracy variability across patient populations, evolving model performance. Documenting NIST alignment demonstrates proactive governance:
- Govern: Clinical AI governance committee, device approval process, vendor evaluation criteria
- Map: Per-device risk mapping — patient populations affected, data sources, clinical decisions influenced, known limitations
- Measure: Accuracy benchmarks by patient demographics, bias monitoring protocols, performance drift detection
- Manage: Clinical AI incident response, device retirement criteria, vendor update verification
Implementation Checklist for Medical Device AI
- Inventory every AI-powered medical device in your clinical environment — including EHR-embedded AI features
- Screen each against TRAIGA prohibited practices — biometric categorization is the highest risk category for medical devices
- Implement SB 1188 disclosure workflows — per-department or per-procedure, documented in the EHR
- Build NIST alignment per device — governance, risk mapping, measurement, management
- Update procurement requirements — require TRAIGA compliance documentation from vendors
- Establish clinical AI incident process — bias events, misdiagnosis patterns, model drift detection
- Generate evidence bundles — audit-ready packages for Joint Commission, AG, and procurement review
Automate Medical Device AI Compliance
TXAIMS includes healthcare-specific compliance surfaces: SB 1188 disclosure tracking, prohibited practice screening for clinical AI, NIST scoring per device, and evidence bundles formatted for healthcare compliance audits. Register as a healthcare deployer and the platform activates medical device compliance workflows automatically.
Ready to automate your TRAIGA compliance?
TXAIMS screens your AI systems, builds your NIST defense, and generates evidence bundles in minutes.
Start 14-day free trial