Abstract
Healthcare cybersecurity concentrates on two threat classes: data confidentiality breaches and adversarial perturbation of machine learning models. This leaves a third attack surface unexamined: the ontological layer, the classification systems (ICD-10, CPT, SNOMED CT) through which clinical encounters become structured data. This analysis is limited to administrative AI in primary care contexts.
Recent deployment of administrative AI has demonstrated that this layer is already subject to measurable distortion: documentation tools inflate symptom levels across all six RDoC domains (estimated increases of 30--51%; Castro et al. 2026) while reducing clinical interventions (adjusted OR 0.83), and coding assistants shift evaluation-and-management levels upward by 8--13 percentage points. Each documented distortion mechanism constitutes a potential attack vector.
This paper presents a six-class threat taxonomy for ontological attacks on clinical AI systems: (1) Ontology Poisoning, (2) Cascade Injection, (3) Semantic Confusion Attacks, (4) Documentation Flooding, (5) Knowledge Supply Chain Compromise, and (6) Feedback Loop Exploitation. Each class is characterised by attack surface, access level, detectability, harm profile, and analogous traditional attack. I argue that existing security frameworks (NIST CSF 2.0, MITRE ATT&CK) and regulatory instruments (EU AI Act, NIS2) lack coverage for ontological attacks. I propose ontological integrity, the fidelity of classification systems under AI mediation, as a security property requiring dedicated monitoring.
Three limitations bound the analysis: the taxonomy is anticipatory, the threat classes derive from one pipeline architecture, and economic incentives relative to traditional cybercrime remain unquantified.



![Author ORCID: We display the ORCID iD icon alongside authors names on our website to acknowledge that the ORCiD has been authenticated when entered by the user. To view the users ORCiD record click the icon. [opens in a new tab]](https://www.cambridge.org/engage/assets/public/coe/logo/orcid.png)