Abstract
This paper introduces Syn-Optic, a novel hybrid framework designed to address the critical challenges of data scarcity, hardware variability, and decision uncertainty in autonomous diagnostic systems. The framework integrates three core components: a synthetic data generation module for creating annotated training datasets, a support vector regression-based optimization engine for real-time sensor parameter tuning, and a fusion architecture that combines learned perception with symbolic reasoning. We present extensive mathematical formulations for each component, including a novel loss function for synthetic-to-real domain adaptation and a constrained optimization routine for lens parameter adjustment. Comprehensive experiments demonstrate the framework's efficacy in two distinct domains: robotic-assisted medical imaging and autonomous vehicle perception. Results show that the Syn-Optic framework achieves a 37% reduction in mean reprojection error compared to baseline calibration methods, while the hybrid decision layer reduces safety-critical errors by over 99% compared to purely statistical models. The paper concludes that the integration of synthetic data, statistical optimization, and rule-based validation provides a robust pathway toward trustworthy and adaptive autonomous systems.



![Author ORCID: We display the ORCID iD icon alongside authors names on our website to acknowledge that the ORCiD has been authenticated when entered by the user. To view the users ORCiD record click the icon. [opens in a new tab]](https://www.cambridge.org/engage/assets/public/coe/logo/orcid.png)