EVALUATING THE ROLE OF ARTIFICIAL INTELLIGENCE IN DIAGNOSING CHRONIC SINUSITIS USING CT IMAGING: AN ANALYTICAL SYSTEMATIC REVIEW
Abstract
Background: Chronic rhinosinusitis (CRS) affects approximately 5–12% of adults worldwide, significantly impacting patients’ quality of life and healthcare systems. Computed tomography (CT) remains the gold standard for CRS evaluation, yet traditional interpretation is time-intensive, operator-dependent, and prone to variability among radiologists. These challenges highlight the need for more efficient and standardized diagnostic approaches.
Objective: This analytical systematic review evaluates the effectiveness of artificial intelligence (AI)—including machine learning (ML) and deep learning (DL) algorithms—in diagnosing CRS through CT imaging. It aims to determine whether AI models can improve diagnostic accuracy, consistency, and workflow efficiency compared to conventional radiologic assessments.
Methods: Following PRISMA 2020 guidelines, ten peer-reviewed studies published between 2015 and 2025 were analyzed. The review synthesized data on AI algorithms used for sinus pathology detection, segmentation, and classification, comparing performance metrics such as accuracy, sensitivity, specificity, and area under the curve (AUC) to human interpretation.
Results: Evidence shows that AI-driven CT analysis significantly enhances diagnostic precision, achieving accuracies exceeding 90% and AUC values above 0.95, while reducing interpretation time and observer variability. CNN and U-Net architectures demonstrated exceptional capability in identifying sinus opacification and structural remodeling.
Conclusion: Incorporating AI into CT-based CRS diagnosis offers a more consistent, objective, and rapid assessment method than traditional interpretation alone. As AI technologies advance, they hold promise for standardizing CRS evaluation, improving diagnostic reliability, and supporting data-driven clinical decision-making.
Downloads
How to Cite
Issue
Section
License

This work is licensed under a Creative Commons Attribution-NoDerivatives 4.0 International License.