BIAS AUDIT FRAMEWORKS: DEVELOPING TOOLS FOR EARLY DETECTION OF ALGORITHMIC BIAS IN AI DEVELOPMENT
Keywords:
Algorithmic bias, Fairness, AI development, AI, Bias audit framework, Equity, AccountabilityAbstract
Algorithmic bias in artificial intelligence (AI) systems continues to pose significant ethical and societal challenges, especially in critical domains such as healthcare, education, and finance. Current approaches to bias mitigation often fail to provide a holistic, proactive solution that integrates fairness, accountability, and transparency into the AI development lifecycle. This study introduces a Bias Audit Framework designed to detect and mitigate algorithmic bias during the early stages of AI development. The framework comprises four core components: Data Bias Assessment, Model Bias Evaluation, Developer Awareness and Training, and Continuous Monitoring and Feedback. A healthcare dataset was used as a case study to evaluate the framework's efficacy. Initially, the logistic regression model trained on the imbalanced dataset achieved high overall performance with Accuracy: 85%, Precision: 0.89, and Recall: 0.83, but exhibited fairness issues. Disparate Impact Ratio (DIR) was 0.67, and Equal Opportunity Difference (EOD) was 0.13, reflecting gender bias. After applying the Bias Audit Framework,—including oversampling, data augmentation, and threshold optimization—the model was retrained. Its performance remained robust (Accuracy: ~84–85%, Precision: ~0.88, Recall: ~0.88), while fairness significantly improved: Female recall increased to 0.88, reducing EOD to ~0, and DIR improved to 0.85–0.95, indicating a more balanced and equitable model. By equipping developers with practical tools and emphasizing interdisciplinary collaboration, the framework ensures a systematic and ethical approach to addressing algorithmic bias. These findings underscore the importance of embedding bias mitigation practices into all stages of AI development to foster equitable and trustworthy AI systems.
Published
How to Cite
Issue
Section
FUDMA Journal of Sciences