model-explainability

Model Explainability

A Shiny-based web application that enables users to upload images and analyze how deep learning models make predictions. The app provides two interpretability techniques:

  1. Grad-CAM (Gradient-weighted Class Activation Mapping) – Highlights important regions in an image that contributed to a model’s prediction.
  2. SHAP (SHapley Additive exPlanations) – A game-theoretic approach that assigns importance values to each pixel.

This project focuses on XAI (Explainable AI) techniques for deep learning with a clean, scalable architecture.

πŸ”₯ Features

πŸ“ Project Structure

model-explainability/
β”œβ”€β”€ src/
β”‚   β”œβ”€β”€ python/                    # Python explainability modules
β”‚   β”‚   β”œβ”€β”€ models/                # Model loading and management
β”‚   β”‚   β”œβ”€β”€ explainability/        # Grad-CAM and SHAP implementations
β”‚   β”‚   β”œβ”€β”€ utils/                 # Utility functions
β”‚   β”‚   └── explainability_api.py  # Main API interface
β”‚   └── shiny/                     # R Shiny application components
β”‚       β”œβ”€β”€ ui.R                   # User interface
β”‚       β”œβ”€β”€ server.R               # Server logic
β”‚       └── modules/               # Modular Shiny components
β”œβ”€β”€ tests/
β”‚   β”œβ”€β”€ python/                    # Python unit tests
β”‚   └── r/                         # R tests
β”œβ”€β”€ assets/
β”‚   β”œβ”€β”€ images/                    # Sample images and test data
β”‚   └── docs/                      # Documentation and papers
β”œβ”€β”€ config/                        # Configuration files
β”œβ”€β”€ outputs/                       # Generated explanations (gitignored)
β”œβ”€β”€ requirements.txt               # Python dependencies
β”œβ”€β”€ app.R                         # Main Shiny app entry point
└── README.md

πŸ› οΈ Installation & Setup

Prerequisites

1️⃣ Clone the Repository

git clone https://github.com/rodrick-mpofu/model-explainability.git
cd model-explainability

2️⃣ Install Python Dependencies

pip install -r requirements.txt

3️⃣ Install R Dependencies

# Install required R packages
install.packages(c("shiny", "shinydashboard", "shinythemes", 
                   "shinycssloaders", "reticulate", "png"))

4️⃣ Configure Python Environment

# In R, configure reticulate to use your Python environment
library(reticulate)
use_python("/path/to/your/python")  # or use_virtualenv()

5️⃣ Run the Application

# Run the Shiny app
shiny::runApp("app.R")

🎯 How It Works

  1. Upload an Image πŸ“·
  2. Select Model & Technique (Grad-CAM or SHAP) πŸ—οΈ
  3. Adjust Confidence Threshold 🎚️
  4. View Model Explanation πŸ”₯ (Heatmaps for Grad-CAM / SHAP values for SHAP)
  5. Interpret Results βœ…

🐍 Python API Usage

You can also use the Python API directly:

from src.python.explainability_api import explain_image

# Generate Grad-CAM explanation
results = explain_image(
    img_path="assets/images/dog.jpg",
    model_name="vgg16",
    technique="gradcam",
    confidence_threshold=0.5
)

# Generate SHAP explanation  
results = explain_image(
    img_path="assets/images/dog.jpg",
    model_name="resnet50", 
    technique="shap",
    confidence_threshold=0.3
)

πŸ§ͺ Testing

Run the test suite:

# Python tests
python -m pytest tests/python/

# R tests (if available)
Rscript -e "testthat::test_dir('tests/r')"

πŸ“š Architecture Benefits

The new modular structure provides:

πŸ“¬ Contact & Contributions

πŸ“„ License

This project is available under the MIT License. See LICENSE file for details.