Integrated Gradient Correlation#

Integrated Gradient Correlation (IGC) is a Python/PyTorch package that provides a unique dataset-wise attribution method.

It is designed to improve the interpretability of deep neural networks at a task-level, rather than an instance-level (as available attribution methods generally do). For more theoretical details, please refer to the original paper [LelievreC24].

This package primarily focuses on a class computing IGC attributions for PyTorch modules. Nonetheless, it also offers utilities to calculate simple gradients, Integrated Gradients (IG), and some naive dataset-wise attribution methods.

Usage#

API Reference#

Citations#

  • Integrated Gradient Correlation: a Dataset-wise Attribution Method
    Pierre Lelièvre, Chien-Chung Chen
    Department of Psychology, National Taiwan University
    badge_1
  • Package (latest version)
    badge_2

License#

The IGC library is freely available under the MIT License.

Copyright 2024 Pierre Lelièvre

Index#

Bibliography#

[LelievreC24]

Pierre Lelièvre and Chien-Chung Chen. Integrated Gradient Correlation: a Dataset-wise Attribution Method. April 2024. arXiv:2404.13910.

[SN20]

Mukund Sundararajan and Amir Najmi. The many Shapley values for model explanation. February 2020. arXiv:1908.08474.

[STY17]

Mukund Sundararajan, Ankur Taly, and Qiqi Yan. Axiomatic Attribution for Deep Networks. June 2017. arXiv:1703.01365.

[Wel62]

B. P. Welford. Note on a Method for Calculating Corrected Sums of Squares and Products. Technometrics, 4(3):419–420, August 1962. doi:10.1080/00401706.1962.10490022.