Algorithmic Accountability
/ˌælɡəˈrɪðmɪk əˌkaʊntəˈbɪlɪti/
Definitions
- (n.) The principle and practice of holding entities responsible for the design, deployment, and impact of automated decision-making systems under legal and ethical standards.
Algorithmic accountability requires companies to disclose how their AI affects loan approvals to prevent discrimination.
Forms
- algorithmic accountability
Related terms
See also
Commentary
Focuses on legal responsibility for effects of algorithms, emphasizing transparency and fairness in automated decisions.
This glossary is for general informational and educational purposes only. Definitions are jurisdiction-agnostic but reflect terminology and concepts primarily drawn from English and American legal traditions. Nothing herein constitutes legal advice or creates a lawyer-client relationship. Users should consult qualified counsel for advice on specific matters or jurisdictions.