How to enforce policy/regulation. “Trust but verify”

Aug 18, 2019AI-Government

Shaping the future of AI will require new regulation of technology. Some possible directions include restrictions on the collection and use of data, requiring the use of machine learning tools and frameworks that are fair by design, and mandating processes that promote accountability by allowing people to contest algorithmic decisions.

However, the natural question that arises is: how will these new policies and regulations be enforced? Without a means for assessing whether systems are in compliance, regulators are powerless to hold the designers and operators of those systems accountable.

Algorithm auditing is one potential answer to this question. Using algorithm auditing techniques, it is possible to scientifically evaluate whether a black-box system exhibits a range of negative behaviors, such as discrimination against protected classes or predatory pricing. Algorithm auditing enables academics, members of civil society, investigative journalists, and regulators to assess whether algorithmic systems are obeying policy proscriptions and best-practices.

Algorithm auditing methods have two desirable properties. First, systems can be audited without requiring access to proprietary source code and datasets. This avoids obvious concerns about revealing trade secrets or sensitive datasets to third-parties. Second, audits can be conducted in secret, making them suitable for voluntary and involuntary compliance testing.

 

As part of the Social Contract 2020 we should carefully consider the compliance and enforcement role that algorithm audits can play. One option is to mandate independent algorithm audits of all consequential AI systems, similar to how we currently mandate financial audits (and should mandate cybersecurity audits). A second option is not to mandate algorithm audits, but to instead legalize and legitimize their practice. This would at least permit consequential AI systems to be audited, rather than allowing system owners to shroud their technology in a haze of legal use restrictions.

Christo Wilson

Associate Professor
Khoury College of Computer Sciences
Northeastern University

Member of the Social Contract 2020 Team

Michael Dukakis Leadership Fellow