Quantifying and minimizing bias in artificial intelligence systems is the goal of a new lab established within SMU’s AT&T Center for Virtualization. Pangiam, a global leader in artificial intelligence, is the first industry partner for the Intelligent Systems and Bias Examination Lab (ISaBEL).
ISaBEL’s mission is to understand how Artificial Intelligence (AI) systems, such as facial recognition algorithms, perform on diverse populations of users. The Lab will examine how existing bias can be mitigated in these systems using the latest research, standards, and other peer reviewed scientific studies.
Algorithms provide instructions for computers to follow in performing certain tasks, and bias can be introduced through such things as incomplete data or reliance on flawed information. As a result, the automated decisions propelled through algorithms that support everything from airport security to judicial sentencing guidelines can inadvertently create disparate impact across certain groups.
ISaBEL will design and execute experiments using a variety of diverse datasets that will quantify AI system performance across demographic groups.
As the lab grows ISaBEL will seek additional industry partners to submit their algorithms for certification. “SMU’s AT&T Center for Virtualization is the perfect place to work on these issues with its focus on cross-disciplinary research, education and training, and community outreach,” said center director Suku Nair.
Both artificial intelligence and computer vision, which enables computers to pull information from digital images and videos, are quickly evolving and becoming increasingly accessible and adopted.
“How to study and mitigate bias in AI systems is a fast moving area, with pockets of researchers all over the world making important contributions,” said John Howard, an AT&T Center research fellow and biometrics expert. “Labs like ISaBEL will help ensure these breakthroughs make their way into the products where they can do the most good and also educate the next generation of computer scientists about these important issues.”
AI industry leaders know that end users must clearly understand bias measurement and the progress of bias mitigation to build trust among the general public and drive full market adoption.
“At Pangiam, we are fundamentally committed to driving the industry forward with impactful efforts such as this,” said Pangiam Chief AI Officer and SMU Alumnus, Shaun Moore. “Bias mitigation has been a paramount focus for our team since 2018 and we set out to demonstrate publicly our effort toward parity of performance across countries and ethnicities. SMU is the perfect institution for this research.”
ISaBEL is currently recruiting graduate and undergraduate students to participate in the lab’s AI research. Please contact Suku Nair at [email protected] if interested.