Coverage-Guided Fairness Testing

Daniel Perez Morales, Takashi Kitamura, Shingo Takada

研究成果: Conference contribution

抄録

Software testing is a crucial task. Unlike conventional software, AI software that uses decision-making algorithms or classifiers needs to be tested for discrimination or bias. Such bias can cause discrimination towards certain individuals based on their protected attributes, such as race, gender or nationality. It is a major concern to have discrimination as an unintended behavior. Previous work tested for discrimination randomly, which has resulted in variations in the results for each test execution. These varying results indicate that, for each test execution, there is discrimination that is not found. Even though it is nearly impossible to find all discrimination unless we check all possible combinations in the system, it is important to detect as much discrimination as possible. We thus propose Coverage-Guided Fairness Testing (CGFT). CGFT leverages combinatorial testing to generate an evenly-distributed test suite. We evaluated CGFT with two different datasets, creating three models with each. The results show an improvement in the number of unfairness found using CGFT compared to previous work.

本文言語English
ホスト出版物のタイトルComputer and Information Science, 2021
編集者Roger Lee
出版社Springer Science and Business Media Deutschland GmbH
ページ183-199
ページ数17
ISBN(印刷版)9783030794736
DOI
出版ステータスPublished - 2021
イベント20th IEEE/ACIS International Summer Semi-Virtual Conference on Computer and Information Science, ICIS 2021 - Shanghai, China
継続期間: 2021 6月 232021 6月 25

出版物シリーズ

名前Studies in Computational Intelligence
985
ISSN(印刷版)1860-949X
ISSN(電子版)1860-9503

Conference

Conference20th IEEE/ACIS International Summer Semi-Virtual Conference on Computer and Information Science, ICIS 2021
国/地域China
CityShanghai
Period21/6/2321/6/25

ASJC Scopus subject areas

  • 人工知能

フィンガープリント

「Coverage-Guided Fairness Testing」の研究トピックを掘り下げます。これらがまとまってユニークなフィンガープリントを構成します。

引用スタイル