Engineering Fair Systems

Many diverse factors can cause software bias, including poor design, implementation bugs, unintended component interactions, and the use of unsafe algorithms or biased data. Our work focuses on using the engineering process to improve software fairness. For example, tools can help domain experts specify fairness properties and detect inconsistencies among those requirements; they can automatically generate test suites to measure software bias to identify bias in black-box systems even when the system’s source code and the data used to train it are unavailable; they can help developers and data scientists debug causes of bias, both in the source code and the data; and they can formally verify fairness properties in the implementation. Our work in engineering fair systems combines research in software engineering with machine learning, vision, natural language processing, and theoretical computer science to create tools that help build more fair systems.