Cyber & IT Supervisory Forum - Additional Resources
Identify the classes of individuals, groups, or environmental ecosystems which might be impacted through direct engagement with potentially impacted communities. Evaluate systems in regard to disability inclusion, including consideration of disability status in bias testing, and discriminatory screen out processes that may arise from non-inclusive design or deployment decisions. Develop objective functions in consideration of systemic biases, in group/out-group dynamics. Use context-specific fairness metrics to examine how system performance varies across groups, within groups, and/or for intersecting groups. Metrics may include statistical parity, error-rate equality, statistical parity difference, equal opportunity difference, average absolute odds difference, standardized mean difference, percentage point differences. Customize fairness metrics to specific context of use to examine how system performance and potential harms vary within contextual norms. Define acceptable levels of difference in performance in accordance with established organizational governance policies, business requirements, regulatory compliance, legal frameworks, and ethical standards within the context of use. Define the actions to be taken if disparity levels rise above acceptable levels. Identify groups within the expected population that may require disaggregated analysis, in collaboration with impacted communities. Leverage experts with knowledge in the specific context of use to investigate substantial measurement differences and identify root causes for those differences. Monitor system outputs for performance or bias issues that exceed established tolerance levels. Ensure periodic model updates; test and recalibrate with updated and more representative data to stay within acceptable levels of difference.
147
Made with FlippingBook Annual report maker