Cyber & IT Supervisory Forum - Additional Resources

GOVERN 4.2 Organizational teams document the risks and potential impacts of the AI technology they design, develop, deploy, evaluate and use, and communicate about the impacts more broadly. About Impact assessments are one approach for driving responsible technology development practices. And, within a specific use case, these assessments can provide a high-level structure for organizations to frame risks of a given algorithm or deployment. Impact assessments can also serve as a mechanism for organizations to articulate risks and generate documentation for managing and oversight activities when harms do arise. be applied at the beginning of a process but also iteratively and regularly since goals and outcomes can evolve over time. include perspectives from AI actors, including operators, users, and potentially impacted communities (including historically marginalized communities, those with disabilities, and individuals impacted by the digital divide), assist in “go/no-go” decisions for an AI system. consider conflicts of interest, or undue influence, related to the organizational team being assessed. See the MAP function playbook guidance for more information relating to impact assessments. Suggested Actions Establish impact assessment policies and processes for AI systems used by the organization. Align organizational impact assessment activities with relevant regulatory or legal requirements. Verify that impact assessment activities are appropriate to evaluate the potential negative impact of a system and how quickly a system changes, and that assessments are applied on a regular basis. Utilize impact assessments to inform broader evaluations of AI system risk. Impact assessments may:

34

Made with FlippingBook Annual report maker