Cyber & IT Supervisory Forum - Additional Resources

developed and implemented by human-factors and socio-technical domain experts and researchers designed to ensure control of interviewer and end user subjectivity and biases Identify and document approaches: for evaluating and integrating elicited feedback from system end users in collaboration with human-factors and socio-technical domain experts, to actively inform a process of continual improvement. Evaluate feedback from end users alongside evaluated feedback from impacted communities (MEASURE 3.3). Utilize end user feedback to investigate how selected metrics and measurement approaches interact with organizational and operational contexts. Analyze and document system-internal measurement processes in comparison to collected end user feedback. Identify and implement approaches to measure effectiveness and satisfaction with end user elicitation techniques, and document results. Did your organization address usability problems and test whether user interfaces served their intended purposes? How will user and peer engagement be integrated into the model development process and periodic performance review once deployed? To what extent can users or parties affected by the outputs of the AI system test the AI system and provide feedback? To what extent are the established procedures effective in mitigating bias, inequity, and other concerns resulting from the system? GAO-21-519SP - Artificial Intelligence: An Accountability Framework for Federal Agencies & Other Entities. AI Transparency Resources Organizations can document the following: Transparency & Documentation

164

Made with FlippingBook Annual report maker