Using the appropriate assessment tools to identify and prioritise risks
Based on agreed principles, an organisation needs to define how they think about risk and impact - what is considered ‘risky’ or high impact, who decides this rating, and how to use this assessment to select use cases. This step is often driven by the industry an organisation is in or type of work an organisation does.
OECD AI Framework
OECD has developed a framework for comparing tools and practices to implement trustworthy AI systems by providing a way to compare tools in different use contexts. There are many impact assessment tools available, but this meta analysis provides some larger context for the selection of tools for specific use cases.
NIST Risk Management Framework
NIST has developed a framework to better manage risks to individuals, organisations, and society associated with artificial intelligence (AI). The NIST AI Risk Management Framework is designed to drive the incorporation of trustworthiness considerations into the design, development, use, and evaluation of AI products, services, and systems.