A platform for assessing the safety of AI has been created in the UK
13.05.2024 | 15:58 |The UK's Artificial Intelligence Security Institute (AISI) has launched Inspect, a platform designed to test and assess the safety of AI in industrial, research and academic organisations.
The Inspect platform consists of three main components - data sets, solution tools and assessment tools.
Data sets are samples for evaluation tests. With the help of solution tools, the work of conducting tests is performed, and evaluation tools are used to evaluate the results and summarize test scores into metrics.
Inspect components can be extended with third-party packages written in the Python programming language.
Chairman of the British Institute for AI Safety, Ian Hogarth, said he hopes the global AI community will use the Inspect platform not only to conduct their own model safety tests, but also to adapt and develop the open source platform.
Last month, the National Institute of Standards and Technology (NIST) announced the launch of the NIST GenAI program, designed to evaluate various generative AI technologies, including models that generate text and images.
ORIENT news