New tool released to test AI safety
New tool designed to strengthen AI safety
The UK AI Safety Institute, part of the Department for Science, Innovation and Technology, has developed a cutting-edge artificial intelligence (AI) safety testing platform called Inspect, which aims to transform global AI safety evaluations.
Released for wider use, Inspect promises to make AI model development safer and more robust, providing a chance for businesses, academia, and AI developers to contribute to AI safety.
The Inspect platform marks a significant step towards global collaboration on AI safety evaluations. Its release follows the establishment of the world's first state-backed AI Safety Institute, showing the UK's commitment to leading AI safety initiatives.
Accessible through an open-source licence, Inspect allows testers to assess specific capabilities of AI models and produce scores based on their performance. From evaluating core knowledge to autonomous capabilities, Inspect offers a comprehensive approach to AI safety evaluations.
For startups and technology businesses, Inspect presents an opportunity to engage in AI safety testing and development. By using this platform, they can contribute to shaping the future of AI safety while gaining insights into improving the safety and reliability of their own AI models.
Access the open source Inspect platform.
First published 13 May 2024