US – UK Team Up and Address Risks of Artificial Intelligence


Washington, DC, Friday, April 13, 2024 – A Memorandum of Understanding (MOU) with the prospective of strengthening the infrastructure of AI models and researching, evaluating and guiding AI safety was signed on March 20, 2024, partnering U.S. and UK AI Safety Institutes. 1

The alliance was formed due to several commitments made at the AI Safety Summit last November.

The team-up is aimed at conducting a combined testing exercise that could be accessible to the public and establishing a pool of expertise comprised of two institutes.

U.S. Commerce Secretary Gina Raimondo and UK Technology Secretary Michelle Donelan have signed on the collaborative effort to ignite the AI space of both countries, particularly the scientific approaches in AI space.

“Because of our collaboration, our institutes will gain a better understanding of AI systems, conduct more robust evaluations, and issue more rigorous guidance,” U.S. Secretary of Commerce, Gina Raimond stated. 2

With the gradual growth of the AI platform, the exigency to meet with the AI safety measures is concerned with this launch in a collaborative manner, while both countries have committed to bind hands with other countries as well as ensuring AI safety worldwide.

“We have always been clear that ensuring the safe development of AI is a shared global issue. Only by working together can we address the technology’s risks head-on and harness its enormous potential to help us all live easier and healthier lives,” UK Secretary of State for Science, Innovation, and Technology, Michelle Donelan said. 2

Sources
  1. commerce.gov: https://www.commerce.gov/news/press-releases/2024/04/us-and-uk-announce-partnership-science-ai-safety#:~:text=The%20U.S.%20and%20UK%20AI,on%20a%20publicly%20accessible%20model.[]
  2. commerce.gov: https://www.commerce.gov/news/press-releases/2024/04/us-and-uk-announce-partnership-science-ai-safety#:~:text=The%20U.S.%20and%20UK%20AI,on%20a%20publicly%20accessible%20model.[][]
Protected by Copyscape