‘We must not let AI shape us’ — UK, US, others sign first legally binding AI treaty
The international AI treaty aims to ensure the ethical and responsible use and development of AI systems
The U.K., U.S. and European Union (EU) member states are among several nations that have signed the first legally binding international treaty to ensure the ethical and responsible use and development of AI systems.
The decision-making Committee of Ministers within the Council of Europe adopted the AI Convention in May following years of discussions between 57 countries. “This Convention is a major step to ensuring that these new technologies can be harnessed without eroding our oldest values, like human rights and the rule of law,” Britain’s Lord Chancellor and Justice Secretary Shabana Mahmood, said in a statement. The Council of Europe’s 46 member states, the European Union and 11 non-member states were involved in the drafting process, and academia, private sector representatives and civil society acted as process “observers.”
The AI Convention is different from the recently established EU AI Act, which involves regulations around AI development, deployment, and use of AI systems within the EU internal market. The new treaty focuses on risk and impact assessments “in respect of actual and potential impacts on human rights, democracy and the rule of law.” Is also includes the establishment of AI risk mitigation measures.
Additional signatories of the convention include Andorra; Georgia; Iceland; Norway; the Republic of Moldova; San Marino and Israel, but the Council of Europe noted that eligibility is open to “countries from all over the world.”
“Artificial intelligence has the capacity to radically improve the responsiveness and effectiveness of public services, and turbocharge economic growth,” said Mahmood. “However, we must not let AI shape us — we must shape AI.
Three months after securing five signatories, of which at least three must be Council of Europe member states, the treaty will come into effect.
Comments are closed.