Silicon Saxony

#6: Putting the EU AI Act to the test: objectives, potential and unresolved issues

Are we able to safeguard democratic values, human rights and the rule of law in an AI-powered world and what exactly does the EU AI Act mean for Europe and its position in the world? We take a closer look at the law, that is described as a “blueprint for trustworthy AI around the world”.

Share this Post
BILD: KI Bundesverband | elevait | Silicon Saxony

Contact info

Silicon Saxony

Marketing, Kommunikation und Öffentlichkeitsarbeit

Manfred-von-Ardenne-Ring 20 F

Telefon: +49 351 8925 886

Fax: +49 351 8925 889

redaktion@silicon-saxony.de

Contact person:

Obstacle or opportunity? What does the EU AI Act – the world’s first comprehensive law on the use of AI – mean for Europe as a business location? We explain. Guests in Julia Nitzschners virtual podcast studio are: Stefanie Baade (KI Bundesverband) and Gregor Blichmann (elevait).

Uniform rules for artificial intelligence in the EU and the responsible and ethical use of AI: this is the promise of the EU AI Act, which was finally adopted by the Council of the 27 EU member states on 21 May 2024. It is the world’s first comprehensive law on the regulation of artificial intelligence. Why is this so important for all of us? What exactly does the EU AI Act say and what does it mean for companies whose products and business models are based on the use of AI? We answer these and other questions in this episode.

We dive deep into the details of the EU AI Act. We explain why the law exists in the first place, explain the four risk levels using examples and address uncertainties in dealing with the law and further steps. Building on this, we discuss how companies can prepare for the new requirements and what else Europe needs as an AI location, apart from regulations, in order to be internationally competitive.

Finally, we assess the Act in terms of Europe’s competitiveness and the attractiveness of the AI location for international companies. Whether the AI Act will be a stumbling block or a historic opportunity for the development and use of artificial intelligence is something we hope you will be able to answer at the end of this episode, as will our two guests.  

_ _ _ _ _ _ _ _

Topics of the episode

We talk to Stefanie Baade (Deputy Managing Director of the German AI Association) and Gregor Blichmann (CTO, elevait) about the following topics:

  • Details of the EU AI Act and the different risk levels of AI applications
  • Prohibited systems and strict requirements
  • Opportunities and challenges of the new regulation for Europe
  • Preparing companies for the new requirements
  • Evaluation and assessment

_ _ _ _ _ _ _ _

Listen to the episode (German language)

🔊 Podigee (web)

🔊 Spotify

🔊 Apple Podcast

🔊 Deezer

_ _ _ _ _ _ _ _

Further information on the AI Act

Important aspects and challenges:

  • Creation of a legal framework for ethical and responsible AI development.
  • Risk-based approach with different levels of risk.
  • Transparency, data protection and diversity as ethical standards for the AI Act
  • Technical and bureaucratic challenges in implementing the regulation still need to be clarified
  • Need for support for small companies is provided by the German AI Association, among others

Future steps and expectations:

  • Detailed organisation of the obligations to provide evidence and transparency measures
  • Creation of a standardised market and a central AI authority
  • Need for larger sums of funding and bold investments in AI
  • Promotion of technological sovereignty and education in the field of AI

Summary and outlook:

  • The AI Act offers potential for legally secure and trustworthy AI development
  • Need to master the balancing act between bureaucracy and innovation
  • Opportunities for Europe to set global standards as a pioneer and for other countries to follow suit

_ _ _ _ _ _ _ _

Further links

👉 AI Act – further information and timings

👉 AI Act – collected documents

👉 Artificial intelligence working group

You may be interested in the following