EU adopts AI Act: Key Takeaways for SMEs
> May 2024

Following its publication in the Official Journal of the EU on 12 July 2024, the AI Act will now enter into force on 2 August 2024. From this date, deadlines will begin to run, and after their expiry respective regulations must be complied with. However, for small and medium-sized companies whose business area does not directly involve AI, only a handful of stipulations are likely to actually become relevant when using intelligent systems.
The AI Act marks the first piece of legislation to uniformly govern the use of intelligent systems by regulation that is directly applicable in the EU Member States. Depending on the area of application and the risks that respective AI systems pose, various stakeholders will be subject to obligations of varying severity. That being said, the basis for AI Act applicability is always the use of an AI system as defined in Art. 3 No. 1 AI Act. Firstly, it is important to recognise that not every semi-automated program constitutes such an AI system. If, for example, a company employs only rule-based software to automate certain processes, it from the outset is not covered by the scope of the AI Act. Rather, to be covered by the regulation, the system must be designed to operate autonomously and infer from input data how to generate outputs.

Secondly, the AI Act distinguishes between different addressees of obligations. It focuses on providers and deployers of AI systems. The legal definitions are rather broad and suggest that certain aspects of these terms overlap. However, it becomes clear that most SMEs are likely to be classified as deployers considering that - for cost reasons and/or due to a lack of expertise - they tend to source pre-made systems instead of having them developed individually or even developing them themselves, which would qualify them as providers. This means that they will not need to comply with regulations exclusively addressing importers, distributors and suppliers of AI systems.

Therefore, SMEs are subject to the general obligation to train personnel exposed to AI systems as per Art. 4 AI Act. Beyond that, the AI Act categorises individual obligations by type of AI system concerned. Mainly, there are three categories:

+ Art. 5 AI Act lists prohibited AI systems that already shall not be placed on the market. SMEs encountering this type of AI system should be an exception. After all, the provision refers to such topics as harmfully manipulating individuals behaviour and predicting crimes similar to Minority Report. If anything, Art. 5 para. 1 lit. c or g AI Act could (at least thematically) be relevant for SMEs in regard to applicant selection and performance reviews.

+ High-risk AI systems are subject to the greatest density of regulations as per Art. 6 - 49 AI Act. Here, system deployers are bound to follow a variety of regulations, from documentation and transparency obligations to the obligation to implement certain technical and organisational measures. The exact meaning of the term high-risk AI is primarily governed by Annex III to the AI Act. Against this backdrop, it is imaginable that SMEs use high-risk AI in the area of ‘Employment, workers management and access to self-employment’ (Annex III, point 4). This topic particularly pertains to systems that autonomously evaluate employee performance or screen and filter applications. Even if such systems are used, however, SMEs may rely on the exemption provided by Art. 6 para. 3 AI Act to escape relevant obligations, where systems do not pose a significant risk to the rights and legal interests of individuals. That being said, pursuant to Art. 6 para 4 AI Act, this would not release SMEs from the registration obligation of Art. 49 AI Act.

+ Finally, according to Art. 50 AI Act certain AI systems underlie transparency obligations. Hence, depending on the systems area of application, SMEs could be obliged to label content created by AI, such as ChatGPT or Copilot. However, this does not apply if such content has undergone human review and editorial responsibility is recognised for it.