News

AI compliance – Balancing regulation and innovation

07.10.2024

The use of artificial intelligence (AI) unlocks numerous opportunities for companies, but also entails considerable legal, ethical and commercial risks. A robust AI compliance structure helps companies ensure their use of AI is both innovative and legally compliant. Their aim should be to align legal compliance with the company’s economic and strategic goals as part of overarching AI governance. That is the only way to make sure there is sufficient room for innovation and progress.

Challenging regulatory environment

Knowledge of the relevant legal and regulatory framework for each company is essential to setting up an AI compliance structure. In Germany and the European Union, these include the following:

AI Act: The AI Act came into force on 2 August 2024. Depending on the specific purpose of an AI system, the Act contains extensive obligations for both providers and operators of AI systems. To ensure compliance with the requirements of the AI Act (such as documentation and human oversight), appropriate compliance structures are needed, including the corresponding roles and responsibilities. The following aspects are particularly important:

  • Many companies assume the obligations of the AI Act do not apply to them because they ‘do not develop AI’. This is a fallacy. A provider and thus an obligated party within the meaning of the AI Act may also be a company that merely makes changes to an AI model or system or integrates it into its own products (e.g. integrating an LLM into an app as a chatbot).
  • Companies often believe they do not carry out prohibited AI practices or operate high-risk AI systems. On closer inspection, however, it becomes clear that their use of AI is a grey area (for example, using AI systems in customer service to recognise customer emotions and then make appropriate suggestions for communication).
  • Under the AI Act, employers are subject to certain notification and information obligations that apply to all companies using AI.
  • Although some of the obligations will not apply until 2026 or 2027, some of them will apply as early as February 2025. Companies should therefore observe the obligation to train their employees in the basics of AI; this obligation will probably apply to all companies.

Codetermination: Companies with a works council should involve it at an early stage as part of AI governance to prevent the risk of AI projects being stopped at short notice. This is also important when producing training courses and guidelines for employees.

Data protection: AI needs data, both in the field of training, as well as in the context of any fine-tuning and when using the system. If personal data is involved, processing must be done in line with data protection regulations, especially the GDPR. In this context, existing consents or data privacy information must be reviewed.

Data Act: The Data Act is also important in connection with AI. The Act, already partly applicable from September 2025, requires a contractual basis in the form of a data licence if non-personal(!) data is collected in connection with IoT products, associated services and virtual assistants. If that data is used in an AI context, companies must ensure that the corresponding licensing rights are in place.

Antitrust law: Many companies overlook any antitrust law aspects that may arise when using AI. The collection of data must not conflict with the prohibition of sharing competitively sensitive data.

Intellectual property law (in particular copyright law): Intellectual property law continues to be of paramount importance in connection with AI systems. This initially concerns the question of what content a system or model may be trained on. Section 44b of the German Copyright Act (Urheberrechtsgesetz), which authorises data and text mining under certain conditions, is particularly relevant. This rule was recently the subject of a court decision for the very first time. And there is the question of who holds the rights to AI-generated content, particularly in the sourcing and development of AI solutions. The reason for this is that purely machine-generated products do not enjoy any protection under German copyright law.

Protection of trade secrets: If a company’s trade secrets are used during training or when operating the AI system, the company must ensure the secrets continue to be covered by the German Trade Secrets Act (Geschäftsgeheimnisgesetz) and do not lose this protection.

Industry-specific requirements: In addition to the legal framework mentioned, there may be industry-specific requirements. This may be the case for insurance companies and banks (e.g. section 25b of the German Banking Act (Kreditwesengesetz)).

Contractual obligations: If third-party data is used in the context of AI, in addition to compliance with legal requirements, companies must ensure that the use of the data does not breach contractual obligations, in particular confidentiality agreements and erasure obligations.

Note that in the context of the legal assessment, companies should always consider that AI is borderless. In the case of cross-border deployment, the requirements of the relevant jurisdictions must therefore be taken into account at an early stage.

Best practices on the path to AI compliance

When establishing an AI compliance structure, the ideal project approach is one based on an interdisciplinary team and taking into account all key stakeholders within the company from the outset.

This usually includes representatives of the legal and compliance departments, IT, IT security and any engineering departments. It is also advisable to involve the HR department or the works council, especially with regard to any employee notifications. The (IT) procurement department should also be taken into account in order to efficiently mitigate the risks associated with AI sourcing. In addition, companies must ensure that management is sufficiently engaged and that contact with the individual departments is guaranteed to ensure that the compliance requirements are implemented.