AI lit­er­a­cy

The AI Act requires providers and deployers of AI systems to have a sufficient level of AI literacy; this requirement became applicable at the beginning of February 2025.

In brief

  • The ability to understand AI, critically question it and use it responsibly is vital today. AI literacy is essential to make informed decisions, minimise risks and comply with statutory requirements. At the same time, AI literacy enables the best possible use to be made of the potential of AI and innovations to be developed.
  • The AI Act does not specifically set out how to build AI literacy. No standardised training measures are laid down. Organisations using AI can decide themselves how to ensure AI literacy to the best of their knowledge and judgement.
  • The individual context of an organisation is fundamental: the organisation’s role as a provider or deployer developing or using AI systems, together with the associated risk, and the staff’s previous knowledge and experience are decisive factors.
  • Developing AI literacy is an ongoing process. AI literacy should be regularly updated and adapted to technological developments.
  • The AI Act does not require certification for participation in training or qualification measures. Measures can be carried out internally or externally.
  • Measures taken to build AI literacy should be documented.
  • There are various free AI literacy training courses (see information further below).
These web pages and our Guidance note on AI literacy (in German) provide helpful information about the requirements and about developing AI literacy. The European AI Office supports the Member States in implementing Article 4 and provides information about AI talent, skills and literacy. It has also hosted a webinar on AI literacy; the recording of the webinar is available to view at the following link: Third AI Pact webinar on AI literacy | Shaping Europe’s digital future.

What is AI literacy?

AI literacy” is defined in Article 3 point (56) of the AI Act:

“‘AI literacy’ means skills, knowledge and understanding that allow providers, deployers and affected persons, taking into account their respective rights and obligations in the context of this Regulation, to make an informed deployment of AI systems, as well as to gain awareness about the opportunities and risks of AI and possible harm it can cause.” Article 3(56)

According to this definition, AI literacy essentially comprises:

  • the skills, knowledge and understanding
  • to deploy AI systems in an informed, responsible and safe manner and
  • to gain awareness about the opportunities and risks (for example ethical, legal and societal) presented by AI.

The legal definition also reflects the two key pillars of the AI Act: minimising risks and possible harm from the deployment of AI and promoting opportunities and innovation through the use of AI. It is therefore in the businesses’ own interests to develop AI literacy.

The rules on AI literacy in Article 4 of the AI Act have been applicable since 2 February 2025:

“Providers and deployers of AI systems shall take measures to ensure, to their best extent, a sufficient level of AI literacy of their staff and other persons dealing with the operation and use of AI systems on their behalf, taking into account their technical knowledge, experience, education and training and the context the AI systems are to be used in, and considering the persons or groups of persons on whom the AI systems are to be used.” Article 4

Article 4 covers the following aspects:

  • Who: Providers and deployers of AI systems, including general-purpose AI systems such as chatbots, irrespective of the sector they operate in or the size of the organisation.
  • What: A sufficient level of AI literacy of all persons operating or using AI systems on an organisation’s behalf. AI literacy covers both an organisation’s own staff and other persons dealing with the operation or use of AI systems on the organisation’s behalf, such as contractors and service providers.
  • How: To the organisation’s best extent and taking into account factors relating to the persons and the context.

 AI literacy as an enabler

It is within the interests and therefore the responsibility of an organisation itself to ensure a high level of AI literacy of its staff in order to make full use of the undisputed potential of AI. A high level of AI literacy enables organisations to, among other things:

  • Make informed decisions: AI literacy is intended to enable providers, deployers and persons concerned to make informed decisions about AI systems and consequently obtain the greatest benefits from AI systems, while protecting fundamental rights, health and safety and enabling democratic control;
  • Improve working conditions and promote innovation: AI literacy can help to improve working conditions and support the innovation path of trustworthy AI; and
  • Ensure compliance with and enforcement of the AI Act: AI literacy is intended to provide all relevant operators along the AI value chain with the necessary knowledge to ensure appropriate compliance with and correct enforcement of the AI Act.

What is or is not laid down in the AI Act?

For compliance with the AI literacy requirement, organisations should, when deploying and using AI:

Not required for compliance with Article 4 are:

- ensure a general understanding of AI;

- consider their own role as a provider or deployer;

- consider the risks of the specific AI system in the specific context;

- incorporate the latest developments and changes.

- formalised or standardised training measures;

- (external) certification of any measures taken;

- the introduction of an AI officer; or

- regular pre-checks by supervisory authorities on measures for ensuring AI literacy.

A lack of AI literacy can be seen as non-compliance with a duty of care, in particular if this results in any harm. The Bundesnetzagentur therefore recommends that organisations keep a good record of their measures for ensuring AI literacy. This will enable them to prove at any time that they comply with the requirements in Article 4.

 How to build AI literacy?

The AI Act deliberately does not lay down any specific formats or other formalised and standardised measures, simply because there is no one-size-fits-all solution. Organisations are free to tailor their measures for acquiring and expanding AI literacy to their individual needs. These needs depend on, for example, the AI systems that an organisation uses, together with the associated risks, the tasks that staff perform and how much the staff already know about AI. Organisations are therefore free to decide for themselves how to ensure AI literacy. Their decisions should be plausible and understandable. A standardised solution cannot accommodate such a wide range of needs.

As a guideline, the Bundesnetzagentur has identified four appropriate cornerstones for building AI literacy.

Four cornerstones for building AI literacy
1. Identifying individual needs

Which persons develop, operate or use AI systems?

Which AI systems are they?

For which purpose do the persons work with the AI systems?

Which risks are associated with working with the AI systems? 

2. Designing measures, taking into account:

individual factors relating to the persons concerned, such as their training, experience, current knowledge and type of work;

the context in which an AI system is deployed, for example the field of application, the phase of use, the persons concerned and the purpose;

the associated risk; and

the organisation’s own role along the AI value chain.

3. Updating regularly

Building AI literacy should be a dynamic and continuous process because, among other things:

  • the concept of AI literacy may change over time;
  • the context in which an AI system is used by a person may change; and
  • technological developments may open up new fields of application.
4. Keeping adequate records of measures, including:

the type of measures,

the scope in terms of content and duration, and

the persons taking part.

Format and content of measures

The AI Act does not lay down a specific format for measures. Measures can range from self-learning programmes, workshops and training courses to multi-staged advanced training, depending on individual needs and the specific context. Measures can be carried out internally or externally.

Organisations are free to decide themselves on the content of the measures.

The Bundesnetzagentur recommends an interdisciplinary and multi-stage structure for the content. An interdisciplinary structure allows proper account to be taken of the technical, legal and ethical aspects of AI and interaction with AI. A multi-stage structure accommodates the fact that AI literacy is built up in stages according to the various levels of knowledge among staff.

The following summary of possible content for building AI literacy is neither binding nor exhaustive. Content should be adapted to an organisation’s specific circumstances, such as its size, the sector it operates in and the level of technological maturity, as well as to the individual needs of the persons concerned.

Stage 1: Creating a basic understanding of data and AI within the organisation

  • Basics of data and AI, including terms and history
  • Overview of AI technologies, including machine learning and large language models
  • General opportunities and risks posed by AI, for example based on use cases or role play

Stage 2: Building advanced AI literacy

  • Role of the organisation along the AI value chain, for example developer or user
  • Technical aspects of the AI used
  • Specific opportunities and risks and legal classification of the AI used

Stage 3: Role-specific training with individual focal points, for example technical, legal or ethical aspects

Support

There is a wide range of offers available free of charge to support organisations in building AI literacy. These include European and national measures as well as private initiatives and associations:

The European AI Office organises events within the framework of the AI Pact to support the exchange of experience and knowledge and the process of promoting trustworthy AI. The aim of the AI Pact is to support organisations in planning ahead for the implementation of measures under the AI Act. The AI Office hosted a Webinar on AI literacy as required by Article 4 of the AI Act on 20 February 2025; a recording of the webinar is available to view online. The AI Office keeps a list of Beispiele of AI literacy practices and provides questions and answers questions and answers on AI literacy. Voluntary codes of conduct promoting AI literacy are supported by the European Commission and the Member States in cooperation with stakeholders.
European Digital Innovation Hubs (EDIHs) help SMEs, small mid-caps and public sector organisations to respond to digital challenges and improve their competitiveness. Their services comprise testing before investing, expanding digital literacy, advice on financing options, and networking.
Around 100 AI instructors belong to the network of Mittelstand-Digital Innovation Hubs (MDZ) in Germany. The instructors support SMEs individually and on an impartial basis. Together with the Hubs, they offer workshops, presentations, roadshows, demonstrations and other services (such as an AI radiness check) relating to AI literacy specifically for SMEs, for example with sector-related and process-related use cases. The aim is to enable SMEs to identify the opportunities and challenges presented by AI for their own businesses and implement specific applications. The range of topics covered includes issues common to both AI and other sectors, such as IT security, sustainability, organisation and change management, and law. Multipliers can use the train-the-trainer programme, a comprehensive AI training course available free of charge.

Online courses

There is also a wide range of online courses available free of charge, including those offered by:

Examples

The European AI Office keeps a list of examples of ongoing AI literacy practices.

Mastodon