Key Takeaways from the AI Pact Webinar on Article 4 of the EU AI Act
Yesterday, the AI Pact, supported by the EU AI Office and The European Commission hosted its third AI Pact webinar on AI literacy the aim of the webinar was to help organisations learn more about the approach of the European Union towards Article 4 of the AI Act and discover the ongoing practices of the AI Pact organisations. Article 4 of the AI Act, which requires providers and deployers of AI systems to ensure a sufficient level of AI literacy, entered into application on 2 February 2025. In this article we will break down some of the webinars key takeaways.
Territorial Application of the EU AI Act
You may be thinking, “Wait. Does the EU AI Act even apply to my company?”.
Good question. Like the GDPR, the Act asserts significant extraterritorial jurisdiction. Regardless of a company’s location, the Act applies if a company is developing, selling, importing, manufacturing, distributing, or deploying AI that touches EU residents.
Here are a couple of examples of the Act’s breadth:
- Any company that makes AI systems available to employees in the EU for professional use is “deploying” AI in the EU.
- Any company that makes AI systems available to EU residents is “distributing” AI, even if the system is free.
The Act addresses many circumstances beyond these examples.
In sum, assuming the Act does not apply to you as a non-EU company is a big mistake that could lead to steep regulatory penalties, reputational damage, and increased legal risk. Look closely at your AI systems before making a decision.

AI Literacy Requirement
So what’s this AI literacy requirement about?
“AI literacy” refers to the knowledge and understanding required to effectively use, interact with, and critically evaluate AI systems.
The AI literacy requirement aims to enable responsible AI deployment and usage within organisations. Although there are no direct fines for non-compliance, a failure to create sufficient AI literacy may increase penalties for other violations.
More importantly for your AI program, a lack of AI literacy will increase the likelihood of violations occurring. Untrained employees may inadvertently misuse AI, leading to unintended harms or non-compliance with broader governance policies.
.png)
Building AI Literacy in a Corporate Environment
Let's break it down now. What do you actually need to do?
The Act does not prescribe any format or method for creating AI literacy, but making a good faith effort will likely go a long way with regulators. The easiest way to do this to follow best practices already established in other compliance areas.
Start with these two prerequisites to AI governance:
1. Build an AI Map: Similar to a personal data map, you need to map where AI is being used in your company and by whom. Talk to your IT teams, survey your employees, and review a list of the applications you are paying for. Discover what’s in your tech stack to build a program that fits your organisation.
2. Create an AI Policy: A key part of every AI literacy program should be educating your teams on your company’s guidelines and rules for using AI – just like your privacy policy is a key part of your privacy training.
Once you have these tools in hand, here is how I would approach increasing AI literacy within an organisation:
- Baseline Training: AI usage is spreading quickly. In most industries, it’s reasonable to assume that 95%+ of your employees will be using an AI system within the next 12 months (if they are not already). Add baseline AI literacy training to your mandatory training suite. Compliance vendors offer training like this that you can plug into your learning management system. Remember that you will likely want to make some customisations to ensure that employees know your specific policies.
- Specialized Training: Besides general AI systems, groups of employees are likely to use specialised systems. These employees will need specific knowledge about operating the systems safely, using the output, and managing related risks. The AI system vendor may have a training you could use, or you may have to create one internally. This can’t be a one-off. New employees will need to be onboarded as the use of the system expands.
- Deep Training: A smaller subset of employees will be responsible for AI decisions, rolling out AI technology, and the daily operation of AI systems. These employees will need more extensive training. Consider sending key leaders out for external training or bringing in an expert for a targeted internal group training. As AI competencies increase within your company, you will be able to manage more of this process internally.
- Communications: Employees will not fully internalise the need for responsible AI practices from a single training. You will need to build a cultural drumbeat around this and integrate it into your broader activities. Leverage your executives to talk about your AI policies. Include refreshers in your company communications. Consider using "AI Moments" at meetings to reinforce concepts. Get creative.
I know this sounds like a lot. AI has the potential to make us more productive, but achieving the promise of AI will require investing in people and processes. Take it one step at a time and you will be amazed with what you can accomplish in a few months.
.png)
The Wrap-up
On February 20, 2025, the AI Pact, in collaboration with the EU AI Office and the European Commission, hosted its third webinar on AI literacy. The session aimed to help organisations better understand Article 4 of the EU AI Act, which came into effect on February 2, 2025. This provision mandates that providers and deployers of AI systems ensure a sufficient level of AI literacy among their staff and users, equipping them with the knowledge and skills necessary to responsibly develop, deploy, and interact with AI technologies.
One of the key announcements in the webinar was the introduction of a "living repository"—a resource compiling AI literacy practices from AI Pact members. This initiative is designed to foster learning and exchange by showcasing real-world AI literacy initiatives. However, the EU AI Office emphasised that merely adopting these practices does not guarantee compliance with Article 4; organisations must ensure their own approaches align with regulatory expectations.
The webinar featured interactive discussions where AI Pact members shared strategies for promoting AI literacy within their organisations. These discussions provided valuable insights into the practical implementation of literacy programs across different sectors. However, one crucial point remained unresolved—the precise definition of AI literacy in the context of Article 4. While the session provided examples of best practices, it did not explicitly clarify the terminology or establish a concrete framework for compliance.
As organisations work toward meeting AI literacy requirements, it remains essential to stay informed on further guidance from the EU AI Office. Future clarifications and evolving best practices will likely play a significant role in shaping AI literacy initiatives across the European Union.