Practical Steps to Align Your Organisation with the EU AI Act
15 January 2025 // 4 Min Read
As the EU AI Act (EU Regulation 2024/1689) sets new standards for the responsible development and use of AI, organisations need to take proactive steps to ensure they’re fully compliant. Whether you’re already using AI systems or planning to adopt them, it’s crucial to understand your obligations under the Act.
In our previous post, we explored the penalties for non-compliance, the governance system, and how the Act extends its reach beyond the EU’s borders.
Below are some practical steps to help you navigate the requirements and align your organisation with the EU AI Act.
1. Develop an AI Policy
Start by creating or updating your organisation’s AI policy. This document should clearly outline the prohibited uses of AI as defined by the EU AI Act, including things like deploying AI systems that manipulate behaviour, exploit vulnerabilities, or involve banned practices.
Top Tip: Even if your organisation isn’t likely to intentionally engage in these activities, the broad definitions of the Act mean it’s essential to include these prohibitions in your policy. This not only helps you stay compliant but also shows your commitment to responsible AI use.
2. Establish AI Governance
Make sure you have the right person or team in place to oversee compliance with the AI Act. This could be your Data Protection Officer (DPO) or someone with the right technical expertise and seniority to manage high-risk AI systems.
Good governance ensures your organisation:
Monitors AI system performance
Effectively manages risks
Communicates clearly with stakeholders and regulators
3. Review AI Procurement Practices
If you’re commissioning or deploying high-risk AI systems, take extra care during procurement:
Ensure everyone involved understands the system’s risk level and the legal implications.
Avoid working with providers unless you’ve carried out thorough due diligence.
Be mindful of any customisation requests for third-party AI systems, as these might trigger additional compliance obligations.
4. Update AI Contracts
Your contracts should clearly outline the roles and responsibilities of all parties in the AI supply chain to ensure compliance with the Act. Key points to address include:
Considering potential regulatory changes over the course of the contract.
Setting expectations for data governance, transparency, and system design.
Top Tip: Check out the EU Commission’s model clauses for public procurement of AI systems. These can serve as a good starting point, even for private organisations, and include provisions around compliance. The Society for Computers and Law is also working on guidance and sample clauses to support businesses.
5. Review and Update Privacy Notices
Make sure your organisation’s privacy notices clearly explain how AI systems process personal data. Being transparent with employees, customers, and stakeholders helps build trust and ensures compliance with both the EU AI Act and GDPR.
6. Invest in AI Literacy
From 2 February 2025, providers and deployers of AI systems must meet the Act’s AI literacy obligations. This means:
Training employees and staff who operate or interact with AI systems.
Ensuring they have a “sufficient and appropriate level” of understanding about the risks, limitations, and proper use of AI.
By prioritising training, your team will be better equipped to operate AI systems safely and effectively, reducing the risk of non-compliance.
Conclusion
Take Action Now
Implementing these steps early will not only get your organisation ready for the EU AI Act but also position you as a leader in ethical and responsible AI deployment. With the regulation’s phased implementation already in motion, getting ahead of the game is key to avoiding penalties and maintaining trust in your AI systems.