The EU AI Act, which came into force on 1 August 2024, aims to foster the responsible development and use of artificial intelligence (AI). The legislation is intended to ensure that AI systems used in the EU are safe, transparent, traceable, non-discriminatory and environmentally friendly—and overseen by people rather than by automation.
The Act establishes a model for the regulation of AI and may set a baseline for global regulations, in a similar way General Data Protection Regulation (GDPR) has done for personal data regulation.
Though the regulation may be seen as a challenge for compliance, the AI Act aims to instil confidence in consumers through reliability, transparency and fairness—as well as being a catalyst for innovation in AI adoption.
The AI Act applies to private organisations and public authorities and, according to Article 2 of the AI Act, affects five main groups of stakeholders: providers, deployers, importers and distributors, product manufacturers and affected persons. The AI Act expressly does not apply to private individuals if they use AI systems for purely private purposes. Additionally, there are further exemptions for systems used for the purpose of scientific research and development, the military, defence and national security.
Organisations falling under the EU AI Act are expected to comply with its requirements by 2026. In some cases, however, such as a ban on prohibited practices and an obligation to ensure staff AI literacy, the term is significantly shorter and is effective from February 2025. There are significant financial penalties for non-compliance with the AI Act. Fines for non-compliance with the EU AI Act are up to €35 million or up to 7% of annual turnover. PwC’s approach, however, is not focused on mere compliance, but is on using the Act instrumentally as a tool to drive business transformation.
Our interdisciplinary Risk and Legal Teams are able to help your organisation in the whole process related to AI Act compliance:
Our teams are able to help your organisation with the following areas:
Algorithm selection and design: Assess the suitability and alignment of algorithms with business objectives, ensuring ethical and responsible AI design practices.
Data collection: Verifying the relevance, sufficiency and representativeness of collected data to support robust model development and reduce biases.
Data quality and preprocessing: Ensuring data accuracy, completeness and consistency through rigorous preprocessing techniques, while identifying and mitigating potential data issues.
Model training and tuning: Evaluating the effectiveness of training processes, hyperparameter optimisation and ensuring that models generalise well across diverse scenarios.
Model validation and performance metrics assessment: Testing model accuracy, robustness, sensitivity and specificity against real-world conditions, using industry-standard and bespoke performance metrics.
Monitoring and continuous improvement: Reviewing model deployment processes, scalability and integration into business workflows, with a focus on operational resilience and governance of Machine Learning Operations (MLOps).
Explainability and transparency: Assessing the interpretability of AI models to enhance trust and understanding among stakeholders, facilitating compliance with regulatory requirements.
Risk mitigation and bias detection: Identifying and mitigating risks, including algorithmic bias, to promote fairness and equity in AI outcomes.
We understand that the AI Act should be considered in relation with other relevant regulations and standards, and are therefore able to capture synergies between different regulations and address compliance in a comprehensive way.
Compliance with the AI Act is a key element of the responsible AI practice which should be implemented and followed by each organisation that seeks to use AI responsibly and build trust in AI. Our teams are able to help your organisation in the following elements:
Angel Bangachev
Director, Legal Business Solutions | Attorney-at-Law, PwC Bulgaria
Tel: +359 894 333 237