Workshops
Cyber security and Data Protection Governance
Online Workshop
Technical Knowledge and Skills CPD
Learn more
In this thought leadership article, Steven Roberts CDir, Vice Chair of the Compliance Institute’s Data Protection & Information Security Working Group and Group Head of Marketing at Griffith College, outlines key data protection legislative items that are essential for directors. Steven is also a Chartered Director, Certified Data Protection Officer and Fellow of the Chartered Institute of Marketing. His forthcoming book on Data Protection for Business is due for publication by Clarus Press in 2026.
Data is a major driver of the Irish and international economies in the twenty-first century. It presents boards with challenges and opportunities as they balance conformance and performance. The latest example is the rapid growth of artificial intelligence (AI) technologies over the past twenty-four months. This is top of mind for many directors. Large language models (LLMs) such as ChatGPT, Claude and Perplexity offer potential gains in productivity and efficiency with concomitant compliance risks in how personal data is used and processed by these systems.
Accompanying this technological growth is a suite a of new and recent legislation as regulators look to provide guardrails and governance around these systems. The EU has been at the forefront of this regulatory activity, introducing laws in areas such as artificial intelligence, digital markets, data governance and cybersecurity as part of its Digital Decade strategy . In this short article, we will look at some of the key data protection issues that should be on board agendas in 2025.
The GDPR captured significant attention in 2018 with its potential for fines of up to €20 million or 4% of global turnover, whichever was the greater amount. This was problematic for boards seeking to establish a risk profile, as the legislation was new and largely untested. Greater clarity has emerged over the past seven years. It is clear that much regulatory attention is focused towards technology companies who have large scale processing of personal data at the core of their business models. Fines for this sector have been eye-watering, most notably the €1.2 billion penalty imposed against Meta Platforms Ireland Limited by the Data Protection Commission (DPC) in 2023 . This trend has continued in 2025, with the DPC in May announcing a fine of €530 million for TikTok .
The situation is somewhat different outside of this sector, and particularly for SMEs. The DPC has issued penalties to a wide range of Irish firms across various sectors. However, the levels of such fines have been far lower and commensurate with the scale of these businesses. In 2023, for example, the Dublin Circuit Court confirmed fines for five companies, with penalties ranging from €15,000 to €750,000 .
Case law continues to evolve with regard to non-material damages. Recent High Court decisions indicate there may be a role for the Injuries Resolution Board, while compensatory amounts have tended to be in the low range.
EU supervisory authorities have criticised the boards of companies in breach of GDPR. In 2024 the Dutch authority, in penalising Clearview AI , advised it was investigating whether directors of that company could be held personally liable. In Ireland, a recent High Court judgement found a director personally liable for a breach of the Data Protection Acts 1988 and 2003. Boards should monitor developments in this area closely as it could indicate a shift in the focus of EU regulators and has the potential to significantly impact individual directors.
Irish companies must devote time to dealing with the range of new legislation passed at EU level. Much of this burden is falling on compliance, data protection and legal teams. It presents a number of challenges including effectively resourcing these departments to avoid burnout, interpreting the impact of overlapping laws (for example GDPR and the AI Act) and ensuring adequate training both for specialists and across the company more generally. For SMEs, reliant on a handful of key managers, the support of third-party experts may play a crucial role. Marketing teams should note the EU’s recent decision to pause development of the proposed ePrivacy Regulation .
According to CSO data, more than 15% of all Irish enterprises used artificial intelligence (AI) in 2024. For larger firms with more than 250 staff the figure was 51%. There is an inherent friction between the principles of data privacy and the processing required to implement large scale AI technologies. This has drawn the attention of EU supervisory authorities. Italy fined Open AI €15 million for its use of personal data to train ChatGPT without properly informing users and in the absence of stringent age verification measures, whilst Ireland’s DPC announced an inquiry into the Grok LLM in April. The EU’s AI Act, which came into force in 2024, provide rules and safeguards around the use of AI technologies within the EU. Compliance and legal teams will be at the forefront in navigating areas of adjacency and overlap between the AI Act and the GDPR. Tools such as Data Protection Impact Assessments (DPIAs) will be vital in ensuring data protection risks and considerations are identified at the outset of any new projects involving the processing of personal data. AI technologies are complex, lack transparency and carry increased risks around automated decision-making. Companies without clear AI policies are strongly recommended to introduce these soonest, enhancing and iterating them over time.
The GDPR had a ripple effect across the globe, with many countries introducing their own data privacy laws. At the same time, there is a clear tectonic shift in how the USA, and to a lesser extent the UK, are approaching regulation when compared to the European Union. The British and US governments are exploring ways to reduce the regulatory burden on industry, particularly regarding the growth potential of AI technologies. Company boards should keep a watchful eye on developments with the UK’s Data (Use and Access) Act 2025. Britain’s adequacy decision will be reviewed by the EU before the end of the year; any substantial divergence in UK privacy laws vis-à-vis the GDPR may threaten renewal, which would significantly increase the data protection burden for firms trading into the UK.
Regular reviews of a board’s skillsets and competencies ensure ongoing fit for purpose. The importance and complexity of new technology, alongside growing security and privacy risks in the areas of cyberattacks and data protection, suggest it is timely for boards to stress-test their current membership. Boards need to strongly consider bringing in more professionals with experience in areas such as digital transformation, information technology, compliance and cybersecurity.
Irish firms will continue to operate within a fast-changing regulatory environment. The ongoing maturity of the GDPR, new privacy laws in international markets, an evolving geopolitical landscape, and the introduction of adjacent legislation present challenges for boards and their management teams. Alongside this, the rapid growth of artificial intelligence technologies provides organisations with the opportunity for productivity and efficiency gains whilst also presenting significant data protection and compliance risks. Boards must adopt a flexible and adaptive approach, balancing performance and compliance. Directors with a growth mindset, both for their own development and that of the boards they serve on, will ensure their skillsets remain relevant.
This article is the view of the author and does not necessarily reflect IoD Ireland’s policy or position.