Hit enter to search or ESC to close

Shadow AI Audit

Shadow AI Audit | How to Use it 

The Shadow AI Audit helps organisations identify where employees are already using AI tools informally, understand the associated governance and data risks, and translate those insights into clear actions for executive management and meaningful oversight for the board. 

It provides a structured way for leadership teams to move from uncertainty to informed governance in an area that is already shaping how work gets done across organisations. The inventory covers all AI types - not just generative AI - and produces outputs suitable for review by the board or other governing bodies.

For executive directors, the audit offers a practical framework to organise and manage emerging AI use across the organisation.

For non-executive directors, it provides visibility into AI adoption, associated risks, and the adequacy of management’s governance response.

This toolkit includes:

Discovery Survey Template (Word)
Executive management can use the Discovery Survey template (a 14-question, editable, print-ready Word document) to gather insights from department heads and a representative sample of employees. The survey explores current AI tool usage, potential data exposure, and unmet needs within teams. The exercise should be positioned as a listening exercise rather than a compliance review, as honest responses depend on psychological safety. The template also includes suggested framing guidance and a thank-you note for participants. 

For boards and non-executive directors, the survey provides a structured starting point to understand how AI tools are already entering the organisation and whether management has visibility over their use.

Risk Matrix and Inventory Tracker (Excel Workbook)
Management teams can consolidate survey responses and review each identified AI tool using the risk matrix provided in the Excel workbook. Each tool is assessed against four criteria:

  • data sensitivity
  • external data transmission
  • vendor governance
  • level of user dependency

For every tool in the inventory, management assigns one of four actions: Block, Formalise, Adopt, or Monitor, alongside a named owner and a clear timeline. For non-executive directors, the resulting inventory provides a clear governance artefact that enables effective oversight of AI-related risk, including how management is prioritising mitigation and accountability.

Summary Report Template (Word Template)
Executive management can use the summary report template to present the audit findings to the leadership team, board, or governing body. The report provides a clear overview of the AI tool inventory, key risks, regulatory considerations, and recommended actions. A built-in measurement framework allows progress to be tracked over time.

For boards and non-executive directors, the report provides a concise basis for governance discussions around AI risk, compliance obligations (including the EU AI Act and GDPR), and the organisation’s broader approach to responsible AI adoption. Where appropriate, management may also communicate outcomes internally and highlight when previously informal tools are formally adopted. This helps close the feedback loop, reinforces that the exercise is about responsible governance rather than punishment, and builds trust for future initiatives.

How to Measure Success

Key indicators should be tracked across both leading and lagging measures to monitor progress. 

  • Over the first 30 days, leading indicators include the survey completion rate by department, the number of distinct tools identified, and the percentage of departments engaged in the discovery process. 
  • On a quarterly basis, lagging indicators should measure the reduction in shadow AI instances, the percentage of identified tools that have been formalised or officially adopted, the number of data exposure incidents (with a target of zero), and the process improvements identified through user intent analysis.
  • These metrics can be summarised through a reporting statement such as: “Shadow AI inventory reduced from X to Y; Z tools formalised; N process improvements identified from user intent analysis.”

Embedding the Discovery Process

  • Position as “We want to understand and support" - not "we're catching you out." Frame the discovery survey as a listening exercise. Honest participation depends on it. People using shadow AI are mostly trying to do their jobs better.
  • Involve IT, department heads, and a sample of actual frontline users. Don't run this as a purely top-down exercise - the intelligence comes from the people using the tools. 
  • Counter resistance by committing upfront that Block is only one of four outcomes - and that Adopt and Formalise are real options for tools that are genuinely valuable.
  • Identify quick wins through one shadow tool that's clearly valuable and formalise it during the exercise. Publicise this. It demonstrates the organisation listens, rewards honesty, and closes the loop on the initial exercise.

Disclaimer
The AI Governance Toolkit is provided by the Institute of Directors (IoD) Ireland for general informational purposes only. It is intended as a practical guide to support members. The toolkit does not constitute legal, regulatory or professional advice. IoD Ireland accepts no liability for any loss, damage or consequence arising from the use of, or reliance on, this material.