Skip to main content

Data Governance Framework

Data governance is the system of accountability, decision rights, and organisational structures that determines how data is managed throughout its lifecycle. The framework defines who can make decisions about data, what decisions require approval, and how conflicting priorities between data producers and consumers are resolved. For mission-driven organisations operating across multiple programmes, locations, and funding sources, data governance provides the scaffolding that enables consistent data practices without requiring centralised control of every data activity.

Data Governance
The exercise of authority and control over the management of data assets, encompassing the people, processes, and technologies required to ensure data is fit for purpose.
Data Owner
The individual with accountability for a data domain, responsible for defining data requirements, approving access, and ensuring compliance with policies. Data owners hold decision-making authority but rarely perform operational tasks.
Data Steward
The individual responsible for operational management of data within a domain, implementing the data owner’s decisions, maintaining quality, and resolving day-to-day issues.
Data Custodian
The technical role responsible for the physical management of data, including storage, security controls, backup, and system administration. Data custodians implement technical controls but do not make business decisions about data use.
Data Domain
A logical grouping of related data assets managed under common ownership. Domains align with business functions such as beneficiary data, financial data, or programme data.
Decision Rights
The explicit assignment of authority to make specific types of decisions about data, including who can create, modify, delete, share, or grant access to data assets.

Governance operating models

The operating model determines where data governance authority resides and how governance activities distribute across the organisation. Three primary models exist, each with distinct characteristics that suit different organisational contexts.

A centralised model concentrates governance authority in a dedicated team, often called a data office or data management unit. This team defines all data standards, approves data sharing requests, manages the data catalogue, and resolves cross-functional data issues. The centralised model works well for organisations under 200 staff with relatively homogeneous operations, where a small team can maintain visibility across all data activities. The primary advantage is consistency: every data asset follows the same standards, uses the same classification scheme, and undergoes the same approval processes. The disadvantage is scalability. As organisations grow beyond 300 staff or operate across more than five countries, the central team becomes a bottleneck. Requests queue for approval, local needs receive inadequate attention, and the governance function develops a reputation for slowing operations rather than enabling them.

A federated model distributes governance authority to business units, country offices, or programme teams, with each unit managing its own data according to locally appropriate standards. A small central function provides coordination, maintains shared standards where necessary, and resolves cross-unit disputes. The federated model suits large organisations with autonomous operating units, particularly those where country offices operate under different regulatory regimes or serve distinct populations. Federation enables responsiveness: a country office can establish data practices suited to local context without waiting for headquarters approval. The risk is fragmentation. Without sufficient coordination, federated governance produces incompatible data formats, duplicate master data, and inconsistent quality levels that impede cross-programme reporting and organisational learning.

A hybrid model combines elements of both approaches, centralising certain governance functions while federating others. High-risk data domains such as beneficiary personally identifiable information or financial data receive centralised oversight, while programme operational data remains under local control. The hybrid model requires explicit definition of which functions centralise and which federate, typically documented in a governance charter. Most organisations with 200-1000 staff and operations in 3-15 countries find hybrid governance provides the best balance between consistency and responsiveness.

+------------------------------------------------------------------+
| GOVERNANCE OPERATING MODELS |
+------------------------------------------------------------------+
CENTRALISED MODEL
+------------------------------------------------------------------+
| |
| +------------------+ |
| | Central Data | |
| | Governance | |
| | Team | |
| +--------+---------+ |
| | |
| +---------------------+---------------------+ |
| | | | |
| v v v |
| +------+-------+ +-------+------+ +-------+------+ |
| | Programme A | | Programme B | | Programme C | |
| | (implements) | | (implements) | | (implements) | |
| +--------------+ +--------------+ +--------------+ |
| |
+------------------------------------------------------------------+
FEDERATED MODEL
+------------------------------------------------------------------+
| |
| +--------------------+ |
| | Coordination | |
| | Function (light) | |
| +--------------------+ |
| | |
| +---------------------+---------------------+ |
| | | | |
| v v v |
| +------+-------+ +-------+------+ +-------+------+ |
| | Country A | | Country B | | Country C | |
| | Governance | | Governance | | Governance | |
| | (autonomous) | | (autonomous) | | (autonomous) | |
| +--------------+ +--------------+ +--------------+ |
| |
+------------------------------------------------------------------+
HYBRID MODEL
+------------------------------------------------------------------+
| |
| +--------------------+ +--------------------+ |
| | Centralised | | Federated | |
| | Functions: | | Functions: | |
| | | | | |
| | - PII governance | | - Programme data | |
| | - Financial data | | - Operational data | |
| | - Master data | | - Local reporting | |
| | - Cross-org | | - Context-specific | |
| | standards | | classifications | |
| +----------+---------+ +----------+---------+ |
| | | |
| +----------------+------------------+ |
| | |
| v |
| +---------+----------+ |
| | Governance Charter | |
| | (defines split) | |
| +--------------------+ |
| |
+------------------------------------------------------------------+

Figure 1: Governance operating model comparison showing authority distribution

The choice of operating model depends on four factors: organisational size, geographic distribution, regulatory complexity, and existing organisational culture. Organisations below 100 staff almost always benefit from centralised governance because the overhead of federated coordination exceeds the benefit. Organisations above 500 staff with operations in more than 10 countries rarely succeed with pure centralisation because the central function cannot maintain sufficient contextual knowledge. Between these thresholds, the choice depends on whether the organisation values consistency or responsiveness more highly, and whether existing culture supports autonomous decision-making or expects hierarchical approval.

Roles and accountability

Effective governance requires clear assignment of accountability, responsibility, consultation, and information rights across data-related activities. The three core roles of data owner, data steward, and data custodian form the foundation of governance accountability, but each role requires precise definition of scope and authority.

Data owners hold accountability for entire data domains. The owner of beneficiary data bears responsibility for ensuring that data collection meets consent requirements, that data quality supports programme decisions, that access controls align with protection obligations, and that retention complies with policy. Data owners do not perform these activities directly. Instead, they set requirements, approve exceptions, and accept risk for decisions within their domain. In most organisations, data owners are senior managers or directors with sufficient authority to make binding decisions about data use. An organisation with five data domains (beneficiary, programme, financial, human resources, donor relations) requires five data owners, though a single individual can own multiple domains in smaller organisations.

Data stewards perform operational governance activities within domains. A steward maintains the data catalogue entries for their domain, monitors quality metrics, processes access requests within established parameters, and escalates issues requiring owner decision. Stewards work with data day-to-day and develop deep knowledge of data content, quality issues, and user needs. The steward role requires 2-4 hours per week in organisations with modest data complexity, rising to full-time positions in large organisations with complex data environments. Most organisations appoint stewards from staff already working with domain data rather than creating dedicated steward positions.

Data custodians manage technical infrastructure supporting data assets. Database administrators, system administrators, and IT operations staff act as custodians for data within systems they manage. Custodians implement access controls defined by owners, execute backup and recovery procedures, and maintain system security. The custodian role differs from owner and steward roles in that custodians do not make decisions about data content or business use. A custodian implements the access matrix defined by the owner but does not determine who should have access.

+-------------------------------------------------------------------+
| DATA GOVERNANCE RACI MATRIX |
+-------------------------------------------------------------------+
| |
| Activity | Owner | Steward | Custodian | User |
| ----------------------------+-------+---------+-----------+------|
| Define data requirements | A | R | C | I |
| Set quality standards | A | R | I | C |
| Classify data sensitivity | A | R | I | I |
| Approve access requests | A | R | I | - |
| Grant system access | I | C | R | - |
| Monitor data quality | I | A | R | C |
| Remediate quality issues | I | A | C | R |
| Implement security controls | A | C | R | - |
| Execute backup/recovery | I | I | A | - |
| Define retention periods | A | R | C | I |
| Authorise data sharing | A | R | I | - |
| Maintain catalogue entries | I | A | C | C |
| Respond to subject requests | A | R | C | I |
| |
| Legend: A=Accountable, R=Responsible, C=Consulted, I=Informed |
+------------------------------------------------------------------+

Figure 2: RACI matrix showing accountability distribution across governance roles

The RACI distinction between accountable and responsible matters significantly in practice. An individual who is accountable cannot delegate that accountability, though they can delegate the work. When a data breach occurs in the beneficiary data domain, the data owner remains accountable even if a steward failed to implement a required control. This accountability structure ensures senior management engagement in data governance rather than relegating governance to an administrative function.

Role assignment should follow the principle of separation of duties where the same individual does not both approve and implement data decisions. The data owner who approves an access request should not also grant system access. This separation requires at minimum two individuals in the governance chain for any data domain, though small organisations may accept the risk of combined roles when staff capacity prevents separation.

Decision rights matrix

Governance effectiveness depends on clear definition of which roles can make which decisions under which circumstances. A decision rights matrix codifies authority levels and escalation paths for common data decisions. Without explicit decision rights, staff either escalate every decision to senior management (creating bottlenecks) or make inconsistent decisions without authority (creating risk).

Decision categories in data governance span creation, modification, access, sharing, and deletion. Each category requires decisions at multiple levels of impact and risk. Creating a new field in a data collection form differs in risk from creating an entirely new data domain. Sharing data with an implementing partner differs from sharing with a government authority. The decision rights matrix assigns authority based on the combination of decision category and impact level.

Low-impact decisions should resolve at operational levels without escalation. A data steward can approve a colleague’s request to access aggregate programme data for internal reporting without consulting the data owner. The steward has delegated authority for routine access within established parameters. Medium-impact decisions require owner involvement. Granting access to personally identifiable beneficiary data requires explicit owner approval regardless of the requestor’s role. High-impact decisions escalate to governance bodies. Sharing beneficiary data with external parties requires data council review, and decisions affecting multiple domains require cross-domain coordination.

Decision categoryLow impactMedium impactHigh impact
Data creationSteward approves new fields within existing structuresOwner approves new data collection initiativesCouncil approves new data domains
Data modificationSteward approves corrections and updatesOwner approves structural changesCouncil approves changes affecting multiple domains
Access requestsSteward approves access to non-sensitive aggregate dataOwner approves access to PII or sensitive dataCouncil approves access for external parties
Data sharingSteward approves internal operational sharingOwner approves sharing with implementing partnersCouncil approves sharing with government or donors
Data deletionSteward approves routine retention-based deletionOwner approves early deletion or preservationCouncil approves deletion of legally-held data

The matrix requires accompanying definitions of impact levels. Low impact involves data that is non-sensitive, aggregate, and internal to the organisation. Medium impact involves personally identifiable information, sensitive operational data, or data shared with known partners under agreement. High impact involves data about vulnerable populations, data shared with parties lacking data sharing agreements, or decisions with regulatory implications. These definitions should appear in the governance charter and inform training for all staff with governance responsibilities.

Governance bodies

Governance bodies provide forums for cross-functional decision-making, dispute resolution, and strategic direction. Two primary bodies serve most organisations: a data council for strategic governance and working groups for operational coordination.

The data council comprises data owners, senior IT leadership, and representatives from compliance or legal functions. The council meets monthly or quarterly depending on organisational pace, with authority to approve data policies, resolve cross-domain disputes, set governance priorities, and allocate governance resources. Council decisions bind the organisation and override individual domain owner decisions where domains conflict. A typical council agenda includes review of governance metrics, approval of policy changes, resolution of escalated issues, and discussion of emerging data challenges.

Council composition should include 5-9 members to enable substantive discussion while remaining manageable. Councils exceeding 12 members struggle to reach decisions and tend toward information sharing rather than governance. Councils below 5 members lack sufficient perspective diversity and risk capture by dominant functions. The council requires a chair with authority to set agenda, manage discussion, and ensure decisions reach implementation. Many organisations assign council chair responsibility to a chief data officer, head of information management, or senior operations director.

Working groups address operational governance needs within specific areas. A data quality working group brings stewards together to share remediation approaches and coordinate quality standards. An integration working group addresses data flows between systems. Working groups meet more frequently than council (weekly or fortnightly), focus on implementation rather than policy, and escalate issues requiring council decision. Working group membership draws from stewards and practitioners rather than owners and executives.

+------------------------------------------------------------------+
| GOVERNANCE BODY STRUCTURE |
+------------------------------------------------------------------+
| |
| +------------------+ |
| | DATA COUNCIL | |
| | | |
| | - Data owners | |
| | - IT leadership | |
| | - Compliance | |
| | | |
| | Meets: Monthly | |
| | Focus: Strategy, | |
| | policy, | |
| | disputes | |
| +--------+---------+ |
| | |
| | Escalation |
| | and reporting |
| | |
| +---------------------+---------------------+ |
| | | | |
| v v v |
| +------+-------+ +-------+------+ +-------+------+ |
| | Data Quality | | Integration | | Protection | |
| | Working Group| | Working Group| | Data Group | |
| | | | | | | |
| | - Stewards | | - Stewards | | - Protection | |
| | - Analysts | | - IT staff | | staff | |
| | | | | | - M&E staff | |
| | Meets: | | Meets: | | | |
| | Fortnightly | | Weekly | | Meets: | |
| | | | | | Fortnightly | |
| | Focus: | | Focus: | | | |
| | Quality | | System | | Focus: | |
| | standards, | | interfaces, | | Sensitive | |
| | remediation | | data flows | | data issues | |
| +--------------+ +--------------+ +--------------+ |
| |
+------------------------------------------------------------------+

Figure 3: Governance body structure showing council and working group relationships

Meeting cadence and format significantly affect governance effectiveness. Councils meeting less than quarterly lose momentum and accumulate unresolved issues. Councils meeting more than monthly consume excessive senior time on operational matters that should resolve elsewhere. Working groups meeting less than monthly fail to maintain coordination. The chair of each body should publish agenda 5 business days before meetings and circulate decisions within 2 business days after meetings.

Policy hierarchy

Data governance operates within a broader policy framework that includes information security, data protection, records management, and organisational policy. Understanding how these policies relate prevents duplication and ensures consistent treatment of overlapping concerns.

The information security policy establishes controls for protecting data confidentiality, integrity, and availability. Security policy defines technical controls such as encryption requirements, access authentication, and monitoring. Data governance defers to security policy for technical protection measures while informing security policy about data sensitivity levels requiring protection. The data classification scheme represents a key integration point: data governance defines business-driven classification (public, internal, confidential, restricted) while security policy defines the controls required for each classification level.

The data protection policy addresses legal obligations for personal data processing, particularly under GDPR and similar regulations. Data protection policy defines lawful basis for processing, consent requirements, data subject rights, and breach notification procedures. Data governance implements data protection requirements by ensuring data owners understand their obligations, stewards monitor compliance, and processes exist to fulfil data subject requests. In some organisations, data governance and data protection combine under single leadership. In others, data protection sits within legal or compliance functions while data governance sits within IT or operations.

Records management addresses retention, preservation, and disposition of organisational records. Records policy defines retention schedules based on legal and operational requirements. Data governance ensures data systems implement retention policy by archiving or deleting data according to schedule. The governance function tracks which data assets fall under which retention requirements and monitors compliance with disposition schedules.

+-------------------------------------------------------------------+
| POLICY HIERARCHY |
+-------------------------------------------------------------------+
| |
| +--------------------------------------------------------------+ |
| | ORGANISATIONAL POLICY | |
| | Mission, values, strategic direction | |
| +-------------------------------+------------------------------+ |
| | |
| v |
| +-------------------------------+------------------------------+ |
| | INFORMATION GOVERNANCE POLICIES | |
| | | |
| | +-------------+ +-------------+ +-------------+ | |
| | | Information | | Data | | Records | | |
| | | Security | | Protection | | Management | | |
| | | Policy | | Policy | | Policy | | |
| | +------+------+ +------+------+ +------+------+ | |
| | | | | | |
| +---------|----------------|----------------|------------------+ |
| | | | |
| v v v |
| +---------+----------------+----------------+------------------+ |
| | DATA GOVERNANCE FRAMEWORK | |
| | | |
| | Implements security Implements data Implements | |
| | classification and subject rights and retention and | |
| | access controls consent management disposition | |
| | | |
| +-------------------------------+------------------------------+ |
| | |
| v |
| +-------------------------------+------------------------------+ |
| | OPERATIONAL STANDARDS | |
| | | |
| | +------------------+ +------------------+ | |
| | | Data Quality | | Metadata | | |
| | | Standards | | Standards | | |
| | +------------------+ +------------------+ | |
| | | |
| | +------------------+ +------------------+ | |
| | | Master Data | | Integration | | |
| | | Standards | | Standards | | |
| | +------------------+ +------------------+ | |
| | | |
| +--------------------------------------------------------------+ |
| |
+-------------------------------------------------------------------+

Figure 4: Policy hierarchy showing relationship between governance framework and related policies

The governance framework should reference but not duplicate content from related policies. Where data classification appears in the security policy, the governance framework references the security policy classification scheme rather than restating classification levels. This approach prevents inconsistency when policies update and reduces maintenance burden. The governance charter should explicitly state which policy governs which domain and how conflicts between policies resolve.

Maturity model

Governance maturity describes how well governance structures, processes, and capabilities meet organisational needs. A maturity model provides vocabulary for assessing current state and planning improvement. Five maturity levels capture the progression from ad hoc data management to optimised governance.

Level 1 (Initial) describes organisations with no formal governance structure. Data management occurs informally within projects or teams. No documented roles, no decision rights, no quality standards. Data quality issues surface through operational problems. Sharing happens based on personal relationships. Organisations at Level 1 experience significant data inconsistency, unknown data quality, and inability to respond to regulatory inquiries about data handling.

Level 2 (Developing) describes organisations with emerging governance awareness. Some data domains have informal owners. Quality issues receive attention when they cause visible problems. Basic documentation exists for critical data assets. Sharing follows informal approval processes. Organisations at Level 2 can respond to immediate data needs but lack consistency across domains and struggle with cross-functional data use.

Level 3 (Defined) describes organisations with documented governance structures. All data domains have assigned owners and stewards. A governance charter defines roles, decision rights, and escalation paths. A data council meets regularly. Quality standards exist and receive periodic monitoring. Organisations at Level 3 have predictable data governance but may not measure effectiveness or systematically improve.

Level 4 (Managed) describes organisations with measured governance effectiveness. Key performance indicators track governance activities: access request turnaround time, quality issue resolution rate, policy compliance levels. The data council reviews metrics and adjusts governance approach based on evidence. Automation supports routine governance activities. Organisations at Level 4 demonstrate consistent governance and continuous improvement.

Level 5 (Optimising) describes organisations where governance enables strategic data use. Governance processes anticipate needs rather than reacting to problems. Advanced capabilities such as automated data lineage, real-time quality monitoring, and predictive analytics inform governance decisions. Governance investment ties directly to organisational outcomes. Few organisations reach Level 5 across all domains; most focus optimisation efforts on highest-value data assets.

+-------------------------------------------------------------------+
| GOVERNANCE MATURITY MODEL |
+-------------------------------------------------------------------+
| |
| Level 5: OPTIMISING |
| +--------------------------------------------------------------+ |
| | Governance enables strategy | Predictive | Outcome-linked | |
| +--------------------------------------------------------------+ |
| ^ |
| | |
| Level 4: MANAGED |
| +--------------------------------------------------------------+ |
| | Measured effectiveness | KPIs tracked | Evidence-based | |
| +--------------------------------------------------------------+ |
| ^ |
| | |
| Level 3: DEFINED |
| +--------------------------------------------------------------+ |
| | Documented structures | Charter exists | Council meets | |
| +--------------------------------------------------------------+ |
| ^ |
| | |
| Level 2: DEVELOPING |
| +--------------------------------------------------------------+ |
| | Emerging awareness | Some owners | Informal processes | |
| +--------------------------------------------------------------+ |
| ^ |
| | |
| Level 1: INITIAL |
| +--------------------------------------------------------------+ |
| | No formal governance | Ad hoc management | Reactive only | |
| +--------------------------------------------------------------+ |
| |
+-------------------------------------------------------------------+
| |
| Assessment dimensions: |
| - Roles: Are owners, stewards, custodians assigned? |
| - Processes: Are decision rights and escalation defined? |
| - Bodies: Do council and working groups meet? |
| - Metrics: Are governance KPIs tracked? |
| - Automation: Are routine activities automated? |
| - Outcomes: Does governance demonstrably improve data use? |
| |
+-------------------------------------------------------------------+

Figure 5: Governance maturity model with assessment dimensions

Maturity assessment should occur annually, covering each data domain separately. A single organisation often exhibits different maturity levels across domains: financial data may reach Level 4 due to audit requirements while programme data remains at Level 2. Assessment results inform governance improvement priorities, with effort focused on domains where maturity gaps create the greatest operational or compliance risk.

Target maturity levels depend on organisational context. Organisations under 50 staff rarely benefit from pursuing maturity beyond Level 3; the overhead of Level 4 measurement exceeds the benefit. Organisations subject to stringent regulatory requirements may need Level 4 maturity in affected domains regardless of size. Most mission-driven organisations should target Level 3 maturity across all domains with Level 4 maturity for domains containing personally identifiable information or financial data.

Implementation considerations

Governance implementation varies significantly based on organisational context. Approaches that work for headquarters-based charities fail in federated international organisations. Approaches suitable for organisations with dedicated data teams fail where data responsibilities distribute across programme staff.

For organisations with minimal dedicated IT or data capacity

Governance can begin without dedicated governance staff. The essential first step assigns data ownership to existing managers who already bear accountability for functions producing or consuming data. The programme director owns programme data. The finance manager owns financial data. The human resources manager owns personnel data. These individuals already make decisions about data within their functions; formal ownership recognition makes implicit accountability explicit.

Stewardship can combine with existing roles where staff already work with domain data. The M&E officer who maintains programme databases assumes stewardship responsibilities for programme data. The finance officer who manages the accounting system assumes stewardship for financial data. Adding 2-4 hours weekly of explicit governance activity to existing roles proves more sustainable than creating dedicated positions in small organisations.

A minimal governance charter fits on 2-3 pages, covering owner and steward assignments by domain, decision rights for common scenarios, and escalation to senior management for exceptional situations. The charter requires annual review and update. A data council can consist of the senior management team meeting with data governance as a standing agenda item rather than a separate body. These minimal structures achieve Level 2-3 maturity at low cost, providing sufficient governance for organisations handling moderate data volumes without significant protection data.

For organisations with established IT or data functions

Organisations with dedicated data or IT staff can implement more sophisticated governance structures. A dedicated data governance coordinator or data manager provides continuity, maintains governance documentation, tracks governance metrics, and supports data owners and stewards in fulfilling their responsibilities. This role requires 0.5-1.0 FTE depending on data complexity and operates as a facilitation function rather than a control function.

Formal data council establishment requires terms of reference defining membership, meeting cadence, decision authority, and relationship to other governance bodies such as IT steering committees or programme quality groups. Council effectiveness depends on active data owner participation; councils where owners routinely send delegates lose decision-making authority and become discussion forums rather than governance bodies.

Investment in governance tooling becomes worthwhile at this scale. A data catalogue provides visibility into data assets and their governance status. Workflow automation routes access requests to appropriate approvers and tracks resolution. Quality monitoring dashboards surface issues requiring steward attention. These tools reduce governance friction and enable the measurement required for Level 4 maturity.

For organisations with federated structures

Federated organisations require explicit governance principles that distinguish global standards from local discretion. Global standards should address minimum data protection requirements, cross-organisation data sharing protocols, enterprise master data definitions, and common classification schemes. Local discretion should address operational data collection methods, local system selection within approved categories, context-specific quality thresholds, and local reporting formats.

A global data council sets organisation-wide policy while country or regional councils govern local implementation. The global council includes representatives from major operating regions, ensuring federation principles reflect diverse operational contexts. Clear escalation paths address situations where local decisions conflict with global standards or where cross-unit issues require resolution.

Federated governance requires investment in interoperability: common metadata standards enabling cross-unit data discovery, shared identifiers for master data entities, and integration patterns supporting data flow between units. Without this investment, federation produces data silos that undermine organisational learning and cross-programme reporting.

Common implementation failures

Three patterns predict governance implementation failure. First, treating governance as an IT initiative rather than a business function: governance requires business owner engagement to succeed, and IT cannot govern data it does not own. Second, creating governance structures without decision authority: councils that advise but cannot decide become irrelevant as staff route around them. Third, implementing governance without communication: staff unaware of governance structures continue previous practices, creating shadow data management that governance cannot address.

For organisations in high-risk operating contexts

Organisations operating in hostile environments, handling protection data, or working with vulnerable populations require governance structures that address elevated risk. Data owners for sensitive domains require training in protection principles and regular engagement with protection specialists. Decision rights for sensitive data should restrict approval authority to trained individuals and require documented justification for access grants.

Governance processes for high-risk data should include provisions for emergency data protection: procedures for rapid data deletion when security deteriorates, break-glass access for evacuations, and data minimisation reviews before entering high-risk areas. These provisions integrate with physical security protocols and business continuity planning.

Protection data governance often benefits from additional oversight. A protection data working group brings together protection specialists, M&E staff handling sensitive data, and IT security staff to address the intersection of technical controls and protection principles. This group operates under the data council but with specific mandate for data presenting protection risks.

Governance metrics

Measuring governance effectiveness requires metrics that capture both activity levels and outcomes. Activity metrics track governance process execution: number of access requests processed, quality issues identified, catalogue entries maintained. Outcome metrics track governance impact: data quality levels, access request turnaround times, policy compliance rates. Effective governance reporting combines both metric types to demonstrate that governance activities produce intended outcomes.

Core governance metrics include access request resolution time (target: 3 business days for routine requests, 1 business day for urgent requests), data quality score by domain (target: domain-specific based on quality dimensions), catalogue completeness (target: 100% of data assets documented within 90 days of creation), and policy compliance assessment results (target: no critical findings in annual assessment). The data council should review these metrics quarterly and investigate adverse trends.

Metric collection requires tooling investment at scale. Manual metric collection works for organisations with fewer than 20 data assets but becomes unsustainable beyond that threshold. Governance tools that automate access request workflows, quality monitoring, and catalogue maintenance generate metrics as byproduct of operation. Organisations implementing governance tools should ensure metric reporting capabilities exist before deployment.

See also