In a time when emerging digital tools (like artificial intelligence, but also digital twins, augmented and virtual reality tools, and business intelligence tools) are transforming how organizations work with data, understanding and improving data capacity has never been more important. This Data Capacity Workshop, hosted by Open North at the GOOD 2025 conference, brought together professionals from a wide range of sectors to explore what data capacity means, how it differs from data maturity, and the real-world challenges organizations face in managing and leveraging their data effectively.
As described in a recent blog, Open North has been working extensively with municipalities and nonprofits to provide capacity and maturity assessments. In this workshop, we wanted to present some of our insights and work with the participants to validate our thinking and discuss different approaches. The conversation was exceptionally rich, and in this blog we’d like to highlight some of the most crucial discussion points.
Understanding Data Capacity
In the workshop, we began by clarifying a crucial distinction in our thinking: the difference between data capacity and data maturity. Data maturity is defined by the UK’s Government Digital Service as an organization’s overall “capability, effectiveness, and readiness to use data.” However, for us there are challenges with this definition. An assessment of “(im)aturity” can be perceived as paternalistic and it can also imply a measurement against a universal, objective standard. In our work, however, we’ve found that talking about data capacity focuses on the specific skills, challenges, and opportunities an organization has. Capacity takes into account an organization’s unique context and needs. This distinction is important because it shifts the conversation from a rigid, one-size-fits-all assessment to a more flexible, context-aware approach.
For example, one municipality we worked with was quite small, with about 8,000 residents. Their central concern was to modernize their data storage solution. A data capacity assessment that measured them against the capacity to effectively utilize predictive AI, or a digital twin is an unrealistic level of capacity, and such an assessment would more likely demoralize than build support for a new data strategy.
Uneven data skills across teams
In the workshop’s discussion a second point was made that highlighted the importance of capacity and assessing based on customized levels of capacity. Participants noted that their organizations have uneven data literacy. While some employees have advanced data skills, others lack even a basic understanding of where data comes from or how it flows through the organization. Participants observed that most employees prefer to remain data consumers—using data provided to them—rather than actively contributing to data collection, cleaning, or analysis. One participant shared that they have pockets of expertise—people who know how to work with data—but very few understand the full lifecycle of data in the organization. Most just use what’s given to them without questioning its origins or limitations.
A capacity assessment needs to account for the variable levels of necessary capacity (in this example a literacy capacity) across an organization. Not everyone needs to be an expert on the entire data lifecycle; some need only be skilled data consumers. Measuring everyone against the same level of maturity can be unhelpful and misleading in these cases.
New tools, or more training?
Another recurring capacity theme across the workshop was around skills capacity, particularly regarding using data tools. This is a perennial question for organizations: Are new tools needed, or do staff need to be trained to use the existing tools better? The solution to insufficient capacity isn’t always obvious. In the workshop, participants noted that organizations often invest in modern platforms like Power BI or Airtable but then face barriers such as
- restricted access (data locked in departmental silos),
- insufficient training (employees unaware of available tools or how to use them), and
- poor interoperability (disconnected systems that don’t communicate)
As one attendee remarked, buying new tools wasn’t the solution — the real challenge is getting people access to the right data and teaching them how to use these tools effectively.
This discussion underscored the necessity of asking multiple, triangulating questions about capacity (e.g., on skills, tools, training, and data access) and deploying carefully worded capacity levels to uncover the real pain points and challenges. It is for this reason that Open North conducts its capacity assessments as qualitative interviews, rather than quantitative questionnaires. This enables us to go back to previous questions, dig into responses, and get at the underlying nature of the challenge.
Data is stored, but not always usable
Another constant question we encounter in our capacity work is around data storage. What is the appropriate storage architecture for an organization? What constitutes sufficient data accessibility? When asked whether their organization’s data was stored and accessible in a well-organized way, participants’ responses varied widely. Some participants highlighted cultural resistance—teams treating data as theirs rather than a shared resource. Others pointed to inconsistent standards, where certain datasets (like geospatial data) were well-documented and easy to find, while other datasets were scattered across drives and emails.
Beyond technical issues, many organizations struggle with legacy mindsets. Phrases like “this is how we’ve always done it” or “we can’t share that data” often reflect deeper cultural or political hurdles. In some cases, data governance is driven primarily by risk avoidance—legal or reputational concerns—rather than a strategic vision for how data could improve decision-making.
One data scientist shared her frustration that she was hired to analyze data, but she spent most of her time requesting access, navigating bureaucracy, and convincing multiple bosses to let her use the data they already had.
This indicated that on some capacity issues the responses can be extremely heterogeneous, differing by department, by unit, or even by data type! A capacity assessment needs to have the flexibility to capture nuance and granularity at this level. Otherwise it runs the risk of overgeneralizing.
Going forward with data capacity
This workshop showed that our approach to data capacity assessment makes intuitive sense, which is vital to an easy and effective process. In addition, the participants highlighted the importance of thinking carefully about ambiguous wording, especially words used to denote levels of capacity (like “modern”). Open North will be integrating this feedback into our data capacity assessment tools and working on a free, public facing version that will enable users to take a quick test to see where they feel their organization is and what might require a more thorough capacity assessment to unravel some challenges.
If you or your municipality is considering a digital project and has concerns around data capacity, don’t hesitate to reach out to us for an assessment. We offer services in the following areas:
- Data management: Supporting effective and strategic use and sharing of data across your organization.
- Cybersecurity: Supporting the confidentiality, integrity, and availability of your organization’s data.
- Privacy compliance: Establishing compliance with regulatory standards for data privacy.
- Data hub development: Supporting inter-organizational data sharing and collaboration through governance models and digital infrastructure.