As private and public bodies become digitally connected, we need data standards to allow technologies to be shared across systems and jurisdictions.
Canadians who rely on public transit know how uncomfortable it is to wait outside for a bus on the coldest day of winter. Some Canadians are turning to transit apps that display information about bus locations in real time on their phones to avoid the cold.
What people who use those apps might not know is that an array of legal, institutional and technical obstacles had to be overcome before that information ended up on their phones. It journeyed from the public transit agency that collected it to the app developer that repurposed it so it could be displayed on a mobile interface. This seamless way of providing data, in a medium that is automatic and easy to read, has made such apps extremely successful.
How do the data make that journey? Could the technology be replicated to improve access to all kinds of city data? And could the technology be put to use not just for innovative services but also to support more transparent and interactive governance?
Data standards can provide public data in a manner that allows them to be shared automatically across disparate systems and to be open and relatively easy to repurpose. When it comes to public transit information, the most commonly applied standard is the General Transit Feed Specification (GTFS). The standard was collaboratively developed by the City of Portland’s Transit Agency, TriMet, which provided the data, and Google, the developer of the script that exported the data into the open format to be repurposed. The initiative arose partly from a perceived need by a civil servant to make transit directions as easily accessible as driving directions. Thanks to the relative simplicity of the format, hundreds of jurisdictions worldwide have adopted the standard and now provide raw data about transit schedules openly online.
Data standards can provide public data in a manner that allows them to be shared automatically across disparate systems and to be open and relatively easy to repurpose.
Since the launch of GTFS in 2005, other data specifications intended for adoption by multiple jurisdictions as a standard have emerged for different types of data that deliver government-derived information (such as service requests, budgets and traffic incidents). When multiple municipalities adopt the same sets of standards for their data, suddenly such data are comparable and discoverable. Tools that have been useful in one jurisdiction can then be scaled up for multiple jurisdictions. For instance, Yelp displays food inspection data in multiple cities with the help of LIVES, an open data standard for restaurant inspection data.
To help more cities standardize their open data, Geothink, a Canadian geospatial and open data research partnership, and the Center for Government Excellence, affiliated with Johns Hopkins University, have collaborated to release the first-ever Open Data Standards Directory. The online site intends to help cities publish open data online by providing a systematic approach to assessing standards based on a set of metrics. The directory communicates a wide array of information about each standard, including its background characteristics (name, publisher, publisher reputation, etc.), its ability to make certain types of data interoperable, the degree to which it’s open to use and transparent to others, its maintenance and development over time and how it specifies the standard’s and the data set’s terms of use. This set of metrics will inform judgments that assist providers of open data in their decisions to reject or adopt standards.
Governments at the international, national, regional and civic levels are increasingly opening their high-value data via online catalogues that publish data sets under open licences and machine-readable formats. In Canada, Ontario and the City of Edmonton have adopted the International Open Data Charter. The charter provides an aspirational set of principles for releasing open data, including the idea that they be comparable and interoperable. This goal aids not only the publication of open data but also the coordination necessary to provide the data in ways that are useful. Yet making data truly open is no easy task. There are different approaches to tackling interoperability. Furthermore, an array of complex coordination and technical challenges come with standardizing data to ensure that data will be interoperable.
News about Alphabet Inc.’s Sidewalk Lab in Toronto has brought to the forefront a debate over what role the private sector should play as local governments promote embedding technologies (such as sensors and cameras) that collect and act on data into public spaces. While proponents of the project are voicing their excitement about the urban project and its technological solutions, some are more critical of it and are drafting a list of crucial questions for project administrators and the city related to project governance, data access and data governance, public engagement, inclusivity, privacy law and the technology’s hard infrastructure. As both private and public organizations work to become more digitally connected and more data driven, it is critical to pause and reflect on who develops and maintains these technologies and to consider their purpose and proposed value.
Technologies that act on and automate public data provided by the government and used to govern should be developed in consultation with many types of stakeholders. The release of the Open Data Standards Directory (which I helped to create) makes this approach a reality. The aim is to educate people more broadly about what data standards are and how they can help data remain accessible to cities and their citizens. Otherwise, you may find yourself left out and waiting for your bus in the cold.
Originally posted on Policy Options