4th IODC

Blog IODC 2016
Madrid. October 6-7, 2016

#IODC16

Global Open Data Standards Critical As We Move Forward

June 8, 2015 by IODC

A guest post from Lynne McAvoy.

At the 3rd International Open Data Conference 2015 (#iodc15), moderator José Alonso of the World Wide Web Foundation, initiated the discussion on global data standards with a panel of open data supporters, including Caroline Burle, World Wide Web Consortium’s Brazil office, Hudson Hollister, Data Transparency Coalition, Chris Taggart, OpenCorporates, Sarah Telford, United Nations Office for the Coordination of Humanitarian Affairs (UN OCHA), with guest, Michael Cañares, from Step Up Consulting in the Philippines.

When asked what the phrase “global data standard” meant to the panelists, we heard that there are different areas in which standards are required. A global data standard would ensure a common semantic understanding across governments and countries, allowing us to effectively extract and use shared data, while improving data quality. Without such a standard, there is no basis for comparison of various social indicators, from government financial transactions through to fair contracting practices or environmental assessments.  Chris Taggart felt that the Open Contracting Data Standard is a first step in the right direction; by increasing clarity in the procurement process, the operational burden is reduced. Sarah Telford highlighted the importance of finding ways to communicate across countries, focusing on the importance of lightweight programs such as Humanitarian Exchange Language (HEL) to ISO standards, which provide stable, internationally agreed-upon guidance. Caroline Burle felt that a global data standard should allow the use and sharing of data and that, most critically, it enables people to use the Web from anywhere.

Drivers for data standard creation include cultural change and a top-down interest. In Jakarta, there are two kinds of response: for agencies lacking infrastructure, data standards provide a framework from which they can begin development; agencies with existing infrastructure can be resistant to onboard new standards, unless it can be shown how the new standard will increase efficiency or provide a quick solution to an existing pain point. Chris Taggart mentioned that such important tools as the Open Data Contracting Standard and the Legal Entity Identifierare critical as we move forward, and are a benefit to different levels of government in defining their data. An example of this is the location code PA: does this refer to Panama or Philadelphia? Sarah Telford stressed that expectations have changed. Previous data collection was done in Excel, using a cut and paste approach. Data collection methods must be nimble, and the time required to understand and use standards must be short. Caroline Burle has been promoting open data standards in Brazil since 2008, and has witnessed the rapid release of  both open data and its metadata.

According to the panel, the best data standard models that have emerged over the past few years are those which exploit top-down  and bottom-up engagement, which increase the chances of adoption. A lack of data awareness and knowledge is seen as a challenge in Jakarta, where the first draft of a metadata standard, based on the World Wide Web Consortium’s Data Catalog Vocabulary (DCAT), was edited beyond recognition because of language issues. By moving away from the standardized terminology, the agency involved impacted the potential for interoperability and data sharing. The risk is that one agency can say “yay” or “nay” to the proposed model. We must engage users and encourage them to modify their behaviour when collecting, describing, and using data. Data standard models that work are led by knowledgeable people who have the time to dedicate to doing it right. Chris Taggart mentioned that there are several barriers to using data standard models: projects close, we have to pay for ISO standards, examples provided in documentation often contain errors. This leads to the idea that standards are there, but only available to some. What is needed is inclusiveness and iteration. He feels that “Open access is a fundamental requirement to what we are trying to accomplish, and the legacy approach will not work”. Sarah Telford stated that rapidly-needed information requires a more nimble approach. The Humanitarian Exchange Language was born of necessity, but the reason it has been taken up so quickly is because a large organization, UN OCHA, is at the helm to promote it. “What makes a standard is that people adopt it and use it”.

When asked about a business model for maintaining a high-quality data standard, the panel agreed that it is not just about the money. Organizations need to change their culture and behaviours around data. Legacy systems and proprietary software are being put aside as we move to open data standards and open access tools, more nimble and agile models for handling data. The role of the Chief Information Officer is changing to include the responsibility of Chief Data Officer, hopefully because organizations are recognizing the importance of having leadership in changing the information and data management culture.

Use of cookies

This site uses cookies in order to improve your user experience. By continuing to use the site, you are agreeing to the use of cookies and accepting our cookies policy. .

ACEPTAR