Francisco Morcillo is Advisor on Urban and Territorial Innovation about Smart Cities. International Lecturer and Blogger in the important blog 2TI #Smartcity and Guest-Blogger in numerous national and international publications; Mentor on Business Opportunities in Smart Cities. Founder and CEO of MB3-GESTION since 2005, develops a line specialized in Territorial Innovation focusing in the Smart City field, searching for the definition of ecosystems, business and management models in cities, under sustainable criteria along with the implementation of ICT solutions and with urban and territorial planning, focused on the citizen and under the framework of social innovation.

The new urban citizen is not only a recipient of public services, “citizens must feel the city through its services” and the future city does not provide computer technology or count on “clever apps that offer services to citizens.” All this can be easier, the smart city concept, as an improvement in the citizens’ quality of life through technology is “about useful and easy stuff” and not about technological applications that invade our everyday life. For that reason, the key lies in creating an ecosystem in the city, where the layers of intelligence can be identified, its resources are known and the value chain, the aforementioned quality of life, is defined, such as enabling business opportunities.

Cities must have a strategy, but “cities should not be the ones being hyper-connected, which is why I prefer the perspective and the strategy of understanding the hyper-connected citizen to give value to the smart city.” From the perspective of the analysis of cities and their citizens, the possible business patterns should consider:

1.- The city and its data. The generation of data with the increase of connectivity between objects and supported by predictive or analytical information technologies turn into useful tools for decision-making in the public sector and an additional source of information.

2.- Citizens and data. The connected citizen becomes an administrador of information that with the use of technology feeds from and provides useful and relevant information for the city. But one must note that information can also be useful for companies, here lies the importance of digital transformation in businesses.

3.- New habits. A city and a connected citizen facilitates new personal and social relationships as well as the possibility of facing solutions in regard with education, health, sports or security. The evolution of the new habit means new innovative services that must be implemented in the urban environment, with the objective of reaching the highest efficiency in the provision of the public service.

4.- Data analysis of the city, the citizens, and the search for new tendencies as a tool towards the city’s digital transformation.

As stated above, getting to know the smart city and hence its business models must follow a previous understanding about its citizens and tendencies. Knowing its technological strategic plans is not enough, but rather knowing the new methods of relation, connection and participation in society are, and the resources they have. All together configure a new model, a new strategy that we must reflect on if we want to evolve in the design process of the smart city, “their smart city.” This can be summarized in five key aspects:

1.- Ubiquity and the stage we are at in the development of the internet, through mobile users in various contexts and environments instead of just focusing on devices is the citizen’s new need. This makes us believe that this is not about designing only new tech solutions with sundry opportunities, but it is also necessary to bring the process closer to a design based on user experience and their global needs. Knowing profiles, demand, place of connection and digital environment is a big challenge to bring technology closer to the real service.

2.- Enabling autoservice, supported by the information and data available on the cloud as the new orientation of the citizens’ orientation and demand. It is clear that leaving everything to the chance of servers and personal information is not justifiable anymore. Real time, the need for an immediate response, auto-administering as the foundation of an improvement in the production system and the public service has become a reality.

3.- Besides, the concept of the Internet of Things or the Internet of Everything (IoE, as shortened by some companies such as Intel). Besides the citizen’s regular connection, we find the opportunity to connect devices while generating a digital environment that may yield great options or solve apparently unsolvable problems. Performing an adequate administration of data, managing them and operating them, as well as providing options for the generation of an economic efficiency, are new approaches that should prevail in the design of a global management of the city.

4.- Also, the re-design of future industries and designers will generate a Fab_Lab, Urban Lab creation process that will enable the testing and “hacking” of the city. It is not just about supplying technological laboratories, nor the future has to be in opening a number of collaborative spaces with the latest technology. We need to test products and develop public and private challenges.

5.- Citizen’s intelligence and collective intelligence will force us to understand each environment and perform an actual response. The citizen does not just want a battery of services that provide a few domestic solutions or a series of options that were not demanded by them. For that matter it is necessary to create an adequate system that allows for an understanding of the context, the collective needs and hence respond. But not just from time to time; generated information and data will act in an evolutionary and increasing way, adapt to collective demand and generate more information, adapted to the user and not just a mere automation of data.

6.- The opening of data as a strategy, not only transparency-led, as open data seems to be oriented sometimes. This is an essential key to give response to demand in an evolutionary way. Systems, networks, data storage or data centers, must provide the citizen with modifiable options in time to adapt them to current needs. In sum, reaching development and the totality of operations in a coordinated manner to promote the fast gradual and continuous development of applications and services.


Bibliographical references: 

1.- Acciones prioritarias para el desarrollo de las smart cities en España. CEOE May 2015

2.-Additional information about Gartner Symposium / ITxpo en Orlando, available on

3.- La Sociedad de la Información en España 2013 siE|15

4.-Mapping Smart Cities in the EU Roadmap 2016

5.- Notes on the Blog 2TI


Cover photo by Jonathan Pielmayer


September 16, 2016 Saskia Beer

Saskia Beer is a Dutch entrepreneur and founder of ZO!City (formerly known as Glamourmanifest) and TransformCity®. She was trained as an architect and worked for renowned Dutch and Japanese offices. In 2010 she decided to thoroughly redefine her role by unsolicitedly initiating local projects for making the city more attractive, inclusive and resilient. In 2011 Glamourmanifest unsolicitedly adopted Amstel3, a 250 ha office district in Amsterdam with a 30% vacancy rate. After the municipality had to withdraw from their top-down redevelopment plans, Glamourmanifest built a multi-stakeholder network and support base around the area transformation. The size and impact of the project evolved incrementally and since May 2016 it forms the test bed for the smart participatory urban planning dashboard TransformCity®. TransformCity® won the second prize (civic engagement) in Le Monde International Smart Cities Innovation Awards 2016. Also it is a best practice in the Citizen City Action Cluster of European Innovation Partnership on Smart Cities and Communities (EIP-SCC). As a leading urban pioneer, Saskia gives regular talks to both students and professionals and is actively involved in the international discourse about new strategies and technologies for urban planning. She lectured at various universities and attended several urban award juries.

Online citizen engagement is hot. We keep reading about it in funding calls, smart city websites, mission statements, research reports and governmental programs. And it is climbing on the agenda for a reason. The success of many urban programs and smart city solutions depends on a change in behavior or active input and support from a critical mass. Both governments and service providers are becoming aware that they need the active engagement of citizens for successful adoption of their solutions and programs. Then citizens need to be approached as active and equal partners rather than mere consumers.

As a startup in platforms for citizen engagement and participatory urban planning I could not be happier, of course. We just launched the first beta test pilot of our online participatory urban planning dashboard TransformCity® in Amstel3 office district in Amsterdam. The dashboard integrates storytelling, data sharing, co-creation, participatory democracy, crowdsourcing and crowdfunding. Now citizens, businesses, organizations and the government can directly exchange information and ideas and collectively plan, change and own their city or neighborhood. The online map shows all relevant objects like buildings, roads, parks and stations as clickable objects that hold basic and in-depth thematic information via both open and user-generated datasets. Apart from the current situation, the timeline features recent developments and future plans and scenarios as well. The full interactivity of the dashboard makes it easy to engage and respond directly to plans and projects. Everybody can share their own ideas and initiatives for the area, getting feedback from other people and from the official institutions and testing the local base of support. With the underlying crowdfunding infrastructure project-based alliances can be formed and resources combined. The rewards for this active engagement may vary from an improvement in the daily environment to actual shares in local projects and developments.

The idea for this dashboard rose a few years ago. We were already active as independent agents in Amstel3, experimenting with new forms of community-driven urban transformation after the local government had to withdraw from their top-down planning approach due to the financial crisis. Under the name Glamourmanifest we had been building a local network of different stakeholders by an ongoing mix of guerrilla events, campaigns and workshops. We used glamorous storytelling with a consistent set of metaphors, like champagne and high heels. We had gained trust and built a substantial database of local contacts, requirements, ambitions, priorities and existing initiatives. Also we encountered an increasing number of people with good ideas and willingness to contribute to their daily environment. There was still capital available for Amstel3, but it was scattered among all these different parties and disguised by different interests and priorities and in different forms, like money, time, material and knowledge. What if we could unlock all this and connect it directly to each other on a transparent and interactive platform dedicated to Amstel3? Could real projects then be formulated, executed and maintained by the community itself, without us being the linchpin and thereby possibly our own weakest link?

Since May of this year, we are testing, improving and further developing the dashboard live in Amstel3. The reactions are very enthusiastic and we are about to complete our first crowdfunding campaign for transforming a muddy social trail next to the station into a temporary park with public pavilion. Also we received many requests from other cities worldwide and we are preparing to scale and customize to different local contexts.

However the most important lesson we learned is that our offline community building and co-creation activities need to be intensified now we have the dashboard. To have maximum reach and impact with a new solution that is not part of people’s own system yet, it needs constant activation and effort to stay relevant and worth people’s time and attention.

A clever mix of online and offline activities and tools will get the most sustainable results. So while we thought the times of popping champagne bottles were behind us now we had all this smart technology, we actually just decided to order two extra pallets for this new phase. It’s a dirt job, but somebody has to do it.


August 23, 2016 Esteban Mucientes

Esteban Mucientes is financial executive officer at Cuchillo, online marketing consultant and president at AERCO-PSM. He has collaborated with various Spanish public administrations in the field of communication adaptation strategies, as well as offering formation on digital marketing to a number of public and private organizations. He is also a member of the Teaching Staff in the Master’s Degree of Digital Business at the Business School of the Official Chamber of Commerce and Industry (Valladolid) and the Master’s Degree BrandOffOn at the ESDEN Business School.

Today, the Internet is everything.

After this statement we could just shut down the computer while we nod our heads. But, no: data are everything. That is a fact. Without data, there would just be no Internet.

It is that curiosity what drove Tim Berners-lee to create the Web an eternity ago. Turn the lights off and play the video:

Why did I use this clip? Not just to know how easy it is to understand the motivation that conceived the web but also because even though it was a brilliant idea, linked data are not expressed clearly enough so that such an extraordinary idea can reach the general audience.

Reaching the general audience is hard. And we have to face many obstacles.

The first problem is that, despite having big storytellers such as Tim Berners-Lee, the world of open data is still seen as something made up for nerds and not-very-sexy geeks (we saw this back in the day) when it is actually behind the real base for the development of democracy in many countries in which this seems impossible. And, yes, when I say storytellers I mean names like Carl Sagan or Neil deGrasse Tyson; people with the capacity not only to make their message heard but also to become memes:

Sure, we know Tim Berners-Lee has a certain capacity to become a meme, but he is not as sexy as Neil deGrasse Tyson:


However, the problem is not just the story that is being told or the one who tells the story. There is a brutal accessibility problem: we do not explain things in plain language. There is still a gap in language that is not easy to grasp for many people. In many cases this is encouraged by a class differentiation we do not notice about but that, in the end, is a more-than-evident kind of contempt. We are advocates of the novelty but we get upset when there are other perspectives. I am sure that in many occasions you have heard the expression ‘meh, it’s made with WordPress’ or the handy ‘no, I only use open source because (add here some random argument concerning anti-capitalism, cost-opportunity and any other trifle).

And, finally, a global perspective about what open data really are is more than necessary. Because open data eventually need to mean the same in Mongolia as in Spain or Dubai, and have the same end, to open up our societies, make governments more transparent and get the citizens to know perfectly what their resources are being used for, let alone a series of those longed-for collateral effects: equality of opportunity, wider scientific progress, development of new niche businesses…

Let’s spread the word as it deserves.


Cover photo by Charlie Foster


August 11, 2016 Jean-Marc Lazard

Jean-Marc Lazard is the co-founder and CEO of OpenDataSoft, a turnkey SaaS platform designed for smart and easy transformation of all types of data into innovative services (APIs, data visualization, real-time monitoring). He was previously the Head of Innovation and Strategic Projects at Exalead (Dassault Systèmes Group) for 5 years, and led innovative IT projects in the food-processing industry and in retail for 9 years. Sparking off and monitoring several startup projects, he is also an expert at Cap Digital, the French business cluster for digital content and services. A graduate of EDHEC Business School, Jean-Marc also has a degree in Applied Mathematics.

This is a summary of a five-part series that looks at strategies cities have used to align and advance their smart city and open data goals. You can read the complete report online in a five-week series being published on (link), or download the complete report as a free PDF at

Five Strategies for Success

Over the past decade, open public data has come to be recognized as an important means of improving government transparency and accountability and deepening citizen engagement. But open data has also proven itself to be an important tool for developing innovative solutions that advance core quality of life, sustainability and economic development goals.

This is especially true today as cities seek significant improvements to essential tasks such as water, waste, energy and transportation management, with many of the highest impact solutions using information and communication technology (ICT) to optimize and automate services.

While at first the high-volume, real-time sensor data generated by such ‘smart city’ solutions was used by cities to meet internal analytical and operational needs only, more and more cities are seeking to make this data publicly available.

Illustrative list of smart city solutions from the Ministry of Urban Development, Government of India, June 2015

They are finding that providing open access to machine data is essential for engaging ecosystem partners, businesses, civic technologists and internal staff in the effort to find innovation solutions to today’s urban challenges.

Over the course of our work with cities over the past five years, we have compiled notes about what works and what doesn’t for cities as they seek to foster innovation through open sensor data.

In particular, there are five strategies we have seen our clients and others use to succeed in aligning their smart city and open data goals in the era of intelligent and connected cities. The strategies are summarized here. A detailed discussion of each strategy with case studies can be downloaded at no cost from the OpenDataSoft website at

Strategy 1: Sometimes the Smartest Tech is Low-Tech

When exploring ways to extract value from open sensor data, don’t overlook the invaluable role inexpensive, low-tech options can play in advancing smart city goals. In many cities across the globe – especially developing cities where budgets are constrained and population growth high – ‘low-tech’ collaborative technologies like mobile phones, social media, online platforms and low-cost sensor kits offer cities affordable alternative ways to collect data, use resources more efficiently, and make better decisions – while empowering citizens to play a key role in shaping the future of their cities.  For example:

  • In Maputo, Mozambique, a World Bank-sponsored waste management project uses crowdsourcing via mobile phones to gather input from citizens and waste collectors about trash removal issues.
  • In Jakarta, Indonesia, a real-time map of flooding in the city has been created by crowdsourcing flood reports from Twitter. Twitter is further being used by Jakarta residents to organize shared car journeys.
  • In Paris, France and Reykjavik, Iceland, citizens are using online platforms to propose, discuss and vote on ideas for improving the city, like the vertical garden project in Paris.
  • In Beijing and other Chinese cities, residents are starting to use low-cost sensors such as the PiMi Airbox to measure and map air pollution in their city.
By Jean-Pierre Dalbéra, Paris, France. Vertical garden (Musée du Quai Branly), CC BY 2.0,

Strategy 2: Go Small Before You Go Big

While you can accomplish many smart city goals in a timely and inexpensive manner by exploring options for leveraging existing low-tech, collaborative technologies, when the right path to take is investing in new equipment and systems, it’s a good idea to use pilot projects to go small before you go big.

Using a pilot approach is a great way to assess the feasibility, potential impact and return-on-investment (ROI) of a large-scale project before a full roll out. In addition, it is a strategy that can be very useful in working through issues of special importance in the smart city context, including governance issues like privacy and data security and ownership, and strategies for animating communities of civic technologists and start-ups to accelerate innovation.

Strategy 3: Collaborate, Collaborate, Collaborate

Defining a smart city vision and priorities is most likely to succeed when it is done in collaboration with residents and community groups. “Openness” in all forms is fast becoming a mandate, not an option.  Open data sharing and collaboration with civic tech communities and ecosystem partners is also essential for driving smart city success.

Enabling civic technology and business communities to join in smart city innovation extends the capabilities of what a city acting alone could do or fund, which is an enormous advantage given the budget constraints of cities everywhere. Encouraging city ecosystem partners in areas like transportation, energy and utilities to open their data as well can greatly accelerate the pace and impact of smart city improvements.

Strategy 4: Treat Your Sensor Data Like a Valuable Asset: It Is!

Sometimes, cities have difficulty persuading city ecosystem partners to open their data, even if it is the city that is paying the bill for the systems that generate the data. This is why more cities are beginning to incorporate openness as a mandate in all new city contracts, even if they don’t ultimately use all the data produced.

This shift is happening as cities increasingly realize that data is a very real and valuable asset. As this awareness grows, more cities are beginning to explore monetization options for their data, such as freemium plans that offer basic access for at no cost, but charge a fee for premium access for high-volume users. This can be invaluable in covering the cost of providing reliable access to streaming data for heavy users while keeping access free for civic technologists and start-ups who develop citizen-centric applications that support local economies, improve citizens’ lives, and help all make more efficient use of public services. It can also provide a much-needed revenue stream to help cash-strapped cities fund smart city initiatives or meet basic budgetary needs.

Strategy 5: Attend to the Tech Must-Haves

While there is much technology that can be sifted into nice-to-have and maybe-someday categories without a negative impact on smart city advancement, there are a few core technologies needed in order to extract value from smart city data.

One is an open data management platform that can provide data access to citizens, researchers, developers, city staff and city ecosystem partners (who should also provide access to their data to these same communities). Ideally, this platform should be natively designed to handle real-time data streams.

Second, cities need to be able to offer this access through robust, standards-based Application Programming Interfaces (APIs). APIs are software code interfaces that allow software applications to exchange data and services. In the context of smart cities, they enable a secure, reliable connection to continuously updated data for developers who want to build web or mobile applications, for researchers or analysts who want to plug city data into existing applications such as business intelligence software, and for external government agencies or ecosystem partners who want to integrate a city’s data.

Finally, cities must have easy data visualization and dashboard-building tools that even non-technical citizens can use. Visualization in the form of charts, graphs and maps is essential for helping human beings make sense of all kinds of data, and it is absolutely a must-have for the otherwise overwhelming big data collections of the type produced by real-time sensors and captors.

We hope these lessons learned will be useful to you as you seek to align your Open Data and Smart City strategies, and to explore options for ensuring that citizens and application developers have ultra-simple access to all the useful data your city produces. This includes the sensor data upon which many of the most engaging and transformative web and mobile-based applications will be built.

There is no doubt that high-tech digital transformation can have enormous impact in helping cities meet the environmental, social and economic challenges of population growth in a world of increasingly strained natural resources and a changing climate. However, even with the most technologically sophisticated solutions, success depends on making residents true partners in defining what ‘smart’ means for their community, and enabling their participation in shaping their city to fulfill that vision. And that means a smart city is first and foremost, an open city.

The complete version of this whitepaper with case studies can be downloaded at no cost from the OpenDataSoft website at


July 19, 2016 Reyes Montiel

Reyes Montiel is expert in public affairs, has worked on formulation, management and evaluation of public policies. Additionally, she has participated in political incidence activities for public decision-making and in negotiation projects at national and international level. Trained journalist, she currently develops 2.0 communications plans for public and private companies and entities. She works in strategies on economic mobilization and citizen participation that make sense out of open data services. Transparency activist, she participates in research projects for the causes of corruption and the evaluation of action plans against it, as well as specific education plans.

Read the first part of the article here.

Don’t do it on your own, tag your friends along

Accumulated experience in the framework of collaboration between public administrations and different social actors brings several positive experiences. The Open Data Charter suggests the convenience of introducing this kind of practices in different ways: with public-private partners, with programs and initiatives that promote dataset development, visualizations, applications and other tools, with the implication of the educational system, using open data as research tool and incorporating alphabetization programs in open data. In sum, the publication of open data by the different public administrations should constitute the cornerstone upon which we shold build a participation and improvement system in society.

Open data initiatives are not only platforms to open and share your own data. They must be also channels for dialogue with other entities and institutions in the private and social spheres that own data. One of the ways of fostering PSI reuse is the connection with other data, other experiences and completing not only our dialogue but also our analysis.

An example of this collaboration is Metrolab in the United States. Metrolab is a network formed by more than 20 cities partnered with local universities and research institutes to work in smart city projects. This way, functioning under this partnership, Rice University collaborated with Houston for a better distribution of cycling points, the University of Washington helped Seattle install sensors to predict the energy demand of its residents and MIT launched in Cambridge SENSEable, whose first application is predicting disease outbreaks based on the analysis of waste water.

Importance of common standards and default opening

Robert Behn, professor at Harvard Kennedy University, affirms that “one of the key aspects of an efficient organization, while not being paid enough attention, is the common definition of keywords, as they can have different meanings for different people and, if a common definition is not established, collaboration could drown due to poor communication.” Any piece of information can be built in several ways, but their value depend on a shared understanding. This is also applied to data, which is why standards are crucial in any open data policy. Data standards create a common structure that facilitates sharing information and the cooperation between organizations.

For that reason, one of the fundamental principles that inspire the International Open Data Charter is the default opening of data. According to this principle, data held by government organisms should have free access, use and reuse by citizens, without any kind of restriction, certifying their right for their privacy. The default open data principle demands –for a correct development– a compromise in the adoption of open data creation, reuse, exchange and harmonization standards and policies.

One of the most important examples is the French Licence Ouverte (Open License), created to give legal support to the publication of open government data without taxes. This license was designed during a public consultation process with the participation of both the government and civil society.

Specific professionals in organizations

We have been listening for a while that all those who handle, manage and use data will be key professionals in the future; that data reuse will provide new economic opportunities and for that matter it will be an important job-creation sector. A new managerial profile is making its way within organization, the Chief Data Officer or CDO, a position that Harvard Business Review magazine described as the “sexiest job of the 21st century.” When we think about its tasks, it reminds us of a technological profile that works with the management of information by definition –data collection, storage and distribution–. However, what must be expected from a CDO is their capacity to use data in order to devise feasible proposals and recommendations, as healthy data brings good public policies. According to DJ Patil, CDO in the US Government, since February 2015 “we have been needing experts in public policy, law, social services, environment, etc. and only then we must incorporate experts in technology. The longer they work together, the more efficient they will be.”

Cover photo by Samuel Zeller


July 12, 2016 José Luis Marín

Holding both a Telecommunication Engineering Degree and a Business Administration Degree from the University of Valladolid (Spain), Mr. Marín’s professional career has been developed at the company Gateway S.C.S. (owner of the brand “EUROALERT.NET“), where he is currently one of the partners and member of the management board, as well as CEO. In his position, Mr. Marín is supervising the huge challenge undertaken by Euroalert to build up a pan-European platform to aggregate EU public procurement data which represents about 18% of EU GDP, and deliver commercial services for SMEs and organizations all over the world.

He is author of the book “Web 2.0. Una descripción muy sencilla de los cambios que estamos viviendo”, published by Netbiblo, (2010) and co-author de “Open Data: Reutilización de la información pública” (2013). As a strong supporter of open source software, innovation, and the spread of free and open knowledge, Mr. Marín participates in many initiatives, projects and events such as those related with the Open Data movement. He has been speaker at events like FICOD09, PSI Meeting 2010, Digital Agenda Assembly, Share PSI or SICARM, and Universities like Oviedo, Almería or Girona, always with the objective to promote the release of public sector data in open and reusable formats.

Almost all governments worldwide and multilateral institutions such as OCED, the UN, the World Bank or the European Commission began their open data policies with the release of the statistic datasets they produce. Because of that, we have a big amount of indicators we can work with in reasonably accessible formats to study almost any issue, either environmental, social, economical or a combination of all these aspects.
Besides providing us with the datasets, in some cases they have created tools to access the data easily (APIs), and even applications that help us work with the indicators (visualizations).

These indicators follow periodic cycles which can happen monthly, yearly or even multiannual perdiods due to the high cost of their production. In general, the methodologies used to calculate the indicators are not available for citizens. In the best-case scenario, they are documented in a very superficial way on their fact sheet.

Now let’s imagine for a moment that the national social security systems, the company registers, the customs registers, the environmental agencies, etc, release the data they hold as open datasets in real time. One of the effects we could easily imagine is that a lot of indicators that these days are being released periodically could be known and, even better, explored in real time.

Besides, this would remove the possibility for anyone to get privileged information, considering that we all could have the same ability to analyze the evolution of the indicators to take our own decisions. Even more, we could customize calculations according to our own particular situation by working on the methodologies.

The fact is that in many cases, the production cycle of some indicators could be shortened until we get closer to ‘real time’, and the cost of production could be reduced greatly as well thanks to open government data.

Even though this is a big step ahead, I don’t think we should settle down with having the indicators as open data; we should aspire to examine the open datasets and methodologies used to calculate these indicators and even customize them, because if conveniently anonymized there is no reason for them not to be released as open data.

Cover photo by William Iven.


July 7, 2016 Adolfo Antón Bravo

Adolfo Antón Bravo, Ph.D. in Information Sciences (Journalism) from the Complutense University of Madrid (Spain), Mr. Antón’s professional career has been developed around communication: from graphic design to web design, from IT journalism to data journalism. Nowadays he works as a coordinator of data journalism working group at Medialab-Prado (Madrid), where there are monthly training sessions, an annual conference and a ddj production workshop.

In the last years he has organized the Sixth International Meeting of Research in Information and Communication (2013, UCM), the Spanish Data Journalism Conference (2014, 2015 and 2016); he has coordinated Data Journalism Production Workshop, a kind of hackathon (2014 y 2015, Medialab-Prado), the exhibition Ojo al Data (Mind the Data, 2015, Medialab-Prado) and the dataviz workshop  Visualizar15 (2015, Medialab-Prado).  He has also been a lecturer in data journalism masters at  Villanueva and UNIR (2015-2016) and trainer in ddj at Cinco Días, Huffington Post, Escuela de Unidad Editorial or Asociación de la Prensa de Madrid.

Member and president of the Open Knowledge Spain , he has been involved in some projects as the School of Data. From time to time he writes in about information and knowledge technologies.

Lately, virtual reality (VR) has been one of the most talked-about topics: VR as one of the niches that current journalism is exploring from different perspectives; a type of digital narration that recreates a specific scenario as close to its totality as possible (through audio and video fundamentally), also including other sensors, providing the fullest possible experience of its first person to anyone who uses a VR device. Immersive storytelling is accompanied by ad hoc devices such as VR headsets and big productions.

Augmented reality (AR) seems to have more interest than any other technology that also uses virtual reality, not as its fundamental base but as an extra element. The Guardian had already seen a great future for augmented reality back in 2010 (1 and 2), as its articles and columns revealed. However, we have heard more about VR and 360º cameras in recent journalism conventions and conferences or even in media than about AR. Robert Hernández, journalism professor at the University of Southern California, has spent several years studying these new forms of storytelling and has carried out some AR experiments with his students and Los Angeles public library,. In 2015, he acknowledged in the prestigious NiemenLab report about journalism trends that 2016 was going to be the VR year (VR/AR/MR).

Augmented reality provides countless possibilities for the interaction of users with their environment, as it is the actual user’s vision of reality that is added virtual elements (Vallejo, 2010); a mixture of virtual reality and real life. It can be used wherever people want to interact with their environment, interposing different layers of information or content between ocular vision and reality through data-input devices (like a camera) and data-output devices (like a mobile device screen).

Knowing the user’s geographic location is fundamental to create appropriate layers of content. Contrary to VR, one must be geo-located in AR for a full experience. This location can be detected out through RFID (Radio Frequency Identification) devices, QR codes (Quick Response Code), bar codes or geo-location. This means that reality can be added one or several layers with the contents one can interact with, and some of them can even be VR. Therefore, AR includes VR, as it allows interaction with virtual objects (contents) and with real objects (contents). If VR wants the creation of a virtual world to be as real as possible to achieve the fullest user experience, AR already has reality as scenario.

Why is AR interesting for the open data community?

Once the interest that AR can have for journalism has been tested, are there any good assets that could benefit the open data community? Fortunately, the answer is affirmative: technology. While VR uses VRML (Virtual Reality Modeling Language), in AR different actors have worked with the purpose of achieving a unified language for their work on AR applications: ARML (Augmented Reality Markup Language), a markup language that uses XML to represent three-dimensional geographical data that is widely used on tools such as Google Maps; it is a descriptive data format used to map points of interest (POI) and add metadata into them. It was created in 2009 by the members of Wikitude World Browsers to allow developers to create contents for AR search engines.

Versions 1.0 of ARML and KML, both are similar, descriptive and proprietary data formats, while version 2.0 of ARML is currently being promoted by OGC (Open Geospatial Consortium) and its ARML 2.0 Standards Working Group. It allows dynamic parts that modify those properties defined in the descriptive area, defines a group of events to work on interaction with, extends options of presentation of POI to visualizations such as 3D objects, lines and polygons, and it will provide a connection port for tracking methods (visual and audio).

Therefore, any individual, institution, journalist or media company can use and take advantage of the advances that are being made in ARML 2.0. A way for standardizing tracking, sensors or any other hardware-software interaction such as the algorythms for projection on screen, POI queries, POI storage formats and data transmission protocols, have been left out of reach from this work group (Lechner, 2015).

Meanwhile, the W3C (World Wide Web Consortium) has developed the Basic Geo WGS84 lat/long vocabulary, which offers location data in RDF (Resource Description Framework) for POI, allowing the exchange of these data between applications. Besides, thanks to linked data, it allows exchanging them with other geographic information repositories such as Geonames, or with other vocabularies such as FOAF (Friend Of A Friend).

The AR group in the W3C community has shaped the Augmented Web concept, the intersection of AR and the Web, a combination of HTML5, Web Audio, WebGL and WebRTC that improves the user experience of those who visit websites. Now, we only have to use it. Who wants to be first?


Bibliographic references

Lechner, Martin. OGC Augmented Reality Markup Language 2.0 (ARML 2.0). Open Geospatial Consortium,, accedida en 2016-05-01.

Vallejo Martín-Albo, César. Realidad aumentada, 2010-09-29,, accedida en 2016-05-01


Cover photo by sndrv


July 5, 2016 Daniel Sarasa

Daniel Sarasa holds a Master’s degree in City Sciences. Urban Innovation Planner and Program Manager at Zaragoza’s Smart City Department. Team ideologist & launcher of “Etopia. Center for Arts and Technology”, Zaragoza’s Open Urban Lab. Co-author of “Zaragoza’s Open Government Strategy 2012-2015. Towards a Smart Citizenship.” Co-editor at

In an era of increasing urban concentration, the multiple risks that our society faces at the environmental, social or economic spheres can be better addressed by adopting innovative and well-informed urban policies. Those policies can not ignore the potential offered by the set of processes and technologies grouped under the generic ‘big data’ buzz-word.

If we take an historical angle, after the ‘big bang’ explosion of data, the new ‘dataverse’ is experiencing a quick inflationary phase. An inflationary expansion that is everything but homogeneous. Clearly, Internet businesses are leading the way fueled, first, by the inherent use of big data related technologies, some of which are even powering the development of the general concept of big data itself, and, second, by the highly competitive and innovative markets in which those companies operate.

However, the use of big data that those Internet giants make is only serving the purposes of their internal business development. In some cases, those companies are disrupting local economies in areas such as transportation or accommodation while bypassing local regulations. Über is a good example of a new paradox. The serious blow that they afflict to the community of local cab drivers not only affects self-employment in the city but also leads to a reduction in the overall local tax collection. To aggravate the bleeding, their systems may very well use, for instance, the data about road outages that the city hall releases in open data formats for routing optimization purposes, while the company locks the vast amount of valuable information gathered through their daily trips around the city.

There is astonishingly (although still some) little evidence of big data providing actual value for cities and communities compared to what it brings internally to the most advanced companies, besides the indirect benefits that provides, through the explosion of new Internet businesses, in those communities where the companies that are effectively using big data at their core of their businesses sit and pay most of their taxes.

As a consequence of this gap between the value that big data provides internally for private organizations and what it does for the general public, we infer that there is a clear potential to increase the economic and social dividends of big data. In the absence of city, state, or nation-wide regulations about data sharing schemes, we feel there is room for an organizational and conversational approach to a mutual relationship between key city players (public and private) that would allow sharing big data to create ‘big knowledge’, in ways that favor public interests while protecting individual privacy and legitimate business assets.

We have been exploring this idea for some time in conversations with researchers at home. Now, and thanks to the support of Real Colegio Complutense, a joint institution between Harvard University and several Spanish universities, we are extending this exploration to Boston area, one of the most innovative milieus in the world. In the following weeks we will meet with prominent urban practitioners, academics and researchers to shape this idea of a ‘shared, city-wide approach, capable of turning dark urban big data into local civic energy’.

We’ll keep you posted about our findings!

P.S. If you feel you have an interesting story on how sharing big data can add social and/or economic value for our cities, we will be more than happy to hear it (just post a comment so we can get in touch).

Cover photo by Luis Llerena


June 23, 2016 Geraldine Garcia

Geraldine is the editor in chief of the blog about open knowledge, “Abierto al público”, and a communications consultant in the Knowledge Management Division at the Inter-American Development Bank (IDB). 

She has a Bachelor’s degree in Political Science from the University of Buenos Aires. During her second year as a student in the Master’s in International Studies program at Torcuato Di Tella University in Argentina, she participated in a study exchange program at the GW Elliott School of International Affairs, which brought her to Washington DC.

Before joining the IDB, she spent five years working for PR agencies as a consultant on communications and public affairs for multinational companies in Argentina, where she developed press campaigns, communication and crisis management strategies, and plans for networking with government. Previously, she also worked as a contributor for the traditional English newspaper in the Argentine capital, the Buenos Aires Herald.

Open data has become an essential resource for promoting transparency and innovation. The growing enthusiasm for open data is reflected by the adoption of the International Open Data Charter by 20 national and municipal governments since its launch in October 2015.

Nevertheless, as the debate surrounding why data should be open comes to a close, a new one has started that focuses its efforts on how to measure its results and impact.

To analyze the developments on this subject, I’ve compiled the following list of six resources, whose interactive displays and case studies you can use to explore open data’s progress and actual impact at a global level.

1. Open Data Inception

This interactive map allows you to explore over 2,500 open data portals around the world that are geotagged by country. The interesting thing about this visualization is that it consolidates different sources of information on the open data portals of cities, countries and organizations in one place.

The aim of this project is to facilitate access to and the search of open data. To that end, it includes a search function so you can quickly filter by topic or place of interest. At the same time, the platform allows you to add portals to the map. You can also access the complete list of portals on the OpenDataSoft website and the API.



This is a map of 519 portals curated by a group of open data experts from around the world, including representatives from local, regional and national governments, as well as international organizations such as the World Bank and numerous NGOs.

Its information is also available on GitHub and in CSV or JSON formats.


3. Global Open Data Index

This is an initiative of the Open Knowledge non-profit network, which evaluates the state of government open data around the world. In its third edition launched in 2015, 122 countries and five new datasets are assessed, including data on government procurement, water quality, land ownership, weather forecasts, and health performance.

The index is based on a global survey, a consultation with civil society, and a discussion forum. You can access the complete description of its methodology here.

The main challenge this index reveals is that only 9% (156 out of 1,586) of the world’s datasets are open. In fact, this number represents a decrease in comparison to the 12% recorded in the previous edition.


4. Open Data Barometer

The Open Data Barometer aims to demonstrate the impact of open data initiatives around the world. It reveals that only 10% of government data is published as open data. In addition, only 8% of governments publish data on public spending, while only 6% of the assessed countries have opened their data on public contracts.

The Barometer covers 92 countries, which it ranks based on three criteria: First, readiness for open data initiatives. Second, the implementation of open data programs. Third, the impact that open data is having on business, politics and civil society.

As a result, this tool allows for the analysis of global trends, and provides comparative data on countries and regions based on a methodology that includes contextual data, technical assessments and indicators. You can access the visualization of its rankings here, and the complete report here.

It is the third edition produced by the World Wide Web Foundation, as a result of collaborative work with the Open Data for Development (OD4D) network and the Omidyar Network. The initiative includes input from 150 researchers and government representatives, and involves more than six months and over 9,000 hours of research work.

In addition, this edition includes an assessment of countries in relation to the principles of the International Open Data Charter.


5. Open Data Inventory

This tool by the Open Data Watch organization assesses the coverage and openness of data available on the websites of the national statistical offices of 125 low- and middle-income countries. Each assessment covers 20 categories of social, economic and environmental statistics.

With its inventory visualization, you can explore coverage and openness scores by country and by overall position in the rankings. In addition, the 2015 edition includes a report which analyzes the state of affairs in terms of the openness of national statistics.

The country with the highest score in the inventory is Mexico (68%), followed closely by Moldova (66%) and Mongolia (64.5%). The county with the lowest score is Uzbekistan in Central Asia (3%), then Haiti (3.7%) and Swaziland in southern Africa (6.4%).


6. Case studies on the impact of open data

This year New York University’s GovLab launched a new repository of detailed case studies on the impact that open data is having on the world.  To date, it includes 25 cases and a report with recommendations.

The case studies focus on four ways open data makes an impact. First, improving government. Second, empowering citizens. Third, creating opportunities for citizens and organizations. Lastly, solving public problems.

You can learn more about this initiative and noteworthy case studies in the region by reading this Abierto al público blog post written by GovLab.




June 16, 2016 Reyes Montiel

Reyes Montiel is expert in public affairs, has worked on formulation, management and evaluation of public policies. Additionally, she has participated in political incidence activities for public decision-making and in negotiation projects at national and international level. Trained journalist, she currently develops 2.0 communications plans for public and private companies and entities. She works in strategies on economic mobilization and citizen participation that make sense out of open data services. Transparency activist, she participates in research projects for the causes of corruption and the evaluation of action plans against it, as well as specific education plans.

Read the second part of the article here.

Numerous open data portals have been launched during the past few years. There are about 80 open data initiatives from different regional and local governments in Spain. And not only are they local or regional governments, but also various sectoral public entities such as AEMET, Confederación Hidrográfica del Júcar[1] or the Port Authority of Melilla join the opening of their public data. We have a broad variety of initiatives with different approaches and once reached the point of having regulation, portals and data, what is the next step? Where are we headed with open data?

Open data as a tool

Despite the significant number of existing portals, there is an actual perception that there is no “investment return.” The pending subject is the reuse of public data, and the first reuse is internal. So far, governments that have implemented open data initiatives have seen open data portals as a complement to transparency. Regardless of the policies of promotion of reuse that have been enacted in each place –some more intense than others–, the clearest indicator that confirms that the administration is not aware of the value that their data can generate is that not even this administration uses their data openly. Departments (inwardly) and institutions (outwardly) are still watertight compartments. Open data portals are designed to unfold and share information in an organized way and own and publish online public information means having a data bank for administrations that is fundamental for the formulation and management of public policies that need to be enacted.

Open data are seen as a menace by administrations, but it makes life easier for governments and civil servants because value does not lie in “owning information,” but in capturing its value. Open data unfolds interdepartmental and interdisciplinary perspectives that save resources and improve decision-making.

 How to make a tool of open data?

First, working on the demand side and not as much on the supply side. It is necessary to make open data plans and strategies more collaborative, as it is being done in Ireland or the United Kingdom. It is important to know the catalogue one has access to but it is also fundamental to know the inventory of those that are not published, that is, to know what is available to request it. In this regard, again the United Kingdom provides citizens with the inventory of currently unpublished datasets, the catalogue of those data that can become available through a call to citizens so that they select those that are interesting and prioritize.

Second, it is important to consult users for those structural revisions of existing data to adapt them to current social needs. The retrieval of data and their treatment need resources, and it is important to prioritize according to these demands. An example of these actions that seek the approximation between data supply and demand can be found in initiatives such as the European Data Portal to get acquainted with the demands and needs of use of data from small and medium-sized businesses.

It is also important to divide strategies by sector in order to identify interesting datasets of value to establish an innovation ecosystem based on data use: health, economy, environment, social inclusion, agriculture, etc. Open Laws is a very interesting example in the legal sector. It is also necessary to explore how citizens can capture the value of their own data applying the open data philosophy to personal data. in the United States explores data use about citizen energy consumption so that they manage their own demand.

[1] Water Confederation of the Júcar River (Spain)


Cover photo by Samuel Zeller

Use of cookies

This site uses cookies in order to improve your user experience. By continuing to use the site, you are agreeing to the use of cookies and accepting our cookies policy. .