Building an open data ecosystem one bit at a time
26 Jul 2014 Marcel 0
Smartphones, the Web and in-car location devices did not exist when the military created the Global Positioning System. But the now-familiar services built around GPS, from Yelp to public transit apps, would not have been possible had the Defense Department kept it closed to the public.
When governments make technologies like GPS open, there are cultural and economic benefits, even if it’s unclear at the outset what they are. The same might be said when city governments open their internal data to Web developers, citizens and entrepreneurs.
“We know that there’s a lot of possible benefits from open data. It’s impossible to predict ahead of time how that’s going to manifest in a particular city,” said Jason Denizac, Web developer and Code for America fellow currently residing in Chattanooga.
Chattanooga city government is one of many hoping to cultivate an ecosystem between open data publishers and consumers, and it’s taking steps to make more of its internal data publicly available.
Mayor Andy Berke recently signed an executive order to implement an open data policy, which outlines the steps his administration plans to take in the next year and establishes some of the technical specifications that should be used when publishing data.
Additionally, the City Council approved an outcomes-based budget that will measure policy goals against key performance indicators. The budget requires city departments and publicly funded agencies to identify datasets that can be used to measure performance.
In an interview, Berke said the collection and dissemination of city data will provide transparency for the public, and it will allow the government to make course corrections between budgeting cycles.
“All these efforts are working together,” he said. “By collecting this data, if we see something that’s not going right, we can go back and say, ‘What’s going on here? How can we make some corrections?’ Or if something’s going really well, we can try to bump it up to the next level.”
The online dashboard, ChattaData, slated to go live in September, will show progress on policy outcomes. A new office of performance management will monitor departments and agencies to ensure goals are being met. The office will work daily with departments to help them do their jobs better, Berke said.
The mayor hopes that the city’s internal data will power new applications that can solve civic problems.
But making the transition from data collector to open data publisher is not as simple as flipping a switch and releasing a stream of ones and zeros. There are legacy issues to resolve, contracts with software vendors and technical barriers to overcome.
Officials meet weekly at City Hall with representatives of Code for America, Open Chattanooga and other organizations to determine which datasets require minimal effort to publish but can yield the most value for citizens.
The city might initially convert static budget figures to a machine-readable format, for example, whereas dynamic information, like 311 records, is always changing and requires more effort to publish and maintain. The goal is to eventually make many of those real-time streams available, too.
Some of the systems used in city government have been “broken for so long that we’ve really got to evaluate the quality of what we’re putting out there,” Chief Policy Officer Stacy Richardson said. “That’s where a lot of our energy and time has been focused.”
The city has some outstanding contracts with companies that lock data in closed formats. In those instances, there are legal hurdles. A bulk export may require special authorization. The name of an address field in another system may be proprietary. Those small speed bumps add up and delay the overall release of information, Richardson said.
The Public Library received funding last year from the Benwood and Knight foundations to build an online portal to host this data. City government plans to give an additional $92,700 for the project.
Developers gathered earlier this month for Hackanooga, a 48-hour hacking event. Representatives from Code for America and Socrata used the library’s portal to create visualizations. One team worked on an app to help students with disabilities find colleges that can accommodate their needs. Another worked on a system to explore map data in 3-D.
Chicago has the kind of robust tech culture wanted here. In 2010, developers gathered for an event called City Camp. Two years later, the mayor ordered an open data policy, and his administration began making more city data available. Now, more than 1,000 datasets are posted online, and numerous hacking groups build municipal apps.
There are several reasons developers are drawn to the municipal level, according to Derek Eder, Open City co-founder: It’s unchartered territory. The information is new and unexplored. There’s a lot of cultural overlap between open data and open-source communities. The data deals with issues the average citizen can relate to, like crime or school performance.
“The data that you get from cities, and the issues that cities are dealing with, are much more tangible, immediate and real than the issues you see happening on the federal level,” Eder said.
Plus, citizens have more influence in city governments than they do in Washington, D.C.
One Open City app poses a familiar question: is there sewage in the Chicago River? Not only does it provide a mechanism for accountability, the app includes a primer of why it happens and what steps are being done to prevent it.
Out of that project, Eder’s company has recently had business meetings with the governmental body that treats wastewater in Chicago.
“They saw what we can do, and they want to have an app like that for themselves,” he said.
As in Chicago, the Berke administration wants developers to use data to solve civic problems and test new products but also hopes its efforts will encourage other local governments, organizations and businesses to follow its lead and put their own datasets on the library’s curated portal.
“I think there’s a lot of high-value datasets out there,” Richardson said. “The library’s been in the business of curating information for hundreds of years. Why should digital information be any different?”
Tags: open data
Popular Last 7 Days
- What you missed in Big Data: real-time analytics racing forward in the enterprise 83 views
- 50 Great Examples of Data Visualization! 21 views
- Coca Cola reveals big data-driven operational improvements 10 views
- De historie van de diskette 8 views
- Google Now Provides a Big Data View of the Ethereum Blockchain 7 views
- Ten big data case studies in a nutshell 6 views
- Roundup Of Analytics, Big Data & Business Intelligence Forecasts And Market Estimates, 2014 6 views
- The Driving Need for Analytics in a Big Data World 5 views