THE STATE OF OPENNESS, PART 3: MACRO-OPENNESS & THE WHITE HOUSE OPEN DATA INNOVATION SUMMIT




Written by Sam McClenney. View original article >

The topic of macro-openness and the idea of openness as an institution (which is the next topic) overlap a good bit and so in the interest of not being redundant, I’m going to try and keep them as separate and unique as possible. That means this Medium post is going to be a little shorter than the others I have done. It also means that it will be more focused as well.

Pretty much all of my Medium posts up until now have been about openness at a very local level, aka micro-openness. Here’s an interesting question though. How do all these grassroots organizations or open evangelists know when something is open? What are the requirements for data to be open? What technological barometers have to be met for something to be considered open source? Who makes these decisions? How are standards set for openness? All of these are questions that we answer by looking at openness at a macro level.

The White House Open Data Innovation Summit was held concurrently with Data Transparency 2016 on September 28, 2016. These two events provided a state of the union on open data. Individuals and companies that make their bones on open data descended onto Washington DC for an honest talk on where open data is at the federal level in comparison to the rest of the world. My interpretation of what was shared on social media is that there was some putting on the back but much of the discussion was on the lack of progress in meeting the standards and benchmarks set by the Obama administration for opening federal government data.

I’m sure that there was much discussion on changing the culture at the federal level around open data. I’m confident that talks were held around building a federal ecosystem of data where standardized spreadsheets are normal. Let me be as frank as possible about this: 95% of these conversations were complete and utter bullshit and meant nothing for the future of openness. I’m sure that will piss off a few of the open pundits who spoke at the summit. I’m fine with that because this is something they need to hear.

Before I get too far into my rant, I do want to provide one caveat. The OPEN Government Data Act is currently in the process of being passed. I’ll link to an article by the Data Coalition that describes the impact of this act.http://www.datacoalition.org/the-open-government-data-act-a-sweeping-open-data-mandate-for-all-federal-information/

The OPEN Government Data Act legitimizes President Obama’s open data policy by making it the law. It requires federal agencies to publish information in open formats to a centralized inventory. In short, all information must become published open data. That’s a game-changer. Open data would be the law, not just a priority. However, with that said, none of this matters if open data professionals don’t change their mindset. I’m about to hurt some feelings.

Here’s the truth: The majority of these people working in open data have lost so much of the point that they can’t find their way outside of their Socrata or CKAN portal. They have forgotten that open data was never about getting a medal for having a cool website. It was never about having the most datasets available. It was and has always been about meeting citizens halfway and giving them access to the information that they have a right to. Anything else is just noise.

I suggest taking a look at this blog post that my mentor Jason Hare wrote. He says it well. https://www.linkedin.com/pulse/white-house-open-data-innovation-summit-what-i-said-jason-hare?trk=prof-post

He mentions that there was some celebration around the 200k datasets that are on data.gov. Cool, but that might be the dumbest marker of success I’ve ever heard. How many of those datasets are getting reused? I’m guessing not a lot if as he says, a decent amount of them are in HTML or PDF format. I mean seriously, how lost do you have to be to think that a PDF counts as a dataset? It’s actually embarrassing.

I know it’s not a walk in the park to change protocols within the different departments of the federal government. I’m sure that there is plenty of resistance around requiring certain metadata and data formatting. I get it. That doesn’t mean we have to half-ass this. For some reason, the Fed has this idea that quantity is more important than quality. This belief that if they throw enough mud on the wall, not only will some of it stick, but it will create a work of art that can be held up for the world to admire? Sorry, but that’s not how any of this works.

Open data people need to get out of their portals. They need to leave their desks, walk out of the office, and go to their nearest coffee shop. They then need to sit down with somebody who has no idea what open data is, and ask them the following, “What information can we provide you that you don’t already have?” It’s that simple. It’s not necessarily that easy, and it might require many trips to Starbucks, but seriously what else do you have to do? Work in your portal and update metadata? Trust me, this is much more worth your time. Open data’s biggest assets are the people who use it, not the data itself. Demand-driven, rather than supply-driven. It’s about time we start acting like it.

I will challenge the status quo of open data even more at All Things Open later this month during the lunchtime open data panel that will be moderated by Jason Hare. There’s a link below that provides more information. It’s going to be a good time, and there will be open data leaders on both the local and national levels present. https://allthingsopen.org/talk/open-data-panel-discussion/

Thanks again for reading, and look for Openness as an Institution later this week.

Sam is an open data enthusiast from Raleigh, North Carolina. He is currently in the process of starting his own open data services and consulting firm, Samuel H. McClenney Associates, and will soon be able to be found atwww.smcclenney.com

For now, he can be reached at sam@smcclenney.com

Original article: https://medium.com/@sam_44726/the-state-of-openness-part-3-macro-openness-the-white-house-open-data-innovation-summit-d7abf9d1bad9#.ynswmd2ty

Comments

Popular posts from this blog

Podcast: Open Data Discussions with Anthony Fung

WHITE HOUSE OPEN DATA INNOVATION SUMMIT - WHAT I SAID, WHAT I MEANT TO SAY

Open Data Licensing