Customer Login  |  

The open data movement is a juggernaut. In the years since the Freedom of Information Act, citizens have come to expect access to public information of all kinds in the easiest, fastest possible way.

At first, federal government agencies struggled to make their information available to the public. Data was typically old, inconsistent, and formatted in unwieldy ways. Decades of pressure from government transparency and accountability advocates, however, have raised expectations for timeliness and usefulness of government data. These higher expectations have seeped down to the state, county, and municipal levels with the result that the quantity and quality of open data have never been higher.

Not everything is rosy, though. The march of progress toward ubiquitous, robust, and timely open data is now confronting perhaps its biggest obstacle: Accurate data makes some people look bad. This wouldn’t be a problem if not for the fact that the people most likely to come off looking bad are the very people required to furnish the data in the first place.

“Looking bad” when data on your work is analyzed is not just unwelcome news to these government employees. It can affect their livelihoods if they’re fired or not promoted. It can even expose individuals and agencies to lawsuits. Imagine the simple case of deciding where to place a fire station. If a municipal employee’s data analysis leads to putting the station someplace where they can’t reach certain neighborhoods quickly enough, then people can actually die. Most jobs do not have this life and death aspect to them, but many government jobs do.

Ted Lehr, chief information architect for the City of Austin, has been the pied piper for open data in our fair city and he recently observed that risk aversion in government work is a fact of life. This means that the move to “born open” government data will likely face an uphill battle as we get closer to 100% openness. The last holdouts of government data repositories (most likely public safety and public health agencies) may only become open when the fallout from the inevitable analyses of those data can be mitigated in some way.

Also, there’s no universally accepted model for the recovery of the government’s expenses in making their data accessible. An agency in one county may offer a given set of their data for free as an API while another county charges thousands of dollars for a DVD of the same data.

There is good news for publishers and other value added resellers of government data in all this: without a universal model for access to data, there will still be a place for aggregation, normalization, curation, and independent analysis.

{ 0 comments }

posted by Shyamali Ghosh on April 8, 2015

As the information industry continues to evolve, we at IEI are convinced that crowdsourced updating will play an ever larger role in its underlying foundation. No longer will databases languish waiting for periodic updates. Rather, they will be constantly updated via a combination of user feedback, event-driven triggers, and crowdsourced revisions overseen by professional editors.

We manage our customers’ crowdsourcing projects via the WorkFusion platform – enterprise software that allows us to handle hundreds of thousands of tasks per month (1). It helps us keep complete control over multiple concurrent projects, diverse workforces, and the quality of the data produced.

We chose WorkFusion over its competitors for a few key reasons. It handles the same core tasks as other platforms (2) and has robust process design tools. It was created with private crowds and blended crowds in mind (3). It includes a repository of automated routines (“machines”) that we can insert at the beginning, middle and end of complex processes. In short, WorkFusion-powered processes allow projects to scale up and down quickly with consistently high-quality output.

Managing crowdsourcing projects requires intimate knowledge of the platform and mastery of several process design and management tasks (4). IEI’s project managers have decades of experience handling complex publishing processes professionally, and we’ve been using WorkFusion for three years now. This experience ensures that we manage data processes better than firms that are new to data management or that have reluctantly grafted project management services onto their software businesses.

Our experience also means that we don’t set unrealistic customer expectations on the cost savings possible via crowdsourcing. This approach is not just or even primarily about saving money on labor. It is about speed, scale, quality, and consistency.

Finally, WorkFusion’s focus is on building the most robust project management tool possible. Leaving the actual project design and management of crowdsourced processes to its customers (and specialists like IEI) means that it will continue to push the limits on what crowdsourcing can achieve. This allows its customers to reap the “better, cheaper, faster” benefits of a world where labor is flexible and data processes can be automated and managed to a degree that was unthinkable only a few short years ago.

{ 0 comments }

posted by Shyamali Ghosh on April 1, 2015