How do "Citizen Scientists" stack up against the experts?

Anybody can contribute to the progress of science! There are plenty of "Citizen Scientist" projects that let you help with scientific research. But one question lingers: How reliable are the results produced by non-experts? A new study provides some good news.

The new research, which was published last month in the open-access journal PLOS ONE, focused specifically on the Geo-Wiki Project. The goal of the project is to improve the quality of global land cover maps, said study lead author Linda See, a geographer with the International Institute for Applied Systems Analysis in Austria.

There are several global land cover maps that scientists often use in their research, including GLC-2000, MODIS and GlobCover. However, these maps frequently disagree. "When I overlay the maps, one of the maps may tell me that there should be cropland here, while another map says it should be grassland," See told io9.

Improving Participation With Competitions

To help correct the discrepancies, the Geo-Wiki Project enlists citizen scientists to analyze those points of disagreements — using a Google Earth interface on the website — and determine the land covers that are really at those spots. Of course, getting people involved in citizen science projects isn't always easy, so the Geo-Wiki overlords began running themed competitions, in which people could win Amazon vouchers and co-authorship on scientific papers resulting from the competitions.

One of the competitions focused on the overall land available for biofuel crops, See said. Previous estimates had said that there are between 320 million and 1411 million hectares available for bioenergy production. But when See's colleague, Steffen Fritz, looked at some of the maps used to come up with the estimates, he immediately knew something was wrong. "He looked at the maps and was incensed," See said. "He thought, 'This is crazy, there's not that much land available.'"

So the researchers created a competition around this issue — they took samples from the biofuel maps and instructed their Geo-Wiki volunteers to determine the type of land cover and the degree of human impact in each 1 km by 1 km Google Earth pixel. "The idea was that if the land is already being used and you can clearly see this, then it can't be used for biofuels," See explained, adding that certain land covers aren't right for biofuel production either.

For about 2 months, the team had more than 60 participants analyze some 53,000 locations. Around half of the participants were considered "experts" based on their background in remote sensing or spatial sciences. For each location, the volunteers had to identify the type of land covers in the images, including tree cover, shrub cover, grassland, wetland and barren (desert), among others. They also had to determine the degree of human impact (in percentages) by looking for signs of such things as roads, deforestation and agricultural fields. Finally, the volunteers indicated how confident they were with their analysis for each point.

See, Fritz and another colleague got together to discuss and come to an agreement on around 299 control points. They determined how well the experts and non-experts did in the overall competition by comparing the volunteers' answers on the control points with their own. They chose winners based on both the quality and quantity of the analyzed locations.

How do "Citizen Scientists" stack up against the experts?

How do "Citizen Scientists" stack up against the experts?

Experts Vs. Non-experts

Looking at the control-point data, the researchers found that the experts and non-experts were equally able to determine the level of human impact in the images. However, experts did better than non-experts at identifying land cover, though not by much — they were correct 69 percent and 62 percent of the time, respectively.

Some land covers are just difficult to identify, See said. Sometimes grasslands look like barren land, for example. And then there are issues with definitions. "When does a shrub become a tree?" she said. Some of these commons errors, however, can be systematically corrected for.

The results from the competition were staggering: The researchers used the citizen scientists' contributions to downgrade the amount of land available for biofuel crops by almost 80 percent.

Geo-Wiki Project leaders are now in the process of updating the system to help improve the accuracy of its citizen scientists. For one thing, future competitions will have test points that pop up every now and then — if you get it wrong, it will explain why. Programmers are also figuring out how to implement social interactions into the competitions, which will allow users to discuss the images and learn from each other.

"The main point right now is that we can use the information provided by citizen scientists," See said. "They can be just as reliable as experts."

Check out the study in its entirety in PLOS ONE.

Top image via USGS. Inset images via Linda See/Geo-Wiki Project.