Next Article in Journal
Satellite Attitude Determination and Map Projection Based on Robust Image Matching
Next Article in Special Issue
Citizen Observatories and the New Earth Observation Science
Previous Article in Journal
High Resolution Aerosol Optical Depth Retrieval Using Gaofen-1 WFV Camera Data
Previous Article in Special Issue
Crowdsourcing In-Situ Data on Land Cover and Land Use Using Gamification and Mobile Technology
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Citizen Science and Crowdsourcing for Earth Observations: An Analysis of Stakeholder Opinions on the Present and Future

Department of Computer Science, University of Sheffield, Sheffield S1 4DP, UK
*
Author to whom correspondence should be addressed.
Remote Sens. 2017, 9(1), 87; https://doi.org/10.3390/rs9010087
Submission received: 12 August 2016 / Revised: 25 November 2016 / Accepted: 11 January 2017 / Published: 19 January 2017
(This article belongs to the Special Issue Citizen Science and Earth Observation)

Abstract

:
The impact of Crowdsourcing and citizen science activities on academia, businesses, governance and society has been enormous. This is more prevalent today with citizens and communities collaborating with organizations, businesses and authorities to contribute in a variety of manners, starting from mere data providers to being key stakeholders in various decision-making processes. The “Crowdsourcing for observations from Satellites” project is a recently concluded study supported by demonstration projects funded by European Space Agency (ESA). The objective of the project was to investigate the different facets of how crowdsourcing and citizen science impact upon the validation, use and enhancement of Observations from Satellites (OS) products and services. This paper presents our findings in a stakeholder analysis activity involving participants who are experts in crowdsourcing, citizen science for Earth Observations. The activity identified three critical areas that needs attention by the community as well as provides suggestions to potentially help in addressing some of the challenges identified.

Graphical Abstract

1. Introduction

Over the last few decades, satellites have taken a primary role in a large number of our daily activities. Billions of users across the globe consult weather services, navigation applications, send and receive data, and share information online via satellites. Satellite observations and applications are hence embedded into the daily fabric of modern society with a large number of application areas such as agriculture, land monitoring, emergency response, defence, security and natural resource management. Observations from Satellites (OS)—including observations of our home planet from Earth Observation (EO) satellites; measurements, experiments and videos taken from the International Space Station (ISS); observations of the universe (Space Science); and measurements from and for the Global Navigation Satellite Systems (GNSS)—provide the foundation for science to better understand our planet and universe.
Rapid advances in the last few years in the field of Information and Communication Technology (ICT)—including the Internet, cloud computing, social networks, and most importantly mobile telephony—has revolutionized the way people connect and share information with each other. The proliferation of smartphones and accessible Internet connectivity has largely contributed to a massive amount of information readily available at the disposal of citizens. This has transformed the information environment, where information is ubiquitously available to users, at all times of their lives even during long commutes. At the same time, citizens can create and share content across a wide variety of platforms using multiple mechanisms. To this end, many businesses and entire industries have based their business model out of the collective contributions of citizens and users. Popular websites such as TripAdvisor, Amazon, eBay and most modern e-commerce platforms exploit the potential of crowdsourced data to provide customers with a greater understanding of the value of the purchases they intend to make. Wikipedia, Flickr and OpenStreetMap, on the other hand, serve as excellent real-world examples where crowdsourcing has provided immense wealth of information to be further used by organizations and communities worldwide. In fact, it has also been reported that in many instances information collected from citizens and informal institutions can be more detailed and of higher quality than provided by official institutions [1,2,3,4] and role of citizen-generated information is ever increasing. As a testament to the importance of citizen-generated crowdsourced data, the Digital Earth vision also encapsulates the role of citizens in daily life as not only mere providers of data, but contributors and collaborators [5]. The Digital Earth vision highlights the main policy, scientific and societal drivers that enable the vision of a “Digital Earth” as a multi-resolution, three-dimensional representation of the planet to find, visualise and analyse large volumes of physical and social environment data.
These two primary advancements have thereby created entirely new opportunities for users of OS—also becoming providers of information to exploit the data for both science and societal applications. Citizen science and crowdsourcing itself has had a rich history, dating back centuries with early examples of crowd participation in tasks such as providing contribution to developing what would eventually be called The Oxford English Dictionary [6] or inventing means of finding longitude to an accuracy of 30 miles [7]. Large numbers of volunteers can be recruited over wide geographical areas to collect, submit and interpret data at low cost [8]. Such widespread data collection (potentially over extended time periods) would be simply infeasible without citizen participation. Indeed, a wide geographical spread is essential to understanding the processes behind many of the important global challenges of today: vegetation loss, climate change, natural resource management, migration patterns, etc. Additionally, the volume of observation data (satellite-, airborne- and land-based)—some of which can only be interpreted by humans—is constantly growing.
Crowdsourcing detailed, high-resolution annotations of such data hence facilitates timely scientific analysis and decision making. Most studies, in this respect, focus on validation of OS and annotation of images by employing volunteers (e.g., [9]). While this potential has already been recognized (The Horizon 2020 Space Advisory Group’s Advice on potential priorities for research and innovation in the Work Programme 2016–2017 notes the importance of crowdsourcing and citizen science and involvement of the public [10]) as significant [11], thereby resulting in several research projects (The NASA Roses Program’s Citizen Science for Earth Systems [12] call for project proposals addressing Earth Observations and Crowdsourcing; [13]) and funded competitions [14] in the recent years, we believe that there is a need to systematically study the potential, benefits, and opportunities as well as risks involved in such efforts by involving different communities.
To this end, the Crowd4Sat project [15] aimed at studying how crowdsourcing and citizen science can impact on Satellite Observations and explore its potential through four demonstration projects involving different stake holders and crowdsourcing mechanisms. This paper presents the findings from an activity within the project where stakeholders in crowdsourcing, citizen science and Earth Observations were contacted to provide their views and opinions on the present state and future of the field. While several studies discuss how crowdsourcing can contribute to Earth Observations and Observations from Satellites [16,17], to our knowledge, this is the first systematic study of stakeholder views and opinions on the highly evolving field. The next section presents an overview of crowdsourcing and citizen science, followed by a description of the stakeholder analysis activity. Finally, we present a summary of stakeholder opinions and highlight three most critical aspects that need to be addressed by the community. We conclude the paper with a discussion on future work.

2. Citizen Science and Earth Observations

The involvement of volunteers in crowdsourcing solutions or engaging in research has a long history [18,19]. This involvement, however, has evolved over the last years due to several factors: Firstly, the use of modern technology and smartphones has significantly changed how citizens can participate in large scale studies. Secondly, the advent of the Internet and Web 2.0 technologies has revolutionized content authoring and means of communication. Prior to the 19th century, science and scientific participation was only limited to people with alternative employment or affluent individuals conducting investigations as a hobby [19]. Citizen science has had a significant contribution in many areas of study such as archaeology, palaeontology, zoology, ecology and astronomy [20]. In recent years, there has been a paradigm shift where citizens and communities are empowered with tools and technologies to directly communicate and collaborate with authorities, businesses, research institutions and scientists. Modern citizen science has also moved from participation of only the privileged few to activities that can potentially encapsulate all citizens and communities, one of the earliest examples being the Christmas Bird count, started in 1900 [19].
At the same time, technically proficient citizens and hobbyists can use their development skills (using DIY kits such as Arduino [21], Raspberry Pi [22] or software analysis frameworks (Tableau [23] and Qlikview [24] provide excellent visual means for users to get an insight on their datasets)) in creating tools and solutions to sense their environments, analyse large scale data and so on. Finally, the move toward an open framework where tools, data, technical plans are openly released and made freely available by developers, authorities and institutions to be used and improved by others has created an environment that fosters collaboration and interdisciplinary research (a large number of datasets are made available by organisations, institutes and Governments for free, to be re-used for developing tools, applications and analyses via open data portals (e.g., [25,26])). All of these factors have now created enormous opportunities [27] for citizens to participate in very large scale scientific studies whereby they can collect observations, analyse data, provide insight and even participate in co-authoring [8,28,29] scientific papers (players of the fold.it project that explores protein folding engineering solutions have been regularly co-authors in scientific publications) at a scale never imagined before at low cost [8,28].
While participating in individual projects can provide means for citizens to be involved in scientific studies and research, traditionally, their participation has been top-down with institutions and professional scientists inviting communities to be a contributor and, hence, many communities and citizens are not aware of projects that may be of interest to them. Websites such as SciStarter [30] breaks this barrier and provide all citizens who are motivated to contribute with means to search for citizen science projects that are aligned to their specific interests. This can empower citizens with means to connect with and contribute to projects of interest even if they have not been previously engaged. In the top-down citizen science projects discussed thus far, the research questions, data collection mechanisms, analyses, reporting and methodologies have been defined by professional scientists. Civic/Community science, on the other hand enhance the role of citizens in a bottom-up approach, where citizens, based on their interests develop citizen science projects employing scientific techniques, often with the collaboration of scientists [29,31]. iSeeChange [32] is an excellent example which combines citizen science with citizen journalism, where citizens observe changes in their natural environment and report them via mobile or online applications. These changes are then identified by scientists and experts who can then respond as an explanation or conduct further experiments.
The notion of exploiting crowdsourcing in scientific and space applications as means for data collection, distributed problem solving and eventually, close collaboration has gained recent traction, more so with the increasing means for citizen participation [33]. The Geo-Wiki Project [34] serves as an excellent testament to this by engaging online volunteers in validation of global land cover maps based on their observations in Google Earth as well as local knowledge. The cities at night project uses crowdsourcing to catalogue and classify high resolution images taken from the International Space Station [35]. In the same spirit, the Image Detective project [36] enables citizens to locate astronaut photographs of Earth via a web based interface. The Stars4All [37] project engages the public for raising awareness on light pollution by engaging citizens in crowdsourcing activities via mobile photo sensors and gamification. The Dark Sky Matter project [38] similarly also enables citizens to provide crowdsourced data to measure night sky brightness. Citizen Science and Earth Observations is also being applied to a variety of application areas such as validation and integration of land cover maps [39,40], urban forest management [41], air quality and pollution [42,43], biodiversity conservation [44] and so on. Several examples exist that demonstrate the value of crowdsourcing and Volunteered Geographic Information (VGI, [1]) within the context of humanitarian relief and Emergency Response. Tomnod [45] crowdsources object and place annotation from satellite imagery and has engaged volunteer communities in several large scale emergencies and studies. Tomnod is one of the best examples of using crowdsourcing and volunteers to analyse satellite imagery over large areas, eventually generating large amounts of data. For example, the search for the missing Malaysian Airlines MH370 aircraft had 2.3 million Internet users submitting 18 million tags [46] for over 745,000 images [47]; and Chile (12,425 objects) and Nepal (1 million objects) earthquakes mapped large areas to identify damaged buildings and infrastructure.
Though not strictly contributing toward Observations from Satellite, several crowdsourcing initiatives collect VGI that can help significantly increase and catalogue local knowledge and observations. eBird [9] is one of the excellent long standing examples that harness the power of crowdsourced ornithological observations across the world and conduct scientific studies at large scale, which would not have been possible with traditional means of scientific study. The large amounts of data (250 million observations covering 98.5% of world’s bird species, provided by 300 thousand users (As presented by Rick Bonney in his keynote at ECSA conference, May 2016; notes from [48])) collected from eBird is open access and has also been instrumental in several scientific publications, demonstrating the value of such crowdsourced data. Similar initiatives in recording observations in nature also exist in other domains such as Leafsnap [49] (automatically identifying and recording plant species from physical attributes such as leaf shape, colour, etc.), NatureServe [50] (recording animal, plant and habitats), and GeoExposures [51] (documenting geological features such as excavation, pipelines, surveys, etc.).

3. The Crowd4Sat Project

The “CROWDsourcing for observations from SATellites” project is a recently concluded project supported by case studies funded by the European Space Agency [52]. The objective of the project was to investigate the different facets of how crowdsourcing and citizen science impact upon the validation, use and enhancement of ESA Observations from Satellites (OS) products and services. Furthermore, the project investigated how ESA products can be used in Crowdsourcing. The Crowd4Sat project kicked off on 2 February 2015 and over the duration of 12 months was comprised of two main strands of activities: Strategic Roadmapping and State of the Art; and Demonstration projects (case studies).
The four key demonstration projects, each addressing a concrete scientific and societal problem informed the technical findings of the project: pollution in metropolitan areas, land use, water management and snow coverage, and flood management and prevention. The project developed, modified and deployed technologies to address the four demonstration projects in a variety of crowdsourcing settings, with differing level of citizen engagement within each demonstration project. In addition to the demonstration project activities, a thorough analysis of the field was also conducted by means of literature survey, stakeholder analysis, technology survey and road-mapping. This paper discusses one strand of the survey conducted, which involves understanding stakeholder views and opinions of citizen science and crowdsourcing for Earth Observations. Over the duration of the project, several areas of further research, recommendations and suggestions have been identified, based on analyses, user feedback and stakeholder interviews.

4. Stakeholder Survey

During the final phase of the Crowd4Sat project, multiple stakeholders were identified who were highly relevant and involved in crowdsourcing/citizen science OS-based EO initiatives. Over a period of five weeks, stakeholders were contacted to elicit information regarding their experiences on a variety of aspects of crowdsourcing and CS, particularly related to EO. This section describes the study conducted with their involvement. Stakeholders, in this scenario, are academics, researchers, developers or even volunteers with a high interest, practical experience and expertise on the subject. Their expertise lies in both aspects of crowdsourcing and citizen science and Earth Observations, particularly from research experience, practice or having led projects and teams in such initiatives. They are typically experts who have a very strong understanding of the field and how it is expected to evolve in the next few years, and mostly have a significant experience in running large projects, initiatives or organizations.

4.1. Methodology

A systematic approach was undertaken while conducting the survey. Although various attempts have been made in the past to survey and exploit the benefits of crowdsourcing and citizen science on Earth and Satellite Observations [33,53], to our knowledge, this is the first time where a variety of stakeholders have been involved in providing their views and opinions on their perception of the current and future state of the field. Hence, the study was aimed at being an exploratory one, with a few open-ended questions to gauge stakeholder perception, and a few categorical questions aimed at understanding expected timelines for the adoption of crowdsourcing/citizen science methodologies.

4.2. Design

The primary concern while designing the study was to ensure that experts are consulted with the right means for them to participate. The large multitude of projects and initiatives across the globe also introduced geographical limitations. Hence, conducting in-person interviews and focus groups was determined to be unsuitable for this study. Finally, given the highly specific nature of the expertise required, and that the process of the study is an exploratory one, it was decided to have structured interviews via an online survey questionnaire. This would remove the need for arranging interviews when the interviewer and interviewee would need to be available for a real-time interaction, while helping stakeholders take their time to deliberate over their responses.
The process of designing the survey began in the last week of November 2015 and extended over a few weeks. During the first weeks of January, the survey was shared among the project partners and requests were sent to prospective participants from the last week of January. A variety of relevant organizations and individuals were contacted, with brief details of the Crowd4Sat project aims and the survey goals. The responses were then collected, and category-based questions (ones that included a drop-down selections) were plotted to show the distribution of responses, and subjective questions were collected together, and key concepts highlighted. We present findings of our analysis in this section.

4.3. Survey

The survey started with a few questions related to the participant, collecting names, organizations and email addresses. The questions in the survey were uncontroversial, and it was deemed important to understand the responses of participants in relation to their experience and roles within their organizations. This was explained before participants proceeded with the survey. Along with contact information, participants were asked to provide some brief information on their experience in Crowdsourcing activities and Satellite-based Earth Observation activities. Individual assessment of personal experience is aimed at providing us with an understanding of the wide variety of participants.
The survey then aimed to capture specific information on how the stakeholders expect Crowdsourcing and Citizen Science to be employed in the near future in three different settings: Scientific (as deployers), Commercial (as deployers) and Societal (citizens, authorities, civil protection, tourism and science and academia). Adoption was categorized into short (<3 years), medium (3–5 years) and long term (5–15 years), by different types of users such as innovators, early adopters, early majority, late majority and laggards. A generic description (As from [54]) of these categories, as provided to the participants is as follows:
  • Innovators: willing to take risks. Their risk tolerance allows them to adopt technologies that may ultimately fail.
  • Early Adopters: judicious choice of adoption; socially forward.
  • Early Majority: adopt an innovation after a varying degree of time that is significantly longer than the innovators and early adopters.
  • Late Majority: adopt an innovation after the average participant.
  • Laggards: last to adopt an innovation. Aversion to change-agents and typically tend to be focused on “traditions”.
A final open-ended question on how the general adoption by the wider citizen community will evolve over the future concluded the section. The following sections aimed at understanding citizen science and crowdsourcing in a variety of aspects: impact, benefits and opportunities and barriers. The final section was a generic one, to understand ease of access and manipulation of satellite observation data. The survey concluded with questions requesting permission to contact the participant in the future regarding the survey and the project in general.

4.4. Participants

A survey was created via Google Forms, made available at [55]. The survey was designed over several weeks, with consultation between project partners and study of similar surveys. An initial lookup of relevant projects and initiatives provided a set of 10 key stakeholders, who were initially contacted to request for suggestions of further stakeholders. In all, 30 stakeholders were contacted directly with requests on several project-based mailing lists. Overall, the number of stakeholders who are estimated to have been contacted can be expected to be around 50, with expertise in a variety of fields. The final number of responses were from 15 participants, all highly expert in the field and have a variety of experiences.
A variety of stakeholders had responded to the survey, where the participant roles were categorized as follows: commercial, consulting, researching/testing/developing solutions for citizen science and leading/organizing crowdsourcing teams/activities. Given their expertise in crowdsourcing and citizen science, particularly relevant to its application in OS, the participants were ideally positioned to provide their inputs. A majority (9) of the participants are either organisers, coordinators or managers of programs (scientific or commercial) that heavily exploit crowdsourcing and OS data, and hence have a good understanding of the issues related to the collection, analysis and exploitation of such datasets. Furthermore, the participants have practical experience in involving and engaging with citizens and communities. Two participants belonged to local authorities and had practical experience in using crowdsourced data for the enhancement of OS data by engaging citizens and testing technological solutions. The remaining four participants were users who develop and deploy technology for citizen science and crowdsourcing, particularly aimed at collection of crowdsourced data and aligning it with satellite imagery.
The stakeholders also employ crowdsourcing/citizen science activities to address a variety of application areas such as: understanding mobility, environment (air quality, land cover and use, and water management), high accuracy time-based applications, education, classification of satellite imagery, emergency management and commodity price collection. As can be seen, the engaged stakeholders encapsulate a wide variety of the communities (scientific, commercial, citizens, authorities and decision makers) related to the survey and given their expertise on the topic and practical experience in the field, can provide invaluable insights into the survey.

4.5. Analysis

The survey was conducted via Google Forms and shared online with the stakeholders. As a result, the responses were available in a structured manner, organized in a spreadsheet. As spreadsheets can be very convenient for summarising and analysing categorical responses, two types of analyses were conducted for the two types of data: categorical responses were grouped based on the categories and a distribution count was calculated for each combination of categories (e.g., number of respondents who chose adoption of crowdsourcing by Innovators in the Short term). The next analyses aspect involved understanding the subjective opinions of participants on the current state and future prospects of the field in a variety of contexts. This involved a three-step process: separately developing categories based on an analysis of responses; coding responses based on the developed categories (Figure 1, left); and aggregate categories as hand-written notes (Figure 1, right-top). Finally, these categories were then organized using affinity diagrams [56] to aggregate all responses in one place, structured within different thematic areas. Affinity diagrams are excellent means to consolidate issues, concerns and insights and structures them into larger spaces to provide a high level overview of the data collected. Typically, affinity diagrams involve using post-it notes on large spaces such as walls or boards, however for this analysis it was sufficient to use a pen and paper method with color-coded notes [57]. These processes are standard HCI approaches used in analysis of subjective datasets such as views, opinions and feedback and have been essential in organizing the data collected from the survey and its subsequent analyses.

5. Survey Results

The responses of stakeholders were highly insightful and identified several issues, while highlighting the immense potential of engaging citizens in crowdsourcing and scientific activities particularly to enhance OS and space-based products. The survey, being an online one provided the opportunity for stakeholders to follow-up the authors had there been any confusion in the survey questions. Prior to contacting stakeholders, the survey had been validated and cross-checked by the Crowd4Sat consortium. The survey also had descriptions of terms that could have been misinterpreted such as the standard definitions of innovators, early adopters and so on. As a result, responses indicate that survey questions were interpreted by all respondents as they were originally intended when designing the survey.

5.1. Scientific Community

The first aspect involved the scientific community—the questions were aimed at understanding uptake within academics, researchers and institutes in the short, medium and long term. The Figure 2 shows the distribution of stakeholder views on the adoption of citizen science initiatives in the scientific community.
As can be observed and indeed expected, all participants agreed that innovators are most likely to adopt citizen science/crowdsourcing activities within the next three years, while most agreed that laggards can take up to 15 years. A consensus was also observed with most participants agreeing that early adopters in the scientific community can be expected to deploy such initiatives in three years. Opinion regarding early majority and late majority was mixed and indicates the possibility of a relatively mixed distribution of adoption.
Upon being asked (as a subjective question) how the experts felt scientific adoption will change in the future, most experts shared a positive outlook on the future of citizen science in the scientific community. Some (3) were relatively negative, noting that there is a fair amount of “resistance to the use of citizen science data” owing to complexities and hence, a slow process. Another participant noted uncertainty in several aspects such as precisely “determining the information value” of citizen science data. One participant also questioned the acceptance of citizen science solutions due to the “cultural changes required” in adopting such mechanisms—for example, scientists who are adhere to traditional knowledge practices may be hesitant in accepting new forms of data. A further concern was that the crowdsourced observations need to be “relatively cheap”, as the scientific community would be unlikely to include such observations if there are expenses involved in testing them.
Positive responses, however, were highly encouraging with some participants noting that the adoption will create a standard for the scientific communities and such mechanisms are ready to be adopted right now. An example of crowdsourced information being used for scientific research is the eBird project, as discussed earlier. One participant noted the advancement in Social Media coincides with such citizen science approaches, and need better integration. This can provide a step change in citizen science. There is also a need for more UAV data, free very high resolution data and new low cost satellites serving free data. There is also a need for high volumes of high precision data, which can trigger a step change in the adoption of citizen science initiatives.

5.2. Commercial Organisations

Similar to the adoption by the Scientific Community, participants were requested to provide their views on citizen science/crowdsourcing initiatives deployed by commercial organizations. This is a slightly different aspect, as enterprises are primarily driven by financial benefits and rewards, while the scientific community is driven by scientific goals. Figure 3 presents the distribution of the stakeholder views on adoption of citizen science and crowdsourcing initiatives in a commercial setting.
As compared to Figure 2, we can observe that there is a larger variation in the opinions of the stakeholders regarding adoption in a commercial setting. While most participants agree that laggards are most likely to adopt citizen science in the longer term, it appears the early adopters and early majority are more likely to adopt citizen science solutions in the short-medium term. Innovators in commercial organizations will likely be more conservative in adopting citizen science solutions in the short term. This could possibly be due to the “perceived risks involved” in citizen science solutions, and the need for investments to cover expenses in the field (as mentioned earlier in the previous section).
This is reflected in the opinions of one stakeholder: “Sound business case not yet so clear and proved from the business point of view”. This is a common theme across most of the participants. Several stakeholders note that the aim of commercial entities is to drive profits and earn revenue—while some organizations are using such initiatives internally (such as Apple, Google, etc.), there is a very limited use of citizen science in a commercial setting. One participant mentions, some organizations can be “supporters of citizen science activities as part of their charitable or educational support activities”, which can then seed new commercial models and products. Overall, the views toward commercial organizations adopting citizen science initiatives were conservative and nicely summed up by the following comment: “I don’t have a great feel for this—companies aren’t set up for this. I can see them making some kind of business model out of this, e.g., micropayments...”.

5.3. Societal Adoption

The adoption of citizen science and crowdsourcing in various aspects of daily life can indicate how well a new technology/approach has impacted citizens. In order to assess this, the participants were requested to share their views on how soon can we expect different sections of the society to adopt citizen science and crowdsourcing.
As can be observed from the Figure 4, most of the areas could adopt citizen science initiatives in the short-medium (in the next five years) term. Few stakeholders believed that adoption would be in the longer term. Among all the options, the stakeholders felt that the citizen science initiatives would be expected to be adapted in science and academia the earliest. Tourism and Civil protection can also be expected to have many early adopters. This can be explained well since the tourism and Civil Protection (Emergency Response) domain has seen a paradigm shift in the way citizens can contribute. Tourism industries have been revolutionized by a variety of websites such as TripAdvisor and Airbnb, which rely on citizens to drive their data. Emergency Response, in limited cases has also seen an uptake on citizen-generated data [59,60,61], and several authorities are highly active on Social Media and exploit social data as means to find critical information, albeit after much effort to overcome institutional resistance. Indeed, many of the crowdsourcing initiatives for the Earth Observation domain have already been in mapping disasters and emergencies [62,63,64], and it is expected that citizen participation and crowdsourcing will continue to be a major aspect of emergency response. Hence, there is an excellent opportunity for citizen science and crowdsourcing initiatives to be adopted by such domains, while there is still a long way to go. Concerns such as lack of resources, staff experience, fear of data quality, privacy and security [65,66] still continue to exist that needs to be addressed in order to be more widely accepted.
Three overarching themes were identified in the responses of the participants in this section: citizen adoption of technology and digital medium in general; ease of use and usefulness; and region of interest. A general increase in the uptake of technology such as mobile phones and tables, with the increased penetration of the Internet among citizens is expected to be the primary driver of adopting citizen science initiatives in the society. The ease of use (and its subsequent value) of the technology is extremely important in influencing the adoption of citizen science. Participants felt that increased citizen participation is likely expected in emerging countries and this can be accelerated with the involvement of non-profit organizations in this sector.

5.4. Participant Engagement

Three subjective questions were part of this aspect of the survey:
  • How has citizen engagement changed over the years?
  • How do you expect participant involvement to change in the future?
  • How do you think participant engagement can be boosted?
The first question was an attempt to summarize the experience of the stakeholders in involving participants. Most of the participants have been directly involved with citizen science and Crowdsourcing initiatives, while some have also led several initiatives and large projects. Hence, it is important to understand how engaged citizens and communities are in such projects. Interestingly, responses to this question had a few overarching themes. Many (8) participants had positive experiences and noted an increased involvement of citizens, while a few (5) noted that a generic answer to the question was not possible. Two participants mentioned that there has been less of a change in the past years—this is primarily due to the lack of knowledge of crowdsourcing and citizen science initiatives. Furthermore, certain topics such as traffic and mobility has seen a great change in adoption owing to the benefit citizens can gain out of their participation. Another participant noted that participation from common citizens was still low, while expert citizens and researchers presently frequently largely contribute to Crowdsourcing initiatives. Participation of citizens have also increased due to several groups being more willing to be engaged such as boy scouts, youth groups, science clubs, senior citizen homes and so on.
The difficulty in providing a concrete response to the question has also been explained in several ways: firstly, this varies greatly based on application areas, projects and programs, and over the duration of the activities, some “have built up over many years and plateaued in number of participants”, while “others have gone down”. Some major issues that citizen science owners are currently struggling with are sustained contributions, limited activity and attrition of participants. Presently, it is also difficult to assess the status of many initiatives. There are many reasons that could be attributed to greater or lower participation such as commercial and social drivers, communication strategy, incentives and crowd motivation. However, this is different for each project, where some advertise better/use the media better to recruit; while some have better interfaces and feedback/contact with their citizens, etc.
The second question was to ascertain how experts perceived citizen science and crowdsourcing to evolve in the next years. This question received a more harmonious response from all participants—most stakeholders believe there will be a constant increase in citizen participation. Several reasons were provided, with the access to technology and increasing awareness of technology and citizen science initiatives being the most cited. A few participants mentioned the need to tie such initiatives with personal activities and hobbies of citizens to ensure citizen participation. Another way to ensure greater participation is to develop suitable tools to address citizen science needs related to issues that are important and relevant to citizens. One responder mentions: “citizen engagement will consist of three components: (1) consolidation and expansion of citizen observatories and citizen science projects; (2) recruitment of new citizen observers from society at large; and (3) improvement of knowledge transfer capacity building through the formation of local leaders to strengthen local-community networks”. This provides an insight into how future road mapping activities can be conducted: as mentioned, a large number of projects already exist and some effort can be focused on integrating/consolidating such initiatives. Initiatives such as SciStarter can help volunteers look for the kind of crowdsourcing and application areas they are interested and passionate about and hence, eventually be able to increase engagement.
The final question was addressed at understanding how we can improve citizen engagement in the context of crowdsourcing and citizen science. Several ideas had emerged: one of the most popular ideas was to create appropriate strategies and planning to ensure citizens’ interests are aligned with the citizen science activities. This will need “win-win” models to boost engagement by providing some kind of benefits to the participants. Engagement needs to be researched as “it could take different actions to keep different people engaged. For some it may be rapid response on the data collected ..., for others it could be hearing from scientists involved in the program. Another core idea emerged very strongly: communication will be the key to ensure citizen participation: starting from raising awareness of the culture of citizen science at schools, to communicating to citizens and partnering with other organizations who already have members. The use of social media was also noted, and there is a need for stronger ties between citizen science and social media. One participant summarized very well the various ideas that emerged:
(we can ensure greater engagement) through good design, good media campaigns, strong feedback mechanisms, appealing idea, strong link to science, good links to grassroots organizations”.

5.5. Impacts, Benefits and Opportunities

This section of the survey addressed the crowdsourcing/citizen science aspect of Earth Observations: in essence, four questions were posed to the participants:
  • How do you think citizen science could be better used with satellite observations?
  • How do you expect citizen science initiatives could contribute to society over the next 3, 5, 15 years?
  • How do you think satellite related citizen science initiatives could better target education and outreach?
  • Are there any areas you think citizen science could really make a difference?
Stakeholder responses to the first question addressed a few key aspects: primarily using citizen science data to augment and improve understanding of contextual environment. There should be mechanisms to provide citizens to see “immediately or automatically the results of data collection”, and communicate the end results of their activities and how they have contributed. Another participant also mentioned this idea, where the data thereby collected can be made easy to access and visualize, Nasa Giovanni [67] being a good example of such an initiative. Citizen science data can potentially verify and improve the spatial and temporal resolution of satellite observations, and there is a need to find ways to use multiple instruments, which are more usable. An idea where producing guidelines on how new citizen science observatories can be started, with best practices for technical tools can provide a good starting point. Indeed, the need for ensuring sustainability of crowdsourcing and crowdsourcing solutions is an urgent one. One good example where crowdsourcing has served well is the disaster management domain. Another suggested use for citizen science could be in validation of EO products and interpretation of imagery.
Outlook for citizen science initiatives contributing to the society were highly positive among the stakeholders: most believe that in the next 3–5 years, it is highly likely that there will be a significant contribution within different aspects of society. Crowdsourcing can be expected to be fast growing in the next three years, though in terms of applications for EO and OS data, crowdsourcing will be expected to be a niche in the next three years. As OS are reaching a stage of maturity, citizen science could be a disruptive technology that could promise a significant improvement. One of the main areas of impact could be providing complementary observations during the revisit times of satellites. Several participants noted that the areas of emergency response and environmental monitoring could be potential areas where citizen science could significantly contribute to the society. A variety of application areas were also suggested such as fisheries, climate change monitoring, biodiversity monitoring, water resources monitoring, carbon sources, agricultural applications.
All participants agreed that targeting education and outreach would be a very helpful activity and, as a result, many ideas were proposed. It is very effective to teach young school students from the very first classes, however the communication and content needs to be differentiated by the targeted age. There is a need to underline and highlight the potential of citizen science initiatives. Some ideas emerged on making scientific research projects mandatory in all science classes grades 3–12 and college level general science education. Another possibility could be to provide tools and data for free to undergraduates and postgraduate students. Finding new academic groups not traditionally involved in crowdsourcing could also help increase uptake of citizen science and crowdsourcing. A gaming approach for education was also suggested, while partnering with existing programs could also provide some help in accessing more students. One such example of gamification is the fold.it [68] game that provides gamers with computationally difficult protein folding problems in the form of puzzles [69]. Some further ideas such as outreach activities in festival of sciences, “train the trainer” workshops, engaging stakeholders and volunteers, bioblitzes (Group activities comprising of scientists, hobbyists and volunteers engage in an intensive field study over a period of continuous study. This is an attempt to record biodiversity and the wide variety of species living in a geographical area, as well as encourage more public participation and generate interest. From [70]), DIY workshops, species surveys etc. were also suggested.
A wide variety of potential areas where citizen science could make a difference were also identified and examples of such are: health citizen science (e.g., uncovering root-causes), urban planning, homeland security, environmental modelling, crisis/emergency management, public affairs, science education and automotive industry to name a few.

5.6. Barriers

In order for citizen science initiatives to be implemented and adopted in a larger scale, several barriers would need to be overcome: participants were requested to provide their views on procedural, technical and trust and provenance. Procedural barriers identified were multiple: data protection and privacy (this is particularly challenging with real-time data and open data movement); standardization; clear policies; ethics and anonymization (respecting privacy of users); commercial barriers (particularly when information needs to be shared among multiple sectors); understanding the true value of citizen science data; confidence on citizen science data (professional/scientific). Some further procedural barriers were also identified such as the need to bring together multiple groups to solve a common problem, buy-in from national mapping agencies and so on.
Technical barriers involve dealing with heterogeneous nature of data, lack of Big Data infrastructure, unavailability of enough data, complexity of tools and technologies, bias in the data (temporal and spatial), need for easy to use technologies for technology savvy and naïve users, Internet availability, data sharing.
The final barriers citizen science initiatives need to overcome are in trust and provenance. Most participants agreed that this is a complex task and is extremely challenging to address: “Anybody working within legislated frameworks will have issues with data that have unknown quality/undescribable quality” (provenance), while “many aspects including security and privacy and possible manipulations” are concerns that relate to trust. One participant noted the need to ensure bias is not introduced in the citizen science data, hence a clear process of capturing provenance information including demographics is needed. The lack of a citizen science generated data quality standard is also a significant challenge to the community. Most responses call for standardized validation processes, which will need to be investigated. There is also a concern in understanding the importance given to citizen science data—while the public can trust “CS data to choose a restaurant and book their holidays”, “professional and some scientific communities are still doubtful about CS data quality and may be reticent to adoption”.

6. Summary and Conclusions

While the need for crowdsourcing and citizen science in the enhancement of space based observations is clear and well-motivated, it is a long way from being an established aspect of Earth Observations. In spite of the benefits being clear to all stakeholders, there is a significant and consolidated effort required for crowdsourcing to gain greater acceptance and prominence. We believe with a rising interest and demand for crowdsourced information, there is also a need to understand the hurdles and challenges that are foreseen by stakeholders, which can eventually be addressed with a systematic approach. This is an extremely important step at this critical juncture where rising demand and interest in low cost approaches for enhancing space based observation meets large communities of well-motivated, technically proficient and increasingly conscious citizens. Addressing these concerns at this stage can help shape how the field evolves in the near future. To this end, we identify the critical aspects of the stakeholder views and attempt to create a roadmap for the citizen science and crowdsourcing community. We believe that these are aspects that are most pressing and will need to be addressed in order to make a step change in the field of crowdsourcing not only for Earth Observations, but also for academia, research, society and industry as well. We list the three most critical areas that we believe must be urgently addressed for a significant impact:

6.1. Data Governance, Standardization, Trust

Crowdsourced data can make significant impact on the application area, however quality, trustworthiness of information and contributors is often uncertain and difficult to assess. Privacy is critical as personal information needs to be protected, anonymized and always securely stored. At the same time, there is a need to preserve the maximum amount of metadata and provenance information related to crowdsourced observations to further provide measures for assessing data quality and user context. The primary recommendation for such concerns is to start developing policies and protocols to ensure there is a well-established standard toward collecting, treating and storing crowdsourced data. Standardization is extremely important as it can generalize how such data is treated with confidentiality and privacy in consideration. The survey [71] conducted by highlighted the diversity of 121 citizen science initiatives and presented how various projects can re-use and share data between them and discuss that a large number of the initiatives already have a dedicated data management plan, providing access to raw data. Such surveys can be extremely important in understanding how other citizen science projects handle various aspects of data (e.g., re-use, sharing, licensing, and quality control).
A standardized approach will also help in increasing credibility and confidence in the data collected, and hence, be more acceptable by different communities such as scientific, authorities and industrial organisations. Furthermore, standardization will significantly help in data sharing and re-use between projects and initiatives. Given that further complexities do exist in aspects of sharing commercially sensitive information and licensing restrictions, we believe a formal and standardized approach of data collection, processing and storage will eventually significantly simplify the process. Hence, the primary recommendation is to establish a framework of standardization, based on guidelines and principles on formalizing the process of data governance. Core working groups such as ECSA’s (European Citizen Science Association) Projects, Data, Tools and Technology Committee (European Citizen Science Association’s Working Group on projects, data, tools and technologies, [72]) and Citizen Science Association’s Data and Metadata working group (Citizen Science Association’s Data and Metadata working group, [73]) has already started efforts in consolidating expertise in the field and can help adapt the activities of wider communities (Data on the web best practices W3C Working Group [74]; Spatial Data on the Web W3C Working Group [75]) toward developing a standardized framework for data governance and management. The ISO 19156 model for Observations and Measurement can also serve as an excellent starting point [76]. Particularly relevant in this context is the work done by the Semantic Web [77] and Linked Data [78] communities, where data is represented with metadata in a machine readable and processable manner, adhering to strict ontological rules and domain formalisms. Such initiatives can also significantly assist in addressing the concerns regarding provenance and trust in crowdsourced datasets. Provenance is metadata regarding an object’s (here, a data point or observation) origin and history [79]. Several models have also been proposed in formalizing provenance in crowdsourced data and VGI [80,81,82,83], which can be used as a great starting point for capturing provenance.

6.2. Social Presence, Support and Education

All stakeholders stressed the importance of the social aspect in crowdsourcing and citizen science. Of course, the very need for citizen engagement warrants a significant focus on societal aspects of involving communities and citizens. One view dominated stakeholder responses: such initiatives need a very strong focus on communication. It is important to critically analyse the successful and unsuccessful crowdsourcing initiatives to understand how they have communicated with citizens. Such initiatives typically use advertising and marketing campaigns, outdoor activities, social media and there is a need to study and understand the impact and possibilities of each mechanism. Communication also requires continuous exchange of ideas, feedback and viewpoints between citizens, authorities and scientists. Often, engagement from scientists and experts in the field can significantly increase the interest of public and ensure continued interaction [84,85,86,87,88].
The next important aspect is to align crowdsourcing activities with citizen interest. This requires firstly understanding the topics, areas and domains of interest of target communities and then developing mechanisms to ensure the required crowdsourced observations can be collected by activities that involve such topics of interest (the hugely successful, recently released game Pokémon Go, for example, has been successful at engaging citizens in physical activities and exploring localities, and the SciStarter blog discusses the possibilities of combining such games with citizen science activities [89]). There is a need to involve citizens and communities in iterative user centred design processes, where citizens are consulted in different aspects of the projects to instil a sense of belonging and ownership. Engaging citizens in outreach activities can provide excellent ways for them to be interested in science and scientific process in an enjoyable environment—though such events have often been part of camps and school activities, there is a need to increase such efforts. In conjunction with ubiquitous technologies such as smartphones, smart watches and tablets, such efforts can help engage different communities such as school children and young professionals. The elderly population also risk being alienated if adequate care is not taken to develop technologies that can assist them with providing their contributions. Their experience and local and community knowledge can offer an immense wealth of information that is invaluable to citizen science initiatives—hence, there should be a focus on developing tools and technologies that are user friendly and intuitive for such communities. Finally, involving non-profit organizations and tie ups with grassroots organizations can help develop win-win models for social engagement, which can eventually help foster a greater collaboration.

6.3. Technical Implementation

While engaging citizens and communities have social and procedural dimensions, several technical challenges still hinder citizen engagement. A concern among the stakeholders was the lack of generic technologies which citizens and communities can quickly employ and use in their daily crowdsourcing activities, particularly for dealing with complicated datasets, combining heterogeneous data and Big Data infrastructure. The first step in addressing this would be to formalize the process of data collection and archiving itself, and hence, a standardized data governance framework would be a significant help toward developing and using generic technologies. One of the most important views of stakeholders was the need to engage citizens by providing them with interactive visualizations and feedback about the data they contributed and how they contributed in the broader initiative. A number of user friendly and freely available tools are already available that can help users (often, non-technical) explore and analyse datasets such as Google Spreadsheets, Infogr.am [90], Tableau [91] and Visualizefree [92], without the need to write complex codes and algorithms. The tools enable users to upload their data as CSV files, databases or provide a URL where the data files are hosted, upon which several dashboards can be easily created with the help of intuitive wizards and visualization suggestions. Additionally, such tools can also generate publicly accessible links which can be shared within communities. Many such tools, such as Tableau, are also highly community-driven and hence a significant amount of support (e.g., tutorials, examples, and discussion boards) is available for users requiring any assistance. Similar (commercially available) tools [93,94] are currently used in a commercial setting for supporting managers and business analysts with their Business Intelligence needs. Although widely available for use today, the stakeholders acknowledged that there is a need to increase awareness among citizens of such tools, and also further develop into innovative user friendly tools that can be used by all user communities. Such tools can be easily deployed in a standardized framework to help citizens engage with their data, and hence feel more involved in the process of data collection and citizen science.
At the same time, several tools have been developed that can be used for seamlessly managing the entire lifecycle of collection and gathering of data via crowdsourcing, linking with external resources, analysing datasets, visualizing and final reporting. Ushahidi [95] is perhaps one of the most famous tools that communities have been using over the past few years and several plugins are widely available that can add value to data. Several other tools such as OpenDataKit [96], Gemma [97] are also available that aim at mass data collection and mapping. While some of these tools may require some technical proficiency, the process of data collection, and its subsequent analysis has never been easier. In fact, a lot of the tools previously discussed do not require users to even write a line of code—this has significantly simplified the process of initiating, monitoring and sustaining citizen science projects. While many of such tools are not prepared for raw EO and OS data, the process of using crowdsourcing for annotating satellite imagery, via user friendly tools and open data can increase the capability of citizens to make highly significantly discoveries. Several examples of such initiatives exist, particularly in planetary science such as GalaxyZoo [98], Zooniverse [99], and iMars [100,101] to name a few, with some already generating scientific discoveries. For example, SpaceWarps [102] discovered 29 new gravitational lenses using volunteers, which were earlier missed by computer programs [103]. Google Earth is also an excellent example where discoveries have been made by citizens studying features from Satellite imagery [104,105].

6.4. Human Centered Design

Finally, technical solutions requiring citizen involvement need a significant effort by User Experience (UX) and interaction design communities. There is a great need for developing highly usable and intuitive interfaces, devices and technologies in order to provide users with an enjoyable and stimulating experience. This is one of the important aspects that can significantly impact on the acceptability of solutions and eventually, continued engagement of users. Several guidelines and principles can be used as an excellent starting point for both browser-based and mobile interfaces [106,107,108,109]. At the same time, care should also be taken to ensure that solutions are designed for all user groups—while technology savvy users can be comfortable with various interaction mechanisms and interfaces, several user communities risk to be left out if solutions are not designed keeping them in consideration. Elderly retired volunteers are one community that can provide highly valuable insights, opinions and local knowledge, which needs to be accessed and catalogued with appropriate means. Interaction and data presentation mechanisms specifically designed for elderly populations are critical to ensuring such communities can effectively participate in citizen science and crowdsourcing initiatives (e.g., [110,111]).
Assessing stakeholder views and opinions is a significant aspect of understanding how the field will evolve in the next few years. With their many years of experience and expertise in crowdsourcing and EO, stakeholders have a very strong understanding and deep insight into practical considerations, opportunities and challenges ahead. This strand of activity within the Crowd4Sat project has hence been highly engaging and provided with a lot of insight and ideas. It is also important to understand the limitations of the survey. As discussed earlier, the number of participants was comparatively low (15). However, all participants are highly active and expert in the fields of crowdsourcing, particularly for Earth Observations and a majority are involved in initiating, monitoring and sustaining citizen science and crowdsourcing projects that are aimed at enhancing and exploiting Satellite Observation data. Participants also belonged to various communities such as research and academia; industry; decision makers and authorities. Although a larger number of respondents would be very helpful, we believe the views and opinions of the participants are concrete and provide a strong starting point for future surveys and studies. At the same time, it must be noted that all the participants are heavy users of crowdsourcing and EO data and hence the survey does not include the viewpoints of sceptics.
The next steps in this regard will involve stakeholders in a more engaged manner, using a variety of methodologies such as focus groups and interviews, but focusing on specific areas and domains. This will also involve more focused questions on space based products, datasets and services. For example, one of the interesting ways to continue will be to invite developer communities to build tools, devices and applications within the framework of hackathons and data mashups. Given the need for highly usable and well-designed solutions, a possible aspect could be to employ the developed solutions within the context of citizen science workshops. Lessons and findings could be paramount to addressing one of the big challenges in the field. We also aim to conduct a more focused and thorough review of the existing citizen science initiatives for EO and OS, particularly that aligns with the application areas identified in this paper: scientific, societal, and commercial. Finally, another possible avenue that we aim to explore is to understand how a standardized framework can be developed using Semantic Web and Linked Data can be used for releasing crowdsourced information and crowdsourcing tools.

Acknowledgments

This research was supported by the European Space Agency funded Crowd4Sat project. The authors would also like to sincerely thank for all the stakeholders and organizations for their inputs and interesting ideas. We would also like to thank the EU FP7 funded WeSenseIt project (Grant agreement No. 308429) and EU H2020 funded Seta project (Grant agreement No. 688082).

Author Contributions

All authors designed the survey and analysed results.

Conflicts of Interest

The authors declare no conflict of interest. The founding sponsors had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript, and in the decision to publish the results.

References

  1. Goodchild, M.F. Citizens as sensors: The world of volunteered geography. GeoJournal 2007, 69, 211–221. [Google Scholar] [CrossRef]
  2. Elwood, S. Volunteered geographic information: Future research directions motivated by critical, participatory, and feminist GIS. GeoJournal 2008, 72, 173–183. [Google Scholar] [CrossRef]
  3. Longueville, B.D.; Luraschi, G.; Smits, P.; Peedell, S.; Groeve, T.D. Citizens as sensors for natural hazards: A VGI integration workflow. Geomatica 2010, 64, 41–59. [Google Scholar]
  4. Gill, A.; Bunker, D. Crowd Sourcing Challenges Assessment Index for Disaster Management. Available online: https://pdfs.semanticscholar.org/7725/5b503bdab5b82627fa3e801042ef81bbd669.pdf (accessed on 12 January 2017).
  5. Craglia, M.; de Bie, K.; Jackson, D.; Pesaresi, M.; Remetey-Fülöpp, G.; Wang, C.; Annoni, A.; Bian, L.; Campbell, F.; Ehlers, M.; et al. Digital Earth 2020: Towards the vision for the next decade. Int. J. Digit. Earth 2012, 5, 4–21. [Google Scholar] [CrossRef]
  6. Milford, H. The Making of the Dictionary. Periodical 1928, 13, 3–33. [Google Scholar]
  7. Sobel, D. Longitude: The True Story of a Lone Genius Who Solved the Greatest Scientific Problem of His Time; Bloomsbury Publishing USA: New York, NY, USA, 2007. [Google Scholar]
  8. Howe, J. The rise of crowdsourcing. Wired Mag. 2006, 14, 1–4. [Google Scholar]
  9. Sullivan, B.L.; Wood, C.L.; Iliff, M.J.; Bonney, R.E.; Fink, D.; Kelling, S. eBird: A citizen-based bird observation network in the biological sciences. Biol. Conserv. 2009, 142, 2282–2292. [Google Scholar] [CrossRef]
  10. HORIZON 2020 Space Advisory Group, Advice on Potential Priorities for Research and Innovation in the Work Programme 2016–2017. July 2014. Available online: http://ec.europa.eu/transparency/regexpert/index.cfm?do=groupDetail.groupDetailDoc&id=15249&no=1 (accessed on 12 January 2017).
  11. See, L.; Fritz, S.; McCallum, I. Beyond sharing Earth observations. Environment 2014, 166, 168. [Google Scholar]
  12. NASA Science Mission Directorate, Citizen Science for Earth Systems Program. 2016. Available online: http://go.nasa.gov/1SUOO63 (accessed on 20 November 2016).
  13. Citizens’ Observatories Projects, Citizens’ Observatories. 2012. Available online: http://www.citizen-obs.eu/Home.aspx (accessed on 20 November 2016).
  14. Copernicus Masters, Copernicus Masters Winners. 2016. Available online: http://copernicus-masters.com/index.php?kat=competition.html&anzeige=winners.html (accessed on 20 November 2016).
  15. Crowd4Sat|Crowdsourcing for Observations from Satellites. Available online: http://crowd4sat.eu (accessed on 16 January 2017).
  16. Gould, M.; Craglia, M.; Goodchild, M.F.; Annoni, A.; Camara, G.; Kuhn, W.; Mark, D.; Masser, I.; Maguire, D.; Liang, S.; et al. Next-Generation Digital Earth: A Position Paper from the Vespucci Initiative for the Advancement of Geographic Information Science. Int. J. Spat. Data Infrastruct. Res. 2008, 3, 146–167. [Google Scholar]
  17. Goodchild, M.F.; Guo, H.; Annoni, A.; Bian, L.; de Bie, K.; Campbell, F.; Craglia, M.; Ehlers, M.; van Genderen, J.; Jackson, D.; et al. Next-generation digital earth. Proc. Natl. Acad. Sci. USA 2012, 109, 11088–11094. [Google Scholar] [CrossRef] [PubMed]
  18. Haklay, M. Long-Running Citizen Science and Flynn Effect. 2014. Available online: https://povesham.wordpress.com/2014/10/25/long-running-citizen-science-and-flynn-effect/ (accessed on 12 January 2017).
  19. Silvertown, J. A new dawn for citizen science. Trends Ecol. Evol. 2009, 24, 467–471. [Google Scholar] [CrossRef] [PubMed]
  20. Wikipedia, Cave of Altamira. Available online: https://en.wikipedia.org/wiki/Cave_of_Altamira (accessed on 22 November 2016).
  21. Arduino. Available online: https://www.arduino.cc (accessed on 20 November 2016).
  22. Raspberry Pi—Teach, Learn, and Make with Raspberry Pi. Available online: https://www.raspberrypi.org (accessed on 20 November 2016).
  23. Tableau Software. Available online: http://www.tableau.com/ (accessed on 20 November 2016).
  24. Qlik. Available online: http://www.qlik.com/us/ (accessed on 20 November 2016).
  25. Search for a Dataset—The Datahub. Available online: https://datahub.io/dataset (accessed on 20 November 2016).
  26. Search Government Data. Available online: http://data.gov.uk (accessed on 20 November 2016).
  27. Paulos, E.; Honicky, R.; Hooker, B. Citizen Science: Enabling Participatory Urbanism. Urban Informatics: Community Integration and Implementation. Available online: https://pdfs.semanticscholar.org/7a60/9f5fce9cf6b0df91d33b8c8b80d121a2f35e.pdf (accessed on 17 January 2017).
  28. Rise of the citizen scientist. Nature 2015. [CrossRef]
  29. Breen, J.; Dosemagen, S.; Warren, J.; Lippincott, M. Mapping Grassroots: Geodata and the structure of community-led open environmental science. ACME 2015, 14, 849–873. [Google Scholar]
  30. SciStarter. Available online: http://scistarter.com/ (accessed on 20 November 2016).
  31. Haklay, M. Citizen Science and Policy: A European Perspective; The Wodrow Wilson Center, Commons Lab: Washington, WA, USA, 2015. [Google Scholar]
  32. I See Change—Community Climate & Weather Journal. Available online: https://www.iseechange.org (accessed on 20 November 2016).
  33. Pankratius, V.; Lind, F.; Coster, A.; Erickson, P.; Semeter, J. Mobile crowd sensing in space weather monitoring: The mahali project. IEEE Commun. Mag. 2014, 52, 22–28. [Google Scholar] [CrossRef]
  34. Fritz, S.; McCallum, I.; Schill, C.; Perger, C.; Grillmayer, R.; Achard, F.; Kraxner, F.; Obersteiner, M. Geo-Wiki. Org: The use of crowdsourcing to improve global land cover. Remote Sens. 2009, 1, 345–354. [Google Scholar] [CrossRef]
  35. De Miguel, A.S. Spatial, Temporal and Spectral Variation of Light Pollution and Its Sources: Methodology and Resources. 2016. Available online: https://www.researchgate.net/publication/304212932_ (accessed on 17 January 2017).
  36. National Aeronautics and Space Administration (NASA). Image Detective. Available online: https://eol.jsc.nasa.gov/BeyondThePhotography/ImageDetective/ (accessed on 22 November 2016).
  37. Stars4All—A collective Awareness Program for Promoting Dark Skies in Europe. Available online: http://www.stars4all.eu/ (accessed on 22 November 2016).
  38. Dark Sky Meter. 2015. Available online: http://www.darkskymeter.com/ (accessed on 22 November 2016).
  39. De Vecchi, D.; Dell’Acqua, F. Cloopsy: A crowdsourcing mobile app to support and integrate Copernicus land cover mapping. In Proceedings of the EO Open Science, Rome, Italy, 12–14 September 2016.
  40. Del Frate, F.; Carbone, F.; Benedetti, A.; Porzio, L.; Grillini, A.; Picchiani, M. A New Citizen Science Platform for Social Land Cover Maps from EO Data. In Proceedings of the EO Open Science, Rome, Italy, 12–14 September 2016.
  41. Moreno, L. Street Health. Earth Observation, Machine Learning and Opendata for Urban forest management. In Proceedings of the EO Open Science, Rome, Italy, 12–14 September 2016.
  42. Snik, F.; Rietjens, J.H.H.; Apituley, A.; Volten, H.; Mijling, B.; Di Noia, A.; Heikamp, S.; Heinsbroek, R.C.; Hasekamp, O.P.; Smit, J.M.; et al. Mapping atmospheric aerosols with a citizen science network of smartphone spectropolarimeters. Geophys. Res. Lett. 2014, 41, 7351–7358. [Google Scholar] [CrossRef]
  43. Schneider, P.; Castell, N.; Vogt, M.; Lahoz, W.; Bartonova, A. Making Sense of Crowdsourced Observations: Data Fusion Techniques for Real-Time Mapping of Urban Air Quality. Available online: http://meetingorganizer.copernicus.org/EGU2015/EGU2015-3503-1.pdf (accessed on 12 January 2017).
  44. Dubois, G.; Bastin, L.; Fritz, S.; Graziano, M.; Martínez-López, J.; Pekel, J.-F.; Paganini, M. Remote sensing and citizen science in support to biodiversity conservation. The case of DOPA, a Digital Observatory for Protected Areas. In Proceedings of the Living Planet Symposium, Prague, Czech Republic, 9–13 May 2016.
  45. Tomnod. Available online: http://www.tomnod.com (accessed on 20 November 2016).
  46. Partyka, J. Post-Mortem on Flight MH370 Crowdsource Search. 2015. Available online: http://gpsworld.com/post-mortem-on-flight-mh370-crowdsource-search/ (accessed on 20 November 2016).
  47. Fishwick, C. Tomnod—The Online Search Party Looking for Malaysian Airlines Flight MH370. 2014. Available online: http://theguardian.com/world/2014/mar/14/tomnod-online-search-malaysian-airlines-flight-mh370 (accessed on 20 November 2016).
  48. Haklay, M. ECSA2016: Open Citizen Science—Day 2 (Morning). 2016. Available online: https://povesham.wordpress.com/2016/05/20/ecsa2016-open-citizen-science-day-2-morning/ (accessed on 20 November 2016).
  49. Kumar, N.; Belhumeur, P.N.; Biswas, A.; Jacobs, D.W.; Kress, W.J.; Lopez, I.C.; Soares, J.V. Leafsnap: A computer vision system for automatic plant species identification. In Computer Vision—ECCV 2012; Springer: Berlin/Heidelberg, Germany, 2012; pp. 502–516. [Google Scholar]
  50. Regan, T.J.; Master, L.L.; Hammerson, G.A. Capturing expert knowledge for threatened species assessments: A case study using NatureServe conservation status ranks. Acta Oecol. 2004, 26, 95–107. [Google Scholar] [CrossRef]
  51. Powell, J.; Nash, G.; Bell, P. GeoExposures: Documenting temporary geological exposures in Great Britain through a citizen-science web site. Proc. Geol. Assoc. 2013, 124, 638–647. [Google Scholar] [CrossRef]
  52. European Space Agency (ESA). Available online: http://www.esa.int/ESA (accessed on 16 January 2017).
  53. Kyba, C.; Wagner, J.; Kuechly, H.; Walker, C.; Elvidge, C.; Falchi, F.; Ruhtz, T.; Fischer, J.; Hölker, F. Citizen science provides valuable data for monitoring global night sky luminance. Sci. Rep. 2013, 3, 1835. [Google Scholar] [CrossRef] [PubMed]
  54. Wikimedia Foundation, Diffusion of Innovations. Available online: http://en.wikipedia.org/wiki/Diffusion_of_innovations#Adopter_categories (accessed on 20 November 2016).
  55. Future of Citizen Science/Crowdsourcing using Satellite Data Survey Form. Available online: https://goo.gl/54MqsM (accessed on 16 January 2017).
  56. Beyer, H.; Holtzblatt, K. Contextual Design: Defining Customer-Centered Systems; Morgan Kaufmann Publishers Inc.: San Francisco, CA, USA, 1997. [Google Scholar]
  57. Ann, B. Semi-Structured Qualitative Studies. In The Encyclopedia of Human-Computer Interaction, 2nd ed.; The Interaction Design Foundation: Aarhus, Denmark, 2013. [Google Scholar]
  58. Usability Study Results (Sneak Preview). Available online: https://blog.wikimedia.org/2009/04/24/usability-study-results-sneak-preview/ (accessed on 16 January 2017).
  59. GISCorps. Available online: http://www.giscorps.org/ (accessed on 23 November 2016).
  60. MapAction. Available online: http://mapaction.org/ (accessed on 23 November 2016).
  61. United Nations Office for the Coordination of Humanitarian Affairs (UN OCHA). Available online: http://www.unocha.org/ (accessed on 23 November 2016).
  62. Heinzelman, J.; Waters, C. Crowdsourcing Crisis Information in Disaster-Affected Haiti; US Institute of Peace: Washington, WA, USA, 2010. [Google Scholar]
  63. Kerle, N.; Hoffman, R.R. Collaborative damage mapping for emergency response: The role of Cognitive Systems Engineering. Nat. Hazards Earth Syst. Sci. 2013, 13, 97–113. [Google Scholar] [CrossRef]
  64. Kotovirta, V.; Toivanen, T.; Tergujeff, R.; Häme, T.; Molinier, M. Citizen Science for Earth Observation: Applications in Environmental Monitoring and Disaster Response. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2015, 40, 1221. [Google Scholar] [CrossRef]
  65. Tapia, A.H.; Bajpai, K.; Jansen, B.J.; Yen, J.; Giles, L. Seeking the trustworthy tweet: Can microblogged data fit the information needs of disaster response and humanitarian relief organizations. In Proceedings of the 8th International ISCRAM Conference, Lisbon, Portugal, 8–11 May 2011.
  66. Li, L.; Goodchild, M.F. The Role of Social Networks in Emergency Management: A Research Agenda. Int. J. Inf. Syst. Crisis Response Manag. 2010, 2, 49–59. [Google Scholar] [CrossRef]
  67. Giovanni. Available online: http://giovanni.gsfc.nasa.gov/giovanni (accessed on 16 January 2017).
  68. Solve Puzzles for Science|Fold.it. Available online: http://fold.it/portal/ (accessed on 23 November 2016).
  69. Cooper, S.; Treuille, A.; Barbero, J.; Leaver-Fay, A.; Tuite, K.; Khatib, F.; Snyder, A.C.; Beenen, M.; Salesin, D.; Baker, D.; et al. The challenge of designing scientific discovery games. In Proceedings of the Fifth International Conference on the Foundations of Digital Games, Monterey, CA, USA, 19–21 June 2010.
  70. BioBlitz—Wikipedia. Available online: https://en.wikipedia.org/wiki/BioBlitz (accessed on 23 November 2016).
  71. Schade, S.; Tsinaraki, C. Survey Report: Data Management in Citizen Science Projects; Publications Office of the European Union: Luxembourg, 2016. [Google Scholar]
  72. European Citizen Science Association (ECSA). Projects, Data, Tools, and Technology. Available online: http://ecsa.citizen-science.net/working-groups/projects-data-tools-and-technology-committee (accessed on 21 November 2016).
  73. Metadata Archives|Citizen Science Blog. Available online: http://citizenscience.org/category/metadata/ (accessed on 20 November 2016).
  74. W3C, Data on the Web Best Practices. Available online: https://www.w3.org/2013/dwbp/wiki/Main_Page#Mission (accessed on 20 November 2016).
  75. W3C, Spatial Data on the Web Working Group. Available online: https://www.w3.org/2015/spatial/wiki/Main_Page (accessed on 20 November 2016).
  76. International Organization for Standardization, ISO 19156:2011—Geographic Information—Observations and Measurements. 15 December 2011. Available online: http://www.iso.org/iso/catalogue_detail.htm?csnumber=32574 (accessed on 18 November 2016).
  77. Semantic Web—W3C. Available online: https://www.w3.org/standards/semanticweb (accessed on 20 November 2016).
  78. Data—W3C. Available online: https://www.w3.org/standards/semanticweb/data (accessed on 21 November 2016).
  79. Frew, J. Provenance and Volunteered Geographic Information. 2007. Available online: http://www.ncgia.ucsb.edu/projects/vgi/docs/position/Frew_paper.pdf (accessed on 12 January 2017).
  80. Keßler, C.; Trame, J.; Kauppinen, T. Provenance and Trust in Volunteered Geographic Information: The Case of OpenStreetMap. Available online: http://kauppinen.net/tomi/cosit11poster.pdf (accessed on 12 January 2017).
  81. Kuhn, W.; Kauppinen, T.; Janowicz, K. Linked Data—A paradigm shift for Geographic Information Science. In Proceedings of the Eighth International Conference on Geographic Information Science (GIScience2014), Vienna, Austria, 24–26 September 2014.
  82. Sheppard, S.A.; Wiggins, A.; Terveen, L. Capturing quality: Retaining provenance for curated volunteer monitoring data. In Proceedings of the 17th ACM Conference on Computer Supported Cooperative Work & Social Computing, Baltimore, MD, USA, 15–19 February 2014.
  83. Belhajjame, K.; Cheney, J.; Corsar, D.; Garijo, D.; Soiland-Reyes, S.; Zednik, S.; Zhao, J. PROV-O: The PROV Ontology. 2013. Available online: https://www.w3.org/TR/prov-o/ (accessed on 21 November 2016).
  84. Crall, A.W.; Newman, G.J.; Jarnevich, C.S.; Stohlgren, T.J.; Waller, D.M.; Graham, J. Improving and integrating data on invasive species collected by citizen scientists. Biol. Invasions 2010, 12, 3419–3428. [Google Scholar] [CrossRef]
  85. Stohlgren, T.J.; Schnase, J.L. Risk analysis for biological hazards: What we need to know about invasive species. Risk Anal. 2006, 26, 163–173. [Google Scholar] [CrossRef] [PubMed]
  86. Bonter, D.N.; Cooper, C.B. Data validation in citizen science: A case study from Project FeederWatch. Front. Ecol. Environ. 2012, 10, 305–307. [Google Scholar] [CrossRef]
  87. Prestopnik, N.R.; Crowston, K. Gaming for (citizen) science: Exploring motivation and data quality in the context of crowdsourced science through the design and evaluation of a social-computational system. In Proceedings of the 2011 IEEE Seventh International Conference on e-Science Workshops (eScienceW), Stockholm, Sweden, 5–8 December 2011.
  88. Raddick, M.J.; Bracey, G.; Gay, P.L.; Lintott, C.J.; Murray, P.; Schawinski, K.; Szalay, A.S.; Vandenberg, J. Galaxy Zoo: Exploring the Motivations of Citizen Science Volunteers. Available online: https://arxiv.org/abs/0909.2925 (accessed on 18 January 2017).
  89. Cutraro, J. Poke Around with Citizen Science. 2016. Available online: http://scistarter.com/blog/2016/07/poke-around-citizen-science/#sthash.dYsYU7R0.dpbs (accessed on 21 November 2016).
  90. Create Online Charts & Infographics|infogr.am. Available online: https://infogr.am/ (accessed on 20 November 2016).
  91. Tableau; Tableau Software: Seattle, WA, USA, 2016.
  92. Free Visualization Software|Free Analysis Software|Free Analytics. Available online: http://visualizefree.com/index.jsp (accessed on 20 November 2016).
  93. Plotly|Make Charts and Dashboards Online. Available online: https://plot.ly/ (accessed on 21 November 2016).
  94. Yurbi|Bring Your Data to Life. Available online: https://www.yurbi.com/ (accessed on 21 November 2016).
  95. Ushahidi. Available online: https://www.ushahidi.com/ (accessed on 20 November 2016).
  96. Open Data Kit. Available online: https://opendatakit.org/ (accessed on 20 November 2016).
  97. Gemma—Geospatial Engine for Mass Mapping Applications. Available online: http://gemma.casa.ucl.ac.uk/ (accessed on 20 November 2016).
  98. Lintott, C.J.; Schawinski, K.; Slosar, A.; Land, K.; Bamford, S.; Thomas, D.; Raddick, M.J.; Nichol, R.C.; Szalay, A.; Andreescu, D.; et al. Galaxy Zoo : Morphologies derived from visual inspection of galaxies from the Sloan Digital Sky Survey. Month. Not. R. Astron. Soc. 2008, 389, 1179–1189. [Google Scholar] [CrossRef] [Green Version]
  99. Zooniverse. Available online: https://www.zooniverse.org/ (accessed on 22 November 2016).
  100. i-Mars. Available online: http://www.i-mars.eu/sum (accessed on 22 November 2016).
  101. Szpir, M. Clickworkers on Mars American Scientist. 2002. Available online: http://www.americanscientist.org/issues/pub/clickworkers-on-mars (accessed on 22 November 2016).
  102. SpaceWarps. Available online: https://spacewarps.org/ (accessed on 22 November 2016).
  103. Hadhazy, A. Crowdsourcing the Universe: How Citizen Scientists are Driving Discovery (Kavli Roundtable). 2016. Available online: http://www.space.com/31626-crowdsourced-astronomy-finding-faint-galaxies-in-deep-space.html (accessed on 22 November 2016).
  104. Cain, F. Geologist Finds a Meteorite Crater in Google Earth. Available online: http://www.universetoday.com/13263/geologist-finds-a-meteorite-crater-in-google-earth/ (accessed on 23 November 2016).
  105. The Telegraph, Meteor Crater Found on Google Earth Could Help Prepare for Future Impacts. 2010. Available online: http://www.telegraph.co.uk/news/science/science-news/8026237/Meteor-crater-found-on-Google-Earth-could-help-prepare-for-future-impacts.html (accessed on 23 November 2016).
  106. Neilsen, J. Ten Usability Heuristics for User Interface Design. 1995. Available online: http://tfa.stanford.edu/download/TenUsabilityHeuristics.pdf (accessed on 12 January 2017).
  107. Gong, J.; Tarasewich, P. Guidelines for handheld mobile device interface design. In Proceedings of the DSI 2004 Annual Meeting, Boston, MA, USA, 20–23 November 2004.
  108. Baharuddin, R.; Singh, D.; Razali, R. Usability Dimensions for Mobile Applications—A Review. Res. J. Appl. Sci. Eng. Technol. 2013, 5, 2225–2231. [Google Scholar]
  109. Reeves, L.M.; Lai, J.; Larson, J.A.; Oviatt, S.; Balaji, T.S.; Buisine, S.; Collings, P.; Cohen, P.; Kraal, B.; Martin, J.-C.; et al. Guidelines for multimodal user interface design. Commun. ACM 2004, 47, 57–59. [Google Scholar] [CrossRef]
  110. Pernice, K.; Nielsen, J. Web Usability for Senior Citizens: Design Guidelines Based on Usability Studies with People Age 65 and Older; Nielsen Norman Group: Fremont, CA, USA, 2008. [Google Scholar]
  111. Zaphiris, P.; Ghiawadwala, M.; Mughal, S. Age-centered Research-Based Web Design Guidelines. In CHI 2005—Late Breaking Results: Posters, Proceedings of the CHI EA ’05, CHI ’05 Extended Abstracts on Human Factors in Computing Systems, Portland, OR, USA, 2–7 April 2005; ACM: New York, NY, USA, 2005. [Google Scholar]
Figure 1. Analysis of stakeholder responses: (left) Following identification of categories based on an analysis of survey responses, each participant feedback is encoded with a relevant category—image blurred to preserve anonymity of responders; (right-top) hand-written notes helped summarise category counts based on survey responses; and (right-bottom) an example affinity diagram—image from [58].
Figure 1. Analysis of stakeholder responses: (left) Following identification of categories based on an analysis of survey responses, each participant feedback is encoded with a relevant category—image blurred to preserve anonymity of responders; (right-top) hand-written notes helped summarise category counts based on survey responses; and (right-bottom) an example affinity diagram—image from [58].
Remotesensing 09 00087 g001
Figure 2. Stakeholder views on the adoption of citizen science initiatives in the scientific community. Short term: less than or equal to three years; Medium term: between three and five years; and Long term: between five and 15 years.
Figure 2. Stakeholder views on the adoption of citizen science initiatives in the scientific community. Short term: less than or equal to three years; Medium term: between three and five years; and Long term: between five and 15 years.
Remotesensing 09 00087 g002
Figure 3. Stakeholder views on the adoption of citizen science initiatives by Commercial Organizations. Short term: less than or equal to three years; Medium term: between three and five years; and Long term: between five and 15 years.
Figure 3. Stakeholder views on the adoption of citizen science initiatives by Commercial Organizations. Short term: less than or equal to three years; Medium term: between three and five years; and Long term: between five and 15 years.
Remotesensing 09 00087 g003
Figure 4. Stakeholder views on the adoption of citizen science initiatives by different communities—citizens, local authorities, civil protection, tourism and science. Short term: less than or equal to three years; Medium term: between three and five years; and Long term: between five and 15 years.
Figure 4. Stakeholder views on the adoption of citizen science initiatives by different communities—citizens, local authorities, civil protection, tourism and science. Short term: less than or equal to three years; Medium term: between three and five years; and Long term: between five and 15 years.
Remotesensing 09 00087 g004

Share and Cite

MDPI and ACS Style

Mazumdar, S.; Wrigley, S.; Ciravegna, F. Citizen Science and Crowdsourcing for Earth Observations: An Analysis of Stakeholder Opinions on the Present and Future. Remote Sens. 2017, 9, 87. https://doi.org/10.3390/rs9010087

AMA Style

Mazumdar S, Wrigley S, Ciravegna F. Citizen Science and Crowdsourcing for Earth Observations: An Analysis of Stakeholder Opinions on the Present and Future. Remote Sensing. 2017; 9(1):87. https://doi.org/10.3390/rs9010087

Chicago/Turabian Style

Mazumdar, Suvodeep, Stuart Wrigley, and Fabio Ciravegna. 2017. "Citizen Science and Crowdsourcing for Earth Observations: An Analysis of Stakeholder Opinions on the Present and Future" Remote Sensing 9, no. 1: 87. https://doi.org/10.3390/rs9010087

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop