Header

Yesterday, Caren Cooper and SciStarter hosted a Twitter discussion tagged #CitSciChat. The chat was structured around ten questions that were tweeted out and then anyone could answer. There is a Storify thread summarising the discussion, which gives a good overview.

However, I also analysed the discussion using a couple of digital tools. With NodeXL, I collected both the content of the tweets and the meta-data. Then I visualised the interactions and social network(s) that emerged in the real-time discussion.

In total there were 191 participants (nodes) creating 867 interactions with each other (edges).

Here are a few preliminary visualisations and a summary of the discussion. (feel free to re-use them)

\\

First, the network structure of the interactions. Larger nodes mean a higher degree of incoming “@-tweets”. The colors are distributed according to the Modularity Filter in Gephi, indicating different “communities”.

Skärmavbild 2015-01-29 kl. 11.30.18

To zoom in, either use this interactive map or pdf-file. Unsurprisingly, the moderator @CoopSciScoop is the central node of the network. But there are also other familiar names, and disussions emerging outside the centre of gravity.

Another way of visualising, is to use the profile images.

citscichat1

The discussion

I also collected the content of all the tweets. Because every tweet was marked up with Q1, Q2, Q3… for each question and A1, A2, A3 for every answer, it was fairly easy to grep the relevant lines in the messy thread. For clarity, I have shortened and cleansed the tweets just to get a brief glimpse of the answers to the questions. (for full conversation, see the Storify link above). Please excuse the “raw” format, I prioritized being close to the source data rather than summarising everything neatly.

Here is first topic #CitSciChat Q1. There are other names for #citizenscience – what are some?

citizen sensing 🙂
More items CitizenObservatories ResponsibleResearchInnovation Science2.0 DigitalSocialPlatform SmartCitizens #CitSciChat
fablabs livinglabs crowdscience crowdfunding
crowdsourcing,volunteered geographic information, civic science #CitSciChat
crowdsourcing,volunteered geographic information, civic science #CitSciChat
In environmental projects, we often use public participation #CitSciChat
More items CitizenObservatories ResponsibleResearchInnovation Science2.0 DigitalSocialPlatform SmartCitizens #CitSciChat
#citizenscience is a wide concept! #CitSciChat OpenDigitalScience ParticipatoryScience AmateurScience PopularScience AmateurScience…
In environmental projects, we often use public participation #CitSciChat
we also have all the lovely amateur naturalist, birdwatcher, butterfly collector, amateur astronomer and other older terms…
Here in Seattle the #quantifiedself folks are often doing #citscichat but don’t label that way
PPSR = Great descriptive name, not so great acronym. RT @maryeford: A1: Public Participation in Scientific Research… PPSR! #CitSciChat
sometimes “crowdsourcing” used as a term to describe certain kinds of cit sci #citscichat
volunteered science, volunteer monitoring, sometimes crowdsourcing is used interchangeably #CitSciChat
Public Participation in Scientific Research… PPSR! #CitSciChat
civic science, public science, participatory science …RT @CoopSciScoop: Here is first topic #CitSciChat Q1

Thoughts – #CitSciChat Q2. What disciplines are involved in #citizenscience?

which are not?
economics (seeing more evaluations come from univ’s econ departs) #CitSciChat
Deciphering handwriting! Not sure what “discipline” that falls under though. #CitSciChat
Are there disciplines that are notably NOT involved in citizen science that we think *should* be? #citscichat
Ecology, biology, social, economic, GIS info to move towards sustainable fisheries in ulf of California #CitSciChat
I think #citizenscience makes the biggest impact in the education discipline
I also believe that #citizenscience should be never disciplinary…
we’ve found examples of CS in arts, linguistics, geography, biology, biochem, genetics, oceanography… #CitSciChat
we work in physics, biotechnology, humanities, enviromental monitoring, policy making, education and learning, arts, ethics..
Hello everyone! This is a great conversation! A2: weather monitoring has been performed for years by the community #CitSciChat
we’ve found examples of CS in arts, linguistics, geography, biology, biochem, genetics, oceanography… #CitSciChat
Ecology, Environmental sciences, physics (astronomy), geology, medical, marine science. Not chemistry so I have noted!
Hello everyone! This is a great conversation! A2: weather monitoring has been performed for years by the community #CitSciChat
All! the more diversity the better. CitizenScience is a mean, not a goal. Any field can be citizenscience-enabled-improved
We really like to focuss on the challenge and then build the research team. We have a transdisciplinar approach #CitSciChat
#biodiversity studies are very common. #waterquality studies are historically common in #CitizenScience projects.
#citizenscience is crucial to our #habitat monitoring projects. #CitSciChat
participatory geography and Human-Computer Interaction are important to make citizen science projects successful…
Based on submissions to #CitSci2015 – so many! Bio, astro, human health, env. monitoring, digitizing, mapping, etc! A2. #biodiversity studies are very common. #waterquality studies are historically common in #CitizenScience projects.
Hello follks. In weather, @CoCoRaHS has been used by thousands for decades for precipitation records. #CitSciChat
We do a lot of cit sci work in ecology/environmental science #citscichat
Seismology for sure. It has always relied on citizen observations. Only way to know past earthquakes #CitSciChat
We do a lot of cit sci work in ecology/environmental science #citscichat

#CitSciChat Q3. What are goals of #citizenscience associations like @CSA, ECSA, and @CSNA?

be inclusive! and define standards, ethical guidelines, promote overarching data archives, annual conferences… lovely work!
please read the White Paper on Citizen Science for Europe (by @socientize & community) recommendations for meso level!
#citizenscience is above all #openscience and we shall avoid the crazily competitive aspect of #science
associations must provide support, add value, do things, aggregate/integrate communities, multiplier effect…
facilitate coordination! (spread knowledge!) identify, monitor, evaluate, exchange best practices and methods…
Supporting countries,address singularities of actors,promote new projects,interface among stockholders,sh…
Agree! But i think that most important aspect is to share methodologies, experiences, failure, success, worries..
Supporting countries,address singularities of actors,promote new projects,interface among stockholders,share resources…
more concrete: Research, deployment, infrastructure, porting
I think the new Citizen Science:Theory and Practice journal will be a great resource. http://t.co/vfkIA38ovp #CitSciChat
associations must provide support, add value, do things, aggregate/integrate communities, multiplier effect…
Would love to see #tenure pts for scientists who use best practices in #citizenscience #CitSciChat
we all scaffolding infrastructures should do citizen2 science, citizen meta-science, deliver citizen science as a service
Bit of a deep dive but A3 paper from @SyracuseU http://t.co/7O26VYRbaV … #CitSciChat
Another goal of @CitSciAssoc is to promote the value of cit sci. Also to provide access to tools/resources
Feedback I get is non-scientists enjoyment in participating/contributing to pro research. #citscichat #citizenscience
for ECSA (European Cit Sci Assoc) details at http://t.co/9qXl6NCGNr #CitSciChat about linking different activities in Europe and beyond
is global community of practice for citizen science, has goal to advance the field thru innovation/collaboration
Connecting people to nature. Increasing peoples understanding of the nature of science. Obtaining data for research.
Avocate, explain and smooth the way. Help with funding ideas. Help with not re-inventing the wheel. #CitSciChat
Great As! also connecting data & observations, raising new Qs, reuse/confirm data, improve practice @CoopSciScoop #CitSciChat
connecting communities, sparking and supporting collaborations, rising the collective citsci tide…RT #CitSciChat Q3.
Researchers sometimes want to improve outreach via #citizenscience #citscichat
The goal of @CitSciOZ is to make everything easier. On citizens as well as scientists. Introduce “Best Practice”.
connecting communities, sparking and supporting collaborations, rising the collective citsci tide…RT #CitSciChat Q3.

And #CitSciChat Q4. What sorts of resources do practitioners of #citizenscience need?

What are best practices for participant metadata: what demographic data can be requested, how can it be stored safely, etc. #CitSciChat
I’d like ethical guidelines – e.g. set of best practices to keep outdoor observers safe. #CitSciChat
guidelines,multimedia documentation,agenda,engagement strategies and channels, funding opportunities, learning materials…
evaluation metrics and indicators (different levels: scientific, societal, economic, environmental, behavioral) @ODS_study
shared spaces and events to meet in the physical world. educational resources: didactical units… experimental data
we have news: we are creating a huge space for citizen science (500.000€ equipment) in Zaragoza
computing resources: servers (housing, hosting),storage, computing,tools,frameworks,software,middleware.. fablab or CitSciLab
@mhaklay that cit sci practitioners must work together on question of ensuring data quality (technology helps!)
Transparency in process and results to generate trust & increase participation #CitizenScience #CitSciChat
a platform for collaboration so that all #CitizenScientists can work as a community #CitSciChat
#CitizenScience needs people – participants, researchers, educators, evaluators, marketers, grant writers, and more.#CitSciChat #team
Patience and understanding. The key is to adapt to participants expectations — as scientists we learned this!
Protocols that are easy to follow! @CoopSciScoop
Cit sci practitioners need to learn from the successes & failures of others, to be able to bounce ideas off each other
seem that a lot of people are looking for information about motivations and how to ensure data quality? #CitSciChat @CoopSciScoop
Data recording devices (computer/ tablet, pen and paper, camera/ smart phone!) & easy training! @CoopSciScoop #CitSciChat
Right now I would benefit the most from a list of successful incentives for getting citizens involved!
Very diverse resources such as community managers, open data expertise & data privacy specialists.
Patience and understanding. The key is to adapt to participants expectations
Other practitioners! 🙂 #CitSciChat
#CitSciChat friends if you want to be a citizen scientist and help on the shark research boat here, check out http://t.co/A4vfb3d6Xp
Scientist: Access to previously sourced data. Website & social media for advertisting and keeping in touch @CoopSciS…

#CitSciChat Q5. What are the pros and cons of these #citizenscience associations?

democracy, visibility, efficiency unexpected actors to the policy-making and decision processes… trusted community!!!
cons: top-down, self-interest, not real actors! just like science must be open, citizen science associations must be open
pervasive scientific literacy “The scientific spirit is of more value than its products” – Thomas Huxley
New associations will have to be careful to be global. Hard to involve those from some regions. #CitSciChat
Share, collaborate, new networks and colleagues Cons:Echo chamber, me toos, doesn’t always include volunteers/amateurs
But the sci questions might be! RT @jmhulbert: A5: #citscichat a con may be #citizenscience initiatives no longer be novel to funders
A major pro of a #CitizenScience is having it be viewed more formally and respectfully by the entire scientific community. #CitSciChat
Forming a #CitizenScience association is a pro and con. I hope the #inclusiveness of CS stays. #CitSciChat #community #team
Forming a #CitizenScience association is a pro and con. I hope the #inclusiveness of CS stays. #CitSciChat #community #team
Pros: provide a resource for learning and advancing citizen knowledge & awareness not just the associations needs #CitSciC…
Cons: academics and scientists don’t always trust data and understand process #CitSciChat”
We may be in danger of over-professionalizing cit sci. Some of its value is in putting science back in the hands of citizens #citscichat
a con may be that #citizenscience initiatives may no longer be novel or exciting to funding sources
Cons: Risk of becoming a discipline as other existing ones, Pros. Sharing, sharing and sharing and sharing
Pros:creating community in #citizenscience! Promoting citsci. Con: don’t want to silo citizen science from science. #CitSciChat
I think even though they don’t mean to, academics (me included) can ‘drown out’ citizens. Will have avoid this. #CitSciChat
academics and scientists don’t always trust data and understand process #CitSciChat
Pros: provide a resource for learning and advancing citizen knowledge & awareness not just the associations needs #CitSciChat
Being part of a #citizenscience community, networking, sharing info/resources. Cons: keeping momentum/interest #citscichat
on the positive side, ability to have a clear voice and ensure that #CitizenScience is securing its place #CitSciChat
#citscichat a con may be that #citizenscience initiatives may no longer be novel or exciting to funding sources
For A5, one risk is that the associations limit what is included in citizen science and what isn’t #CitSciChat – need to keep it wide for now
Pro:creating community in #citizenscience =huge plus! Promoting citsci. Cons: don’t want to silo citizen science from science #CitSciChat
Pros: collective learning, transparency, sustainable long-term commitments #CitSciChat

#CitSciChat Q6. What are some examples of best practices in #citizenscience?

respect for participants: inform, state conditions clearly, help give return and feedback. dont forget the #science pa…
Provide feedback to your #CitizenScience participants. We should want them to know that their participation is valued. #CitSciChat
Example of great privacy/legal dashboard for #citscichat health-related http://t.co/USCDXcin8D @Sagebio #resilience project
Using/understanding scientific method is (in my experience) great method for establishing solid #citizenscience foundation. #CitSciChat
identify and use open and easily portable data frameworks that can be adopted to answer many questions @coopsciscoop
Political action and immediate impact on very concrete issues #CitSciChat
focusing on the educational output of engaging the public!”
program empowering local communities to lead in fisheries management bc only they can produce data #CitSciChat
#citizenscience projects for both science engagement & science data. Sci questions that can ONLY be answered through…
focusing on the educational output of engaging the public!

#CitSciChat Q7. What’s the range of activities of citizen scientists? #citizenscience

citizens promote&co-create experiments with scientific-based methodologies and tools @crowdcrafting
citizens provide intellectual power (doing scientific tasks , participating in complex surveys and citizens provide resources (computers,smartphones, cameras, scientific equipment…),experimental data, peer-based community…
citizen scientists use, fund, collaborate, contribute, evaluate, appraise, change, shift, evolve …
data but also learn about underlying model & assumptions, build instruments (apps+sensors) share learnings
citizen scientists use, fund, collaborate, contribute, evaluate, appraise, change, shift, evolve …
Human-wildlife conflict & community capacity building: Room for citizen social sciences & mixed methodologies #CitSciChat
Citizen scientists touch, listen, visualize, discover, learn, initiate, develop, emerge, support, catalyze, participate…
with the right training nothing is beyond the range of citizen scientists!
Taking photos of organisms is primarily what we do at #NHMLA. Collecting spiders and sending them to us has been done too. #CitSciChat
our program includes fisheries monitoring, biological monitoring, gathering GIS data, communicating results #CitSciChat
Range of activities basically includes outside, inside and under water here in #Australia! #CitSciChat

#CitSciChat Q8. What are hoped for outcomes of the #citizenscience conference in Feb?

how can we ensure all who can’t come to #CitSci2015 can benefit from lessons and knowledge that is shared?
Excited to get more #citizenscience in the classroom and aligned to #NGSS so teachers can use the lessons!
Example, A8 – http://t.co/CJ3LUYKkt2 Beta life-science #citizenscience project #citscichat (participants must READ well) h/t …
A8 seeing all the new connections and ideas for projects, and the growing communities. Learning about other projects #CitSciChat
ideas that help starting new citizen science projects be successful and ideas that help those projects make …
for me, personally: meeting more of our amazing community. Networking. Finding synergies in new places. So excited! #…
Would like to understand divide between life and environmental sciences in #citizenscience Need bridges #citscichat
ideas that help starting new citizen science projects be successful and ideas that help those projects make an impact
To help local projects link into global data collecting efforts. To move other projects outside of US too. …
Making new connections that lead to advances in the field of #citizenscience – new projects, partnerships
Action! All conferences should be looking to facilitate action and not just discussion. #CitSciChat
Inclusion, everyone feeling like they have a voice. Building community. Making #citizenscience even better & stronger. #CitSciChat

#CitSciChat Q9. Are #citizenscience practices universal or do they differ in different countries?

I am personally finding them pretty universal having experience in both Oz and the US #CitSciChat
logo that stays with pubs. showing %data from #citizenscience, Way to suggest future research direction #citscichat
Absolutely! 🙂 RT @LQDdata: A9: #CitizenScience awareness = participation = ownership = sustainability #CitSciChat
I am personally finding them pretty universal having experience in both Oz and the US #CitSciChat
awareness = participation = ownership = sustainability #CitSciChat
some practices might be similar -take photos report sightings etc but bound to vary by availability of resources, inv…
I think it differs. We talk a lot in @CitSciOz about metadata vs language. Words like “subject” can mean different things #CitSciChat
You are always learning! #CitSciChat
Further, do countries do ‘science’ broadly differently too? #citscichat & does this lead to best practice…
we adapt dynamics of our programs in each community and incorporate new technology and techniques as they become available #CitSciChat
a growing respect for #CitizenScience data and the conclusion they report #CitSciChat
There are common methodologies. Wonderful thing is that every project is different and cultures are different
a growing respect for #CitizenScience data and the conclusion they report #CitSciChat
Further, do countries do ‘science’ broadly differently too? #citscichat & does this lead to best practice discussions again
Having reviewed many #CitSci2015 proposals, I think they differ. The conference should confirm that. #CitSciChat
#citscichat Maybe incentives for participation lead to the greatest differences between projects?
#citscichat I suspect may be a reporting or documentation element here – many regions have involvement difficult to monitor
I feel for A9, we still need some reviews and research! eg. what about range of languages being used today in #citizenscience? #CitSciChat
Maybe incentives for participation lead to the greatest differences between projects?

#CitSciChat Q10. Please share your favorite things on the #citizenscience horizon?

I want to see wearable citizen science,digital arts and collective intelligence,neurodata gathering and new societal values!
more people connecting to internet of things to generate data
Hoping to eventually see dedicated funding lines for #citizenscience, particularly for long-term projects w/ low up…
tools and big efforts to mobilize data a people to make positives change on local and global scales…
Projects sharing platforms/technologies/protocols so we don’t have to build new software over & over. #citscichat
Hoping to eventually see dedicated funding lines for #citizenscience, particularly for long-term projects w/ low upkeep #citscichat
more people connecting to internet of things to generate data
data *and* communities RT @MeghaninMotion: A10: Connecting silo’ed, distributed but related data.. #CitSciChat
I look forward to projects sharing platforms/technologies/protocols so we don’t have to build new software over & over. #citscichat
Increasing use of mobile tech & access to variety of projects regardless of where participants live. #CitSciChat
Connecting silo’ed & distributed but related data thru efforts of multiple (potentially unrelated) projects #CitSciChat @CoopSciScoop
Increasing use of mobile tech & access to variety of projects regardless of where participants live. #CitSciChat #citizenscience
Neurosynaptic processors opening up previously human-only visual identification projects (galaxies, cells, pa…
improved policymaking, more access to tools/data/projects, new interdisciplinary communities (in-person)RT @CoopSciScoop:…
sensor network for #biodiversity. It’s the only way to really get the data needed for conservation! #CitSciChat
improved policymaking, more access to tools/data/projects, new interdisciplinary communities (in-person)RT @CoopSciScoop: #CitSciChat

/Christopher
For more information and source data, please e-mail christopher dot kullenberg AT gu dot se

At the 2nd General Assembly of the European Citizen Science Association (ECSA), held in Berlin on November 26, 2014,  Lucy Robinson and Jade Cawthray of the National History Museum in London, presented the recent progress of the ECSA working group on Principles and Standards in Citizen Science and asked for feedback and input from participants. Since it was recommended to make the ten principles public to encourage broader discussion than what that would take place only within ECSA, we circulate them through this blog as well.

The ten principles of citizen science

  1. Citizen science projects actively involve citizens in scientific research. Citizens can act as contributors, collaborators, or as project leader and have a meaningful role in the research project.
  2. Citizen science projects have a genuine research question or goal.
  3. Citizen scientists benefit from taking part. Benefits may include learning opportunities, social benefits, community cohesion, gathering evidence for a local issue, or the opportunity to influence policy.
  4. Citizen scientists may, if they wish, participate in multiple stages of the scientific process. This may include developing the research question, designing the method, gathering and analysing data, and publishing the results.
  5. Citizen scientists receive feedback from the project. For example, how their data are being used and what the research, policy or societal outcomes are.
  6. Citizen science data are considered equally valuable as traditionally collected data.
  7. Citizen science project data and meta-data are made publically available, and results are published in an open access format. Data sharing may occur during or after the project, unless there are security or privacy concerns that prevent this.
  8. Citizen scientists are acknowledged in project results and publications.
  9. Citizen science programmes are evaluated for their scientific output, data quality, participant experience and wider societal or policy impact.
  10. Citizen science is a flexible concept which can be adapted and applied within diverse situations and disciplines. Citizen science lends itself to cross-disciplinary work, bringing new perspectives and skills to a research project.
A lively discussion followed this presentation  In general, more effort appears to be needed to define citizen science, as well as what we mean by “citizen” and what we mean by”scientist”. As Riesch and Potter (2014) noted, citizen science is a contested term with multiple origins underpinned by different views: on one side, the term was coined in the mid-1990s by Rick Bonney in the US (see Bonney et al., 2009) to refer to public participation and science communication projects. On the other side, the term was used in the UK by Irwin (1995) to refer to his developing concepts of scientific citizenship which foregrounds the necessity of opening up science and science policy processes to the public (Riesch & Potter, 2014, p. 107).
Several key points emerged from the discussion, but, for the sake of this presentation, we would like to highlight the following:
  • The scientific component is important: citizen science should be about science and not only about public engagement or education.
  • The notion of “science” needs to be further developed. It should be also hypothesis-led or maybe “evidence-based”.
  • Citizen science should not be narrowed down to monitoring: much citizen science also deals with analyzing and manipulating existing data.
  • Citizens’ participation is not only about data collection, but also about analysis and interpretation of data.

These points state clearly the need to shift from the citizen scientist working for scientists as “avian biological sensor” (Sullivan et al., 2009, p. 2290) – merely involved in observing, collecting and classifying data – to a participant who can work together with scientists to analyze and interpret data. As also emerged from the most cited literature on citizen science, typically citizens have engaged in large-scale projects involving exploratory research aimed at surveillance monitoring, conducted without specific hypotheses in mind. The idea of expertise that comes out from the reports of published citizen science projects is too often limited to the pronouncements of scientists, reflecting a very restricted model of the relationship between citizens and scientists.

References

Riesch, H. & Potter, C., (2014) ). Citizen science as seen by scientists: Methodological, epistemological and ethical dimensions, Public Understanding of Science 23 (1) : 107- 120.

Sullivan B. L., et al. (2009) eBird: a citizen-based bird observation network in the biological sciences. Biological Conservation, 142: 2282–2292.

In a guest blog post on CitizenSci, Gwen Ottinger writes about a fresh study on air quality monitoring. The study reveals that “[a]ir concentrations of potentially dangerous compounds and chemical mixtures are frequently present near oil and gas production sites”, which in turn affects the health of local residents negatively. The data used in the study was collected by volunteer citizen scientists, using cheap buckets where:

[s]amples were ultimately collected near production pads, compressor stations, condensate tank farms, gas processing stations, and wastewater and produced water impoundments in five states (Arkansas, Colorado, Ohio, Pennsylvania, and Wyoming). (Macey et al. (2014, p.6)

The method of using buckets was advanced by the Louisiana Bucket Brigade already in 1995, inspired by the famous litigation against the Pacific Gas and Electric Company by Erin Brockovich.

This case is particularly interesting because the citizen scientists were active in shaping the research problem, and even in choosing the locations for collecting samples. Ottinger writes:

The recently released study pioneers a new approach to choosing sites for air quality monitoring: it mobilizes citizens to identify the areas where sampling was most likely to show the continuous impact of fracking emissions. Citizens chose places in their communities where they noticed a high degree of industrial activity, visible emissions, or health symptoms that could be caused by breathing toxic chemicals. They took samples themselves, following rigorous protocols developed by non-profit groups working in conjunction with regulatory agencies and academic researchers.

Moreover, in another article in Science, Technology and Human Values, Ottinger analyses the “Buckets of Resistance” of the Lousiana Bucket Brigade. She argues that the effectiveness of citizen scientists depended to a large extent on standards and standardized practices. To measure air quality successfully, the citizen scientists had to follow certain standardized procedures and tests that were used already by established scientists. This way, the measurements could “count” as proper scientific observations. However, other actors also used the same standards as an entry point for criticism of the citizen scientists’ measurements.

In the case of ‘bucket brigades’ and similar cases, it seems like the citizen scientists have a great deal of influence in configuring the research process as a whole. The problematization occurs on a local level, where citizens identify and react to a problem in their communities. But also the decisions on what to measure (and what not to measure) seem to be in the hands of volunteers. However, as Ottingen shows, the standards and established procedures, are harder to reshape. The citizens, in order to make ‘science proper’, need to relate and connect to an already existing paradigm of scientific knowledge and practice.

Do you know of any other interesting projects that share similar features as above? Please leave a comment!

References

Ottinger, G. (2010) “Buckets of Resistance: Standards and the Effectiveness of Citizen Science”, Science, Technology and Human Values, March 2010 vol. 35 no. 2 244-270.

Several of the most cited articles we examined stress the importance of a functional protocol. Protocols are considered critical for establishing control over the tasks performed by citizen scientists. In fact, scientists setting up citizen science projects are typically concerned with accuracy, reliability and usability of data collected by citizens. How can amateurs collect data that are as good as those generated by professional re- searchers? According to the scientists interviewed by Cohn (2008), amateurs can collect reliable data and help advance scientific knowledge if they are properly trained to use instruments, collect and read data. Furthermore, it is important to design specific protocols that limit the tasks assigned to amateurs, test them and see whether reliable data are collected.

What are protocols, by the way? Bonney et al. (2009) described clearly what a protocol is and what it is for. They tell us that protocols specify when, where, and how data should be gathered. Used in large projects spanning multiple locations, such as, for example, the Seed Preference Test (SPT), which in 1994 attracted more than 17,000 participants of all ages and birding abilities (Trumbull et al., 2000), protocols “define a formal design or action plan for data collection” (p. 980), which allows observation made by many independent amateurs to be combined and used for analysis. These protocols should be clear, easy to use, and engaging for volunteer participants. Bonney et al. (2009) described how project designers working at the Cornell Lab of Ornithology (CLO) have tested draft protocols with both local groups, by accompanying them in the field and observing them as they collect and submit data, and with distant groups, by collecting their feedback online.

Unsurprisingly, protocols are one of the pillars supporting the engagement of citizen scientists, as emerging from our reading of the articles. Arguably, we could say that they act as ‘representatives’ of professional scientists, acting as “boundary objects”, aligning heterogeneous participants and professional scientists, such as in the SPT project. They reflect a normative view of how science should be performed and normative expectations of what the “scientific citizen” should do, once involved in a research project. Similarly to a speed bump, which is a technical artifact with an inbuilt script that prescribes drivers to proceed slowly (Latour, 1992), protocols have an inbuilt script that prescribes citizen scientists what to observe and report about. Control over observation tasks is delegated to this tool. Obviously, drivers can choose to ignore speed bumps – just fly over them and do not slow down. Similarly, citizen scientists can choose to ignore the protocols – perhaps over-reporting certain species of birds and under-reporting others. They will not be fined for their behavior as they would be by policemen, if they decided to ignore speed bumps, but their data is unlikely to pass scientists’ scrutiny.

References

Bonney R (Bonney, Rick); Cooper CB (Cooper, Caren B.); Dickinson J (Dickinson, Janis); Kelling S (Kelling, Steve); Phillips T (Phillips, Tina); Rosenberg KV (Rosenberg, Kenneth V.); Shirk J (Shirk, Jennifer), 2009, Citizen Science: A Developing Tool for Expanding Science Knowledge and Scientific Literacy, BIOSCIENCE 59 (11): 977-984.

Cohn, J. (2008). Citizen Science: Can Volunteers Do Real Research?

Latour, B. (1992). Where are the Missing Masses? The Sociology of a Few Mundane Artifacts.
In W. E. Bijker & J. Law (Eds.), Shaping Technology/ Building Society: Studies in
Sociotechnical Change
(pp. 225 -258). Cambridge, MA: MIT Press.

Trumbull, D., Bonney, R., Bascom, D., & Cabral, A. (2000). Thinking Scientifically during Participation in a Citizen-Science Project. Science Education, v84 n2 p265-75.

In early September this year, Caren B. Cooper, Jennifer Shirk and Benjamin Zuckerberg published an article called The Invisible Prevalence of Citizen Science in Global Research: Migratory Birds and Climate Change. This article analyzes the role of citizen science in the (most cited) articles describing the “impacts of climate change on avian migration”. The results are quite interesting:

We found that 85 of the 171 papers that we could classify were based on citizen science, constituting 5 to 20 papers per claim (Appendix S1). Citizen science heavily informed claims related to ecological patterns and consequences and was less frequently cited for claims about mechanisms (Table 1).

In other words, when it comes to avian migration and climate change, citizen scientists contribute to almost half of the body of scientific facts that we rely on for knowing about this phenomenon. Moreover, the quality of the data was examined, and the authors found no deviation among the observations performed by citizen scientists vs. the observations made by conventional means.

However, Cooper, Shirk and Zuckerberg point to a problem of visibility of the citizen scientists. It seems like the scientific community has not yet recognized the contribution of citizen scientists properly, and the authors argue that there is a “stigma” attached to involving the public:

The use of citizen science data in an active field of ecological research, such as migration phenology, is strong evidence that any stigma associated with the use of data collected by volunteers is unwarranted. Yet, the contributions of citizen science were not readily detectable in most cases. Thus, the stigma may persist unless researchers begin to draw attention to the citizen-science elements in their research papers.

As a consequence, scientific articles do not always render visible that citizens have participated in keywords, titles or abstracts. Thus, the authors suggest that the keyword “citizen science” should be used as a standardized keyword for all further studies that involve their contribution.

Cooper CB, Shirk J, Zuckerberg B (2014) The Invisible Prevalence of Citizen Science in Global Research: Migratory Birds and Climate Change. PLoS ONE 9(9): e106508. doi:10.1371/journal.pone.0106508

Franzoni and Sauermann (2014), in their article titled Crowd science: The organization of scientific research in open collaborative projects, suggest a classification of crowd science projects according to task complexity and structure, which also provides an explanation of why and how projects will perform (regardless of being successful or not).

They define task complexity as the relationship between different individual sub-tasks. Less task complexity (usually preferred) is attained by minimizing individual sub-tasks. Therefore, a large and complex problem can be modularized, by being divided into many smaller modules, to address smaller problems, with a strategy or architecture specifying how modules fit together. Modularization is taken by the authors to allow for greater division of labor. Then, Franzoni and Sauermann, refer to task structure to denote how well defined the structure of sub-tasks is. Task complexity and task structure are useful for examining what amateurs are asked to do. Several “citizen science” projects, such as Galaxy Zoo, for example, ask for contributions that only require skills that are common to the general human population. For example, in Galaxy Zoo, when classifying galaxies, citizen scientists should be able to work independently on their sub-tasks, without the need to consider what other project participants contribute. This modularization – or granularity of tasks, as Benkler and Nissenbaum (2006) called it – allows people with different levels of motivation to work together by contributing small or large grained modules, consistent with their level of interest in the project and their motivation. Furthermore, modularization is compatible with loosely coupled work (Olson & Olson, 2000), which has fewer dependencies, is more routine, and tasks and procedures are clear. As a result, less amount and frequency of communication are needed to complete the task.

According to Franzoni and Sauermann, crowd science projects could benefit from modularization, by differentiating task complexity and structure aiming at citizens with different skills and expertise at different stages in a project. Different crowd science projects display more or less clearly formulated task complexities and structures and can be classified accordingly.

It should be noted that not only the organization of crowd science projects, often involving a number of independent participants in multiple locations, demands for independent and well-structured tasks, but also the emphasis on controlled and prescribed protocols and validation and accuracy of data. As Bonney et al. (2009) put it:

Citizen science data are gathered through protocols that specify when, where, and how data should be collected. Protocols must define a formal design or action plan for data collection that will allow observations made by multiple participants in many locations to be combined for analysis.

The need for accurate and validate data require convergent tasks  (Nickerson, 2014) to be assigned to citizen scientists, meaning that scientists look for a single output from contributors.  Classification of stars or annotation according to standard labels from experts are examples of convergent task. Since  in most citizen science projects reported in the literature we have examined, citizen scientists are only expected to perform tasks according to prescribed protocols, but not to design those tasks, which remains scientists’ responsibility,  it is worth reflecting on Nickerson’s thought-provokings words (which refer to Taylor’s advocated division between design and performance of tasks):

Distressingly, current crowd work seems to be at the early stages of recapitulating factory employment practices (p. 40).

 

References

Benkler, Y., & Nissenbaum, H. (2006). Commons-based peer production and virtue. Journal of Political Philosophy, 14(4), 394-419.

Bonney R (Bonney, Rick); Cooper CB (Cooper, Caren B.); Dickinson J (Dickinson, Janis); Kelling S (Kelling, Steve); Phillips T (Phillips, Tina); Rosenberg KV (Rosenberg, Kenneth V.); Shirk J (Shirk, Jennifer), 2009, Citizen Science: A Developing Tool for Expanding Science Knowledge and Scientific Literacy, BIOSCIENCE 59 (11): 977-984.

Franzoni, C., & Sauermann, H. (2014). Crowd science: The organization of scientific research in open collaborative projects. Research Policy, 43, 1–20.

Nickerson, J. V. (2013). Crowd work and collective learning. In A. Littlejohn & A. Margaryan (eds.), Technology-Enhanced Professional Learning (pp. 39-47). Routledge.

Olson, G. M., & Olson, J. S. (2000). Distance matters. Human-Computer Interaction, 15, 139–179.

 

 

Today, the terms “citizen science” and “crowd science” are quite the buzzwords. Searching the internet for these terms will render millions of hits. However, what happens when you search the Web of Science for the actual scientific publications? Is citizen and crowd science really publishing properly peer-reviewed articles, and if so, where can these articles be found?

We used the following search string in the Thomson-Reuters Web of Science Core Collection (requires subscription) to cover as much citizen/crowd science as possible:

TS="crowd science" OR TS="citizen science" OR TS="crowdsourcing" OR TS="crowd sourcing"

The search produced 1462 articles, which we then imported to a clever application called HistCite, which allows you to sort and search the output data. HistCite makes possible, for example, to sort the publications according to journals (click on the image below to access the data interactively):

 

top10journals

The journal with the most citizen/crowd science publications is the open-access multidisciplinary Plos One. Skipping the conference proceedings for now, the next journal in line is Frontiers in Ecology and the Environment followed by the computer science journal IEEE Internet Computing. The rest of the list seems to concern biology, conservation, ecology and one medicine journal. As a very preliminary observation, there appears to be a tendency for the mainstream of citizen/crowd science to appear within the disciplines of “life”, “computers” and “medicine”.

However, there are other things that you can do with scientometric data. Using the application VosViewer, it is possible to visualize the journal according to a principle called bibliographic coupling of sources.  This means, that articles that cite similar sources (other articles, books, etc.) are regarded as being “close” to each other. This can also be performed on a journal level. What we see below is, thus, journals that have clustered together closely because they cite similar references.

impact1

(click to enlarge)

Here we find one cluster on the right hand side (red) that focuses on ecology/zoology/conservation and another blue cluster on the left side, which is dominated by computer science journals. In the middle, the single journal Plos one forms a center of gravity. This seems to verify, at least visually, the top-10 list that we produced above.

If we zoom in on the red cluster (if your computer supports Java, you can do this interactively), we see the following:

 

(click to enlarge)impact2

Here, at least from a preliminary point of view, there seems to be an interesting line of research. What you can do then is to return to HistCite and look for the most cited authors, the individual articles that are most cited globally, or the publications that are the most cited within the dataset (n=1462).

Of course, this is not a complete picture. Scientific publications always lag behind “science in action”. If there is a strong trend right now, it will not show in the publication data until years later.

no