Zooniverse is a citizen scienceweb portal owned and operated by the Citizen Science Alliance. It is home to the internet's largest, most popular and most successful citizen science projects.[3] The organization grew from the original Galaxy Zoo project and now hosts dozens of projects which allow volunteers to participate in crowdsourced scientific research. It has headquarters at Oxford University and the Adler Planetarium.[4] Unlike many early internet-based citizen science projects (such as SETI@home) which used spare computer processing power to analyse data, known as volunteer computing, Zooniverse projects require the active participation of human volunteers to complete research tasks. Projects have been drawn from disciplines including astronomy, ecology, cell biology, humanities, and climate science.[5]
As of 14 February 2014[update], the Zooniverse community consisted of more than 1 million registered volunteers.[6] The volunteers are often collectively referred to as "Zooites".[7][8] The data collected from the various projects has led to the publication of more than 70 scientific papers.[9] A daily news website called 'The Daily Zooniverse' provides information on the different projects under the Zooniverse umbrella, and has a presence on social media.
The fourth and latest incarnation of the Galaxy Zoo project, in which users are shown images of a galaxy and then asked a series of questions to classify its morphology. The current sample includes images of high-redshift galaxies taken by the Hubble Space Telescope and low-redshift galaxies from the Sloan Digital Sky Survey in New Mexico.[11]
High-resolution images of the Moon's surface provided by the Lunar Reconnaissance Orbiter are used by volunteers to create detailed crater counts, mapping the variation in age of lunar rocks.[12][13]
Detecting bubbles in the interstellar medium which indicate regions where the early stages of star formation are taking place. The project uses infrared images from the Spitzer Space Telescope, as well as sub-millimetre data from Herschel.[15][16]
Analyze images of the surface of Mars, taken near the Martian southern polar cap. Classifications include marking fans and blotches caused by sublimating gas and geysers underneath the carbon dioxide ice. Images come from the HiRISE camera on board the Mars Reconnaissance Orbiter.[18][19][20][21]
Identify radio-wavelength images of astrophysical jets in galaxies that are powered by accretion onto a black hole. The task is to correctly associate any radio components with an infrared image of the black hole's host galaxy.[22]
Examine images of sunspots and rank pairs of images according to their relative complexity. The science goal is to examine how the complexity of sunspots evolves over time and how they produce eruptions. Data for the project comes from the Michelson Doppler Instrument aboard the SOHO spacecraft.[24][25]
Examine sets of time-lapsed images to search for moving objects that could be undiscovered asteroids. Data comes from the Catalina Sky Survey telescopes in Arizona. The project is run in conjunction with Planetary Resources, which is focused on developing technology for asteroid mining.[26]
Analyze images of the surface of Mars, taken by the low resolution Context camera on board the Mars Reconnaissance Orbiter, to identify future targets for the spacecraft's higher resolution HiRISE camera to investigate.[28]
Zooites use a special interface to digitally transcribe weather and sea ice data from the log books of United States Arctic exploration and research ships, that were at sea between 1850 and 1950.[29] The current data is the third phase of the project.
Classifying tropical cyclones by using a modified version of the Dvorak Technique. Volunteers are shown a series of images from infrared sensors on weather satellites and asked a number of questions to identify the type and strength of the storm.[30][31]
Monitor the status of bat populations by classifying the sounds they make for echolocation and social purposes.[32] The data are originally recorded using ultrasonic microphones; calls are played back at a slower speed within the range of human hearing; data are also shown visually in the form of a spectrogram.[33]
Classifying animals at the Serengeti National Park in Tanzania using images gathered from 225 camera traps. The purpose is to study how species are distributed across the landscape and interact with each other.[34]
Classifying plankton from images gathered by the In Situ Ichthyoplankton Imaging System to understand how plankton types are distributed at a variety of ocean depths. The information can be used to map oceanic carbon dioxide levels, as plankton provide a valid indicator.[38][39][40]
Examine motion-capture images from Pinnacles National Park in northern California. Volunteers identify California condors and mark the distance to feeding sources such as animal carcasses.[41]
Remote camera images of areas in the Southern Ocean and Antarctic Peninsula are tagged for detections of penguins of various species. Scientists aim to measure changes in the timing of penguin breeding, nest survival rates, the rates of predation on penguin chicks, and determining where colonies overwinter at breeding sites.[45][46]
By identifying individual Chimpanzees from videos and highlighting examples of tool use and other behaviour patterns help scientists, from the Max Planck Institute for Evolutionary Anthropology, understand chimp culture, population size and demographics in specific regions of Africa.[47]
Photograph wild orchids throughout the summer of 2015 and/or annotate images and transcribe data from the orchid collection of the Natural History Museum, London.[48]
Interpret the movement of wildebeest in images from camera traps in the Serengeti National Park to help scientists better map their migration movements and to understand the collective intelligence of herds.[49]
Document surface images from fossil bearing landscapes in the Turkana Basin in northern Kenya to identify potential sites of fossils and stone tools for further investigation.[52]
Transcribe hand-drawn observations, made between 1937 and 1958, of life cycle events for over 2,000 trees in the tropical forests of the Democratic Republic of the Congo.[53]
Examine camera trap images from western Australia to help the Western Shield project manage the impact of feral foxes and cats on the regions' native wildlife.
Examine camera trap images from Wisconsin to identify animals and help scientists better understand trends in the distribution of wildlife populations in the state.[54]
Transcribe British war diaries from World War I, helping historians to track troop movements, add to catalogue metadata, and delve into individual experiences of soldiers.[58][59]
Transcribe documents to help create a comprehensive database of New Zealand war history, comprising the names, jobs, birthplaces and health at enlistment of Australian and New Zealand soldiers in the New Zealand Army during World War One.[62]
Watch videos of nematode worms to collect genetic data that will assist medical research.[65][66] The classifications offer data to researchers on brain and gene function.[67] The nematode species studied is Caenorhabditis elegans.[68]
Between October 2010 and July 2012, some 16,400 volunteers transcribed the weather data from 1,090,745 pages[79] of the log books of World War 1 era Royal Navy ships. The project generated 1.6 million weather observations that will be used to improve climate modelling.[80][81]
Described the shapes and colors of star clusters in the Southern Pinwheel Galaxy (M83) using images from the Hubble Space Telescope.[86]
13 Jan 2014
2014
Whale FM
Pattern matching
Categorized the sounds made by killer whales and followed the travels of individual animals around the oceans.[30] Volunteers heard an audio clip of the whale sounds and viewed the data as a spectrogram. The project was run in conjunction with Scientific American.[87]
29 Nov 2011
Seafloor Explorer
Filtering
Identified species and ground cover in images of the seafloor to create a library of seafloor habitats.[30] The images were from a robotic camera that mapped the seafloor off the coast of the northeastern United States.[88]
SETILive was a project which attempted to use humans to identify potential signals from intelligent extraterrestrial life which may be missed by computer algorithms.[89] The data came from radio observations by the Allen Telescope Array of stars in the Kepler field of view.[90]
29 Feb 2012
12 Oct 2014
References
^"Projects". Citizen Science Alliance. Retrieved 19 November 2011.
^A.K. Finkbeiner (2010). A Grand and Bold Thing: An Extraordinary New Map of the Universe Ushering In A New Era of Discovery. Free Press. ISBN1416552162.
^Wakefield, Jane (29 February 2012). "Seti Live website to crowdsource alien life". BBC News Technology. Los Angeles, California. British Broadcasting Company. Retrieved 18 March 2012.