DARPA Network Challenge
The 2009 DARPA Network Challenge was a prize competition for exploring the roles the Internet and social networking play in the real-time communications, wide-area collaborations, and practical actions required to solve broad-scope, time-critical problems. The competition was sponsored by the Defense Advanced Research Projects Agency (DARPA), a research organization of the United States Department of Defense. The challenge was designed to help the military generate ideas for operating under a range of circumstances, such as natural disasters. Congress authorized DARPA to award cash prizes to further DARPA's mission to sponsor revolutionary, high-payoff research that bridges the gap between fundamental discoveries and their use for national security.
In the competition, teams had to locate ten red balloons placed around the United States and then report their findings to DARPA. Due to the distributed nature of the contest, many teams used online resources, such as social media sites, to gather information or to recruit people that would look for balloons. Teams often had to deal with false submissions, and so they needed to come up with ways to validate and confirm reported sightings. The contest was concluded in under nine hours, much less than expected by DARPA, and had many implications with regards to the power of online social networking and crowdsourcing in general.
Specifics of the competition
Under the rules of the competition, the $40,000 challenge award would be granted to the first team to submit the locations of 10 moored, 8-foot, red weather balloons at 10 previously undisclosed fixed locations in the continental United States. The balloons were to be placed in readily accessible locations visible from nearby roads. The balloons were deployed at 10:00 AM Eastern Time on December 5, 2009, and scheduled to be taken down at 5:00 PM. DARPA was prepared to deploy them for a second day and wait for up to a week for a team to find all of the balloons.
Part of the purpose of the challenge was to force participants to discern actual pertinent information from potential noise. Many teams came across false reports of sightings, both accidental and purposeful. One valid strategy was spamming social networks with false reports to throw competitors off the trail of real sightings. The verification of balloon sightings was paramount to success.
The contest was announced only about a month before the start date. This limited the amount of time teams had to prepare. The ability of many to do so showed the effectiveness of mass and social media to distribute information and organize people quickly. The time in which information about the challenge spread was actually more compressed than a month. However, in the week preceding the launch day the official competition site increased in traffic from an average of 1,000 hits per day to 20,000 hits per day. Similarly, the efforts of many competing teams went viral in the last few days before the start date.
DARPA selected the date of the competition to commemorate the 40th anniversary of the Internet.
Even though DARPA was prepared to deploy the balloons for a second day and accept submissions for up to a week until a team found all 10 balloons, the MIT Red Balloon Challenge Team won the competition in under 9 hours. A team from the Georgia Tech Research Institute (GTRI), which located nine balloons, won second place. Two other teams found eight balloons, five found seven, and the iSchools team (which represented Pennsylvania State University, University of Illinois at Urbana-Champaign, University of Pittsburgh, Syracuse University, and University of North Carolina at Chapel Hill), whose strategy is described below, finished tenth with six balloons. In table form, the top ten teams were:
|1||MIT Red Balloon Challenge Team||Cambridge, MA||10||6:52:41 PM|
|2||GTRI "I Spy a Red Balloon" Team||Atlanta, GA||9||6:59:11 PM|
|3||Christian Rodriguez and Tara Chang (Red Balloon Race)||Cambridge, MA||8||6:52:54 PM|
|4||Dude It's a Balloon||Glen Rock, NJ||8||7:42:41 PM|
|5||Groundspeak Geocachers||Seattle, WA||7||4:02:23 PM|
|6||Army of Eyes Mutual Mobile||Austin, TX||7||4:33:20 PM|
|7||Team Decinena||Evergreen, CO||7||6:46:37 PM|
|9||Nerdfighters||Missoula, MT||7||8:19:24 PM|
|10||iSchools DARPA Challenge Team||State College, PA||6||6:13:08 PM|
The winning MIT team used a technique similar to multi-level marketing to recruit participants, with the prize money to be distributed up the chain of participants leading to successful balloon spottings, and all prize income remaining after distribution to participants to be given to charity. The team's strategy for public collaboration in finding the balloons was explained on their website:
We're giving $2000 per balloon to the first person to send us the correct coordinates, but that's not all -- we're also giving $1000 to the person who invited them. Then we're giving $500 whoever invited the inviter, and $250 to whoever invited them, and so on ... (see how it works).
It might play out like this. Alice joins the team, and we give her an invite link like http://balloon.media.mit.edu/alice. Alice then e-mails her link to Bob, who uses it to join the team as well. We make a http://balloon.media.mit.edu/bob link for Bob, who posts it to Facebook. His friend Carol sees it, signs up, then twitters about http://balloon.media.mit.edu/carol. Dave uses Carol's link to join ... then spots one of the DARPA balloons! Dave is the first person to report the balloon's location to us, and the MIT Red Balloon Challenge Team is the first to find all 10. Once that happens, we send Dave $2000 for finding the balloon. Carol gets $1000 for inviting Dave, Bob gets $500 for inviting Carol, and Alice gets $250 for inviting Bob. The remaining $250 is donated to charity.
The strategy was a variant of the Query Incentive Network model of Kleinberg and Raghavan, with the main difference being that the incentive rewards in the team's technique scale down for later participants. The recursive nature of the reward had two beneficial effects. First, participants had an incentive to involve others, as these new people would not become competitors for the reward but rather cooperating partners. Second, people not located in the United States were motivated to participate by passing along information even though they had no way of locating a balloon in person. This helped the team garner a large number (over 5,000) of participants. The team only began with four initial participants.
To determine whether submissions were legitimate or fake, the team employed at least three strategies. The first strategy was examining whether there were multiple submissions for a location. If this was the case, then the likelihood of a balloon actually being there was thought to be higher. A second strategy was to check whether the IP address of the submitter matched the supposed location of the balloon. A third strategy was to examine photos accompanying the submission. Real photos included a DARPA employee and a DARPA banner, details which were not announced, while faked ones did not.
A detailed analysis of the winning strategy highlighted the important role that social media played. Analysis of Twitter data showed that while some teams relied on large initial bursts of activity over Twitter, mentions of those teams quickly faded. It was argued that due to the recursive incentive structure, the MIT team was able to create a more sustained social media impact than most teams.
The second-place GTRI team used a strategy that relied heavily on Internet publicity and social media. They created a Web site three weeks before the launch day and used a variety of media-related efforts, including a Facebook group, in order to increase the visibility of the team and increase the chance that people who spotted the balloons would report the sightings to them.
The team promised to donate all winnings to charity to appeal to the altruism of participants. However, due to the lack of a structure that created much incentive as the winning MIT team's scheme, their network of participants grew to only about 1,400 people.
With regards to validating submissions, the team assumed that because of the charitable nature of their effort, the number of false submissions would be low. In any case, they primarily relied on personal validation, having phone conversations with submitters.
The tenth-place iSchools team, which represented five universities, tried two distinct approaches. The first was directly recruiting team members to look for the balloons on launch day. These members included students, faculty, and alumni on official mailing lists and social media website groups for organizations on the team (e.g., Pennsylvania State University). Only a few of these observers actually participated, however, and only one balloon was found using this strategy.
The second strategy was using open-source intelligence methods to do cyberspace searching for results related to the challenge. This was the main source of their success in locating balloons. This strategy, in turn, consisted of two distinct sub-strategies. The first was to use a group of human analysts who would manually search online on a variety of information sources, including Twitter and the websites of competing teams, compile reported sightings, and then evaluate the validity of sightings based on the reputation of the sources.
Another strategy relating to cyberspace searching that the team used was an automated Web crawler which captured data from Twitter and opposing teams' websites and then analyzed it. This technology worked slowly and would have benefited from a longer contest duration, but the Twitter crawler proved to be especially useful because tweets sometimes contained geographic information.
To confirm the validity of possible sightings, recruited team members were used when possible. If none were available, new observers were recruited from organizations located near the sighting. The distributed location of the different organizations in the team allowed this to be a feasible strategy. Photographic analysis was used to confirm or dispute the validity of claims.
The team also encountered a case of another team accidentally leaking information about a sighting and then trying to cover it up. The iSchools team used a variety of information sources, including social networks, to determine what the real location was. This demonstrated the possibility of using information from a wide variety of public websites to determine the validity of something.
Prior to the competition numerous people had discussed possible strategies, including satellite photography, aerial photography and crowdsourcing to detect balloons, as well as the possibility of misinformation campaigns to stop other teams from winning. In the actual competition, there was a variety of strategies employed by teams.
One team leader, Jason Brindel of San Rafael, California, organized a team of around 140 people. His plan was to create a web site and Twitter account dedicated to the challenge that would allow his team members to communicate their findings. Anyone participating in the challenge would be allowed to submit information, provided that they included details confirming about their submmission. Brindel planned to have the team scour the Internet for mentions of balloons across news sites, blogs, and social media sites.
George Hotz, a Twitter celebrity now famous for hacking the PlayStation 3 and settling a suit by Sony, only prepared for the competition for an hour before posting a tweet an hour before the start of the competition. Hotz was able to locate 8 balloons successfully. Four were found within his Twitter network of almost 50,000 followers, and four were acquired through trades of information with other teams.
The fifth-place finisher, the Groundspeak Geocachers, deployed active geocachers and Groundspeak employees to search for balloons. They were successful in finding eight balloons, but due to a data entry error, were only credited with seven.
A team calling themselves Nerdfighters utilized their existing network of followers from the Brotherhood 2.0 vlog to launch a viral video before the competition. They managed to attract 2,000 active balloon seekers. They also utilized 3,000 Nerdfighters who scanned for Internet traffic related to the competition and specialized in launching a misinformation campaign, hoping to confuse or misdirect other teams. They also created a network of cell phone users to provide direct text message verification of findings.
A team of iPhone application developers formed Army of Eyes, based out of Austin, TX. Their application was developed soon after the original challenge announcement in order to be made available by challenge launch day.
The iNeighbors team, made up of members of an existing social media site for neighborhood watch communities, performed no recruitment or trading efforts. Their goal was to evaluate the ability of their network to effectively report on abnormal activity within neighborhoods. They were able to successfully locate five of the ten balloons.
The challenge generated a number of insights. First, it showed how mass and social media can act complementarily. While mass media were useful primarily for spreading general information about the challenge, social media were effective for viral dissemination of information about the challenge to potential team recruits. Second, it showed how social media can be useful as a data mining source. For example, the iSchools team did better than many other teams by simply monitoring public websites. Third, the challenge showed the variety of ways in which social networking can be utilized. The MIT and GTRI teams used them primarily to facilitate fast communication between participants, while the iSchools team used it as a source of information.
Fourth, the challenge showed the general effectiveness of using crowdsourcing techniques to solve geographically-distributed, time-sensitive problems. The DARPA program managers were surprised by how quickly the challenge was completed. However, it can be difficult to filter useful data from public sites, and the independent verification of publicly listed information remains a challenge in efficiency and accuracy.
DARPA noted that though social networks can be a powerful source of intelligence, using them may be politically sensitive due to the privacy concerns involved with data mining user content. Similarly, the winning MIT team surmised that their recursive approach would only be effective if the effort's goal was seen to be moral and good by its participants.
Verified balloon locations
The officially verified coordinates of the balloons, listed by their tag numbers, were:
- Balloon 1: Union Square, San Francisco, California
- Balloon 2: Chaparral Park, Scottsdale, Arizona
- Balloon 3: Tonsler Park, Charlottesville, Virginia 
- Balloon 4: Chase Palm Park, Santa Barbara, California
- Balloon 5: Lee Park, Memphis, Tennessee
- Balloon 6: Collins Avenue, Miami, Florida
- Balloon 7: Glasgow Park, Christiana, Delaware
- Balloon 8: Katy Park, Katy, Texas
- Balloon 9: Waterfront Park, Portland, Oregon
- Balloon 10: Centennial Park, Atlanta, Georgia
Inspired by the success of the DARPA Network Challenge, DARPA launched the Shredder Challenge in 2011. This competition aimed to explore methods to reconstruct documents shredded by a variety of paper shredding techniques. As with the DARPA Network Challenge, some teams used crowdsourcing to solicit human help in reconstructing the documents. The winning team used a computer-vision algorithm to suggest fragment pairings to human assemblers for verification.
In January 2012, the University of Pennsylvania School of Medicine launched the MyHeartMap Challenge to map Automatic External Defibrillators (AEDs) in the city of Philadelphia. According to the organizer Dr. Raina Merchant, "DARPA succeeded with locating red balloons. AEDs are a natural extension of a brilliant idea."
Also inspired by the DARPA Network Challenge, a contest called Tag Challenge was sponsored by the United States Department of State and the Institute of International Education. Tag Challenge sought to have teams locate and obtain pictures of five individuals in five different cities across North America and Europe within twelve hours on March 31, 2012. Despite the fact that the potential winnings were considerably lower than for the DARPA Network Challenge, organizers sought to test the ability of the methods discovered in that challenge to "find a person of interest" rather than a statically located object.
- "MIT wins $40,000 prize in nationwide balloon-hunt contest". CNN. 2009. Archived from the original on 2012-01-20. Retrieved 2012-02-21.
- "MIT Red Balloon Team Wins DARPA Network Challenge" (PDF). DARPA. Archived from the original (PDF) on November 11, 2010. Retrieved 2009-12-06.
- John C. Tang; Manuel Cebrian; Nicklaus A. Giacobe; Hyun-Woo Kim; Taemie Kim; Douglas "Beaker" Wickert (2011). "Reflecting on the DARPA Red Balloon Challenge". Communications of the ACM. 54 (4): 78–85. doi:10.1145/1924421.1924441.
- Defense Advanced Research Projects Agency. "DARPA Network Challenge Project Report". Retrieved 2012-03-03.
- "DARPA Network Challenge Final Standings" (PDF). DARPA. Archived from the original (PDF) on November 11, 2010. Retrieved 2010-10-07.
- "How It Works". MIT Red Balloon Challenge Team. Archived from the original on 2010-01-11.
- J. Kleinberg; P. Raghavan (2005). "Query Incentive Networks". Proceedings of 46th Annual IEEE Symposium on FOCS: 132–141.
- Galen Pickard; Wei Pan; Iyad Rahwan; Manuel Cebrian; Riley Crane; Anmol Madan; Alex Pentland (2011). "Time-Critical Social Mobilization". Science. 334 (6055): 509–512. doi:10.1126/science.1205869.
- Adrian Hon (October 31, 2009). "How to Win the DARPA Network Challenge". Mssv.
- Gross, Doug. "Nationwide balloon-hunt contest tests online networking". CNN. Archived from the original on 1 March 2012. Retrieved 3 March 2012.
- "10 Balloonies - Groundspeak's DARPA War Room". Groundspeak. December 9, 2009.
- "DARPA Network Challenge Balloon Coordinates" (PDF). DARPA. Archived from the original (PDF) on August 19, 2010. Retrieved 2009-12-13.
- "Ten red balloons– and one's in Charlottesville!". The Hook. December 5, 2009.
- "Crowdsourcing the 'most challenging puzzle ever". CNET. November 17, 2011. Retrieved 2011-12-01.
- Drummond, Katie (December 2, 2011). "Programmers Shred Pentagon's Paper Puzzle Challenge". Wired. Retrieved December 5, 2011.
- McCullough, Marie (January 31, 2012). "Global contest will lead to help during heart attacks". The Philadelphia Inquirer. Retrieved 2012-02-02.
- "MyHeartMap Challenge Media Page". University of Pennsylvania. Retrieved 2012-02-03.
- "Tag Challenge". Archived from the original on 14 July 2013. Retrieved 22 March 2012.
- Shachtman, Noah (March 1, 2012). "U.S. Wants You to Hunt Fugitives With Twitter". Wired. Retrieved 22 March 2012.