1st & Ten (graphics system)
1st & Ten is the name for a computer system that generates and displays one version of the yellow first down line that a TV viewer sees during a live broadcast of a college or professional American football or Canadian football game  A competing system that performs the same task using different technology is called L-VIS for Live Video Insertion System. The line, which is not physically present on the field and is seen only by the television audience, spans the width of the football field and indicates the location of the first down marker. The purpose of the line is to make it easier for television viewers to follow play on the field. Some television football broadcasts change the color of the line from yellow to red on 4th down, or show a second computer-generated line (usually blue in color) that marks the line of scrimmage.
The system makes use of a combination of motion sensors mounted on the broadcast cameras to record what they are viewing, and/or the use of match moving computer graphics technology  and an enhanced version of chroma key or "green screen" technology.
The idea of creating an on-field marker to help TV viewers identify 1st down distances was conceived and patented in 1978 by David W. Crain, who presented the concept to Roone Arledge and Roger Goodman of ABC News and Sports and to the CBS Technology Center. At the time, both decided the broadcast industry was not ready to use Crain's invention. In 1998, ESPN Programmer Gary Morgenstern and others, revived the idea. ESPN's NFL Coordinating Producer, Fred Gaudelli was tasked with overseeing an implementation for their network. The 1st & Ten line was first broadcast by Sportvision, a private company, during ESPN's coverage of a Bengals-Ravens game on September 27, 1998. A few weeks later, on Thanksgiving Day, 1998 (October 12), Princeton Video Image (PVI) aired their version of the virtual yellow down line on a CBS broadcast of a Pittsburgh Steelers - Detroit Lions game. Four years later, SportsMEDIA introduced a third version during NBC coverage of a Notre Dame game.
The rivalry between PVI and Sportvision began with a collaboration. In July 1995 PVI had successfully used their L-VIS (Live Video Insertion System) match moving technology to broadcast virtual advertising behind the home plate on a local broadcast of a Trenton Thunder baseball game in Trenton, NJ. In January 1996, Roy Rosser, Director of Special Projects at PVI saw SportsVision's FoxTrax puck on the broadcast of the 1996 NHL All-Star Game and realized that a combination of L-VIS and FoxTrax would allow virtual insertions in a wider range of situations than either could do on their own, given the then power of affordable computers. He contacted Stan Honey, CTO at Sportsvision, and the two companies undertook a joint demonstration of their combined technologies during the 1996 World Series between the Atlanta Braves and the New York Yankees at the Atlanta Fulton County Stadium. The test was not a success and the two companies parted ways, each developing complementary systems that were eventually used to broadcast Sportsvision’s “First and Ten” line and PVI’s “Yellow Down Line”. In October 1999, SportVision sued PVI alleging that PVI's virtual signage, first down line, and other products infringed Fox/Sportvision patents. In August 2001, PVI counterclaimed against Sportvision in the Federal Court action, alleging that Sportvision's virtual strike zone and virtual signage products infringed a PVI patent. In 2002, the companies settled the law suits out of court through a cross-licensing deal.
Before the game
Each football field has a unique crown and contour and is not perfectly flat, so a 3D model is made of the field prior to the game. Due to the low amount of change throughout a football season, this 3D model is usually only generated once a season at most. It also has a unique color palette, typically various shades of green, depending on the type of surface (i.e. real or artificial grass) and the weather (e.g. bright, shady or even snowing). In addition, after cameras are set up, the position of the camera relative to the field is established to be used in conjunction with the previously created 3D model of the field.
There are usually a number of cameras shooting the field, but typically only three main cameras are used for an American football broadcast (one on the fifty-yard line, and one on each twenty-yard line). The cameras with video that will be used with the graphics system have electronic encoders within parts of the camera assembly (in the lens and the moving platform the camera sits on, sometimes called a "panhead") that monitor how the camera is used during the game (pan, tilt, zoom, focus and extender). The encoders transmit that info live 30 or more times per second to the broadcaster's production truck, where it is processed by Sportvision computers (typically one for each camera). A camera with this type of extra hardware is usually called an "instrumented" camera. This information helps keep the yellow 1st & ten line in the proper place without being distorted whenever the camera follows the players or the ball.
In the larger productions, several other cameras can be "instrumented" to work with the graphics system, but these are usually restricted to following additional types: a camera usually placed in a high position to see all the field, typically called the "all 22" camera, and a camera shooting from above one end-zone, called an "end-zone camera", or in the industry often just "camera 4". The Skycam (or moving camera attached to cables above the field) can also be used to draw a yellow line over its video, but the mechanism has some major differences from the typical "instrumented" camera.
For the initial implementation, there were seven computers in total and a crew of four. Recent implementations require around four computers, one computer per camera plus a shared computer for chroma-keying and other tasks, that can be run by a single operator (although two is optimal). The primary operator usually uses a KVM to switch between camera computers and has an extra monitor, keyboard, and mouse setup for the chroma-keying computer.
Of the original four-member crew, two members, one inside the stadium and one in front of a computer, communicated the position of the real first down line to make sure everything was working. The third crew member was a troubleshooter. The last crew member monitored the various colors that make up the color palette onto which the line is drawn.
In recent setups only a single operator is required for all cameras. The operator clicks on the ball in the video to set the line of scrimmage and right-clicks where the first down line should be (or presses a button to automatically position it 10 yards in the direction of play). If lighting conditions don't change that much, the primary operator can also monitor chroma-key settings, but often a secondary operator is used when conditions get too variable.
Each set of camera encoders on a camera transmits position data to an aggregator box that translates the digital information into modulated audio where it is sent down to the corresponding camera computer in the truck. This data is synchronized with the video from that camera. At the camera computer the camera position data is demodulated back to digital data for use by the program that draws the "yellow line" over the video.
Separately, the chroma-keying computer is told what colors of the field are okay to draw over (basically grass) and that information is sent to the camera computers.
The old way
The first computer in the truck gathers all the separate readings from the cameras and transmits a single, consolidated data stream to the central computer.
The central computer takes these readings, the 3D field model and color palette, the knowledge of which camera is on the air, and together using a geometrical calculation determines which pixels in the video frame would make up the first down line. All pixels that are obstructed by a player, a referee, the ball or any other object are identified and not included in the calculation. This will ensure that the 1st & Ten line will be projected only onto the field.
The PVI Virtual Media system relies on a signal on a single spotter to relay the down and distance, and a single operator at the studio as their vision system does not need camera data to perform the insertion. The primary operator of the Sportvision system does the spotting by merely clicking on the video to place the line.
The only pixels that should change are the ones that are the same color as the field, typically several shades of green. As a result, there are a few situations that are difficult. One is when the player's uniform color nearly matches that of the field (for example, the Green Bay Packers' jersey on a bright, sunny day, or for Boise, Idaho's Bronco Stadium, where the field and the home team uniform share the same blue shade). The other is when the field itself changes, like during a rain/snow storm or if the grass field becomes very muddy. In those cases, the field's color palette would need to include brown and/or white shades. The most difficult situations are when the shade of the field is constantly changing as in situations where moving clouds are shadowing the field on some spots, but not others, but continue to move across the field.
The data collection and computation also requires time. The audio feed goes to an audio delay to be synchronized with the delayed video. The total delay for the viewer from the live feed ends up being about 2/3 of a second.
After the camera computer has determined which pixels represent the 1st & Ten line, it takes that pixel information and draws the yellow line in video format at around 60 times per second (depends on video refresh frequency).
In recent years the system has been upgraded to add more features. During Fox broadcasts, the Sportvision system also generates an arrow-like graphic on the field with down and distance text information inside of an arrow pointing in the direction of play. Competitors have also added this feature in recent years.
Additionally, the Sportvision system can also place virtual graphics that have another embedded video feed inside them like a video picture frame. This is sometimes called "video-in-perspective".
This technology is also the basis for showing ads where they may not appear (i.e. behind home plate in baseball during national broadcasts), and Race F/X in which images can be displayed on the race track, and info can follow a specific car, no matter what the camera does. This technology is used by CBS, ESPN, Fox, NBC, NFL Network, RDS, TSN, and TNT.
-  “Kicking Reality Up a Notch”, by Leslie Berlin, NY Times, July 11, 2009
-  “When the Game's on the Line, the Line's on the Screen” by Matt Lake, NY Times, January 27, 2000
-  US Patent 5,917,553 “Method and apparatus for enhancing the broadcast of a live event”
-  US Patent 6,100,925 “Image insertion in video streams using a combination of physical sensors and pattern recognition”
-  U.S. Patent 5,953,076 "System and method of real time insertions into video using adaptive occlusion with a static reference image"
-  US Patent 4,084,184 "TV Object locator and image identifier" David W. Crain
-  How Stuff Works - the first down line
-  “Football made simpler” by Glen Dickson, Broadcasting & Cable, June 07, 1999
- Sportvision, Inc. and Fox Sports Productions, Inc. v. Princeton Video Image, Inc., Civil Action No. 99-CV-20998 (N.D. Cal.)
-  Sportvision, Fox and PVI Settle Patent Litigation; "Lawsuits Resolved By Cross-Licenses and Interference Not Pursued, Business Wire" February 21, 2002
- "Untitled Document". Archived from the original on 2009-09-26. Retrieved 2009-09-24.
-  SportVision
-  PVI Virtual Media Services
-  Computing Basics - How Did They Do That? Thin Yellow Line