From Wikipedia, the free encyclopedia
Jump to: navigation, search
WikiProject Television (Rated C-class, Mid-importance)
WikiProject icon This article is within the scope of WikiProject Television, a collaborative effort to improve the coverage of television on Wikipedia. If you would like to participate, please visit the project page, where you can join the discussion and see a list of open tasks.
C-Class article C  This article has been rated as C-Class on the project's quality scale.
 Mid  This article has been rated as Mid-importance on the project's importance scale.

Disc Storage format[edit]

1080p-encoded titles have been released on DVD, Blu-ray Disc and HD DVD. 1080p content has been released on Blu-ray exclusively (does this mean all 1080 p is Blu-Ray or all Blu-Ray is 1080p?) after July 1, 2008. Blu-ray players have been able to output 1080p video since their inception.[citation needed] Current Blu-ray players allow output of film-based material in conventional interlaced 1080i60 form. 1080p displays that are able to apply 3:2 pulldown reversal can deinterlace film-based content and achieve full 1080p image quality.[citation needed]

This Page Needs To Be Locked[edit]

If possible can someone please lock this page to prevent others who are not providing anything interesting and putting pointless banter. Vlad788 10:46, 5 Jun 2005 (UTC)

"1080p (professional) is similar to 1080i (inexperienced), but twice as good

1080i is interlaced, whereas 1080p is non-interlaced. which is only really relevent if you have a really old fashioned CRT monitor''"

It has been deleted, but please lock this if possible to prevent this from occurring again. Thanks ;).


The second point is absolutely correct and should not have been deleted. 1080i material is deinterlaced on Plasma and LCD displays so standard 1080i50 material is infact rendered at 1080p25 meaning that there is basically no difference in the end result. (talk) 09:12, 19 June 2008 (UTC)

That doesn't seem right. How can you deinterlace without causing motion artifacts? Either you are wrong or Deinterlacing is. Snielsen (talk) 00:55, 22 October 2008 (UTC)
No, that is correct. If the source material is 25fps and it is interlaced to 50 fields, then deinterlacing will fully restore the original 25fps progressive content with no artefacts. Teppic74 (talk) 14:16, 24 September 2011 (UTC)

This statement has an incorrect reference and doesn't make sense: "1080p and 1080i are currently the highest-resolution formats..."[edit]

Two problems here: 1) The citation actually does not mention 1080i at all 2) It is a misleading statement

There is a lot of misunderstanding, often helped along by misleading marketing in the HD space. By saying 1080p and 1080i are the highest-resolution one could easily imply - incorrectly - that they are about the same. In fact most consumer product marketed as high-definition is either 720p/1080i or upscaling equipment that is used simply to present a decent version of lower-resolution material on high-res displays. We should try to clear up this confusion rather then add to it. Perhaps this might be a good substitute:

The high-definition consumer marketplace changes rapidly. Only in recent years has 1080p source material (links to Blue-Ray and HD--DVD) become available. Many high-definition products are sold today that display in the somewhat lower resolutions of 720p and 1080i or provide link:upscaling capability to display lower resolution source material on higher resolution displays. Most broadcast media high-definition material is at no higher then 720p/1080i format. The highest resolution format available on the broad consumer market is 1080p source material presented on 1080p capable TVs and displays.

Technically speaking they are the same resolution, just with different scanning methods. --Ray andrew (talk) 21:59, 4 January 2008 (UTC)
No. Only half the lines are drawn at a time so it takes far fewer physical pixels to display 1080i. You can display 1080i on a 720p display, you need a 1080p display with more pixels to display a 1080p image. 1080p creates a sharper - higher resolution image. (also see talk point below on "highest resolution formats." From a subjective point of view many people find 720p to be a better picture the 1080i. That is a secondary point. The key is that 1080p is much better then either of those. —Preceding unsigned comment added by (talk) 16:21, 7 January 2008 (UTC)
Sorry but thats completely incorrect, see Display_resolution#overview. --Ray andrew (talk) 18:56, 7 January 2008 (UTC)
Ok, I didn't say this clearly. For the last several years HD meant 720p or 1080i. True, 1080i means 1080 lines, but 1080i has been for the most part displayed on 720p displays for all this time. Strictly speaking this is downscaling - you are right. My point here is that the marketing materials can mislead people into buying lower resolution equipment by calling everything HD. If you buy a display that can only display 720p and 1080i you are likely buying a machine with less pixel density and a lower resolution image then a 1080p native display. You are right - I was confusing the display capability and the standard. All the more reason this statement should be rewritten. Here is another attempt. Maybe too long, but I've said my piece:

1080p Native Resolution

1080p refers to a resolution standard but is also used to describe video equipment capabilities. For example, video equipment that upscales to 1080p takes lower resolution material and reformats it for a higher resolution display. The image that results is different from the display of original 1080p source material on a native 1080p capable display. Some displays with a native 720p resolution advertise that they can display high-definition 720p and 1080i resolutions. In this case a downscaling from the original 1080i image is applied for display on the lower resolution display. 1080p native resolution equipment is equipment that can present 1080p and 1080i resolution images in full detail without downscaling. 1080p source material (e.g. Blu-Ray and HD-DVD) and 1080p native resolution video equipment are both available in the consumer marketplace today. —Preceding unsigned comment added by (talk) 20:30, 8 January 2008 (UTC)

Meaningless marketing nonsense[edit]

"True High Definition" is meaningless marketing poop and should not be here. Mirror Vax 10:46, 5 Jun 2005 (UTC)

Poop or not, it is now widely used in the industry and was certainly prominent at the CES. I haven't seen much objection to the term (acually haven't seen any until now). Definitions (powered by uses it in its definiton of full HDTV. is a resource for small and medium-sized business professionals. is an it encyclopedia. I'm sure you are familiar with CNET as well.,,sid44_gci1071898,00.html
Hope that helps. Parmaestro 11:33, 5 Jun 2005 (UTC)
I did a Google search and found that the vast majority of references to "true high definition" did NOT refer to 1080p. So in the last CES "true high definition" meant 1080p - next CES it will mean something else. It's not a technical term - it refers to whatever they are currently selling. It doesn't mean anything more than "New and Improved". Mirror Vax 11:55, 5 Jun 2005 (UTC)
Let's try to incorporate whatever concerns and objections you have in the text. Parmaestro 12:11, 5 Jun 2005 (UTC)
Since it is a buzzword and used in marketing materials, I changed the sentence to match. Regardless, it is the best available. However, I don't know if there is source material that can be displayed on a 1080p set. Microsoft has some videos, but I don't know if you can use a DVI connector from your computer to display on a 1080p set. I don't have the money to test, but if someone wants to buy me an HDTV set... : ) --Daev 18:40, 9 August 2005 (UTC)


I'm pretty sure that "Most 17" computer monitors that support 1280x1024 60hz" would display an 'Out of Range' or similar message if pushed to 1920x1080. Simply forcing a graphics cards output to a certain level does not make the monitor compatible. Also the frequency is 'Hz' not 'hz'

You'd think that, but every single monitor I've seen that does 1280x1024@60Hz can do 1920x1080@60Hz. Horizontal resolution matters not to the monitor, just the pixel clock in the graphics card. Vertical resolution on the other hand does matter, but since 1080 lines aren't that far from 1024 (56 extra lines), these monitors generally dont have any troubles. The physical size of the monitor though (17") doesn't really matter, its just a generalization based on average maximum timings by size.
This statement that most CRTs can display 1920x1080@60Hz if they can do 1024x768@85Hz is ridiculous. Simple maths shows that the required pixel clock for the former is about 124MHz, the latter is only 67MHz. There certainly are CRTs that can handle it, but the average 17" CRT has a pixel clock somewhere in the vicinity of 85MHz IIRC (it's been some years this has been relevant...), which is miles away from what it'd need. If a citation can't be provided, this statement should be removed or at least heavily edited. —Preceding unsigned comment added by (talk) 21:32, 11 September 2008 (UTC)
Pixel clock (and thus horizontal pixel count) is meaningless for an analog VGA connection. You could drive a display that supports 1024x768 with a 8192x768 signal and it would work. The display's bandwidth limit would attenuate high frequencies i.e. horizontal details though. The only things that matter with a VGA connection is the vertical frequency (refresh rate) and the horizontal frequency (refresh rate times the total number of lines). Since "1080p" has almost the same number of lines as 1280x1024, the horizontal and vertical frequencies are nearly the same. Displays usually support a range of horizontal and vertical frequencies (see multisync).Totsugeki (talk) 22:22, 18 February 2009 (UTC)

I agree, the first thing I thought when I read that sentance was "thats a stupid thing to say". If a monitor is capable of running at a resolution such as 1920x1080@60Hz, it should report so when it communicates its abilities over DDC.

DDC only reports standard resolutions.
I tried this on my crappy 6 year old Gateway 17" CRT monitor and 1920x1080 @ 60Hz worked fine, if a bit blurry - I did this with the custom resolution dealy on the Nvidia display control panel thing. However, I don't know how picky LCD monitors are in comparison, though. --Zilog Jones 00:08, 23 August 2005 (UTC)
LCD Monitors use a completely different display method, however they might show a picture in analog mode, but it would be downsampled to fit the native resolution. Most LCD monitors though would just say "Out of Range".
Most 17" monitors *cannot* display 1920x1080 at any refresh rate - they may take the signal, but they don't have sufficient dot pitch to display the full res (the reason that the Gateway above was blurry). A typical 17" CRT has about 16" viewable in the diagonal, and makes a 3/4/5 triangle - so the horizontal is 16*4/5 = 12.8". For 1920 pixels to crowd into 12.8" you have to have 150 pixels per inch. 150^(-1) = .0067 in, which is 0.169 mm - an unheard of dot pitch on a CRT. For example -At work I have a 19" monitor that does 1280x1024 perfectly, but doesn't look quite right at 1600x1200 - a quick calculation gives the reason - at the monitor's dot pitch, I only have 1400 dots in the horizontal. This monitor *cannot* display UXGA, even if it takes the signal.

My ViewSonic E70f (17" 16" viewable 1280x1024 60Hz) can display a 1920x1080 image reasonably well at a max of 60Hz. I wouldn't recommend it though cuz it causes the monitor to emit a high pitched squeel.

Some other information regarding PC monitors seems less then accurate or just incomplete. For example my ViewSonic G225fB (21" 20" viewable 2048x1536 85Hz) is more then capable of taking a true 1920x1080p signal. I run it at 1920x1440 and thats really no diffrent then 1920x1080 despite being a 4:3 screen. It just has black borders on the top and bottom. The black bar issue tends to be highly over rated. A 1920x1080 frame displayed on a 1920x1440 screen has no image loss so the black bars are irrelevant. It matters more on VHS and DVD when they physically cut into the available image size. The same is essentially true of 16:10 CRTs that can run at 1920x1200 or better. However those are rare.

However to achieve this without using a PC based solution such as a console HD-DVD player or HDTV tuner you must pass it through a device capable of padding the signal or using fixed aspect ratio timing or it will be distorted. In this case you must adjust the vertical size manually. Generalleoff 02:28, 15 October 2007 (UTC)

Consumer Television Capabilities[edit]

I can see some merit in mentioning capabilities specific to manufacturers. The Samsung comment of accepting 1080p was added by soeone other than myself. I put in the capabilities of the Mitsubishi TVs because they were mentioned in the 2005 CES sentence, and to have a more fair representation than only one brand gaining extra mention.

I'm opening this discussion because this seems like a slippery slope. Should brand-specific information be removed altogether from this section, or limited in some way before even more people come along to make it even more comprehensive? Or should it stand since it's new information and technology, and the issue reevaluated when 1080p sets become more common and the list gets more out of hand?

Eventually most manufacturers will offer sets with 1080p, but at the moment they are not ubiquitous, so having specific information may be helpful to interested viewers. Early sets such as the Qualia have some historical significance. The Samsung HLR was mentioned because it was one of the first sets that were confirmed to accept 1080p. Ditto for the HP MD (possibly the first for HDMI). Shawnc 23:35, 6 November 2005 (UTC)
The CES in Jan 2006 introduced dozens of 1080p displays.[1]. Not all of these would be available initially, but some model-specific details in the main article can probably be removed now. Shawnc 23:21, 14 January 2006 (UTC)
Good call. Sounds much better that way, too.Rsalerno 23:50, 2 February 2006 (UTC)

collapse 1080i and 1080p into one entry?[edit]

they both have duplicate paragraphs, maybe they should just be combined into one entry.

They are distinct formats though. Shawnc 23:25, 14 January 2006 (UTC)

==> in the range of quality, 1080i is far from 1080p ; one could see 1080i collapsed with 540P, with 720P in between... Also the field rate is important, as flicker artifacts are reported in 1080P24 but not in 1080P59.94 or 1080P60. The Motion Picture industry that could not afford paying for Douglas Trumbull's Showscan was limited to 24fps, with its stroboscopic effects (e.g. spokes rotating backwards, etc...), and then the main reason behind Blu-Ray & Microsoft WMV-HD 1080P24 limit. As much as one would like 1080P60, most agree that artifacts even with the best 2:3 pulldown do not bode well for 24fps material, so even if one has a whole 1080P60 chain [Sony/Dalsa/Kinetta 1080P60 camera, native 1080P60 display], there is still a concern about the 1080P24 display flicker. One could double-flash the frame, so I don't think it's desperate, but the whole 1080P60 origination business is at stake until some Blu-Ray MPEG4 1080P60 can be produced. Microsoft might be able to raise the cap in WMV-HD as well.

But what about PAL, which gets none of this 2:3 crap the other formats get (PAL just gets a 4% speed boost and a SSRC sound tempo change). Some progressive PAL DVD's surpass some NTSC HD-DVDs because of this.

Why are 1080p and 2:3 pulldown being mentioned in the same sentence? 2:3 pulldown only relates to frame rate conversion between 24 and 30 frames whether in 1080 or 720 or standard defintion. Sony has a progressive format that is refered to as psf or progressive segmented frame. Which only says that in laying the progressive frame to tape they dump one half (odd lines) to one field and the other half of the progressive frame (even fields) to the other field. If you have aquired footage shot in true interlace 1080i it will never become progressive in the same sense, because each field is it's own moment in time and is displaced from the previous field and the one ahead by the same duration.

In regards to 24 frame flicker, this is a throw back to the flicker of movie theatres, remember those, where audiences were literally entranced by the flashing of the screen. Some people believe this is a desirable effect...:>

You need windows to play 1080p VDs?[edit]

"Some 1080p and near-1080p content have been released on regular DVD-ROM disks using WMV HD compression. These titles cannot be played in normal DVD players and can only be viewed on a Windows-based computer with a 3.0 GHz or faster CPU, among other hardware requirements.[2]"

Why a windows based computer? And why 3.0GHz? Ever heard of Linux, or Mac for that matter?

That statement refers to WMD-HD only. The info from its webpage:

"Optimum Configuration (Play 1080p video with 5.1 surround sound) Microsoft Windows XP 3.0 GHz processor or equivalent" Shawnc 07:51, 24 March 2006 (UTC)

According to the source sited, Microsoft's WMV Website, it reads exactly as above "3.0 GHz processor or equivalent". However, the article cited Intel Pentium 4, which is not mentioned in the reference. I removed the brand affiliation. Toastysoul 01 August 2006

1080p content[edit]

MAc Break is the first 1080p podcast!!!!! --BorisFromStockdale 20:28, 22 March 2006 (UTC)

Cell reference added[edit]

Quoting a performance threshhold in clock frequency relative to an unspecified device/technology/manufacturer is bad dental hygeine, and grants Intel far more implicit credit than they deserve. I thought Cell would serve as a useful counterpoint, and that it was also pertinent enough as 1080p is easily within Cell's performance wheelhouse, though my text might say more than best serves the article at that juncture.

This could be pertinent as well. Here Microsoft claims that 1080p can not be properly supported by games on the current and upcoming generation of game consoles; one must take this FUD with a grain of salt, for while their admission concerning their own XBox-360 is dead on the mark, the Sony PS3 borderlines on being able to pull this off, and perhaps some PS3 games (but not many) will pull this off once the dust finally settles. I might add that pulling this off does not necessarily add anything to the gaming experience; the feat could devolve entirely into adolescent bragging rights.

1080p used by studios?[edit]

I'm fairly sure that 1080p is not used for filming/editing most movies, the standard resolution in hollywood has 2K lines. Digital projectors in movie houses are also not 1080p.

Listen to This This week in media episode 4 or 5 (I do not remember which one). There they talk about digital movie projectors in cinemas. And most projectors out now are 1080p...--BorisFromStockdale 18:28, 28 May 2006 (UTC)

graphic at bottom[edit]

isn't the NTSC standard 720x486, not 640x480? 20:21, 15 May 2006 (UTC)

Digital media designed for NTSC playback is usually 720x480. Some standards (e.g. Super Video CD) use 480x480. But NTSC itself is analogue standard without a precisely defined number of horizontal pixels, so the definition you actually get from it is hard to quantify exactly. JulesH 08:06, 26 August 2006 (UTC)

Video Gaming section[edit]

Requesting a new Video Gaming Section for talk on Next-gen video game consules and it's abilities to produce 1080p. --Jack Zhang 00:32, 30 September 2006 (UTC)

The two paragraphs that discussed Microsoft (no mention of Sony, etc) and 1080 upgrades in XBox360 software are nothing more than advertising one's fanboyism, they have nothing to do with the 1080 standard and belong in discussion of the video game systems, not in the resolution standard. That passage has been so edited as to be meaningless, and I removed the remaining paragraph.

Resolution is a quality of Gaming Consoles/Games that should be discussed on those pages. Gaming Consoles are not an element of a resolution standard that needs to be discussed on this page. 00:28, 5 November 2006 (UTC)

1080p broadcasts[edit]

Why isn't anyone broadcasting at 1080p (24fps and 30fps) even though it's an ATSC standard (see ), and has been since ATSC was conceived (not a late addition, see Internet Archive --> 1996 for example). Every television with an ATSC tuner (practically every digital TV in the US for example) supports 1080p at film frame rates. I don't know about cable or satellite tv, but it would be logical to broadcast movies and many television series over the air as 1080p in the US instead of degrading quality by encoding interlaced. Or is the article wrong in stating that no one broadcasts in 1080p? What about other standards than ATSC? I live in a country without HDTV so I can't check the bitstreams myself. Totsugeki

At least the Conversion 1080p30 <-> 1080i60 (-> 1080p60-Display) can and should be done losslessly. It is just a Matter of Indicating and Selecting the correct (De-)Interlacingmode. I am unsure, whether 3:2-Pulldown can be reversed losslessly for progressive Displays, but i assume it is, too. The Question is, whether Consumer-Hardware does what it should with the common interlaced Signals. I further assume, the general Compatibility is better with interlaced Signals, especially for CRT-HDTV-Sets. That is just a lot of Assumptions and Semiknowledge, i know. Christoph Päper 13:10, 22 November 2006 (UTC)

Ability of the eye to see 1080p section[edit]

Can someone clarify this section? It doesn't make any sense. -- 16:48, 1 March 2007 (UTC)

What in the section doesn't make sense? Should we start with the first sentence? Daniel.Cardenas 19:43, 1 March 2007 (UTC)
Persons ability to distinguish small details is described by visual acuity. When the invididual pixels are barely resolvable, increased resolution would indeed bring no benefit for the viewer, unless if the display could be brought closer. Just an anecdote: when I worked in computer helpdesk, some older people complained the tiny size of the pixels at a 1280x1024 resolution in a 17" TfT panel. The "high" resolution brought no advantage to those people, but 1024x768 had to be used instead. The section is misleading tho, because not all people watch displays at the limits of their vision. If a person can clearly see the pixels in a display, the resolution could then be doubled, the pixel size halved and the viewing distance would not need to be changed. The section does not explain the variables pertaining to these scenarios. 23:49, 30 March 2007 (UTC)
I agree with the above. This section should be removed. The concept of visual acuity is not specific, unique, or even very notable to 1080p compared to other display formats. Visual acuity and the relationship between viewing size/distance applies to all displays and all resolutions. One could argue that the resolution for a video iPod is "too good" at "normal viewing distances" and that its 480p display offers "no benefit" to the viewer. Obviously an entry like that concerning visual acuity and portable devices should not exist on the 480p page, just like it shouldn't exist here on the 1080p page. Furthermore, when you specify the relationship between visual acuity and 1080p, it makes little or no sense make no mention of 1080i (which is the same frame resolution). The very fact someone would include this is indicitive of a POV. When 1080p became a popular marketing tool in the home electronics industry, it was met with a lot of resistance within the industry because at the time, virtually everyone was making non-1080p capable displays. Fear that consumers would feel slighted getting anything less than 1080p capable set, the issue became almost politicized. The "arguments against 1080p" were touted by retail salespeople, as well as consumers who had already invested in non-1080p sets. Internet message boards like AVSForum were flooded with arguments over 1080p. As ridiculous as it sounds, it became an emotional issue for many people. The fundamental concept of visual acuity in relation to high-res televisions is sound (obviously the human eye will not resolve 1920x1080 on a 19" display viewing from 15 ft. away). But the concept applies to all display resolutions. The screen size/viewing distance will determine how much detail you can actually see. Common sense. Yet when some people talk about 1080p, this concept has taken the form of a political talking point - weasel words and all. This section is an example of that. Notice how a mention of visual acuity does not appear on the 1080i page, the QXGA page, or the 70mm film page . Lifterus 21:02, 5 May 2007 (UTC)

I don't see anyone above yours, suggesting the section get removed. 1080i article is focused on the interlace. QXGA is for computer work and people don't sit far away, 70mm is an analog technology used in big theaters. I've pointed several people to this section and they have thanked me for the info. Daniel.Cardenas 21:31, 5 May 2007 (UTC)

The post by User:62:220.237.65 above me takes issue with the subjective nature and POV of the section. I mentioned that I agree with him or her. The 1080p does not nor should it invoke questions or discussions of visual acuity any more so than other formats like the ones I linked to. If the 1080i article is focussed mainly on "the interlace" as you put it, and you believe that is appropriate (I don't), why doesn't this article just focus on "the progressive"? It's the same resolution so any relationship about visual acuity with regards to 1080p - the same thing applies to 1080i just as equally. The QVGA resolution, even at "normal computer viwing distances" for a 30" monitor, is still "too good" according to your own argument. And please explain why you think a visual acuity lesson belongs here but not on the 70mm film page? The fact that 70mm film is "analog" as you put it (film is almost never referred to as "analog" - its misleading because it is nothing like analog videotape) has absolutely nothing to do with the humans eye's ability to resolve images. Analog and digital have absolutely no relevance to this topic. 1080p in and of itself is not necessarily digital. You can display a 1080p image on an analog CRT set. In fact that is what most professional color grading monitors are - analog. You say you pointed people here and it helped them? What were you helping them with? Are you a television salesperson using Wikipedia to help you exercise a sales technique? If you think the subject of visual acuity and buying a new television is important, then create its own page explaining which resolutions for certain screen sizes at certain distances are ideal. But it doesn't belong here on the 1080p page, just like it doesn't belong on the 480p page, the 720p page, the 1080i page, etc. Not to mention the HDTV in general page. Come to think of it, why isn't it mentioned with 4k Digital Cinema or on the Ultra HDTV page? Or why don't you bring the subject to light on the Red page? Lifterus 19:37, 6 May 2007 (UTC)
I should also mention that you have no business using visual acuity as an argument that a certain resolution "has no benefit". Even if individual pixels at a certain resolution are too far away to resolve, a higher "than needed" resolution can still have benefits such as masking lossy compression artifacts. On what is supposed to be a NPOV encyclopedia, you've written an argument based on anecdotes of what you think "normal viewing distance is" and what "normal display size" is. Don't lots of people buy 70" televisons and sit 10 feet away? I know people that do that. Can you link to a verifiable source showing a poll which shows what distance the vast majority of people sit from their televisons? The section just doesn't belong here. None of these blanket arguments that cover "all 1080p capable displays and/or media" belong here. Lifterus 19:53, 6 May 2007 (UTC)
Even if you can't tell the difference between 720 and 1080, the extra detail is actually there and the picture is actually clearer and of a much higher detail. This may sound obvious, but when people keep brainwashing you by saying that you can't tell the difference between 720 and 1080 it is easy to forget the fact that 1080 really does have that extra detail. In the end that is a fact, and people saying that you can't tell the difference is personal opinion and also bias. The rest of the internet may be confused as to which is actually "better", but on wikipedia of all places it has to state the fact that it IS better, and a side note saying that it 720 to 1080 isn't as big a leap as 480 to 720 so some people do not appreciate the improvement 1080 offers compared to the improvement 720 already offered them when they bought a new tv JayKeaton 12:19, 15 May 2007 (UTC)

How do you define clearer? It won't be clearer to a person looking at a 32" inch screen 10 feet away. The section has references that show the average person can't tell a difference after a certain size screen and distance away. That is fact based on research and that is what the section says. Also consumer reports published an article saying it isn't better for typical viewing distance and size. That again is fact based on research. You will have to describe under what circumstances it is better. If you think it is better under all circumstances than you haven't checked the facts. Daniel.Cardenas 15:42, 15 May 2007 (UTC)

Also I'm expecting that all my future TV purchases will be 1080p sets. I'm an engineer who works on video. But I won't be buying one that is too small for the intended viewing distance. Daniel.Cardenas 15:53, 15 May 2007 (UTC)

But regardless of viewing distance, the quality is still there. If you walk past the screen, or walk right up to it, or gather around the screen really close to play split screen videogaming, the quality is actually there. Even if you are a thousand miles away from the screen, the quality will still be there. Like I said, this simple fact is often lost in the debate over if you can tell the difference or not, it makes people think that there is no difference, which is incorrect JayKeaton 01:54, 16 May 2007 (UTC)

Some Experts Say[edit]

Text has been recently added to say "some experts say" even though the additional reference[2] provided

  1. does not conflict with the two original references [3] [4]
  2. the additional reference does not state the reason for the recommendation like the first two do, based on research. Therefore it is less authoritative.

I'm removing the additional text for these reasons. Let me know if you disagree. Daniel.Cardenas 05:08, 31 May 2007 (UTC)

Yes, I disagree. Hitachi officially recommends [5] sitting back 3.3 times the height of the screen. Richard Fisher from HDTV Magazine says that you should sit between 2.5 and 3.3 times the height. says 3.75 times the height [6]. All of these sources, including the official Hitachi report, all quote different distances to the one that is included in the article. Further to that, the two sources that were already in the article, [7] and [8], but the source is actually based on the schubin report source, the hdtvmag source in fact only seems to link to the schuben report link, so it is essentially only one source. Plus the schuben report link is sponsored by JBC, so I am not sure how unbiased it really can be. Not to mention that searching for "theschubinreport" "mark schubin" and even just "schubin" in wikipedia doesn't really bring up anything about him at all, in fact the only page it comes up with is this 1080p page, so I don't think that Mr Schubin is really notable at all. So the three times the height issue is only one persons opinion, whereas the Hitachi company puts forward a different viewing distance, Richard Fisher the expert from HDTV Mag puts forth a different distance, different to the HDTV Mag source that just links to the Schubiun report one and Broadcast Engineering even offer a completely different distance. I don't see what the fuss is about. Have you perfectly engineered your living room around the 3 times distance, and you can't bring yourself to accept that it probably wasn't the right one? JayKeaton 21:38, 31 May 2007 (UTC)
Did you understand the angry man image? You seem to be focusing on the recommendation rather than the research behind it. I changed the article text to say approximate, which none of the above recommendations disagrees with. About my living room: There is no TV there or family room, but my basement has a 12 foot wide screen, in which seating is closer than 3 times the viewing height.  :-) Daniel.Cardenas 16:47, 1 June 2007 (UTC)
I recently purchase a 1080p screen, well my room mate did, but I found we needed to replace our old rusty cough with light frame arm chairs so we can bring ourselves right up to the screen for 1080p images, and then easily move the chairs back for lower quality sources. But I'm not basing this on my own personal research, many professional websites, like Hitachi, Cnet and HDTV Mag which employ people experienced with HD TVs or actual TV technicians recommend many different viewing distances. Even flicking through the manual on my Sony tv, it quotes a completely different minimum viewing distance. Three times the height is the result of one investigation into viewing distances, and one result doesn't mean the only result, especially when there are a startling array of different results out there by different experts in this field. I have even read around three times the height as a viewing distance for 720p TVs, and 720p isn't even half the resolution of 1080p. I get a bit funny with things like this in Wikipedia, it is a small thing in a small section of an article, but I just feel compelled to make things accurate when there are sources and facts to back it up. It's the details I worry about, not so much the formatting, someone else will always clean that up, but it's the small details that often get skewed and glossed over that I just have to fix. I've looked at the sources out there for this, I've looked at the source that was included, and despite the efforts made by that source, I just have to fix this so the wording is accurate and not limiting itself to something that isn't completely true JayKeaton 18:03, 1 June 2007 (UTC)
Did you understand the angry man image? If not then you didn't really understand the reference given. Daniel.Cardenas 20:26, 1 June 2007 (UTC)
I do not believe that the angry man image is a conclusive enough method of pinpointing the exact distance needed to be from a television screen. There are other sources that mention the science they use to target the minimum distance, including total possible veiwing angle of a screen, reference images displayed on the screen, calibration of levels and just general hands on experience from experts to determine the best distance simply by being around HD screens as part of their jobs reviewing, testing or working with screens. Now from what I gather MIT has a lot of respect as a technical institute and such respect is not unearned, but an image made in 1997 by someone named Aude, and contrast level reference images in less than 720p resolution do not seem to apply to 1080p at all. In fact, I don't see how this Schubin Report mp3 can be conlclusive at all, he doesn't even talk about test groups for 1080p and it sounds like he has hardly tested a 1080p screen himself, he's just going by the math of it all and looking at low resolution reference jpegs. JayKeaton 21:10, 1 June 2007 (UTC)
In fact, straight after Shcubins report on HDTVs, the Schubin report tells people to go buy a JBC tv that has full cinematic quality and other glowing specs. Funnily enough, the whole mp3 file is sponsored by JBC. He even admits that his numbers are based purely on theoretical trigonometry and the research behind it all was all done in only one consumer test, a test that didn’t measure consumers reactions to tvs, but a test on how far consumers sit from their tvs. And the angry man image does not measure how close you can sit to your HDTV, it only measures how far away from your HDTV is too far, so it measures the MAXIMUM viewing distance, not the minimum distance. It’s all impressive numbers and science, but it’s not a test, it’s not even research. It is all just theoretical numbers. It is also almost a year old now, the first HD-DVD player was only released in April and the first HD-DVD were only really released at the end of May, less than two before that report was written. He mentions HDTV, but TV isn’t broadcast in 1080p, so I really don’t know what he is basing it on, besides pen and paper theory. And ignoring ALL of that, one mans talk in an internet audio blog does not overrule all other sources, it is just one theoretical opinion that mainly talks about the maximum viewing distance, the complete opposite of what we are talking about here, the minium viewing distance! JayKeaton 22:29, 1 June 2007 (UTC)
Feel free to reword the section to state that this is about maximum viewing distance. The section is about the ability of the eye to see 1080p, not about recommended viewing distances. The additional references given are about "recommended viewing distances" and not about the ability of the eye to see 1080p. Daniel.Cardenas 17:35, 3 June 2007 (UTC)
The whole point about the ability to see 1080p is to sit close enough to it to make out the dots. You can probably see 1080p from 50 feet away, it's the ability to make out 1080p that matters when talking about the ability to see 1080p. Adding "maximum" and "approximately" was a very good move though, it means that the section now says that you must sit at least that close or closer, or else your eyes should start to lose detail, and "approximately" means that not all experts say 3 times the height back, but almost all experts might agree that around 3 times the height is around the right maximum distance, as it is around the same number as almost most max viewing distance recommendations. There is a little more room to have the tv further away according to the experts, but it is approximately the max distance, which makes the section as factual, accurate and useful as it's going to get. JayKeaton 23:31, 3 June 2007 (UTC)
Although it does bring into question the example given. "For example at nine feet (2,75 m) away you need at least a 46 inch (115 cm) display to see a benefit from 1080p or for optimum viewing of the resolution you need a 70 inch (175 cm) display". At nine feet away, a 70 inch display is bordering very close to sitting too close to the screen, as 9 feet away practically IS the minimum viewing distance, you could probably sit a further 7 feet away and still resolve 1080p. Also having the maximum possible screen for your viewing distance, while it does give you the biggest screen possible, it isn't necessarily the "optimum" size in terms of viewing the resolution. It brings you as close as possible to the resolution, yes, but it sounds like it is trying to say that being as close as possible to the screen gives you the "best" ability to resolve the resolution, which isn't at all accurate. It would be far better to word it in a way to give the smallest screen size from 9 feet for optimum resolution viewing, which would be 42 or 46 inches, and to give the largest screen size for optimum resolution viewing at 9 feet away which would be 70 inches. JayKeaton 23:50, 3 June 2007 (UTC)


This article needs information on progressively scanned frames.

The "psF" does in fact refer to "progressive segmented Frame" and is a concept suggested for professional HDTV with 1920 x 1080. Below you may find an excerpt from "A Guide to Standard and High-Definition Digital Video Measurements" by TEKTRONIX:

"Segmented frame production formats Several formats in the scanning formats table are nomenclated 1:1sF.

(I would be more than happy to insert a table in .jpg format if I only knew how to do this...)

The “sF” designates a “segmented frames” format per SMPTE recommended practice RP211. In segmented frame formats, the picture is captured as a frame in one scan, as in progressive formats, but transmitted as in an interlaced format with even lines in one field then odd lines in the next field.

(I would be more than happy to insert another picture to illustrate the "psF" concept if I only knew how to do this...)

The assignment of lines is the same as in an interlaced system, but the picture is captured for both fields in one pass eliminating spatial mis-registration that occurs with movement in an interlaced system. This gives the advantages of progressive scan but reduces the amount of signal processing required and doubles the presentation rate (reducing 24 to 30 Hz visual flicker) in the analog domain."

Please get in touch with me at <> so I can mail the pictures mentioned. — Preceding unsigned comment added by (talk) 00:51, 26 December 2013 (UTC)

1080p over the Internet[edit]

This section read like a brochure so I removed the "OMFG OAR COMAPNE IZ TEH GREETEST!!!!1!" crap that some jerk, obviously employed by Constructive Lab Ltd, had put in and made it read more neutral. But, does the section provide enough info to warrant inclusion?

Highest resolution formats[edit]

I know it is not really that important, but 720p is higher resolution than 1080i. 1080i is only higher than 720p in spatial resolution (pixels per frame). It has a lower temporal resolution though than 720p (pixels per second). Granted people notice spatial resolution more than temporal resolution, but it still makes the sentence in this article not quite accurate. Minor point, but no one else seems to have mentioned it.-- 03:38, 20 August 2007 (UTC)

full HD 1080p compatibility[edit]

I am looking at the sony home cinema projectors. some are labeled "full HD 1080p compatibility" (eg VPL-AW10) others say "Full High Definition 1920 (H) x 1080 (V) resolution" (eg VPL-VW100). this looks to me like a marketing trick. should this be noted on this page? -- 14:48, 21 August 2007 (UTC)

Fullscreen 1080p?[edit]

According to the article: "The term usually assumes a widescreen aspect ratio of 16:9, implying a horizontal resolution of 1920 pixels. This creates a frame resolution of 1920×1080." So, what resolution is 1080p fullscreen content delivered at? Is it 1440x1080, 1920x1080, or something different? I ask because DVD's are 480p and the resolution is always 720x480, which is then resized to either fullscreen or widescreen. -driver8

1080p is only for 16:9 transmission, any other aspect ratio must be converted (eg. black bars). --Ray andrew (talk) 13:34, 5 December 2007 (UTC)

1080i HD-DVD?[edit] wrote, "→Broadcasts: That was worded terribly. It gave me the impression the writer was implying HD-DVD video is encoded as 1080i, which isn't." Are you sure certain? I believe that is in fact the claim. jhawkinson (talk) 05:49, 8 March 2008 (UTC)

No, it isn't encoded in 1080i, the source material just like Blu-Ray movies are natively 1080p. The person who edited that was probably confusing the fact that the first Toshiba player only output a 1080i signal, but that's a lot different from claiming the actual media on the disc was using a 1080i wrapper or whatever. —Preceding unsigned comment added by (talkcontribs) 13:14, 11 March 2008
I moved the above comment from his talk page. I've talked to the editor in question, who was traveling last week, and he'll look into getting a better citation, but no, his claim is indeed that the 1080p source material is flagged as 1080i and the decoder does the appropriate pulldown. jhawkinson (talk) 03:47, 12 March 2008 (UTC)

Full 1080p source content[edit]

At some point this article should mention that most content is not 100% 1080p. All of the compression schemes mpeg2, vc-1, h.264 only use 1/4 of the color information, while trying to keep 100% of the luminance. The input to the encoders is YUV 4:2:0 format which only contains 1/4 of the color resolution. The only time you get 100% 1080p is from varied computer sources, such as viewing a still picture from a ps3 on your big screen. Daniel.Cardenas (talk) 23:15, 2 April 2008 (UTC)

Video game console[edit]

Would it not be appropriate to mention both the Xbox 360 and the PlayStation 3 as consoles capable of outputting 1080p, or use a less specific wording such as "some video game consoles"? —Preceding unsigned comment added by (talk) 00:24, 11 April 2008 (UTC)

"Full HD"[edit]

The lede of the article currently says that "Full HD" is a synonym of 1080p. But I don't think that this is true. According to HD ready, "Full HD" is actually even weaker than "HD ready". As an example, this monitor is definitely not 1080p (it's not tall enough), but is labeled "Full HD". Can someone with more knowledge of this look into what's going on with these terms? —AySz88\^-^ 20:26, 25 November 2008 (UTC)

HD Capture - Video Cameras[edit]

There should be some information here about the ways to capture HD content, the various cameras and chips available and how they relate to the different HD formats —Preceding unsigned comment added by (talk) 19:05, 9 December 2008 (UTC)

Sanyo's recently released Xacti HD2000 is the first consumer camcorder that records 1080p 60fps. Totsugeki (talk) 15:39, 20 March 2009 (UTC)

Ps3 1080p capable?[edit]

Ive got a ps3 80 GB version, and a good new HDMI cable. Ive tried to set my Ps3 at 1080p on my own almost brand new 42" 1080p TV, and it wont work. It only supports 576p, 720p and 1080i. Ive tried it on two other 1080p tvs, and the same result occurred. So the statement that Ps3 supports 1080p, is questionable. (talk) 02:23, 3 August 2009 (UTC)

Computer Monitor section[edit]

The Computer monitor section mentions that some larger modern LCD monitors can display higher resolutions then 1920x1080. However, perhaps it should be noted the resolutions some common HD capable CRT's were able to reach such as 2048x1536.--Senor Freebie (talk) 05:25, 3 February 2010 (UTC)

Internet Bandwidth[edit]

One question that should be answered is how much Internet bandwidth (Mb/sec) is needed to reliably stream full HD. I read a bunch of websites and they seem to be unsure and contradictory. —Preceding unsigned comment added by (talk) 20:16, 22 August 2010 (UTC)

A whole lot of unimportant trivia[edit]

I came to this page to see what the resolution of "1080" is. What a vain, futile effort. The article is jam packed with information a broadcasting director would want to know, and nothing most other people would. The answer to my question was buried in a semi-relevant graph (that is, a graph tangentially relevant to the discussion on the page).

If I were to rewrite this page in one sentence, it would read "A marketing label for 1920x1080." (talk) 07:06, 7 March 2011 (UTC)

I added 1920 x 1080 to the lead. Both numbers already appeared in the lead albieit in separate paragraphs. --Kvng (talk) 00:39, 8 March 2011 (UTC)

History section needed in article[edit]

There are some questions a history section should contain, like when was 1080 standardized, when was it first adopted (and for what), and what was the rationale in choosing 1080 lines (and not 1000 nor 1024 nor another number). (talk) 13:03, 2 January 2012 (UTC)

That's exactly for what I came here! At I saw a picture of John Carmack using some high-tech monitor capable of 1080p in mid-90s, and started to wonder when was the 1920x1080 resolution specced the first time and so on. (talk) 01:01, 4 July 2012 (UTC)

Merge this and 1080i into HDTV article?[edit]

I don't see anything here or 1080i that wouldn't be better folded into High-definition television. What do you think? Khendon (talk) 13:33, 8 December 2012 (UTC)