- 1 Rename back to input lag
- 2 "New ways to measure"
- 3 Monitor List
- 4 Sources
- 5 What are "external devices" ?
- 6 It's input lag
- 7 Does that even exist?
- 8 World of Warcraft lag not being an issue
- 9 Introduction to the Scientific Method of testing
- 10 Move to display lag
- 11 console gaming example- forum references
- 12 Research on input and display lag.
- 13 Input Lag Example for Console Gaming
- 14 Keep talk pages separate if new independent article
The original name was more accurate, because it's lag behind human input. It's certain websites that doesn't quite fully clarify "input lag" meaning. See my long explanation above, that explains precedent for the naming "input lag". Add new section "display lag" vs "input lag", and the confusion of "input lag" (some sites refer to it as the "lag on video input", some sites refer to it as "lag relative to human input"). The terminology "display lag" does not cover lag introduced by the computer side. Mdrejhon (talk) 23:49, 1 October 2012 (UTC)
- Examples of Precedent. See Anandtech's definition of input lag  which is a really good article of the many sources of input lag between human input ("input device"), to the human eyeballs. An example of input lag using high speed video camera is on YouTube at  showing the lag between input device and visible action. I'd rather have a single, good article "input lag" that covers the multiple different sources of of lag. Besides, Google agrees too:
- Quoted search on Google "display lag" = 21,200 results 
- Quoted search on Google "input lag" = 829,000 results 
"New ways to measure"
The article says:
Many new ways to measure input lag have been ingeniously developed and produce satisfying results. One such method involves connecting a laptop to an HDTV through a composite connection and run a timecode that shows on the laptop's screen and the HDTV simultaneously. Whenever paused, the difference in time on both displays can be interpreted as an estimation for the input lag.
Doing it that way reintroduces the old mistakes! The mentioned "satisfying" results are simply wrong. The timecode will definitely not be sent at the same time to the LCD of the Laptop and the HDTV through the composite connection. This is one of the biggest mistakes you can do. It's compareable to plain stop watches on two monitors. In a best case scenario all you can determine is the difference between these two different monitors (LCD of the laptop vs the HDTV) but, as it has been provem by the developer of SMTT, even this is not the case as digital and analoge signals are *not* sent at the same time, most of the time they are not even sent at the same frequency. V-Sync deactivated, CRT (tested with high end scientific equipment: NO input lag) vs TFT: http://www.prad.de/images/monitore/specials/inputlag/V-Sync-60Hz-Problem_teilweise-geloest.png The horizontal yellow lines show the actual position of the real screen refresh. As you can see: Its not synchronious.
If you the the following test-setup: http://www.prad.de/images/monitore/specials/inputlag/dvi-a_vs_dvi-d.png (analog vs digital signal output, high-end Oscilloscope for the measurement) You will see the following results (animated gifs): http://www.prad.de/images/monitore/specials/inputlag/9600xt_1280x1024_6.2_signale_v-sync-an_clone.gif Different graphics card, just to prove that this is not just a nvidia related problem: http://www.prad.de/images/monitore/specials/inputlag/hd4850_1280x1024_9.5_signale_v-sync-egal_clone.gif
Explanation: The pink peak in the first and the green peak in the second animated gif is the analog vertical sync signal. The oscilloscope triggers on the digital vertical sync signal (cyan part = bit patterns of the DVI-D link). What you can see is that the relative position of the analog vertical sync signal to the digital vertical sync signal is not constant. Doing a test as described as the "ingeniously" new test, will result in the display of not synchronized pictures. Once you will compare frame 1 on monitor A to frame 2 on monitor B and the very next moment you will compare frame 1 on monitor A to frame 1 on monitor B. Only caused by the asynchronious output of the graphics card without any monitor related interaction.
The "ingeniously" developed test ist not worth doing it at all.
The quoted text should be deleted as it produces the illusion that already proven wrong measurement methods would be correct again.
I've removed the monitor list, as it is completely unsourced. The same goes for the whole article however. - JohnyDog 13:11, 27 September 2006 (UTC)
This article should be changed to note that input lag isn't only an LCD phenomenon, but rather a digital display phenomenon. Plasmas, DLPs, and even digital CRTs can suffer from it as well. It probably has most to do with the imagine buffering and scaling these displays do, and the processing involved with deinterlacing when the video source is an interlaced format.
- Um video from a computer usually isn't interlaced and scaling does not occur at the native resolution AFAIK Nil Einne 19:05, 31 July 2007 (UTC)
- He does not say they do... Computer material requires less processing and thus should (not will) have less input lag. Some displays always pipe data through scaler even at native resolution, and thus have uniform input lag.
- Above anonymous poster is totally correct according to A/V Synchronization: How Bad Is Bad? and HDTVs and Video Game Lag: The Problem and the Solution. --Musaran (talk) 10:37, 23 February 2008 (UTC)
Any claims of overdrive causing input lag are misleading, as said input lag is usually due to image processing chips - deinterlacing, brightness/contrast, scaling adjustments etc. One of the fastest lcd's about in terms of response time - Samsung 226bw (2ms) has an input lag close to 0. The lag is less than one frame (~17ms @ 60hz) always, indicating no frame buffering is going on. http://www.behardware.com/articles/667-7/samsung-226bw-a-and-s-series-the-verdict.html I don't see how overdrive should cause input lag, all it does is apply more voltage temporarily than the required to reach the required brightness, so the lcd untwists quicker. No need to buffer frames to do that since only working on the current frame, not doing any interporlation based on next/previous frames.--22.214.171.124 (talk) 03:22, 30 July 2008 (UTC)
Article begins by describing this effect and why it is, by its very nature, variable and not accurately measurable, and then goes on to point out several times that 'manufacturers do not advertise input lag for their displays.' This should be obvious - we've already determined that there is no set input lag value for any given display. Generally poorly written and should be removed if not revised/sourced. Tagged as unreferenced. 126.96.36.199 01:24, 16 November 2007 (UTC)
- "we've already determined": Who is "we"? Where?
- "there is no set input lag value for any given display": Tests measure reproducible lag values. Do you mean there is no such thing as input lag, or that it depends on display settings and material mode? If so, how would that be incorporated to this article? --Musaran (talk) 10:37, 23 February 2008 (UTC)
- It is accurately measurable, as many computer hardware review sites have shown. It just requires (ironically already mentioned) that you just need a "control" display, a "input lag" display, a camera with a high shudder speed and free stopwatch software to measure. The only part about it that is not "accurately measurable" is the fact that during testing you will find monitors have minimums, maximums, and averages for how long it takes to display each frame. Similar to benchmarking frames per second. See here:  -- Cody-7 (talk) 21:45, 28 February 2008 (UTC)
- "we've already determined that there is no set input lag value for any given display." what is your source? who is "we"? are you an idiot? Randomvillian4378 (talk) 03:10, 8 July 2008 (UTC)
- these claims are clearly wrong since the fastest monitor tested by TFTcentral uses TN panel. and overdrive is not panel-specified. http://www.tftcentral.co.uk/images/nec_2490wuxi/input_lag1.jpg -188.8.131.52 (talk) 13:02, 2 April 2009 (UTC)
I've added a source to the first paragraph, but some of these specific claims are hard to cite. For example, the "Most sensitive users can tolerate latency under 20ms." claim. I happen to have a 24 LCD that has been measured at 20ms lag, and I agree 20ms is probably a bearable number for a sensitive user, but how do I back that claim?
I'd be great full if you all could help find some more sources. I'm removing the no sources template at the top because it now has at least one source. I feel that tag clutters the page, it's obvious will still need more sources. Thanks. -- Cody-7 (talk) 22:10, 28 February 2008 (UTC)
What are "external devices" ?
The article says :
External devices have also been shown to reduce overall latency by providing faster image-space resizing algorithms than those present in the LCD screen.
What are those external devices ? It is not explained, neither does it link. I know a few bits about display and PC technology, but have no idea to what "external devices" refers to. Someone please clarify. --Xerces8 (talk) 09:32, 10 December 2008 (UTC)
It's input lag
Does that even exist?
World of Warcraft lag not being an issue
"For instance, in a relatively slow MMORPG such as World of Warcraft, slight delays are far more tolerable than in medium paced tactical shooters"
I dispute this; being behind even a few milliseconds while, for example, tanking a raid boss such as Auriaya in the 25-player dungeon Ulduar will cause great annoyance and, having perceived it myself, I am in objection to this statement.
Introduction to the Scientific Method of testing
Much of this article appears to be just speculation by hobbyists using no formal test methodology, other than "there seems to be a difference, I just know it!" ... and most of the cites are from hobbyists too, also not using any formal test methods.
Here is what you need to have factual and proper research of this topic:
1. You need to establish a baseline using identical equipment in such a way that you get repeatable, equal results. This is known as the "control". It is no good to start off comparing an LCD to a CRT in your first "test". You need to begin with at least two identical devices and accurately establish that you are getting the exact same results from both. At LEAST two devices, maybe more, so as to increase the sample size. (Oh, it sounds like this isn't going to be cheap for some home hobbyist to do themselves.)
1a. Through this initial research you may discover that your signal sources are not as uniform and reliable as you may have expected. A dual monitor video card is not necessarily going to output exactly-timed signals from each port. Probably some sort of external distribution amplifier is going to be needed to do this testing properly. If you want full HDCP compliance in this distribution amp, that may further increase the costs of the testing process.
2. You need a sufficiently reliable output detection system to verify accuracy. It isn't good enough to just "eyeball" it and from that try to establish one as better than the other. A camera of some sort is needed, and probably not a generic digital camera either. Something with a very high framerate designed specifically for the task of capturing high speed events may be necessary. This may cost a few thousand dollars/euros, moving this project outside the range of some home hobbyist to test by themselves.
3. Once you have established you are getting exactly identical results from two or more identical sources, with verifiable documentation of that fact, then you can begin testing other devices against your baseline. Though just one device of one type, against the one device of the other type, may not be good enough. Testing one pair against another pair would be better, since it will show variability between two different devices compared with two other. Or testing three or more of one kind against three or more of the other. Oh, now this too gets complex and expensive.
In short, testing this with any sort of verifiability and credibility requires someone with deep pockets willing to do professional testing with expensive test equipment, and this article isn't going to be reliably sourced from some video a hobbyist posted on Youtube. DMahalko (talk) 20:19, 8 February 2011 (UTC)
- You can verify that they are with in a frame of each other using a simple camera. You are correct, there may be sub-frame timing differences between the ports, but we are generally only concerned with timing differences on the order of frames. The general assumption is that the frame rate is around 60hz, so any delays under 17ms are considered insignificant.
- The refresh rate of most monitors is in the 60-85hz range, so a shutter speed faster than 1/30 of a second will not improve results. A standard digital camera is more than adequate.
- I know this idea probably doesn't mean anything, but it's possible that using the Wii Remote's speaker is a good way to measure the lag. For example, in Mario Kart Wii, both the TV and the remote's speaker make a noise when you drive into an item box. Setting up two microphones (one on the controller and one on the TV) and doing a simultaneous recording should make it relatively easy to detect how long the delay is. Toomai Glittershine (talk) 02:10, 9 March 2011 (UTC)
Move to display lag
|The request to rename this article to Display lag has been carried out.|
This is a mistake, says an industry insider. "input lag" is a superset of "display lag" (display-specific lag). I don't like the rename unless we create a separate article for computer-side lag -- I interpret "input lag" as a lag behind human/controller/keyboard/mouse input. (Joystick, keyboard, etc) -- That's what is called the "input". Input lag is the lag between the input (mouse, keyboard, joystick) and the actual display. Lots of things in the computer can contribute to lag (not just the display); turning on double buffering adds extra input lag. (e.g. Windows 7 and Windows 8 compositing managers use double buffering, unlike Windows XP, so Windows 7/8 with Aero mode has slightly more keyboard-input/mouse-input lag than Windows XP), or videogames that have a VSYNC ON/OFF setting. Turning VSYNC ON adds game-controller-input lag. We also need an expanded section "Sources of Input Lag" that also covers discussion of computer-side lag, because for our group, "input lag" refers to gaming input, not the signal input. That said, display lag is certainly a consideration for non-interactive sources (e.g. video, television), but the display lag manifests itself as an annoying human problem /MAINLY/ if there's interactivity (human doing input, e.g. keyboard or mouse), therefore "Input Lag" is a possibly a MORE accurate title for this article, provided the article is expanded to include computer-side input lag. If I step up to the plate and rewrite this Wikipedia article within a year or two (I have worked in the home theater industry (google "mark rejhon home theater" for proof), so I have a bunch of good references I might be able to dig up), I am going to suggest rename it back to "Input Lag", and refer "Input" as human input, not signal input. That's where the lag counts (as a human annoyance -- e.g. it chiefly annoys only when there's interactivity, e.g. human input). Also, most forums use terminology "input lag". Most discussion forums (HardOCP, Overclock, TomsHardware, etc) use the terminology "input lag", and most references (e.g. http://www.tftcentral.co.uk/articles/input_lag.htm ...) uses "input lag", even though some sites only cover the display-side equation of input lag (because it's less controllable on the display side of things, e.g. influencing purchase decisions). If it's renamed back to the correctly accurate title, it'll help me volunteer time to rewrite parts of this article gradually. It's lag behind _human_ input (keyboard, mouse, joystick, gamepad). Mdrejhon (talk) 23:38, 1 October 2012 (UTC)
Quoted search on Google "display lag" = 21,200 results 
Quoted search on Google "input lag" = 829,000 results 
console gaming example- forum references
Hi, I realise forum references are not WP:verifyible, but just wanted to put them here to provide evidence for the claim that "below 30ms is not noticeable, discussions on gaming forums tend to agree with this value", so here's a couple of links (hard to get many sources because often the threads get confused between input lag and frame rate):
http://hardforum.com/showthread.php?t=1592751 Cheers, 1292simon (talk) 07:10, 29 January 2012 (UTC)
There's a big difference between it being "not noticeable" and not meaning anything. In particular if you are trying for some sort of record or playing a difficult speed-intensive part. Milliseconds could often count even if you don't notice them... think of time trial games where there is just milliseconds between records and then consider that with input lag one person is seeing the display that late the entire time. Anonywiki (talk) 17:56, 22 October 2012 (UTC)
Research on input and display lag.
Input Lag Example for Console Gaming
I believe this text is worth keeping, since "input lag" is a big issue for many people. While the use of the term "input lag" may differ from the academic definition, it is a commonly used term and the search term people use when researching  . If there another WP article where this text would be a better fit, we could always look into that. Cheers, 1292simon (talk) 10:32, 23 October 2012 (UTC)
- Actually, since the article is now called "Display Lag", I agree that it doesn't really fit, so I've moved it. Cheers, 1292simon (talk) 09:05, 24 October 2012 (UTC)
Keep talk pages separate if new independent article
As posted "over there"....
#REDIRECT [[Talk:Display lag]]
DMahalko (talk) 21:36, 28 October 2012 (UTC)