Talk:Path tracing

From Wikipedia, the free encyclopedia
Jump to: navigation, search
WikiProject Computing (Rated Start-class)
WikiProject icon This article is within the scope of WikiProject Computing, a collaborative effort to improve the coverage of computers, computing, and information technology on Wikipedia. If you would like to participate, please visit the project page, where you can join the discussion and see a list of open tasks.
Start-Class article Start  This article has been rated as Start-Class on the project's quality scale.
 ???  This article has not yet received a rating on the project's importance scale.

Added images[edit]

Added some images. Are there any other helpful effects to demonstrate that aren't shown presently? Qutorial (talk) 02:17, 8 July 2016 (UTC)


Why is the cos(theta) term being multiplied with the reflectance twice in the pseudocode? It looks wrong to me. Could somebody look over the pseudocode and check if it's actually correct? (talk) 08:46, 16 July 2012 (UTC)

Now being in a position to answer my own question, I confirm that it is incorrect. I fixed it and tried to make the pseudocode a little better. Please fix if I made any errors or said anything stupid. (talk) 11:07, 24 September 2012 (UTC)


I gotta say that "pseudocode" is the most C++ish I've ever seen, complete with references, deferencers and other C++ syntax quirks. Why call it pseudocode when it's perfectly legal C++? just lacking a few method definitions... (talk) 20:54, 20 November 2007 (UTC)

I noticed that, too. I changed it a bit, took out some of the pointless C++ syntax and left in some useful C++ syntax. —Preceding unsigned comment added by (talk) 02:14, 6 January 2009 (UTC)

Incorrect formula?[edit]

I think there shouldn't be multiplication by "cost". If we sample from probability density measured with respect to projected solid angle, there should be no cos(theta) in the integrand, if we sample from probability density measured with respect to solid angle, there should be cos(theta) in "scale", which will cancel out with cos(theta) in the integrant. And i guess it should be made clear what density do we use when we calculate "RandomUnitVectorInHemisphere". —Preceding unsigned comment added by Iceglow (talkcontribs) 10:08, 12 May 2009 (UTC)

Three years later, the issue has been addressed. Although I may have gotten the cosines wrong - please recheck. I assumed the RandomUnitVectorInHemisphere was a naive, not cosine-weighted distribution (= naive sampling, PDF of 1). This way the cosine term stays, and I also added a few notes about sampling schemes and energy conservation below the pseudocode. (talk) 11:09, 24 September 2012 (UTC)


The way the algorithm is described it seem's like path tracing is just ray tracing but with infinite recursion depth. It might be usefull to state this if it is the case and if not perhaps provide a description of how path tracing differs from ray tracing.

-- Well actually, I will contend that path tracing is an algorithm that is totally alien to ray-tracing. And that it is not something that is simply added on top of a ray tracer. Miloserdia (talk) 10:19, 21 May 2012 (UTC)

Full rewrite?[edit]

I would like to take this article on as the main author and completely rewrite most of it. If no one on wikipedia gives me a red light on this in a few day I will go ahead and do it. Here are some highlights I would like to expand upon in the article.

  • Here is my definition: "Path Tracing is a computer graphics method for rendering three-dimensional images that are indistinguishable from photographs." Anyone have a problem with that?
  • In various contexts in which the word "unbiased" is used to describe a renderer, that invariably indicates that Path Tracing was used in that software.
  • Kajiya's rendering equation can be solved by two different algorithms. (1) Shooting rays from the light sources and creating paths in the scene. The path is cut off at a random number of bouncing steps and the resulting light is sent through the projected pixel on the output image. During rendering, billions of paths are created, and the output image is the average of every pixel that received some contribution. It is not the sum of these contributions, but their direct average. (2) Gathering rays from a point on a surface. A ray is projected from the surface to the scene in a bouncing path that terminates when a light source is intersected. The light is then sent backwards through the path and out the pixel on the output image. The creation of a single path is called a "sample". For a single point on a surface, approximately 800 samples (up to as many as 3 thousand samples) are taken. The final output of the pixel is the average of all these samples, not the sum!
  • Path Tracing is not a variation of Ray Tracing, nor is it a generalization of that earlier rendering method. In path tracing every instance of a bounce in the light path, the illuminance contribution is taken through a function called a BRDF -- a bidirectional reflectance distribution function. These functions relate the incoming angle of light to its outgoing angle. In orthodox Ray Tracing, the algorithm merely casts a shadow ray to the light source and then "shades" the pixel. Ray tracing does not sample an integral of incoming illuminance, nor does Ray Tracing use averages, nor does it contain concepts of convergence.
  • High Dynamic Range. Unlike Ray Tracing, or Scanline Graphics, Path Tracing must invariably be performed with colors in the High Dynamic Range. This means the resulting raw output must be taken through some manner of Tone Mapping before an actual pixel value is output to a monitor. Path Tracing in a naked form would output an image file in HDR format.
  • It was known that the two algorithms above, "Shooting" and "Gathering" respectfully, were capable of producing a numerical solution to Kajiya's rendering equation. Later developments in Path Tracing algorithms were motivated by the fact that those algorithms are so slow as to be infeasible to calculate on a desktop computer.
  • Bidirectional Path Tracing. BPT was created explicitly to obtain faster convergence of the integral. A shooting path and a gathering path are traced independently, and then the head of the shooting path is connected to the tail of the gathering path. The light is then projected through every bounce and back out into the pixel. This technique at first seems paradoxically slower, since for every gathering sample we additional trace a whole shooting path. In practice however, the extra speed of convergence far outweighs any performance loss from the extra ray casts on the shooting side.
  • Importance Sampling. The central performance bottleneck in Path Tracing is the complex geometrical calculation of casting a ray. Importance Sampling is a technique which is motivated to cast less rays through the scene while still converging correctly to the integral of incoming illuminance on the surface point. This is done by casting more rays in directions in which the illuminance would have been greater anyway. If the density of rays cast in certain direction matches the strength of contributions in those directions, the result is identical, but far less rays were actually cast. Importance Sampling is used to match Lambert's Cosine law, and can also be used to match BRDFs.
  • Metropolis Light Transport. This algorithm was created in order to get faster convergence in scenes in which the light must pass through oddly-shaped or small holes in order to reach the part of the scene that the camera is viewing. It is also shown promise on correctly rendering odd/complicated caustics. Instead of generating random paths, new sampling paths are created as slight mutations of existing paths. In this sense, the algorithm "remembers" the rare paths that connect light sources to the camera.

Miloserdia (talk) 11:33, 21 May 2012 (UTC)

Path tracing will be ported to DX12 and Vulkan...[edit]

"In 2015, path tracing will be ported to DirectX 12 and Vulkan API." whom? In what context? Citation needed! SteveBaker (talk) 15:31, 11 September 2015 (UTC)

Wrong BRDF calculation in pseudocode?[edit]

Why is there a multiplication by 2 when calculating the BRDF in the pseudocode? According to the [Lambertian reflectance] article, shouldn't it be a simple reflectance * cos_theta?