Nvidia NVENC

From Wikipedia, the free encyclopedia
Jump to: navigation, search

Nvidia NVENC is a feature in its graphics cards that performs H.264 video encoding, offloading this compute-intensive task from the CPU. It was introduced with the Kepler-based GeForce 600 series in March 2012.[1][2]

The encoder works with Share game capture, which is included in Nvidia's GeForce Experience software and is supported in many other streaming and recording programs, such as Open Broadcaster Software (OBS) and Bandicam.[3][4]


NVENC has undergone several hardware revisions since its introduction with the first Kepler GPU (GK104).[5]

First generation (Kepler)[edit]

The first generation of NVENC, which is shared by all Kepler-based GPUs, supports H.264 high-profile (YUV420, I/P/B frames, CAVLC/CABAC), H.264 SVC Temporal Encode VCE, and Display Encode Mode (DEM).

NVidia's documentation states a peak encoder throughput of 8x realtime at a resolution of 1920x1080 (where the baseline "1x" equals 30fps). Actual throughput varies on the selected preset, user controlled parameters and settings, and the GPU/memory clock frequencies. The published 8x rating is achievable with the NVENC high-performance preset, which sacrifices compression efficiency and quality for encoder throughput. It is considerably slower, but produces fewer compression artifacts.

Second Generation (Maxwell GM107/GM108)[edit]

Introduced with the first-generation Maxwell architecture, second generation NVENC adds support for the high-performance

HP444 profile (YUV4:4:4, predictive lossless encoding), and increases encoder throughput up to 16x realtime, which corresponds to about 1080p @ 480fps with the high-performance preset.)

Third Generation (Maxwell GM20x)[edit]

Introduced with the second-generation Maxwell architecture, third generation NVENC implements the video compression algorithm High Efficiency Video Coding (a.k.a. HEVC, H.265) and also increases the H.264 encoder's throughput to cover 4K-resolution at 60fps (2160p60). However, it does not support B-frames for HEVC encoding (just I and P frames). The maximum NVENC HEVC coding tree unit (CU) size is 32 (the HEVC standard allows a maximum of 64), and its minimum CU size is 8.

HEVC encoding also lacks Sample Adaptive Offset (SAO). Adaptive quantization, look-ahead rate control, adaptive B-frames (H.264 only) and adaptive GOP features were added with the release of Nvidia Video Codec SDK 7. [6] These features rely on CUDA cores for hardware acceleration.

SDK 7 supports two forms of adaptive quantization; Spatial AQ(H264 and HEVC) and Temporal AQ (H264 only).

Nvidia's consumer-grade (GeForce) cards are restricted to two simultaneous encoding jobs. Their professional Quadro cards do not have this restriction.

Fourth Generation (Pascal GP10x)[edit]

Fourth generation NVENC implements HEVC Main10 10-bit hardware encoding. It also doubles the encoding performance of 4K H.264 & HEVC when compared to previous generation NVENC. It supports HEVC 8K, 4:4:4 chroma subsampling, lossless encoding, and sample adaptive offset (SAO).

Nvidia Video Codec SDK 8 added Pascal exclusive Weighted Prediction feature (CUDA based). Weighted prediction is not supported if the encode session is configured with B frames (H264).

There is no B-Frame support for HEVC encoding, and the maximum CU size is 32x32.

Fifth Generation (Volta GV10x)[edit]

Operating system support[edit]

The Nvidia NVENC SIP core needs to be supported by the device driver. The driver provides one or more interfaces, (e.g. OpenMAX IL) to NVENC. The NVENC SIP core can only be accessed through the proprietary NVENC API (as opposed to the open-source VDPAU API).

It is bundled with Nvidia's GeForce driver.

NVENC is available for many operating systems. A free and open-source device driver is available, but support for Nvidia NVENC is not documented.[7]

FFmpeg has supported NVENC since 2014,[8] and is supported in Nvidia drivers.[9]

See also[edit]


External links[edit]

Nvidia's NVENC page