VLC GPU Decoding
Contents
Introduction to GPU decoding in VLC
The VLC framework can use your graphic card (aka GPU) to decode H.264 streams (wrongly called HD videos) under certain circonstances.
VLC, in its modular approach and its transcoding/streaming capabilities, does decoding in GPU at the decoding stage only and then gets the data back to go to the other stages (streaming, filtering or plug any video output after that).
What that means is that, compared to some other implementation, GPU decoding in VLC can be slower because it needs to get the data back from the GPU. But you can plug ANY video output (sink) to it and use all the VLC video filters.
Windows
VLC supports DxVA in its version 2.0. That means that Windows Vista, Windows 2008 or Windows 7 are required. If you are using Windows XP, VLC cannot work for you yet.
Linux
On Linux, there is code for VDPAU and VAAPI. There is also some code for a VAAPI video output, that isn't merged in the current Git.
Mac OS X
Mac OS X doesn't provide a GPU decoding API. Complain to Apple.
Requirements for Windows DxVA2 in VLC
Graphic card
VLC developers recommend to use nVidia cards so far to have the best performance.
nVidia
For nVidia GPU, you are required to use a GPU supporting PureVideo in its 2nd generation (VP2 or newer), which means that you need a GeForce 8, GeForce 9 (advised), GeForce 200 or newer.
To be sure, check your GPU against this table on wikipedia and check if you are VP2 or newer.
ATI
For ATI GPUs, it is a bit trickier.
First, you are required to use a GPU supporting Unified Video Decoder.
We believe you need a GPU supporting UVD+, but you might require a UVD2 compatible GPU. We don't have the hardware to test so far. We have tested against Radeon 4K so far.
Then, ATI cards seems to be very slow to get the data back from the GPU. So far, it seems that you need a SSE4.1 compatible CPU to decode, in order to use MOVNTDQA instruction. This may change in the future!
Intel
We haven't tested any Intel implementation so far.