Difference between revisions of "VLC GPU Decoding"
Line 21: | Line 21: | ||
=== nVidia === | === nVidia === | ||
− | For nVidia GPU, you are required to use [http://en.wikipedia.org/wiki/Nvidia_PureVideo PureVideo] | + | For nVidia GPU, you are required to use a GPU supporting [http://en.wikipedia.org/wiki/Nvidia_PureVideo PureVideo], which means that you need a [http://en.wikipedia.org/wiki/GeForce_8_Series GeForce 8], [http://en.wikipedia.org/wiki/GeForce_9_Series GeForce 9] (advised) |
Revision as of 07:23, 27 November 2009
Contents
Introduction to GPU decoding in VLC
The VLC framework can use your graphic card (aka GPU) to decode H.264 streams (wrongly called HD videos) under certain circonstances.
VLC, in its modular approach and its transcoding/streaming capabilities, does decoding in GPU at the decoding stage only and then gets the data back to go to the other stages (streaming, filtering or plug any video output after that).
What that means is that, compared to some other implementation, GPU decoding in VLC can be slower because it needs to get the data back from the GPU. But you can plug ANY video output (sink) to it and use all the VLC video filters.
Windows
VLC supports DxVA in its version 2.0. That means that Windows Vista, Windows 2008 or Windows 7 are required. If you are using Windows XP, VLC cannot work for you yet.
Linux
On Linux, there is code for VDPAU and VAAPI. There is also some code for a VAAPI video output, that isn't merged in the current Git.
Requirements for DxVA2
Graphic card
nVidia
For nVidia GPU, you are required to use a GPU supporting PureVideo, which means that you need a GeForce 8, GeForce 9 (advised)