Difference between revisions of "VLC GPU Decoding"
Line 18: | Line 18: | ||
== Graphic card == | == Graphic card == | ||
+ | VLC developpers recommend to use nVidia cards so far to have the best performance. | ||
=== nVidia === | === nVidia === |
Revision as of 07:28, 27 November 2009
Contents
Introduction to GPU decoding in VLC
The VLC framework can use your graphic card (aka GPU) to decode H.264 streams (wrongly called HD videos) under certain circonstances.
VLC, in its modular approach and its transcoding/streaming capabilities, does decoding in GPU at the decoding stage only and then gets the data back to go to the other stages (streaming, filtering or plug any video output after that).
What that means is that, compared to some other implementation, GPU decoding in VLC can be slower because it needs to get the data back from the GPU. But you can plug ANY video output (sink) to it and use all the VLC video filters.
Windows
VLC supports DxVA in its version 2.0. That means that Windows Vista, Windows 2008 or Windows 7 are required. If you are using Windows XP, VLC cannot work for you yet.
Linux
On Linux, there is code for VDPAU and VAAPI. There is also some code for a VAAPI video output, that isn't merged in the current Git.
Requirements for DxVA2
Graphic card
VLC developpers recommend to use nVidia cards so far to have the best performance.
nVidia
For nVidia GPU, you are required to use a GPU supporting PureVideo in its 2nd generation (VP2 or newer), which means that you need a GeForce 8, GeForce 9 (advised), GeForce 200 or newer.
To be sure, check your GPU against this table on wikipedia and check if you are VP2 or newer.