You are not logged in.
I have a Nvidia 560 based graphics card.
When I run luxrender, both from the community repository and the openCL version from the luxrender site, it does not appear to be using the card GPU while rendering judging from the fact that the card temperature doesn't rise.
glxgears reports about 18800 fps.
I am using the latest Nvdia driver in a 64 bit installation.
What do I need to do to get luxrender to use the GPU?
TIA
Last edited by gabu (2011-07-09 13:33:18)
Offline
I ran luxrender (the one from the luxrender site) from the command line and got this message:
./luxrender: /usr/lib/libOpenCL.so: no version information available (required by ./luxrender)
However there are no messages when I run the community version of luxrender from the command line.
I have got community/opencl-headers 1.1.20100715-3 installed.
Do I have to do anything more to get a luxrender installation that is capable of using the GPU?
On browsing the luxrender documentation and forum it looks like you have to make explicit statements in the image definition file to cause the GPU to be used .One user reports that he had to make changes to the example test files to enable GPU use. Although this seems odd that the example test files they supply for the opencl version aren't capable of testing the GPU capability!
Is my interpretation right?
Offline
After getting my hands on a luxrender image file that had been set up for GPU use:
- the community sourced luxrender program reported in its log that it didn't have openCL capability.
- the openCL capable program from the luxrender website is rendering using the GPU!
The only remaining question is why the community program hasn't the openCL capability - it could have been compiled without it or there is something missing on my PC that would enable it to do so.
Offline