Difference between revisions of "GPGPU-WG KnowledgeBase Batch Schedulers Torque"
Line 1: | Line 1: | ||
The latest production version of Torque (version 4.2.6) is not widely used in the production EGI Grid. | The latest production version of [[http://www.adaptivecomputing.com/downloading/?file=/torque/torque-4.2.6.1.tar.gz | Torque (version 4.2.6.1) ]] is not widely used in the production EGI Grid. | ||
Further information on configuring support for GPGPUs can be found under | Further information on configuring support for GPGPUs can be found under | ||
[[http://docs.adaptivecomputing.com/torque/archive/4-0-1/Content/topics/3-nodes/schedulingGPUs.htm Torque GPGPU scheduling]] | [[http://docs.adaptivecomputing.com/torque/archive/4-0-1/Content/topics/3-nodes/schedulingGPUs.htm Torque GPGPU scheduling]] |
Revision as of 18:46, 27 January 2014
The latest production version of [| Torque (version 4.2.6.1) ] is not widely used in the production EGI Grid. Further information on configuring support for GPGPUs can be found under [Torque GPGPU scheduling]
Support for GPGPUs was introducted in torque 2.5.6. The number of GPGPUs made available on the client node is controlled through the torque nodes file on the torque server (normally /var/spool/torque/server_priv/nodes). For example, to indicate that 2 GPGPUs are available on wn001.example.com, we would set the following in the nodes file:
wn001.example.com np=8 gpus=2
NVidia Support
If you are using high-end Nvidia GPGPUs (Kepler/Fermi), torque can extract and publish information about the status of these cards. However, you must re-compile torque to support these features. See: [NVidia GPGPU support]. This information could be quite useful for GIP plugins to determine the status of the available GPGPUs.
Caveat This feature does not work with mid-to-low range GTX cards.
MAUI Support
See: