Difference between revisions of "GPGPU-WG KnowledgeBase Batch Schedulers Torque"
Line 4: | Line 4: | ||
Support for GPGPUs was introducted in torque 2.5.6. The number of GPGPUs made available on the client node is | Support for GPGPUs was introducted in torque 2.5.6. The number of GPGPUs made available on the client node is | ||
controlled through the torque '''nodes'' file on the torque server (normally /var/spool/torque/server_priv/nodes). | controlled through the torque '''nodes''' file on the torque server (normally /var/spool/torque/server_priv/nodes). | ||
For example, to indicate that 2 GPGPUs are available on wn001.example.com, we would set the following in the nodes file: | For example, to indicate that 2 GPGPUs are available on wn001.example.com, we would set the following in the nodes file: | ||
Revision as of 18:31, 27 January 2014
The latest production version of Torque (version 4.2.6) is not widely used in the production EGI Grid. Further information on configuring support for GPGPUs can be found under [Torque GPGPU scheduling]
Support for GPGPUs was introducted in torque 2.5.6. The number of GPGPUs made available on the client node is controlled through the torque nodes file on the torque server (normally /var/spool/torque/server_priv/nodes). For example, to indicate that 2 GPGPUs are available on wn001.example.com, we would set the following in the nodes file:
wn001.example.com np=8 gpus=2
NVidia Support
If you are using high-end Nvidia GPGPUs (Kepler/Fermi), torque can extract and publish information about the status of these cards. However, you must re-compile torque to support these features. See: [NVidia GPGPU support]. This information could be quite useful for GIP plugins to determine the status of the available GPGPUs.
Caveat This feature does not work with mid-to-low range GTX cards.
MAUI Support
See: