Alert.png The wiki is deprecated and due to be decommissioned by the end of September 2022.
The content is being migrated to other supports, new updates will be ignored and lost.
If needed you can get in touch with EGI SDIS team using operations @ egi.eu.

Difference between revisions of "GPGPU-WG KnowledgeBase Batch Schedulers Torque"

From EGIWiki
Jump to navigation Jump to search
Line 17: Line 17:


If you are using high-end Nvidia GPGPUs (Kepler/Fermi), torque can extract and publish information about the status of these cards. However, you must  
If you are using high-end Nvidia GPGPUs (Kepler/Fermi), torque can extract and publish information about the status of these cards. However, you must  
re-compile torque to support these features. See: [[http://docs.adaptivecomputing.com/torque/archive/4-0-1/help.htm#topics/3-nodes/NVIDIAGPGPUs.htm NVidia GPGPU support]]. This information could be quite useful for GIP plugins to determine the status of the available GPGPUs.
re-compile torque to support these features. See: [[http://docs.adaptivecomputing.com/torque/archive/4-0-1/help.htm#topics/3-nodes/NVIDIAGPGPUs.htm NVidia GPGPU support]]. This status information could be quite useful for GIP plugins to determine the availabilty and usage of GPGPUs resources.


'''Caveat''' This feature does not work with mid-to-low range GTX cards.
'''Caveat''' This feature does not work with mid-to-low range GTX cards.
Line 24: Line 24:
== MAUI Support ==
== MAUI Support ==


See: [[https://wiki.egi.eu/w/index.php?title=GPGPU-WG:GPGPU_Working_Group_KnowlegeBase:Batch_Schedulers:Torque_MAUI]]
See: [[https://wiki.egi.eu/w/index.php?title=GPGPU-WG:GPGPU_Working_Group_KnowlegeBase:Batch_Schedulers:Torque_MAUI Torque with MAUI]]

Revision as of 20:25, 27 January 2014

The latest production version of [| Torque (version 4.2.6.1) ] is not widely used in the production EGI Grid. Moreover, it is normally used in conjuction with the MAUI scheduler.


Support for GPGPUs was introducted in torque 2.5.6. The number of GPGPUs made available on the client node is controlled through the torque nodes file on the torque server (normally /var/spool/torque/server_priv/nodes). For example, to indicate that 2 GPGPUs are available on wn001.example.com, we would set the following in the nodes file:

wn001.example.com np=8 gpus=2

Further information on configuring support for GPGPUs can be found under [Torque GPGPU scheduling]


NVidia Support

If you are using high-end Nvidia GPGPUs (Kepler/Fermi), torque can extract and publish information about the status of these cards. However, you must re-compile torque to support these features. See: [NVidia GPGPU support]. This status information could be quite useful for GIP plugins to determine the availabilty and usage of GPGPUs resources.

Caveat This feature does not work with mid-to-low range GTX cards.


MAUI Support

See: [Torque with MAUI]