Difference between revisions of "GPGPU-WG KnowledgeBase Batch Schedulers Torque"
m (moved GPGPU-WG:GPGPU Working Group KnowledgeBase:Batch Schedulers:Torque to GPGPU-WG KnowledgeBase Batch Schedulers Torque) |
|||
(8 intermediate revisions by 2 users not shown) | |||
Line 1: | Line 1: | ||
The latest production version of Torque (version 4.2.6) is not widely used in the production EGI Grid. | {{Template:Op menubar}} {{TOC_right}} | ||
[[Category:Task_forces]] | |||
'''[[GPGPU_Working_Group| << GPGPU Working Group main page]]''' | |||
The latest production version of [[http://www.adaptivecomputing.com/downloading/?file=/torque/torque-4.2.6.1.tar.gz | Torque (version 4.2.6.1) ]] is not widely used in the production EGI Grid. Moreover, it is normally used in conjuction with the MAUI scheduler. | |||
Support for GPGPUs was introducted in torque 2.5.6. The number of GPGPUs made available on the client node is | Support for GPGPUs was introducted in torque 2.5.6. The number of GPGPUs made available on the client node is | ||
controlled through the torque '''nodes'' file on the torque server (normally /var/spool/torque/server_priv/nodes). | controlled through the torque '''nodes''' file on the torque server (normally /var/spool/torque/server_priv/nodes). | ||
For example, to indicate that 2 GPGPUs are available on wn001.example.com, we would set the following in the nodes file: | For example, to indicate that 2 GPGPUs are available on wn001.example.com, we would set the following in the nodes file: | ||
< | <pre> | ||
wn001.example.com np=8 gpus=2 | wn001.example.com np=8 gpus=2 | ||
</ | </pre> | ||
Further information on configuring support for GPGPUs can be found under | |||
[[http://docs.adaptivecomputing.com/torque/archive/4-0-1/Content/topics/3-nodes/schedulingGPUs.htm Torque GPGPU scheduling]] | |||
Line 16: | Line 21: | ||
If you are using high-end Nvidia GPGPUs (Kepler/Fermi), torque can extract and publish information about the status of these cards. However, you must | If you are using high-end Nvidia GPGPUs (Kepler/Fermi), torque can extract and publish information about the status of these cards. However, you must | ||
re-compile torque to support these features. See: [[http://docs.adaptivecomputing.com/torque/archive/4-0-1/help.htm#topics/3-nodes/NVIDIAGPGPUs.htm NVidia GPGPU support]]. This information could be quite useful for GIP plugins to determine the | re-compile torque to support these features. See: [[http://docs.adaptivecomputing.com/torque/archive/4-0-1/help.htm#topics/3-nodes/NVIDIAGPGPUs.htm NVidia GPGPU support]]. This status information could be quite useful for GIP plugins to determine the availabilty and usage of GPGPUs resources. | ||
'''Caveat''' This feature does not work with mid-to-low range GTX cards. | '''Caveat''' This feature does not work with mid-to-low range GTX cards. | ||
== MAUI Support == | |||
See: [[https://wiki.egi.eu/w/index.php?title=GPGPU-WG:GPGPU_Working_Group_KnowlegeBase:Batch_Schedulers:Torque_MAUI Torque with MAUI]] |
Latest revision as of 16:01, 22 January 2015
Main | EGI.eu operations services | Support | Documentation | Tools | Activities | Performance | Technology | Catch-all Services | Resource Allocation | Security |
<< GPGPU Working Group main page The latest production version of [| Torque (version 4.2.6.1) ] is not widely used in the production EGI Grid. Moreover, it is normally used in conjuction with the MAUI scheduler.
Support for GPGPUs was introducted in torque 2.5.6. The number of GPGPUs made available on the client node is
controlled through the torque nodes file on the torque server (normally /var/spool/torque/server_priv/nodes).
For example, to indicate that 2 GPGPUs are available on wn001.example.com, we would set the following in the nodes file:
wn001.example.com np=8 gpus=2
Further information on configuring support for GPGPUs can be found under [Torque GPGPU scheduling]
NVidia Support
If you are using high-end Nvidia GPGPUs (Kepler/Fermi), torque can extract and publish information about the status of these cards. However, you must re-compile torque to support these features. See: [NVidia GPGPU support]. This status information could be quite useful for GIP plugins to determine the availabilty and usage of GPGPUs resources.
Caveat This feature does not work with mid-to-low range GTX cards.
MAUI Support
See: [Torque with MAUI]