Alert.png The wiki is deprecated and due to be decommissioned by the end of September 2022.
The content is being migrated to other supports, new updates will be ignored and lost.
If needed you can get in touch with EGI SDIS team using operations @ egi.eu.

Difference between revisions of "GPGPU-WG KnowledgeBase Batch Schedulers Torque"

From EGIWiki
Jump to navigation Jump to search
 
(5 intermediate revisions by 2 users not shown)
Line 1: Line 1:
The latest production version of [[http://www.adaptivecomputing.com/downloading/?file=/torque/torque-4.2.6.1.tar.gz | Torque (version 4.2.6.1) ]] is not widely used in the production EGI Grid.
{{Template:Op menubar}} {{TOC_right}}
Further information on configuring support for GPGPUs can be found under
[[Category:Task_forces]]
[[http://docs.adaptivecomputing.com/torque/archive/4-0-1/Content/topics/3-nodes/schedulingGPUs.htm  Torque GPGPU scheduling]]
 
'''[[GPGPU_Working_Group| << GPGPU Working Group main page]]'''
The latest production version of [[http://www.adaptivecomputing.com/downloading/?file=/torque/torque-4.2.6.1.tar.gz | Torque (version 4.2.6.1) ]] is not widely used in the production EGI Grid. Moreover, it is normally used in conjuction with the MAUI scheduler.
 


Support for GPGPUs was introducted in torque 2.5.6. The number of GPGPUs made available on the client node is
Support for GPGPUs was introducted in torque 2.5.6. The number of GPGPUs made available on the client node is
Line 7: Line 10:
For example, to indicate that 2 GPGPUs are available on wn001.example.com, we would set the following in the nodes file:
For example, to indicate that 2 GPGPUs are available on wn001.example.com, we would set the following in the nodes file:


<nowiki>
<pre>
wn001.example.com np=8 gpus=2
wn001.example.com np=8 gpus=2
</nowiki>
</pre>


Further information on configuring support for GPGPUs can be found under
[[http://docs.adaptivecomputing.com/torque/archive/4-0-1/Content/topics/3-nodes/schedulingGPUs.htm  Torque GPGPU scheduling]]




Line 16: Line 21:


If you are using high-end Nvidia GPGPUs (Kepler/Fermi), torque can extract and publish information about the status of these cards. However, you must  
If you are using high-end Nvidia GPGPUs (Kepler/Fermi), torque can extract and publish information about the status of these cards. However, you must  
re-compile torque to support these features. See: [[http://docs.adaptivecomputing.com/torque/archive/4-0-1/help.htm#topics/3-nodes/NVIDIAGPGPUs.htm NVidia GPGPU support]]. This information could be quite useful for GIP plugins to determine the status of the available GPGPUs.
re-compile torque to support these features. See: [[http://docs.adaptivecomputing.com/torque/archive/4-0-1/help.htm#topics/3-nodes/NVIDIAGPGPUs.htm NVidia GPGPU support]]. This status information could be quite useful for GIP plugins to determine the availabilty and usage of GPGPUs resources.


'''Caveat''' This feature does not work with mid-to-low range GTX cards.
'''Caveat''' This feature does not work with mid-to-low range GTX cards.
Line 23: Line 28:
== MAUI Support ==
== MAUI Support ==


See:
See: [[https://wiki.egi.eu/w/index.php?title=GPGPU-WG:GPGPU_Working_Group_KnowlegeBase:Batch_Schedulers:Torque_MAUI Torque with MAUI]]

Latest revision as of 16:01, 22 January 2015

Main EGI.eu operations services Support Documentation Tools Activities Performance Technology Catch-all Services Resource Allocation Security


<< GPGPU Working Group main page The latest production version of [| Torque (version 4.2.6.1) ] is not widely used in the production EGI Grid. Moreover, it is normally used in conjuction with the MAUI scheduler.


Support for GPGPUs was introducted in torque 2.5.6. The number of GPGPUs made available on the client node is controlled through the torque nodes file on the torque server (normally /var/spool/torque/server_priv/nodes). For example, to indicate that 2 GPGPUs are available on wn001.example.com, we would set the following in the nodes file:

wn001.example.com np=8 gpus=2

Further information on configuring support for GPGPUs can be found under [Torque GPGPU scheduling]


NVidia Support

If you are using high-end Nvidia GPGPUs (Kepler/Fermi), torque can extract and publish information about the status of these cards. However, you must re-compile torque to support these features. See: [NVidia GPGPU support]. This status information could be quite useful for GIP plugins to determine the availabilty and usage of GPGPUs resources.

Caveat This feature does not work with mid-to-low range GTX cards.


MAUI Support

See: [Torque with MAUI]