Difference between revisions of "Profiling with IPM and Vampir"

 
Line 9: Line 9:
 
==To run==  
 
==To run==  
 
Add this handedit to Input/Output Control->User hand edit files in the UMUI
 
Add this handedit to Input/Output Control->User hand edit files in the UMUI
<syntaxhighlight>
+
<syntaxhighlight lang=text>
 
~access/umui_jobs/hand_edits/profiling/ipm.sh
 
~access/umui_jobs/hand_edits/profiling/ipm.sh
 
</syntaxhighlight>
 
</syntaxhighlight>
Line 16: Line 16:
 
===OpenMPI===  
 
===OpenMPI===  
 
When using OpenMPI IPM creates an xml file in the UM output directory, who's name consists of your user name plus a timestamp. ipm_view is used to display the file, showing a number of graphs with the call time and load balance.
 
When using OpenMPI IPM creates an xml file in the UM output directory, who's name consists of your user name plus a timestamp. ipm_view is used to display the file, showing a number of graphs with the call time and load balance.
<syntaxhighlight>
+
<syntaxhighlight lang=text>
 
$ module load ipm
 
$ module load ipm
 
$ ipm_view $DATAOUTPUT/$USER/saatd/saw562.1346035173.302475.0
 
$ ipm_view $DATAOUTPUT/$USER/saatd/saw562.1346035173.302475.0
Line 25: Line 25:
 
===Intel MPI===  
 
===Intel MPI===  
 
Intel MPI's IPM produces a text file with communications information
 
Intel MPI's IPM produces a text file with communications information
<syntaxhighlight>
+
<syntaxhighlight lang=text>
 
$ cat $DATAOUTPUT/$USER/saatc/stats.txt
 
$ cat $DATAOUTPUT/$USER/saatc/stats.txt
 
</syntaxhighlight>
 
</syntaxhighlight>
<syntaxhighlight>
+
<syntaxhighlight lang=text>
 
Intel(R) MPI Library Version 4.0 Update 3
 
Intel(R) MPI Library Version 4.0 Update 3
 
<u>__ MPI Communication Statistics __</u>
 
<u>__ MPI Communication Statistics __</u>
Line 62: Line 62:
 
==To run==  
 
==To run==  
 
Add this handedit to Input/Output Control->User hand edit files in the UMUI
 
Add this handedit to Input/Output Control->User hand edit files in the UMUI
<syntaxhighlight>
+
<syntaxhighlight lang=text>
 
~access/umui_jobs/hand_edits/profiling/vampir.sh
 
~access/umui_jobs/hand_edits/profiling/vampir.sh
 
</syntaxhighlight>
 
</syntaxhighlight>
Line 69: Line 69:
 
===OpenMPI===  
 
===OpenMPI===  
 
When using OpenMPI the vampir utility is used to view the output files and graph the call timeline
 
When using OpenMPI the vampir utility is used to view the output files and graph the call timeline
<syntaxhighlight>
+
<syntaxhighlight lang=text>
 
$ module load vampir
 
$ module load vampir
 
$ vampir $DATAOUTPUT/$USER/saatd/saatd.otf
 
$ vampir $DATAOUTPUT/$USER/saatd/saatd.otf
Line 78: Line 78:
 
===Intel MPI===  
 
===Intel MPI===  
 
Intel has its own version of Vampir, called Trace Analyser. It has a similar function, to view graphs of call timelines etc. use the Charts menu.
 
Intel has its own version of Vampir, called Trace Analyser. It has a similar function, to view graphs of call timelines etc. use the Charts menu.
<syntaxhighlight>
+
<syntaxhighlight lang=text>
 
$ module load intel-itac
 
$ module load intel-itac
 
$ traceanalyser $DATAOUTPUT/$USER/saatc/saatc.stf
 
$ traceanalyser $DATAOUTPUT/$USER/saatc/saatc.stf

Latest revision as of 23:54, 11 December 2019

Template:Pre UM10 This information is only for versions of the UM before 10.0

NCI has a variety of profiling tools that can be helpful to use when developing. IPM is a low-overhead way to collect messaging information, while Vampir/Trace Analyser is a more in-depth profiler.

To use these with the UM some hand-edits are provided. You don't need to recompile when you add IPM, but you will with Vampir and Trace Analyser.

IPM

To run

Add this handedit to Input/Output Control->User hand edit files in the UMUI

~access/umui_jobs/hand_edits/profiling/ipm.sh

To view

OpenMPI

When using OpenMPI IPM creates an xml file in the UM output directory, who's name consists of your user name plus a timestamp. ipm_view is used to display the file, showing a number of graphs with the call time and load balance.

$ module load ipm
$ ipm_view $DATAOUTPUT/$USER/saatd/saw562.1346035173.302475.0

Ipm.png | NCI IPM notes

Intel MPI

Intel MPI's IPM produces a text file with communications information

$ cat $DATAOUTPUT/$USER/saatc/stats.txt
Intel(R) MPI Library Version 4.0 Update 3
<u>__ MPI Communication Statistics __</u>

Stats level: 3
P2P scope:< FULL >
Collectives scope:< FULL >

[[user:ScottWales | 1346044901]] Process 0 of 4 on node vayu2 lifetime = 394501.92

Data Transfers
Src     Dst     Amount(MB)      Transfers
-----------------------------------------
000 --> 000     0.000000e+00    0
000 --> 001     5.019608e-01    6
000 --> 002     1.968384e-03    2
000 --> 003     0.000000e+00    0
=========================================
Totals          5.039291e-01    8

Communication Activity
Operation       Volume(MB)      Calls
-----------------------------------------
P2P
Csend           3.929138e-03    4
Send            5.000000e-01    4
Bsend           0.000000e+00    0
Rsend           0.000000e+00    0

| Intel-MPI docs

Vampir

To run

Add this handedit to Input/Output Control->User hand edit files in the UMUI

~access/umui_jobs/hand_edits/profiling/vampir.sh

To view

OpenMPI

When using OpenMPI the vampir utility is used to view the output files and graph the call timeline

$ module load vampir
$ vampir $DATAOUTPUT/$USER/saatd/saatd.otf

Vampir.png | Vampir docs

Intel MPI

Intel has its own version of Vampir, called Trace Analyser. It has a similar function, to view graphs of call timelines etc. use the Charts menu.

$ module load intel-itac
$ traceanalyser $DATAOUTPUT/$USER/saatc/saatc.stf

Itac.png | Trace Collector docs | Trace Analyser docs