P2EL Examples

To ease readability the following commands has been broken down in several lines. If you wish to resubmit any of them, you will need to put it back together as a single line, unless you are using a sensitive editor such as the "P2EL editor" of the "GWE Terminal" or the "GWE Web Control Panel". Also valid parameters of these commands have been replaced by intuitive self explanatory tags of valid values.

Parallel Online Documents Reader

${INDEX}=$range(0,50,01) 
${FASTA_FILE}=$in(http://www.expasy.org/uniprot/P801${INDEX}.fas)

cat ${FASTA_FILE}

This command instructs GWE to read in parallel, from online Expasy DB, the 81 fasta sequences generated by permutating the variables INDEX_1 and INDEX_2

Parallel Backup-er

${INDEX}=$range(1,100,001)
${DEST}=$out(sftp://destinationHost/destinationDir/destinationFile-${INDEX}.tar) 
${SOURCE}=$in(sftp://sourceHost/sourceDir/sourceFile-${INDEX}) 

tar -cf ${DEST} ${SOURCE}

This command instructs GWE to download the matching files and, tar them, and save them in a different host

Parallel Directories Analyzer

${REMOTE_HOME_ROOT}=sftp://host/home
${FILES}=$dir(${REMOTE_ROOT},user-\d*,.*) 
${TARGET_FILE}=$in(${FILES}) 
${FILE_NAME}=$regExp(${FILES}, /, [^/]*, $) 
${STAT_DIR}=$out(${REMOTE_HOME_ROOT}/admin/dirStats/${FILE_NAME}) 

${RUNTIME_USER_HOME}/file-analysis.sh ${TARGET_FILE} ${STAT_DIR} 

This command instructs GWE to find all files and directories stored in the remote 'host' machine under the home directory of any user which user name starts with user- and is followed by digits. Once found, GWE must do the following for each matching file/directory:

  • Download the file/directory to the compute node that is going to analyze it.
  • Execute the file-analysis.sh script found in the cluster's home directory of the user who submitted this command.
  • Upload the directory (supposably generated by the script) specified as the second parameter of the script to the home directory of the user admin , under the subdirectory dirStats

    The following is a simple test sample for the script dir-analysis.sh

    #!/bin/sh
    
    mkdir $1
    ls -alR $2 > $1/ls-test
    ls -alR $2
    

Parallel FreeSurfer Subject Cases Processor

${PATH}=sftp://sourceHost/subjectsPath 
${FILES}=$dir(${PATH},.*) 
${INPUT_DIR}=$in(${FILES})
${SUBJ_ID}=$regExp(${FILES}, /, [^/]*, $)
${OUTPUT_DIR}=$out(${PATH}/${SUBJ_ID}_out)

${RUNTIME_USER_HOME}/RunFreesurfer.sh ${INPUT_DIR} ${OUTPUT_DIR}

This command is similar to the Parallel Directories Analyzer . It instructs GWE to find all directories under sftp://sourceHost/subjectsPath and for each of those:

  • Download it to the compute node tasked to process it.
  • Execute the RunFreesurfer.sh script found in the cluster's home directory of the user who submitted this command against this downloaded directory.
  • Upload the directory generated by FreeSurfer to the source remote host as a sibling of its source subject directory with the same name suffixed by the token _out .

Parallel Slicer's BSpline Deformable Registration (NAMIC Dataset)

${SLICER_MODULE}=$const(BSplineDeformableRegistration,BSplineDeformableRegistration-version2) 
${FIXED_FILES}=http://www.na-mic.org/ViewVC/index.cgi/trunk/Libs/MRML/Testing/TestData/fixed.nrrd?view=co
${MOVING_FILES}=http://www.na-mic.org/ViewVC/index.cgi/trunk/Libs/MRML/Testing/TestData/moving.nrrd?view=co
${FIXED_LOCAL}=$in(${FIXED_FILES},fixed.nrrd) 
${MOVING_LOCAL}=$in(${MOVING_FILES},moving.nrrd)
${OUTPUT}=$out(sftp://destinationHost/path/${SLICER_MODULE}/out-${ITER}-${HIST}-${SAM}.nrrd)
${ITER}=$range(10,50,5) ${HIST}=$range(20,100,010) ${SAM}=$range(500,5000,0750)

${SLICER_HOME}/Slicer3 --launch ${SLICER_HOME}/lib/Slicer3/Plugins/${SLICER_MODULE} 
--iterations ${ITER} --gridSize 5 --histogrambins ${HIST} --spatialsamples ${SAM} 
--maximumDeformation 1 --default 0 
--resampledmovingfilename ${OUTPUT} ${FIXED_LOCAL} ${MOVING_LOCAL} 

This command instructs GWE to execute in parallel 700 parameter exploration type of invocations for each of two different versions of the BSplineDeformableRegistration Slicer3 module (1400 parallel runs total) and store each of their output files in an specified remote host. This command requires the '$SLICER_HOME ' variable to be defined in the grid descriptor file .

Parallel Slicer's BSpline Deformable Registration (OASIS Dataset)

${URI}=http://www.oasis-brains.org/app/action/DownloadImages/template/ClosePage.vm?download_type=zip&search_element=xnat%3AmrSessionData&search_field=xnat%3AmrSessionData.ID&scanmpr1=true&scanmpr2=true&scanmpr3=true&scanmpr4=true&search_value=
${FIX_NAME}=OAS1_0101_MR1
${FIX}=$in(${URI}${FIX_NAME},${FIX_NAME}.zip,Y)
${MOV_NUM}=$range(1,51,10)
${MOV_NAME}=OAS1_00${MOV_NUM}_MR1
${MOV}=$in(${URI}${MOV_NAME},${MOV_NAME}.zip,Y)
${SCAN}=$range(1,4)

${ITER}=$const(10,20) ${HIST}=$range(20,100,060) ${SAM}=$range(500,5000,3000)
${OUT}=${RUNTIME_USER_HOME}/oasis-results/out-${SYSTEM_JOB_NUM}-${MOV_NUM}-${ITER}-${HIST}-${SAM}.
${OUT_HDR}=$out(${OUT}hdr)
${OUT_IMG}=$out(${OUT}img)

${SLICER_HOME}/Slicer3 --launch ${SLICER_HOME}/lib/Slicer3/Plugins/BSplineDeformableRegistration 
--iterations ${ITER} --gridSize 5 --histogrambins ${HIST} --spatialsamples ${SAM} --maximumDeformation 1 
--default 0 --resampledmovingfilename ${OUT_HDR} 
${FIX}-contents/${FIX_NAME}/RAW/${FIX_NAME}_mpr-${SCAN}_anon.hdr 
${MOV}-contents/${MOV_NAME}/RAW/${MOV_NAME}_mpr-${SCAN}_anon.hdr

This command instructs GWE to download a set of OASIS MRI Images and execute BSplineDeformableRegistration s over them. It also instructs that at completion GWE must save the results to under the home directory of the running cluster. This command requires the '$SLICER_HOME ' variable to be defined in the grid descriptor file .

Parallel Slicer's BSpline Deformable Registration (OASIS Dataset) Using P2EL Macros

${MOV}=$oasis_xcat(http://www.gridwizardenterprise.org/test/oasis-id.xcat)
${FIX}=$oasis(0101,1)
${ITER}=$const(10,20) ${HIST}=$range(20,100,060) ${SAM}=$range(500,5000,3000)
${BSPLINE}=$bspline(${SLICER_HOME},${ITER},${HIST},${SAM})
${OUT}=$uploadHDR(${RUNTIME_USER_HOME}/oasis-result2/${BSPLINE_EXPLORATION_ID}/out-${FIX_ID}-${MOV_OASIS_ID}-${SYSTEM_JOB_NUM})
${BSPLINE_CMD} ${OUT_HDR} ${FIX_HDR} ${MOV_OASIS_HDR}

Exact command as the previous one highly simplified by using custom P2EL Macros .

Parallel Slicer's BSpline Deformable Registration (XNAT System Images) Using P2EL Macros (XNAT Library)

${XNAT_SYS}=http://central.xnat.org
${FIX_EXPERIMENTS}=$xnatExperiments(${XNAT_SYS},CENTRAL_OASIS_CS,OAS1_0101)
${FIX_IMAGE_LIST}=$xnatListImages(${FIX_EXPERIMENTS},OAS1_0101_MR1,mpr-1)
${FIX}=$xnatStageHDR(${XNAT_SYS},${FIX_IMAGE_LIST})
${MOV_SUBJECTS}=$const(OAS1_0101,OAS1_0102)
${MOV}=$xnatStageAllImages(${XNAT_SYS},CENTRAL_OASIS_CS,${MOV_SUBJECTS})
${ITER}=$const(5,10) ${HIST}=$range(20,100,030) ${SAM}=$range(500,5000,1500)
${BSPLINE}=$bspline(${SLICER_HOME},${ITER},${HIST},${SAM})
${OUT}=$uploadHDR(${RUNTIME_USER_HOME}/gwe-res/${BSPLINE_EXPLORATION_ID}/out-${SYSTEM_JOB_NUM})
${BSPLINE_CMD} ${OUT} ${FIX} ${MOV}

Similar command to the previous one running registrations over images queried and downloaded from a XNAT system using its REST API. Heavily relies on the out-of-the-box P2EL macros written for XNAT support.

Parallel Creation Of Multiple Slices For All Volumes Associated To A Particular XNAT Session

${XNAT_SYS}=http://central.xnat.org
${XPATH}=//rows/row/cell[(position()=3)and(substring-after(.,'.')='nrrd')]/text()
${VOL_REL_URI}=$xpath(${XNAT_SYS}/REST/experiments/CENTRAL_E00465/scans/brain/files?format=xml,${XPATH})
${VOL}=$in(${XNAT_SYS}${VOL_REL_URI})

${AXIS}=$const(0,1,2)
${PLANE}=$range(100,140,2.5)
${SLICING_CMD}=$sliceGen(${SLICER_HOME},${VOL_REL_URI},${AXIS},${PLANE})

${VOL_REL_URI_PATH}=$regExp(0,${VOL_REL_URI},,.*,/[^/]*$)

mkdir -p ${VOL_REL_URI_PATH} && cp ${VOL} ${VOL_REL_URI} && ${SLICING_CMD}

Parallel Slicer's DWI Streamline Tractography

${SEED_POINT}=[[0,0,0]]
${TENSOR_TYPE}=$const(Two-Tensor)
${ANISOTROPY}=$const(Cl1)
${ANISOTROPY_THRESHOLD}=$range(0.1,0.1,0.1)
${STOPPING_CURVATURE}=$range(0.8,0.9,0.1)
${INTEGRATION_STEP_LENGTH}=$range(0.4,0.4,0.1)
${STEPS}=$range(1,1,1)
${CONFIDENCE}=$range(1,1,1)
${FRACTION}=$range(0.1,0.1,0.1)
${MINIMUM_LENGTH}=$range(40,40,10)
${MAXIMUM_LENGTH}=$range(800,800,100)
${MINIMUM_STEPS}=$range(1,1,1)
${ROI_LABEL}=$range(1,1,1)
${NUMBER_OF_SEED_POINTS_PER_VOXEL}=$range(1,1,1)
${OUTPUT}=$const(/media/sda7/Testing/out-${SEED_POINT}-${TENSOR_TYPE}-${ANISOTROPY}-${ANISOTROPY_THRESHOLD}-${STOPPING_CURVATURE}-${INTEGRATION_STEP_LENGTH}-${STEPS}.vtk)

${SLICER_HOME}/Slicer3 --no_splash --evalpython 
"import Slicer; 
volNode=Slicer.slicer.VolumesGUI.GetLogic().AddArchetypeVolume('/media/sda7/Images/01053-dwi-filt-Ed.nhdr','FirstImage',0); 
roiNode = Slicer.slicer.VolumesGUI.GetLogic().AddArchetypeVolume('/media/sda7/Programming/Slicer/Slicer3-build/Working.nhdr','ROIImage',1); 
p = Slicer.Plugin('DWI Streamline Tractography'); 
nodeType = 'vtkMRMLFiberBundleNode'; 
outputNode = slicer.MRMLScene.CreateNodeByClass(nodeType); 
fiberBundleNode=Slicer.slicer.MRMLScene.AddNode(outputNode); 
p.Execute(inputVolume=volNode.GetID(), inputROI=roiNode.GetID(), seedPoint=${SEED_POINT}, outputFibers=fiberBundleNode.GetID(), tensorType=${TENSOR_TYPE}, anisotropy=${ANISOTROPY, anisotropyThreshold=${ANISOTROPY_THRESHOLD}, stoppingCurvature=${STOPPING_CURVATURE}, integrationStepLength=${INTEGRATION_STEP_LENGTH}, steps=${STEPS}, confidence=${CONFIDENCE}, fraction=${FRACTION}, minimumLength=${MINIMUM_LENGTH}, maximumLength=${MAXIMUM_LENGTH}, minimumSteps=${MINIMUM_STEPS}, roiLabel=${ROI_LABEL}, numberOfSeedpointsPerVoxel=${NUMBER_OF_SEED_POINTS_PER_VOXEL}, fiducial=False); 
Slicer.slicer.ModelsGUI.GetLogic().SaveModel(${OUTPUT},fiberBundleNode)"
    

This command instructs GWE to execute in parallel Slicer's "DWI Streamline Tractography" module which is a Python based module. Other Python based modules can follow the same pattern to run as standalone command line executables in the grid.

Parallel Volume Slices Generator

${SLICER_HOME}=/Users/admin/GSlicer3-3.3-alpha-2009-02-01-darwin-x86-0.7.2.alpha
${volumes_files_dir}=/demos/gwe/data
${volumes_name_regexp}=.*[.](nrrd|nhdr)
${volumes_filenames}=$dir(${volumes_files_dir},${volumes_name_regexp})
${axis}=$const(0,1,2)
${plane}=$range(40,160,2)
${sliceGenerationCommand}=$sliceGen(${SLICER_HOME},${volumes_filenames},${axis},${plane})

${sliceGenerationCommand}

This command instructs GWE to execute in parallel UNU commands against all NRRD images found on specific directory to generate 71 slices (every other plane from a plane of 40 to a plane of 160) on each of the 3 axis. These images are ideal to work with in conjuction with the result browser available in the "GWE Control Panel" as featured in the GWE introductory screen cast.

Parallel Gradient Anisotropic Diffusion

${CONDUCT}=$range(0.5,2.5,0.5)
${TIME}=$range(0.01,0.05,0.01)
${ITER}=$const(1,11,21,31)
${outNRRD}=${RUNTIME_USER_HOME}/grad-test/${SYSTEM_JOB_NUM}.nrrd
${grad}=$gradient(${SLICER_HOME},${CONDUCT},${TIME},${ITER})
${png0}=$pngGen(${SLICER_HOME},${outNRRD},0,100)
${png1}=$pngGen(${SLICER_HOME},${outNRRD},1,100)
${png2}=$pngGen(${SLICER_HOME},${outNRRD},2,100)

${grad} ${RUNTIME_USER_HOME}/grad-test/target.nrrd ${outNRRD} && ${png0} && ${png1} && ${png2}

This command instructs GWE to execute in parallel gradient anisotropic diffusion run against a target volume and take a slice at the plane 100 on each of its axis to easily browse through a representative image. These representative images are ideal to work with in conjuction with the result browser available in the "GWE Control Panel".