Submitting Workflows to the Local Job Scheduler for Windows
The Local Job Scheduler is a Windows service that can run some Insight workflows on the local machine. It is integrated into the DUG Insight installer, so please be sure that the Local Job Scheduler was selected during Insight installation. For further information, refer to Download and Installing on Windows.
Submitting workflows to the Local Job Scheduler currently has the following limitations:
- Task splitting is not supported.
- The Dataload and Create Survey processes are supported only for 3D volumes.
- Some processes such as SEG-Y Output are not supported.
- Multiple workflows can be queued, but only one workflow will be executed at a time.
Submitting Workflows
- Add the desired processes, and configure the processes accordingly. Refer to Using Workflows for detailed steps.
- In the control panel Process list, click the workflow(s) you would like to submit to the Local Job Scheduler.
- Input Workflow: Optionally, use the input from another workflow to make the current workflow a Dependent Workflow. Dependent workflows are built and submitted by their preceding workflow.
- Input Volume: The input volume for the first process in the workflow, which will flow through to the subsequent processes in the workflow.
- Match Trace By: Optionally, you can configure how processes that combine traces from different volumes choose the matching traces. The Indexed Order is the usual method, but for binned data, or in other special circumstances, you may use the Headers option to match by using standard or custom headers.
- Variables: Displays the input variable. New variables can be created in the Variable section of the workflow process, and an input can be linked to a variable by clicking the icon next to the selection box. Currently, only volume inputs are supported. Changing a variable's value automatically changes the value of all linked inputs.
- Product Links: Lists various data that is auto-defined from the configuration of the workflow and contained processes. The data can be viewed by clicking on it.
-
Job Configuration and Creation
- Iteration: This controls whether the scripts and output data are written to seiTimeProc/test or seiTimeProc/prod.
- Subdirectory: Select a subdirectory to contain the workflow, within the selected iteration directory. This allows grouping of workflows according to the following structure: .../seiTimeProc/{iteration}/{subdirectory}/workflow
- Inline: Define the inline extent of the workflow. This must be a subset of the input volume's inline extent.
- Crossline: Define the crossline extent of the workflow. This must be a subset of the input volume's crossline extent.
-
Advanced
-
Compression:
- Input Data and Output Default: By default, the output data will be compressed in the same way as the input data. Raw field data will emerge from data loading at 32-bit, and you should be switching to 16-bit as soon as the data has been de-spiked. The compression happens at the end of the workflow, so the workflow that performs the de-spiking can be output as 16-bit. It is possible to select different compression levels for different output volumes by selecting Allow different compression for some outputs.
- Output Volumes: For workflows with multiple output volumes; eg with QC Outputs, or Export Volume processes, you can turn on or off additional volumes to write to disk, and select different compression levels, if required.
-
Compression:
- Click Build Workflow.
- Click Submit All. The job is now added to the Local Job Scheduler queue and will get executed when it gets to the front of the queue. For more information, refer to Managing Submitted Workflows.