CryoSPARC Guide
  • About CryoSPARC
  • Current Version
  • Licensing
    • Non-commercial license agreement
  • Setup, Configuration and Management
    • CryoSPARC Architecture and System Requirements
    • CryoSPARC Installation Prerequisites
    • How to Download, Install and Configure
      • Obtaining A License ID
      • Downloading and Installing CryoSPARC
      • CryoSPARC Cluster Integration Script Examples
      • Accessing the CryoSPARC User Interface
    • Deploying CryoSPARC on AWS
      • Performance Benchmarks
    • Using CryoSPARC with Cluster Management Software
    • Software Updates and Patches
    • Management and Monitoring
      • Environment variables
      • (Optional) Hosting CryoSPARC Through a Reverse Proxy
      • cryosparcm reference
      • cryosparcm cli reference
      • cryosparcw reference
    • Software System Guides
      • Guide: Updating to CryoSPARC v4
      • Guide: Installation Testing with cryosparcm test
      • Guide: Verify CryoSPARC Installation with the Extensive Validation Job (v4.3+)
      • Guide: Verify CryoSPARC Installation with the Extensive Workflow (≤v4.2)
      • Guide: Performance Benchmarking (v4.3+)
      • Guide: Download Error Reports
      • Guide: Maintenance Mode and Configurable User Facing Messages
      • Guide: User Management
      • Guide: Multi-user Unix Permissions and Data Access Control
      • Guide: Lane Assignments and Restrictions
      • Guide: Queuing Directly to a GPU
      • Guide: Priority Job Queuing
      • Guide: Configuring Custom Variables for Cluster Job Submission Scripts
      • Guide: SSD Particle Caching in CryoSPARC
      • Guide: Data Management in CryoSPARC (v4.0+)
      • Guide: Data Cleanup (v4.3+)
      • Guide: Reduce Database Size (v4.3+)
      • Guide: Data Management in CryoSPARC (≤v3.3)
      • Guide: CryoSPARC Live Session Data Management
      • Guide: Manipulating .cs Files Created By CryoSPARC
      • Guide: Migrating your CryoSPARC Instance
      • Guide: EMDB-friendly XML file for FSC plots
    • Troubleshooting
  • Application Guide (v4.0+)
    • A Tour of the CryoSPARC Interface
    • Browsing the CryoSPARC Instance
    • Projects, Workspaces and Live Sessions
    • Jobs
    • Job Views: Cards, Tree, and Table
    • Creating and Running Jobs
    • Low Level Results Interface
    • Filters and Sorting
    • View Options
    • Tags
    • Flat vs Hierarchical Navigation
    • File Browser
    • Blueprints
    • Workflows
    • Inspecting Data
    • Managing Jobs
    • Interactive Jobs
    • Upload Local Files
    • Managing Data
    • Downloading and Exporting Data
    • Instance Management
    • Admin Panel
  • Cryo-EM Foundations
    • Image Formation
      • Contrast in Cryo-EM
      • Waves as Vectors
      • Aliasing
  • Expectation Maximization in Cryo-EM
  • Processing Data in cryoSPARC
    • Get Started with CryoSPARC: Introductory Tutorial (v4.0+)
    • Tutorial Videos
    • All Job Types in CryoSPARC
      • Import
        • Job: Import Movies
        • Job: Import Micrographs
        • Job: Import Particle Stack
        • Job: Import 3D Volumes
        • Job: Import Templates
        • Job: Import Result Group
        • Job: Import Beam Shift
      • Motion Correction
        • Job: Patch Motion Correction
        • Job: Full-Frame Motion Correction
        • Job: Local Motion Correction
        • Job: MotionCor2 (Wrapper) (BETA)
        • Job: Reference Based Motion Correction (BETA)
      • CTF Estimation
        • Job: Patch CTF Estimation
        • Job: Patch CTF Extraction
        • Job: CTFFIND4 (Wrapper)
        • Job: Gctf (Wrapper) (Legacy)
      • Exposure Curation
        • Job: Micrograph Denoiser (BETA)
        • Job: Micrograph Junk Detector (BETA)
        • Interactive Job: Manually Curate Exposures
      • Particle Picking
        • Interactive Job: Manual Picker
        • Job: Blob Picker
        • Job: Template Picker
        • Job: Filament Tracer
        • Job: Blob Picker Tuner
        • Interactive Job: Inspect Particle Picks
        • Job: Create Templates
      • Extraction
        • Job: Extract from Micrographs
        • Job: Downsample Particles
        • Job: Restack Particles
      • Deep Picking
        • Guideline for Supervised Particle Picking using Deep Learning Models
        • Deep Network Particle Picker
          • T20S Proteasome: Deep Particle Picking Tutorial
          • Job: Deep Picker Train and Job: Deep Picker Inference
        • Topaz (Bepler, et al)
          • T20S Proteasome: Topaz Particle Picking Tutorial
          • T20S Proteasome: Topaz Micrograph Denoising Tutorial
          • Job: Topaz Train and Job: Topaz Cross Validation
          • Job: Topaz Extract
          • Job: Topaz Denoise
      • Particle Curation
        • Job: 2D Classification
        • Interactive Job: Select 2D Classes
        • Job: Reference Based Auto Select 2D (BETA)
        • Job: Reconstruct 2D Classes
        • Job: Rebalance 2D Classes
        • Job: Class Probability Filter (Legacy)
        • Job: Rebalance Orientations
        • Job: Subset Particles by Statistic
      • 3D Reconstruction
        • Job: Ab-Initio Reconstruction
      • 3D Refinement
        • Job: Homogeneous Refinement
        • Job: Heterogeneous Refinement
        • Job: Non-Uniform Refinement
        • Job: Homogeneous Reconstruction Only
        • Job: Heterogeneous Reconstruction Only
        • Job: Homogeneous Refinement (Legacy)
        • Job: Non-uniform Refinement (Legacy)
      • CTF Refinement
        • Job: Global CTF Refinement
        • Job: Local CTF Refinement
        • Job: Exposure Group Utilities
      • Conformational Variability
        • Job: 3D Variability
        • Job: 3D Variability Display
        • Job: 3D Classification
        • Job: Regroup 3D Classes
        • Job: Reference Based Auto Select 3D (BETA)
        • Job: 3D Flexible Refinement (3DFlex) (BETA)
      • Postprocessing
        • Job: Sharpening Tools
        • Job: DeepEMhancer (Wrapper)
        • Job: Validation (FSC)
        • Job: Local Resolution Estimation
        • Job: Local Filtering
        • Job: ResLog Analysis
        • Job: ThreeDFSC (Wrapper) (Legacy)
      • Local Refinement
        • Job: Local Refinement
        • Job: Particle Subtraction
        • Job: Local Refinement (Legacy)
      • Helical Reconstruction
        • Helical symmetry in CryoSPARC
        • Job: Helical Refinement
        • Job: Symmetry search utility
        • Job: Average Power Spectra
      • Utilities
        • Job: Exposure Sets Tool
        • Job: Exposure Tools
        • Job: Generate Micrograph Thumbnails
        • Job: Cache Particles on SSD
        • Job: Check for Corrupt Particles
        • Job: Particle Sets Tool
        • Job: Reassign Particles to Micrographs
        • Job: Remove Duplicate Particles
        • Job: Symmetry Expansion
        • Job: Volume Tools
        • Job: Volume Alignment Tools
        • Job: Align 3D maps
        • Job: Split Volumes Group
        • Job: Orientation Diagnostics
      • Simulations
        • Job: Simulate Data (GPU)
        • Job: Simulate Data (Legacy)
    • CryoSPARC Tools
    • Data Processing Tutorials
      • Case study: End-to-end processing of a ligand-bound GPCR (EMPIAR-10853)
      • Case Study: DkTx-bound TRPV1 (EMPIAR-10059)
      • Case Study: Pseudosymmetry in TRPV5 and Calmodulin (EMPIAR-10256)
      • Case Study: End-to-end processing of an inactive GPCR (EMPIAR-10668)
      • Case Study: End-to-end processing of encapsulated ferritin (EMPIAR-10716)
      • Case Study: Exploratory data processing by Oliver Clarke
      • Tutorial: Tips for Membrane Protein Structures
      • Tutorial: Common CryoSPARC Plots
      • Tutorial: Negative Stain Data
      • Tutorial: Phase Plate Data
      • Tutorial: EER File Support
      • Tutorial: EPU AFIS Beam Shift Import
      • Tutorial: Patch Motion and Patch CTF
      • Tutorial: Float16 Support
      • Tutorial: Particle Picking Calibration
      • Tutorial: Blob Picker Tuner
      • Tutorial: Helical Processing using EMPIAR-10031 (MAVS)
      • Tutorial: Maximum Box Sizes for Refinement
      • Tutorial: CTF Refinement
      • Tutorial: Ewald Sphere Correction
      • Tutorial: Symmetry Relaxation
      • Tutorial: Orientation Diagnostics
      • Tutorial: BILD files in CryoSPARC v4.4+
      • Tutorial: Mask Creation
      • Case Study: Yeast U4/U6.U5 tri-snRNP
      • Tutorial: 3D Classification
      • Tutorial: 3D Variability Analysis (Part One)
      • Tutorial: 3D Variability Analysis (Part Two)
      • Tutorial: 3D Flexible Refinement
        • Installing 3DFlex Dependencies (v4.1–v4.3)
      • Tutorial: 3D Flex Mesh Preparation
    • Webinar Recordings
  • Real-time processing in cryoSPARC Live
    • About CryoSPARC Live
    • Prerequisites and Compute Resources Setup
    • How to Access cryoSPARC Live
    • UI Overview
    • New Live Session: Start to Finish Guide
    • CryoSPARC Live Tutorial Videos
    • Live Jobs and Session-Level Functions
    • Performance Metrics
    • Managing a CryoSPARC Live Session from the CLI
    • FAQs and Troubleshooting
  • Guides for v3
    • v3 User Interface Guide
      • Dashboard
      • Project and Workspace Management
      • Create and Build Jobs
      • Queue Job, Inspect Job and Other Job Actions
      • View and Download Results
      • Job Relationships
      • Resource Manager
      • User Management
    • Tutorial: Job Builder
    • Get Started with CryoSPARC: Introductory Tutorial (v3)
    • Tutorial: Manually Curate Exposures (v3)
  • Resources
    • Questions and Support
Powered by GitBook
On this page
  • Queue Job
  • Queue a chain of jobs that will run automatically
  • View Job Status
  • Inspect Job
  • Open inspect view
  • View parameters that were modified
  • Interact with outputs and result groups
  • Other Job Actions
  • Link Job
  • Move Job
  • Kill Job
  • Clear Job
  • Clear Intermediate Results
  • Export Job
  • Clone Job
  • Mark Job as Complete
  • Delete Job
  1. Guides for v3
  2. v3 User Interface Guide

Queue Job, Inspect Job and Other Job Actions

PreviousCreate and Build JobsNextView and Download Results

Last updated 8 months ago

Queue Job

Once you have built a job and are ready to run the job, click 'Queue' on the Job Builder. This will bring up the Queue Modal.

1. Select the lane where the job should run. By default, new jobs you create will run in the current active workspace. If you wish to run the job and link it into a different workspace, you can also specify this under 'Run Job in'.

2. Optionally, you can set the Job Priority. See: Priority Queuing.

3. Click 'Create' to run the job.

Queue a chain of jobs that will run automatically

You can queue up a chain of jobs, which will each commence as soon as their respective required inputs (i.e., the outputs from jobs earlier in the series) become available.

For example, while an Import Movies job is running, open the job card, and drag and drop the imported-movies output into a new Patch Motion Correction job in the Job Builder. Queue the Patch Motion Correction job and it will appear in "Queued - waiting because inputs are not ready" state until the imported movies become available, at which point it will start running automatically.

View Job Status

You can easily determine the status of jobs within the workspace view looking at the coloured dots on the job card. Purple = building, teal = queued, gray = started, blue = running, green = completed, orange = killed and red = failed. Hovering over a job's status circle will indicate the status as a tool tip.

Inspect Job

Once the job starts to run, you can view its progress and intermediate outputs at any time from the inspect view.

Open inspect view

To open the inspect view, click on the job number (at the top right of the job card), or select the job and press SPACEBAR.

The job inspect view has several tabs, including an 'Overview' or stream log, 'Inputs and Parameters', 'Outputs' and 'Metadata'. To navigate easily through the stream log without having to scroll, select a checkpoint from the top, or use the filters. Clicking on "Follow latest" will cause the streamlog to tail the end of the job's output as it runs in real time. You can also filter by image, plots, etc.

Press SPACEBAR again or use the x on the top right of the inspect view, to close it.

View parameters that were modified

To quickly view which parameters you adjusted for a given job, open the inspect view and click on the 'Inputs and Parameters' tab. Parameters which were unchanged from their default values display with a blue D. Parameters you specified are outlined in green and display with a green S.

Interact with outputs and result groups

Other Job Actions

Link Job

To avoid redundant processing, it's possible to link jobs in multiple workspaces to continue processing in a new or existing workspace. Select a job and click on 'Link Job' the Job Details panel to select another Workspace in the same Project where you would like to link the job.

Any changes you make to the job in one workspace will be reflected in the other, as both workspaces will be displaying the exact same job. If a linked job is deleted from any workspace, all other instances of it in other workspaces will also be deleted. You can also unlink a job from a workspace if needed.

Move Job

When re-organizing your workspaces, you may find it necessary to move a job from one workspace to another. To do so, select the job and navigate to the 'Details' tab. Here, you can click the Move Job dropdown to select which workspace you would like to move the job to. The job will disappear from its original workspace and move to the new one.

Kill Job

You can kill a job while it is running. A pop-up message will allow you to confirm. Killed jobs will display as orange and will remain part of the workspace unless you delete them. Killed jobs will not have their intermediate results expunged until they are cleared.

Clear Job

When a job has been killed, has failed, or has been completed, it can be cleared, meaning that its results and outputs are all erased from the file system, but its connections to other jobs and parameters are retained. This makes it easy to correct a mistakenly set parameter, or to re-run a failed job. Clearing also removes queued jobs from the queue and resets jobs to Building status.

Clear Intermediate Results

Unused intermediate results can be cleared from iterative jobs to save space. This action can be executed at a job level (by clicking the "Clear Intermediate Results" button on the Job Details panel) or at a project level for every job (by clicking the "Clear Intermediate Results" button on the Project Details panel). This function will remove all unused outputs created by iterative jobs that save raw data at every iteration. Final results for every result slot will be retained, whether they have been used elsewhere or not.

Export Job

Clone Job

Cloning is particularly useful when you wish to process subsequent jobs in a new workspace for better organization of your experiment, or if you wish to quickly replicate a job (to try another setting of parameters) without dragging and dropping inputs into the Job Builder. Select the job you wish to clone, and from the 'Details' tab, click 'Clone job'. By default, the job will be cloned in the current workspace. Click 'Queue' to launch the job. A modal will appear asking whether you wish to run the job in the current workspace or in a new workspace, and you can select as appropriate. Note that you can always edit the parameters of a new cloned job before Queuing. You can also change inputs by removing connected inputs and dragging in new inputs.

Mark Job as Complete

Delete Job

While selected, you can delete a job from the 'Details' tab. A pop-up message will ask you to confirm the delete. Once deleted, the job will disappear from the workspace (except in the tree view, if it has children that need to be displayed). Note that job numbers within a workspace are unique, so a new job in the same workspace will not be assigned the same number as a previously run or previously deleted job. Deleted jobs are first cleared before deleting, meaning that their intermediate and final results are erased from disk.

See also, if you wish to create and launch jobs through the command line interface.

Each output in cryoSPARC jobs contains "low level" result groups that may be useful for advanced processing strategies. For a detailed overview of outputs and passthrough results, please see:

Any individual job can be exported, for sharing, manipulation, or archiving. Jobs must be exported manually in order to create a "consolidated" exported-job directory which is then importable. For more details, please see:

This option allows you to mark failed or killed jobs in cryoSPARC as completed in order to allow their latest outputs to be used for further processing. For more details, please see:

Management and Monitoring
Inputs and Outputs in cryoSPARC
Exporting a job
Data Management in cryoSPARC