CryoSPARC Guide
  • About CryoSPARC
  • Current Version
  • Licensing
    • Non-commercial license agreement
  • Setup, Configuration and Management
    • CryoSPARC Architecture and System Requirements
    • CryoSPARC Installation Prerequisites
    • How to Download, Install and Configure
      • Obtaining A License ID
      • Downloading and Installing CryoSPARC
      • CryoSPARC Cluster Integration Script Examples
      • Accessing the CryoSPARC User Interface
    • Deploying CryoSPARC on AWS
      • Performance Benchmarks
    • Using CryoSPARC with Cluster Management Software
    • Software Updates and Patches
    • Management and Monitoring
      • Environment variables
      • (Optional) Hosting CryoSPARC Through a Reverse Proxy
      • cryosparcm reference
      • cryosparcm cli reference
      • cryosparcw reference
    • Software System Guides
      • Guide: Updating to CryoSPARC v4
      • Guide: Installation Testing with cryosparcm test
      • Guide: Verify CryoSPARC Installation with the Extensive Validation Job (v4.3+)
      • Guide: Verify CryoSPARC Installation with the Extensive Workflow (≤v4.2)
      • Guide: Performance Benchmarking (v4.3+)
      • Guide: Download Error Reports
      • Guide: Maintenance Mode and Configurable User Facing Messages
      • Guide: User Management
      • Guide: Multi-user Unix Permissions and Data Access Control
      • Guide: Lane Assignments and Restrictions
      • Guide: Queuing Directly to a GPU
      • Guide: Priority Job Queuing
      • Guide: Configuring Custom Variables for Cluster Job Submission Scripts
      • Guide: SSD Particle Caching in CryoSPARC
      • Guide: Data Management in CryoSPARC (v4.0+)
      • Guide: Data Cleanup (v4.3+)
      • Guide: Reduce Database Size (v4.3+)
      • Guide: Data Management in CryoSPARC (≤v3.3)
      • Guide: CryoSPARC Live Session Data Management
      • Guide: Manipulating .cs Files Created By CryoSPARC
      • Guide: Migrating your CryoSPARC Instance
      • Guide: EMDB-friendly XML file for FSC plots
    • Troubleshooting
  • Application Guide (v4.0+)
    • A Tour of the CryoSPARC Interface
    • Browsing the CryoSPARC Instance
    • Projects, Workspaces and Live Sessions
    • Jobs
    • Job Views: Cards, Tree, and Table
    • Creating and Running Jobs
    • Low Level Results Interface
    • Filters and Sorting
    • View Options
    • Tags
    • Flat vs Hierarchical Navigation
    • File Browser
    • Blueprints
    • Workflows
    • Inspecting Data
    • Managing Jobs
    • Interactive Jobs
    • Upload Local Files
    • Managing Data
    • Downloading and Exporting Data
    • Instance Management
    • Admin Panel
  • Cryo-EM Foundations
    • Image Formation
      • Contrast in Cryo-EM
      • Waves as Vectors
      • Aliasing
  • Expectation Maximization in Cryo-EM
  • Processing Data in cryoSPARC
    • Get Started with CryoSPARC: Introductory Tutorial (v4.0+)
    • Tutorial Videos
    • All Job Types in CryoSPARC
      • Import
        • Job: Import Movies
        • Job: Import Micrographs
        • Job: Import Particle Stack
        • Job: Import 3D Volumes
        • Job: Import Templates
        • Job: Import Result Group
        • Job: Import Beam Shift
      • Motion Correction
        • Job: Patch Motion Correction
        • Job: Full-Frame Motion Correction
        • Job: Local Motion Correction
        • Job: MotionCor2 (Wrapper) (BETA)
        • Job: Reference Based Motion Correction (BETA)
      • CTF Estimation
        • Job: Patch CTF Estimation
        • Job: Patch CTF Extraction
        • Job: CTFFIND4 (Wrapper)
        • Job: Gctf (Wrapper) (Legacy)
      • Exposure Curation
        • Job: Micrograph Denoiser (BETA)
        • Job: Micrograph Junk Detector (BETA)
        • Interactive Job: Manually Curate Exposures
      • Particle Picking
        • Interactive Job: Manual Picker
        • Job: Blob Picker
        • Job: Template Picker
        • Job: Filament Tracer
        • Job: Blob Picker Tuner
        • Interactive Job: Inspect Particle Picks
        • Job: Create Templates
      • Extraction
        • Job: Extract from Micrographs
        • Job: Downsample Particles
        • Job: Restack Particles
      • Deep Picking
        • Guideline for Supervised Particle Picking using Deep Learning Models
        • Deep Network Particle Picker
          • T20S Proteasome: Deep Particle Picking Tutorial
          • Job: Deep Picker Train and Job: Deep Picker Inference
        • Topaz (Bepler, et al)
          • T20S Proteasome: Topaz Particle Picking Tutorial
          • T20S Proteasome: Topaz Micrograph Denoising Tutorial
          • Job: Topaz Train and Job: Topaz Cross Validation
          • Job: Topaz Extract
          • Job: Topaz Denoise
      • Particle Curation
        • Job: 2D Classification
        • Interactive Job: Select 2D Classes
        • Job: Reference Based Auto Select 2D (BETA)
        • Job: Reconstruct 2D Classes
        • Job: Rebalance 2D Classes
        • Job: Class Probability Filter (Legacy)
        • Job: Rebalance Orientations
        • Job: Subset Particles by Statistic
      • 3D Reconstruction
        • Job: Ab-Initio Reconstruction
      • 3D Refinement
        • Job: Homogeneous Refinement
        • Job: Heterogeneous Refinement
        • Job: Non-Uniform Refinement
        • Job: Homogeneous Reconstruction Only
        • Job: Heterogeneous Reconstruction Only
        • Job: Homogeneous Refinement (Legacy)
        • Job: Non-uniform Refinement (Legacy)
      • CTF Refinement
        • Job: Global CTF Refinement
        • Job: Local CTF Refinement
        • Job: Exposure Group Utilities
      • Conformational Variability
        • Job: 3D Variability
        • Job: 3D Variability Display
        • Job: 3D Classification
        • Job: Regroup 3D Classes
        • Job: Reference Based Auto Select 3D (BETA)
        • Job: 3D Flexible Refinement (3DFlex) (BETA)
      • Postprocessing
        • Job: Sharpening Tools
        • Job: DeepEMhancer (Wrapper)
        • Job: Validation (FSC)
        • Job: Local Resolution Estimation
        • Job: Local Filtering
        • Job: ResLog Analysis
        • Job: ThreeDFSC (Wrapper) (Legacy)
      • Local Refinement
        • Job: Local Refinement
        • Job: Particle Subtraction
        • Job: Local Refinement (Legacy)
      • Helical Reconstruction
        • Helical symmetry in CryoSPARC
        • Job: Helical Refinement
        • Job: Symmetry search utility
        • Job: Average Power Spectra
      • Utilities
        • Job: Exposure Sets Tool
        • Job: Exposure Tools
        • Job: Generate Micrograph Thumbnails
        • Job: Cache Particles on SSD
        • Job: Check for Corrupt Particles
        • Job: Particle Sets Tool
        • Job: Reassign Particles to Micrographs
        • Job: Remove Duplicate Particles
        • Job: Symmetry Expansion
        • Job: Volume Tools
        • Job: Volume Alignment Tools
        • Job: Align 3D maps
        • Job: Split Volumes Group
        • Job: Orientation Diagnostics
      • Simulations
        • Job: Simulate Data (GPU)
        • Job: Simulate Data (Legacy)
    • CryoSPARC Tools
    • Data Processing Tutorials
      • Case study: End-to-end processing of a ligand-bound GPCR (EMPIAR-10853)
      • Case Study: DkTx-bound TRPV1 (EMPIAR-10059)
      • Case Study: Pseudosymmetry in TRPV5 and Calmodulin (EMPIAR-10256)
      • Case Study: End-to-end processing of an inactive GPCR (EMPIAR-10668)
      • Case Study: End-to-end processing of encapsulated ferritin (EMPIAR-10716)
      • Case Study: Exploratory data processing by Oliver Clarke
      • Tutorial: Tips for Membrane Protein Structures
      • Tutorial: Common CryoSPARC Plots
      • Tutorial: Negative Stain Data
      • Tutorial: Phase Plate Data
      • Tutorial: EER File Support
      • Tutorial: EPU AFIS Beam Shift Import
      • Tutorial: Patch Motion and Patch CTF
      • Tutorial: Float16 Support
      • Tutorial: Particle Picking Calibration
      • Tutorial: Blob Picker Tuner
      • Tutorial: Helical Processing using EMPIAR-10031 (MAVS)
      • Tutorial: Maximum Box Sizes for Refinement
      • Tutorial: CTF Refinement
      • Tutorial: Ewald Sphere Correction
      • Tutorial: Symmetry Relaxation
      • Tutorial: Orientation Diagnostics
      • Tutorial: BILD files in CryoSPARC v4.4+
      • Tutorial: Mask Creation
      • Case Study: Yeast U4/U6.U5 tri-snRNP
      • Tutorial: 3D Classification
      • Tutorial: 3D Variability Analysis (Part One)
      • Tutorial: 3D Variability Analysis (Part Two)
      • Tutorial: 3D Flexible Refinement
        • Installing 3DFlex Dependencies (v4.1–v4.3)
      • Tutorial: 3D Flex Mesh Preparation
    • Webinar Recordings
  • Real-time processing in cryoSPARC Live
    • About CryoSPARC Live
    • Prerequisites and Compute Resources Setup
    • How to Access cryoSPARC Live
    • UI Overview
    • New Live Session: Start to Finish Guide
    • CryoSPARC Live Tutorial Videos
    • Live Jobs and Session-Level Functions
    • Performance Metrics
    • Managing a CryoSPARC Live Session from the CLI
    • FAQs and Troubleshooting
  • Guides for v3
    • v3 User Interface Guide
      • Dashboard
      • Project and Workspace Management
      • Create and Build Jobs
      • Queue Job, Inspect Job and Other Job Actions
      • View and Download Results
      • Job Relationships
      • Resource Manager
      • User Management
    • Tutorial: Job Builder
    • Get Started with CryoSPARC: Introductory Tutorial (v3)
    • Tutorial: Manually Curate Exposures (v3)
  • Resources
    • Questions and Support
Powered by GitBook
On this page
  • Job Card
  • Header
  • Central Area
  • Outputs View
  • Footer
  • Job Actions
  • The “Quick Actions Menu”
  • The sidebar “Actions Menu”
  • Core Actions
  • Multi Actions
  • Quick Create Actions
  • Secondary Actions
  • Selection Actions
  • Selection Actions
  • Navigation Actions
  1. Application Guide (v4.0+)

Jobs

PreviousProjects, Workspaces and Live SessionsNextJob Views: Cards, Tree, and Table

Last updated 8 months ago

The fundamental unit of data processing within CryoSPARC is a job. Jobs are given inputs and parameters, do work, and through that work, create outputs. The outputs of one job can be connected as the inputs of another job, and through this process, allow you to create branching pipelines through which you can explore and refine your data.

Job Card

The most common representation of a job in CryoSPARC is a “Job Card”. This card is composed of a number of different sections used to organize job information in a consistent way with flexibility to accommodate the specific differences between job types.

Header

The header shows the job’s ID and a status indicator with the colour of the job’s current status (e.g. green for completed, red for failed, etc). The whole lefthand side of the header is a button that will open the job dialog when clicked.

The righthand side of the header shows all of the available job actions, including buttons to toggle the outputs view, show the quick actions menu, and a button to star the job for future reference.

Central Area

Underneath the header, you can give your job a unique title if you wish.

The description is an optional text field that can be added and edited in the sidebar (pencil icon). This field supports markdown and will turn into a scrollable sub-section on the card if a non-trivial amount of information is added.

Outputs View

The outputs view can be accessed either by clicking the outputs view toggle (cube with arrow icon) in the card header, or by pressing the O key to switch to the view on all cards at once. This view shows a list of all of the outputs created by the job. These outputs can be used in two ways:

Footer

The job’s footer contains a variety of information to gain a better idea of the processing that has occurred within a job at a glance. Each icon can be hovered to reveal a tooltip with additional relevant information.

  • The timer shows how long your job has been running or how long it ran for. It can be hovered to show who created the job and when they created it, along with the time it concluded processing if it is completed, killed, or failed.

  • The lane icon shows the name of the lane to which the job has been assigned. Hovering will show the target name and type.

  • The GPU icon shows the number of GPUs used by the job, and if running, will show the names of the GPUs in use. Hovering will show the number of GPUs used and available on the target.

  • The gear icon shows the number of custom parameters set on the job, hovering shows those parameters and their values.

  • Tag buttons shown beside the tag icon represent the tags that have been applied to the job. Clicking on these tag buttons will set the given tag as a filter in the current view.

Job Actions

There are a variety of actions available for interacting with jobs. These range from directly assigning jobs the computational resources that they need to run, to organizing jobs between different workspaces, and even building templates from pre-existing jobs to expedite your work.

Job actions can be accessed in two places: the “Quick Actions Menu”, and the sidebar “Actions Menu”.

The “Quick Actions Menu”

You can access the quick actions menu by clicking the trigger button in a card header (the trigger button is represented by an icon with three horizontal dots) or by right clicking on the card.

The sidebar “Actions Menu”

This menu can be accessed by clicking the “Actions” button at the bottom of the job sidebar with the job you wish to perform actions on selected. This menu contains all job core and compound actions, but does not include quick actions, selection actions, or navigation actions.

Core Actions

  • Queue Job: Queuing a job will assign the necessary computational resources to it so that is can begin processing your data.

  • Link Job: Linking a job allows you to connect a job in one workspace to another workspace within the same project. The job will exist in both workspaces as though it were created in them.

  • Move Job: Moving a job will connect it to another workspace, and remove it from the current one.

  • Clone Job: Cloning a job will create an identical copy of it, this includes all of its custom parameters and inputs.

  • Kill Job: Killing a job will interrupt its processing and completely stop it from running. The job will release its computational resources and tokens.

  • Clear Job: Clearing a job resets it to building status so that its parameters and inputs can be edited again. This is necessary to return the job to an operable state after it has failed or has been killed.

  • Mark Job as Complete: Marking a job as complete will change its status to complete and allow you to access its outputs as though the job had completed normally. This can be useful if a job fails or is killed but the generation of its outputs had reached a point sufficient for further data processing.

  • Unlink Job from Workspace: Unlinking a job will remove it from the current workspace. It will still be fully accessible in all other workspaces it had been linked to or from.

  • Delete Job: Deleting a job removes it permanently from all views, and renders it inoperable for further data processing. Deleted jobs can still be accessed through a filter for accounting purposes, and continue exist in the database in a limited form.

Multi Actions

These actions can only be performed with multiple jobs selected.

  • Clone {x} Jobs

  • Clone Job Chain (only available for valid chains of jobs): This action allows you to clone all jobs between, and including, two initial job selections. One job represents the start of the chain, and the other represents the end of the chain.

  • Delete {x} Jobs

Quick Create Actions

Secondary Actions

Selection Actions

  • Select Job Chain: Selects all jobs between two initial job selections, one represents the start of the chain, and the other represents the end of the chain.

  • Select Ancestor Jobs: Selects the chain of jobs that lead up to and connect to the current job.

  • Select Descendent Jobs: Selects the chain of jobs that connect to and branch out from the current job.

Selection Actions

  • Select Job Chain: Selects all jobs between two initial job selections, one represents the start of the chain, and the other represents the end of the chain.

  • Select Ancestor Jobs: Selects the chain of jobs that lead up to and connect to the current job.

  • Select Descendent Jobs: Selects the chain of jobs that connect to and branch out from the current job.

Navigation Actions

  • View workspaces in {project}: This action will navigate you to the workspaces page corresponding to the project that the job is inside.

  • View jobs in {project} {workspace}: This action will navigate you to the jobs page within the workspace that the job is assigned to.

The corners of the job card are populated with at-a-glance information - - about the outputs, such as the number of particles or classes. This view allows you to get a quick overview of your job without needing to open it.

Outputs can be dragged and dropped into the inputs of a building job, visible in the sidebar when the job is selected. Further information is available in the “” guide page.

Outputs can alternatively be clicked, which will add them to the “Job Cart”, a system that allows you to filter sequential jobs if their inputs match the selected outputs. This system can be further explored in the “” section of the guide.

: This will clear all unused outputs created by the job, in order to save storage space used by the raw data.

: This will export all essential files (images, streamlog events, etc) into a folder inside of its parent project’s directory. This folder can be used to import the job into another project or even instance.

Create Workflow: Workflows are a multi-job template that allow you to quickly build pre-populated sets of jobs to create automated pipelines. They can be explored in more detail in the “” section of the guide.

Quick create actions are touched on in more detail in the “” section of the documentation. These actions allow for the creation of predefined template jobs that are automatically connected to the outputs of the current job.

Create Blueprint: Blueprints are a per job template that can be used to quickly create a job and populate its parameters, or populate/overwrite the parameters of a pre-existing job. They are covered in more depth in the “” section of the guide.

Edit tags: Allows you to add or remove pre-configured tags from the job. Tags and tagging are expanded upon in the “” section in the guide.

info tags
Tutorial: Job Builder
Creating and Running Jobs
Clear Intermediate Results
Export Job
Workflows
Creating and running jobs
Blueprints
Tags
A 2D Classification job in the default image view
A 2D Classification job in the “outputs” view. Outputs are selectable for use in the "Job Builder" and “Job Cart”.