CryoSPARC Guide
  • About CryoSPARC
  • Current Version
  • Licensing
    • Non-commercial license agreement
  • Setup, Configuration and Management
    • CryoSPARC Architecture and System Requirements
    • CryoSPARC Installation Prerequisites
    • How to Download, Install and Configure
      • Obtaining A License ID
      • Downloading and Installing CryoSPARC
      • CryoSPARC Cluster Integration Script Examples
      • Accessing the CryoSPARC User Interface
    • Deploying CryoSPARC on AWS
      • Performance Benchmarks
    • Using CryoSPARC with Cluster Management Software
    • Software Updates and Patches
    • Management and Monitoring
      • Environment variables
      • (Optional) Hosting CryoSPARC Through a Reverse Proxy
      • cryosparcm reference
      • cryosparcm cli reference
      • cryosparcw reference
    • Software System Guides
      • Guide: Updating to CryoSPARC v4
      • Guide: Installation Testing with cryosparcm test
      • Guide: Verify CryoSPARC Installation with the Extensive Validation Job (v4.3+)
      • Guide: Verify CryoSPARC Installation with the Extensive Workflow (≤v4.2)
      • Guide: Performance Benchmarking (v4.3+)
      • Guide: Download Error Reports
      • Guide: Maintenance Mode and Configurable User Facing Messages
      • Guide: User Management
      • Guide: Multi-user Unix Permissions and Data Access Control
      • Guide: Lane Assignments and Restrictions
      • Guide: Queuing Directly to a GPU
      • Guide: Priority Job Queuing
      • Guide: Configuring Custom Variables for Cluster Job Submission Scripts
      • Guide: SSD Particle Caching in CryoSPARC
      • Guide: Data Management in CryoSPARC (v4.0+)
      • Guide: Data Cleanup (v4.3+)
      • Guide: Reduce Database Size (v4.3+)
      • Guide: Data Management in CryoSPARC (≤v3.3)
      • Guide: CryoSPARC Live Session Data Management
      • Guide: Manipulating .cs Files Created By CryoSPARC
      • Guide: Migrating your CryoSPARC Instance
      • Guide: EMDB-friendly XML file for FSC plots
    • Troubleshooting
  • Application Guide (v4.0+)
    • A Tour of the CryoSPARC Interface
    • Browsing the CryoSPARC Instance
    • Projects, Workspaces and Live Sessions
    • Jobs
    • Job Views: Cards, Tree, and Table
    • Creating and Running Jobs
    • Low Level Results Interface
    • Filters and Sorting
    • View Options
    • Tags
    • Flat vs Hierarchical Navigation
    • File Browser
    • Blueprints
    • Workflows
    • Inspecting Data
    • Managing Jobs
    • Interactive Jobs
    • Upload Local Files
    • Managing Data
    • Downloading and Exporting Data
    • Instance Management
    • Admin Panel
  • Cryo-EM Foundations
    • Image Formation
      • Contrast in Cryo-EM
      • Waves as Vectors
      • Aliasing
  • Expectation Maximization in Cryo-EM
  • Processing Data in cryoSPARC
    • Get Started with CryoSPARC: Introductory Tutorial (v4.0+)
    • Tutorial Videos
    • All Job Types in CryoSPARC
      • Import
        • Job: Import Movies
        • Job: Import Micrographs
        • Job: Import Particle Stack
        • Job: Import 3D Volumes
        • Job: Import Templates
        • Job: Import Result Group
        • Job: Import Beam Shift
      • Motion Correction
        • Job: Patch Motion Correction
        • Job: Full-Frame Motion Correction
        • Job: Local Motion Correction
        • Job: MotionCor2 (Wrapper) (BETA)
        • Job: Reference Based Motion Correction (BETA)
      • CTF Estimation
        • Job: Patch CTF Estimation
        • Job: Patch CTF Extraction
        • Job: CTFFIND4 (Wrapper)
        • Job: Gctf (Wrapper) (Legacy)
      • Exposure Curation
        • Job: Micrograph Denoiser (BETA)
        • Job: Micrograph Junk Detector (BETA)
        • Interactive Job: Manually Curate Exposures
      • Particle Picking
        • Interactive Job: Manual Picker
        • Job: Blob Picker
        • Job: Template Picker
        • Job: Filament Tracer
        • Job: Blob Picker Tuner
        • Interactive Job: Inspect Particle Picks
        • Job: Create Templates
      • Extraction
        • Job: Extract from Micrographs
        • Job: Downsample Particles
        • Job: Restack Particles
      • Deep Picking
        • Guideline for Supervised Particle Picking using Deep Learning Models
        • Deep Network Particle Picker
          • T20S Proteasome: Deep Particle Picking Tutorial
          • Job: Deep Picker Train and Job: Deep Picker Inference
        • Topaz (Bepler, et al)
          • T20S Proteasome: Topaz Particle Picking Tutorial
          • T20S Proteasome: Topaz Micrograph Denoising Tutorial
          • Job: Topaz Train and Job: Topaz Cross Validation
          • Job: Topaz Extract
          • Job: Topaz Denoise
      • Particle Curation
        • Job: 2D Classification
        • Interactive Job: Select 2D Classes
        • Job: Reference Based Auto Select 2D (BETA)
        • Job: Reconstruct 2D Classes
        • Job: Rebalance 2D Classes
        • Job: Class Probability Filter (Legacy)
        • Job: Rebalance Orientations
        • Job: Subset Particles by Statistic
      • 3D Reconstruction
        • Job: Ab-Initio Reconstruction
      • 3D Refinement
        • Job: Homogeneous Refinement
        • Job: Heterogeneous Refinement
        • Job: Non-Uniform Refinement
        • Job: Homogeneous Reconstruction Only
        • Job: Heterogeneous Reconstruction Only
        • Job: Homogeneous Refinement (Legacy)
        • Job: Non-uniform Refinement (Legacy)
      • CTF Refinement
        • Job: Global CTF Refinement
        • Job: Local CTF Refinement
        • Job: Exposure Group Utilities
      • Conformational Variability
        • Job: 3D Variability
        • Job: 3D Variability Display
        • Job: 3D Classification
        • Job: Regroup 3D Classes
        • Job: Reference Based Auto Select 3D (BETA)
        • Job: 3D Flexible Refinement (3DFlex) (BETA)
      • Postprocessing
        • Job: Sharpening Tools
        • Job: DeepEMhancer (Wrapper)
        • Job: Validation (FSC)
        • Job: Local Resolution Estimation
        • Job: Local Filtering
        • Job: ResLog Analysis
        • Job: ThreeDFSC (Wrapper) (Legacy)
      • Local Refinement
        • Job: Local Refinement
        • Job: Particle Subtraction
        • Job: Local Refinement (Legacy)
      • Helical Reconstruction
        • Helical symmetry in CryoSPARC
        • Job: Helical Refinement
        • Job: Symmetry search utility
        • Job: Average Power Spectra
      • Utilities
        • Job: Exposure Sets Tool
        • Job: Exposure Tools
        • Job: Generate Micrograph Thumbnails
        • Job: Cache Particles on SSD
        • Job: Check for Corrupt Particles
        • Job: Particle Sets Tool
        • Job: Reassign Particles to Micrographs
        • Job: Remove Duplicate Particles
        • Job: Symmetry Expansion
        • Job: Volume Tools
        • Job: Volume Alignment Tools
        • Job: Align 3D maps
        • Job: Split Volumes Group
        • Job: Orientation Diagnostics
      • Simulations
        • Job: Simulate Data (GPU)
        • Job: Simulate Data (Legacy)
    • CryoSPARC Tools
    • Data Processing Tutorials
      • Case study: End-to-end processing of a ligand-bound GPCR (EMPIAR-10853)
      • Case Study: DkTx-bound TRPV1 (EMPIAR-10059)
      • Case Study: Pseudosymmetry in TRPV5 and Calmodulin (EMPIAR-10256)
      • Case Study: End-to-end processing of an inactive GPCR (EMPIAR-10668)
      • Case Study: End-to-end processing of encapsulated ferritin (EMPIAR-10716)
      • Case Study: Exploratory data processing by Oliver Clarke
      • Tutorial: Tips for Membrane Protein Structures
      • Tutorial: Common CryoSPARC Plots
      • Tutorial: Negative Stain Data
      • Tutorial: Phase Plate Data
      • Tutorial: EER File Support
      • Tutorial: EPU AFIS Beam Shift Import
      • Tutorial: Patch Motion and Patch CTF
      • Tutorial: Float16 Support
      • Tutorial: Particle Picking Calibration
      • Tutorial: Blob Picker Tuner
      • Tutorial: Helical Processing using EMPIAR-10031 (MAVS)
      • Tutorial: Maximum Box Sizes for Refinement
      • Tutorial: CTF Refinement
      • Tutorial: Ewald Sphere Correction
      • Tutorial: Symmetry Relaxation
      • Tutorial: Orientation Diagnostics
      • Tutorial: BILD files in CryoSPARC v4.4+
      • Tutorial: Mask Creation
      • Case Study: Yeast U4/U6.U5 tri-snRNP
      • Tutorial: 3D Classification
      • Tutorial: 3D Variability Analysis (Part One)
      • Tutorial: 3D Variability Analysis (Part Two)
      • Tutorial: 3D Flexible Refinement
        • Installing 3DFlex Dependencies (v4.1–v4.3)
      • Tutorial: 3D Flex Mesh Preparation
    • Webinar Recordings
  • Real-time processing in cryoSPARC Live
    • About CryoSPARC Live
    • Prerequisites and Compute Resources Setup
    • How to Access cryoSPARC Live
    • UI Overview
    • New Live Session: Start to Finish Guide
    • CryoSPARC Live Tutorial Videos
    • Live Jobs and Session-Level Functions
    • Performance Metrics
    • Managing a CryoSPARC Live Session from the CLI
    • FAQs and Troubleshooting
  • Guides for v3
    • v3 User Interface Guide
      • Dashboard
      • Project and Workspace Management
      • Create and Build Jobs
      • Queue Job, Inspect Job and Other Job Actions
      • View and Download Results
      • Job Relationships
      • Resource Manager
      • User Management
    • Tutorial: Job Builder
    • Get Started with CryoSPARC: Introductory Tutorial (v3)
    • Tutorial: Manually Curate Exposures (v3)
  • Resources
    • Questions and Support
Powered by GitBook
On this page
  • Cryo-EM Data Processing in CryoSPARC: Introductory Tutorial
  • Introduction: Dashboard, Projects, Workspaces and Jobs
  • Step 1: Create a Project
  • Step 2: Create a Workspace
  • Step 3: Download the Tutorial Dataset
  • Step 4: Import Movies
  • Step 5: Motion Correction
  • Step 6: CTF Estimation
  • Step 7: Particle Picking (Blob Picker)
  • Step 8: Inspect Picks (Blob Picks)
  • Step 9: Extract from Micrographs (Blob Picks)
  • Step 10: 2D Classification (To Generate Templates for the Template Picker)
  • Step 11: Select 2D Classes (To Select Templates for the Template Picker)
  • Step 12: Template Picker
  • Step 13: Inspect Picks (Template Picks)
  • Step 14: Extract from Micrographs (Template Picks)
  • Step 15: 2D Classification (Template Picks)
  • Step 16: Select 2D Classes
  • Step 17: Ab-initio Reconstruction
  • Step 18: Homogeneous Refinement
  • Step 19: Sharpening (Optional)
  • Step 20: Inspect Workflows
  • Conclusion
  1. Guides for v3

Get Started with CryoSPARC: Introductory Tutorial (v3)

In this tutorial, we will process a small dataset from movies to reconstructed density map. If you are new to data processing in CryoSPARC, we highly recommend following along!

PreviousTutorial: Job BuilderNextTutorial: Manually Curate Exposures (v3)

Last updated 2 years ago

The information in this section applies to CryoSPARC ≤v3.3. For CryoSPARC v4.0+, please see: Get Started with CryoSPARC: Introductory Tutorial (v4.0+)

Cryo-EM Data Processing in CryoSPARC: Introductory Tutorial

We recommend starting off with the T20S Tutorial to become familiar with the workflow in CryoSPARC. This dataset is a subset of 20 movies from the T20S Proteasome dataset. While not a representative example of the complexity of most cryo-EM projects today, it is a good way to become familiar with the interface and software features and to learn .

For a refresher on the interface, projects and jobs, please see the .

Overview of processing the T20S dataset from raw movie data to a high-resolution 3D structure.

Introduction: Dashboard, Projects, Workspaces and Jobs

CryoSPARC organizes your workflow by Project, e.g, P1, P2, etc. Projects contain one or more Workspaces, which in turn house Jobs.

Projects are strict divisions. Files and jobs from different projects are stored in dedicated project directories and jobs cannot be connected from one project to another.

Workspaces are soft divisions, and allow for logical separation of jobs and workflows so they can be more easily managed in a large project. Jobs may be connected across workspaces and each job may belong to more than one Workspace.

Step 1: Create a Project

  • Navigate to the Projects view by clicking on the drawer icon in the header, or from the Projects button in the footer:

  • To create a project, press the n key or click the "+ Add" button in the header. A dialog window appears to enter new project details.

  • Enter a project Title and browse for a location for the associated Project directory with the File Browser. The project directory should already exist. CryoSPARC populates it with job directories as you create jobs. All files associated with the project will be stored inside the selected project directory. You may also enter a Description for your project.

  • Click "Create". The new project now appears on the Projects page.

Step 2: Create a Workspace

Use Workspaces to organize or separate portions of the cryo-EM workflow for convenience or experimentation. Create at least one Workspace within a Project before running a Job.

  • Select the project number (e.g., "P67") to open the Project.

  • Alternatively, select the Projects drop-down in the header. This opens a searchable list of all Projects associated with your user account. Select an entry to open the associated project.

  1. Create a New Workspace with the "+ Add" button in the header or press N on your keyboard. Alternatively, select New Workspace from the Project Details panel on the right side of the screen. Set a Title (may be changed later) and optionally a description.

  1. Click "Create". This will create the new Workspace.

Step 3: Download the Tutorial Dataset

  • Log in to the machine where CryoSPARC is installed via command-line.

  • Run the command cryosparcm downloadtest while in this directory. This downloads a subset of the T20S dataset.

  • Run tar -xf empiar_10025_subset.tar to decompress the downloaded data.

Step 4: Import Movies

  • In CryoSPARC, navigate to the new Workspace. To do so, navigate to the project to see a list of workspaces, and then into the workspace.

  • Select the Movies data path: Click the file browse icon and select the movie files (.mrc or .tif format). To select multiple files, use a wildcard, e.g., *.mrc. This selects all files that match the wildcard expression. The file browser displays the list of selected files along with the number of matches at the bottom. For this tutorial dataset, navigate to the directory where the test data was downloaded. Use the wildcard expression *.tif to select all TIFF format movies in the folder. There should be 20 imported movies.

  • Select the Gain reference path with the file browser: Select the single .mrc file in the folder where the test data was downloaded.

    1. Raw pixel size (Å): 0.6575

    2. Accelerating voltage (kV): 300

    3. Spherical abberation (mm): 2.7

    4. Total exposure dose (e/Å^2): 53

  • After changing a parameter, the blue D icon changes to a green S. This indicates the parameter is different from its default value.

  • Click Queue to start the import. Use the subsequent dialog to select a lane/node on which to run the job. The available lanes depend on your installation configuration. By default, import and interactive jobs will run on the master node as they are not resource intensive. Press the Create button.

  • The Import Movies job queues and starts running. Look for the Job card in the workspace to monitor its status.

  • To open a Job and view its progress, click on the Job number on the top left hand side of the Job card. Alternatively, select on the job card and press the spacebar on your keyboard:

  • To exit the job/close the inspect view, press the spacebar again, or press the × button on the top-right of the dialog.

  • Once finished, the job's status indicator changes to "Completed" in green.

Step 5: Motion Correction

  • The Patch Motion Correction job requires raw movies as Inputs. Open the previously completed Import Movies job, then drag and drop the Outputs of the Import Movies job, to the Movies placeholder in the Job Builder.

  • Once dropped, the connected output name appears in the Job Builder as an Input:

  • If you have multiple GPUs available, you can speed up the processing time by setting the Number of GPUs to parallelize parameter within the "Compute settings" section to the number of GPUs you would like to assign to that job.

  • Queue the job and select a lane. It is generally not necessary to adjust the Patch Motion Correction job parameters; they are automatically tuned based on the data.

  • Once the job starts to run the card will update with a preview image:

Step 6: CTF Estimation

  • This job type requires micrographs as the input. Open the previous Patch Motion Correction job, and drag and drop the output (20 micrographs) into the Micrograph placeholder in the Job Builder.

You can connect outputs of jobs that haven't completed into the inputs of a building job. In this case, the newly created job will start to run automatically when all parent jobs have completed. This makes it easy to queue up a series of jobs to run without having to wait until they're completed to queue them manually.

  • As with Patch Motion Correction, the job will complete faster by allocating multiple GPUs. This can be configured with the Number of GPUs to parallelize parameter.

  • Queue the job to start. It is generally not necessary to adjust the Patch CTF Estimation job parameters; they are automatically tuned based on the data.

Step 7: Particle Picking (Blob Picker)

Blob picking is a good idea because it verifies data quality and sets expectations for what particle images, projections, and structures should look like. We'll use blob picks to generate a set of templates that can be used as an input to the Template Picker, which will generate a set of much higher-quality picks matching the two primary 2D views of the T20s structure.

  • Select the Blob Picker job, then enter the Job Builder and set Min. Particle Diameter to 100 and Max Particle Diameter to 200.

  • Locate the exposures output from the previously completed Patch CTF Estimation job. Drag and drop these into the Micrograph placeholder in the Job Builder. Queue the job.

Step 8: Inspect Picks (Blob Picks)

  • Select Inspect Particle Picks from the Job Builder.

  • Drag and drop both the particles and micrographs outputs from the previously completed Blob Picker job. Queue the job.

  • Once the job is ready to interact with, it will be marked as 'Waiting' and an "Interactive" tab will be available in the job details dialog:

  • Browse through and select from the list of micrographs on the left side.

  • The histogram on the left side shows statistics across all pick locations, including false positives and true particles. The y-axis measures the Power Score, and the x-axis measures the Normalized Cross-Correlation (NCC) Score for each particle pick. True particles generally have high NCC scores (indicating agreement in shape with the templates) and a moderate-to-high Power score (indicating the presence of significant signal). Picks that have too little power are false positives containing only ice, while picks with very high power are carbon edges, ice crystals, aggregate particles, etc.

  • Make adjustments to the parameters below if needed. All adjustments are saved automatically.

    • Adjust the lowpass filter slider if needed to better view the picks.

    • Adjust the box size to make it easier to see the location of picks. Often a very small box size (32) can be helpful.

The box size is not used in computing the outputs of this job; Inspect Picks only outputs particle (x, y) location coordinates.

  • Adjust the NCC slider (approx. 0.350).

  • Adjust the Power threshold slider. This helps to remove false positives (approx. between 1075 and 1745).

  • Once satisfied with the picks, select "Done Picking! Output Locations!" button. This completes the Inspect Particle Picks job and saves selected particle locations.

Step 9: Extract from Micrographs (Blob Picks)

This job extracts particles from micrographs, and writes them out to disk for downstream jobs to read.

  • Open the recently-completed Inspect Picks job. Drag and drop both the micrographs and particles outputs into the corresponding inputs on the Job Builder.

  • In the Job Builder, look under the Particle Extraction section and change the Extraction box size (pix) to 440.

  • Queue the job.

  • Once the job completes, you'll notice the number of resulting particles is less than the input; this is due to the fact that particle extraction process excludes picks that are too close to the edge of the micrograph:

Step 10: 2D Classification (To Generate Templates for the Template Picker)

  • Drag and drop the particles output from the previously completed Extract from Micrographs, into the input and queue the job.

  • In the stream log, preview images of class averages appear after each iteration. Classification into 50 classes (the default number) takes about 15 minutes on a single GPU.

  • The quality of classes depends on the quality of input particles. While we could use these blob-picked particles directly for 3D Reconstruction, we can obtain better results by repeating our particle picking using these templates generated by 2D Classification. Thus, in this tutorial, we will use these classes to inform the Template Picker of the shape of our target structure. The next step in this workflow is the Select 2D Classes job.

Step 11: Select 2D Classes (To Select Templates for the Template Picker)

  • Drag and drop both the particles and class_averages outputs from the most recently completed 2D Classification job.

  • Queue the job. Once the data is loaded, the job status changes to Waiting and Interactive class selection mode is ready.

  • Select a "good" class for each distinct view of the structure. In this case, a top view and side view. Use both the number of particles and the provided class resolution score to identify good classes of particles. The interactive job provides several ways to sort the classes in ascending or descending order based on:

    • # of particles: The total number of particles in each class

    • Resolution: The relative resolution of all particles in the class (Å)

    • ECA: Effective classes assigned

  • Use the sort and selection controls to quickly sort and filter the class selection. Each class has a right-click context menu that allows for selecting a set of classes above or below a particular criteria.

Avoid selecting classes that contain only a partial particle or a non-particle junk image.

  • When finished, select Done at the top right side of the window. The job completes.

Step 12: Template Picker

  • Connect the output of the Select 2D 'selected classes' into the template input

  • Connect the output of the Patch CTF Estimation job into the micrographs input

  • Set the Particle diameter (Å) value to 190

  • Queue the job. It should take around 15 seconds to process the dataset.

Step 13: Inspect Picks (Template Picks)

As with the Blob Picker, use the Inspect picks job to view and interactively adjust the results of template-based automatic particle picking.

  • Select Inspect Particle Picks from the Job Builder.

  • Drag and drop both the particles and micrographs outputs from the previously completed Template Picker job. Queue the job.

  • Select a NCC score of around .350

  • Select a power score between around 930 and 1990

Step 14: Extract from Micrographs (Template Picks)

We will now repeat the extraction process given the new set of pick locations generated by the latest Inspect Picks job.

  • Open the recently-completed Inspect Picks job. Drag and drop both the micrographs and particles outputs into the corresponding inputs on the Job Builder.

  • In the Job Builder, look under the Particle Extraction section and change the Extraction box size (pix) to 440.

Step 15: 2D Classification (Template Picks)

  • Select 2D Classification from the Job Builder.

  • Drag and drop the particles output from the previously completed Extract from Micrographs, into the input and queue the job.

  • The particles extracted from the template picker result in much higher quality 2D classes. Proceed to the next step to filter out the highest quality classes for 3D reconstruction.

Step 16: Select 2D Classes

  • Select Select 2D Classes from the Job Builder.

  • Drag and drop both the particles and class_averages outputs from the most recently completed 2D Classification job.

  • Once queued and running, switch to the Interactive tab and select all of the good quality classes.

    • In this case, we want to keep all true particles in our dataset (rather than just selecting one top and one side view, as previously done in step 11), and reject all false positives.

    • The visual quality of a 2D class, together with its resolution, number of particles, and ECA, can all provide proxy measurements of the quality of the underlying particles that comprise the class.

    • Note that clear "junk" classes, corresponding to non-particle images, blurry images, ice crystals, etc., should be rejected at this stage. For example, the following shows one possible selection of 2D classes to retain:


  • Once clicking 'Done', the job will generate outputs for each group of classes and particles, one for the selected set and another for the excluded set:

Step 17: Ab-initio Reconstruction

  • Drag and drop the particles_selected output from the most recently completed Select 2D classes job (classes selected from the result of the template picker) into the Particle stacks input in the Job Builder.

  • Note: You do not need to enforce symmetry during Ab-initio Reconstruction.

  • Queue the job. Results appear in real-time in the stream log as iterations progress. Ab-initio reconstruction should resolve the T20S structure to a coarse resolution.

Step 18: Homogeneous Refinement

  • Select Homogeneous Refinement from the Job Builder.

  • Drag and drop both the particles_all_classes and volume_class_0 from the recently completed Ab-initio Reconstruction into the Particle Stacks and Initial Volume inputs, respectively.

  • Set the following parameter:

    • Symmetry: D7

  • Once complete, download the volume and/or mask directly from the Outputs section on the right hand side: Select the drop-down to choose the outputs you wish to download. A refinement job outputs a map_sharp, the final refined volume with automatic B-factor sharpening applied and filtered to the estimated FSC resolution.

Step 19: Sharpening (Optional)

For optimal results in publications and for model-building, it is often necessary to re-sharpen and adjust the B-factor. This step is optional as Homogeneous Refinement already outputs a sharpened volume (map_sharp) with a B-factor reported within the Guinier plot.

  • Drag and drop the volume output from the result of the previous refinement job, into the volume input.

  • Set a B-Factor. Get a good starting B-Factor value from the final Guinier plot in the stream log of the refinement job you previously ran. It is recommended to input a B-Factor (as a negative value) ±20 the reported value in the plot. In this case, -56.7 and -96.7

  • Activate the Generate new FSC mask parameter, which will generate a new mask for the purposes of FSC calculation from the input volume. This is done by thresholding, dilating, and padding the input structure.

  • Queue the job. Once complete, download the sharpened map (map_sharp) from the output.

  • After visually assessing the map, optionally run another sharpening job (clear or clone the existing one) with a different B-Factor.

Step 20: Inspect Workflows

Conclusion

Now that you have refined the data to a high-resolution structure, you can apply more advanced processing techniques. Explore the job builder and other documentation to see the available job types and processing options. Common workflows include:

    • Sub-classification to identify small, slightly differing populations

  • Re-pick with multiple higher quality 2D classes

For detailed explanations on all available job types and commonly adjusted parameters, see:

Check back to see updates to this guide, as new features and algorithms are in constant development within CryoSPARC.

The Dashboard provides at-a-glance information on your Projects, Workspaces and status of Jobs. It also shows the change log for new versions of CryoSPARC. The header and footer contain links to Projects view, Workspaces, the and the identity of the current user.

Navigate to or create a directory into which to download the test dataset (approx. 8 GB). This location should have read permissions for .

Select the in the right sidebar. The Job Builder displays all available job types by category (e.g., workflows, imports, motion correction, etc.). A tutorial on the Job Builder is available .

Select the job type in the Job builder. This creates a new job within the current Workspace, displayed as a card. By default, new jobs are set to Building status, indicated on the job card in purple. To change parameters, select the Building job and , or click the "Building" badge on the job card.

Edit Job parameters from the Builder; enter the following parameters (obtained from the original publication in ).

This opens the , which shows a streaming log of the real-time progress for the Import Job. Scroll through the stream log to view results. Select a checkpoint to find a specific location in the stream log or click 'Show from top' to return to the beginning. Additional actions and detailed information for the job are available in the details panel. The Output of the import job, i.e., the 20 imported movies, are available on the right hand side of the event log:

For our next stage of processing, we will be performing on the imported movies. Motion Correction refers to the alignment and averaging of input movies into single-frame micrographs, for use downstream.

In the Job Builder, select (parallelized over multiple GPUs, if you have them available). This creates a new job in building state so that its inputs and parameters are editable in the right side panel.

Our next step is to perform . This stage involves the estimation of several CTF parameters in the dataset, including the defocus and astigmatism of each micrograph.

Select in the Job Builder to create a new job.

It is important to attain a large number of high-quality particles for an optimal reconstruction. The is a common starting point for particle picking as it is a quick way to attain an initial set of particle images that can be used to refine picking techniques over time.

Use the job to view and interactively adjust the results of blob-based (and template-based) automatic particle picking.

Select (parallelized over multiple GPUs, if you have them available).

We generally recommend selecting a box size that is at least double the diameter of the particle. The box size controls how much of the micrograph is cropped around each particle location. Larger box sizes capture the most high-resolution signal that is spread out spatially due to the effect of defocus (CTF) in the microscope. However, larger box sizes significantly increase computation expense in further processing. To mitigate this, you can use the to speed up processing for jobs that do not require the full spectrum of data such as 2D Classification.

is a commonly used job in cryo-EM processing to get a first look at the data, group particles by 2D view, remove false positive picks, and even to get early insights into potential heterogeneity present in the dataset. In this step, we will use 2D Classification to group particles by 2D view, and then use the resulting class averages (also referred to as "templates") to improve our particle picking.

Select from the Job Builder.

allows us to select a subset of the generated templates from 2D Classification, and to reject the rest.

Select from the Job Builder.

The operates similarly to the Blob Picker but allows for an input set of templates to use to more precisely pick particles that match the shape of the target structure

Select (parallelized over multiple GPUs, if you have them available).

Now that we have a set of good quality particle picks, we can proceed into .

Select from the Job Builder.

Now that we have a coarse resolution 3D density, we can the density to high-resolution using the job.

Queue the job. Results appear in real time in the stream log. The refinement job performs a rapid gold-standard refinement using the Expectation Maximization and branch-and-bound algorithms. The job displays the current the resolution, measured via , and other diagnostic information for each iteration.

Sharpen the result of the refinement with the from the Utilities section in the Job Builder.

Once you have assembled a workflow of connected jobs within a project, you can switch to the to understand how jobs are connected to obtain the final result. Click the flowchart icon in the top header:

Within the Tree View, you can select jobs and modify/connect them in the same way as previously demonstrated in the Card view. For more information on the Tree view and other useful tips, see the .

to explore both discrete and continuous heterogeneity in the dataset

to improve resolutions by accounting for disordered regions and local variations in a structure

or to refine multiple conformations and simultaneously classify particles

Heterogeneous to find multiple unexpected conformational states or multiple distinct particles in the data

Multiple rounds of to remove more junk particles

to focus on sub-regions of a structure

(per-exposure-group) or (per-particle)

Resource Manager
the linux user account running CryoSPARC
Job Builder
here
Import Movies
toggle between active or inactive building states with B on your keyboard
eLife
Inspect view
Motion Correction
Patch motion correction (multi)
Contrast Transfer Function (CTF) Estimation
Patch CTF Estimation (multi)
Blob Picker
Inspect picks
Extract from Micrographs
Downsample Particles job
2D Classification
2D Classification
Select 2D Classes
Select 2D Classes
Template Picker
Extract from Micrographs
3D Reconstruction
Ab-initio Reconstruction
refine
Homogeneous Refinement
Fourier Shell Correlation
Sharpening Tools
Tree View
User Interface and Usage Guide
3D Variability Analysis
Non-Uniform Refinement
Heterogeneous Refinement
3D Classification
Ab-initio Reconstruction
2D Classification
Masked/Local Refinements
Global
Local
CTF Refinement
All Job Types in CryoSPARC
Tutorial Videos
Data Processing Tutorials
EMPIAR-10025
how CryoSPARC organizes jobs and projects
User Interface and Usage Guide
Overview of processing the T20S dataset from raw movie data to a high-resolution 3D structure.
The CryoSPARC user interface
Parameters with default values are marked with a blue 'D' (for default). Custom values are marked with a green 'S' (for spec. or user-specified).
Job card displayed while running (left) and when complete (right)
The job queueing dialog displaying a list of lanes (configured via the CryoSPARC command-line interface) to choose from
The blue icon next to the job ID (J2) indicates this job is running
Connecting Patch Motion Correction job outputs to a building Patch CTF Estimation job and queuing it to a GPU-accelerated instance.
A portion of the Blob Picker event log depicting the template the algorithm uses for picking and exposure image with pick selections plotted.
Job details dialog for the Blob Picker open while the Inspect Particle Picks job details panel is active in build mode
Output result groups of the Inspect Picks job showing the resulting 20 micrographs and 13,436 picked particles
Workspace with the Extract from Micrographs job builder active
Output result groups of the Extract from Micrographs job showing the resulting 20 micrographs and 11,781 extracted particles
Generated 2D classes from extracted blob picks
Interactive Select 2D job depicting two classes selected representing primary orientations.
Event log of the Select 2D job depicting the two classes selected and 48 classes excluded
Job details dialog depicting the completed Template Picker job with 19,067 particles picked
Job details dialog depicting the completed Inspect Picks job with 12,014 resulting particles
Output result groups of the Extract from Micrographs job showing the resulting 20 micrographs and 10,947 extracted particles
Generated 2D classes from extracted template picks
Interactive Select 2D job depicting good quality classes selected.
12 selected classes and 38 excluded classes.
Event log of the completed Ab-initio Reconstruction job
Ab-initio volume visualized in USCF ChimeraX
Event log of the completed Homogeneous Refinement job
Gold Standard Fourier Shell Correlation (GSFSC) plot for the final refinement iteration.
Within the job details dialog, output groups listed have a download menu with various options.
Refined volume visualized in USCF ChimeraX.
Guinier plot from the final Refinement iteration depicting B-Factor value.
Comparison of maps sharpened with different B-factor values. Visualized in ChimeraX.
Tree view of the full T20S processing workflow