Topaz Denoisejob can be used to remove noise from micrographs using the Topaz provided model. A new model can also be trained using the job, which can then also be used. This job has the following inputs and outputs.
Topaz Denoisejob, users have the option of using the provided pretrained model or to train a model for immediate and future use. Thus the user must select which model to use from three general categories. These categories are:
Topaz Denoisejob requires the micrographs that will be denoised to be input into the
micrographsinput slot regardless of model specification. The job inputs and build parameters required to select each category are summarized in the table below and detailed further below the table.
training_micrographsinput slots must be empty.
training_micrographsinput slot and the
denoise_modelinput slot must be empty.
topaz_denoise_modeloutput, allowing the the trained model to be used in other
Topaz Denoisejobs. How to use this output is specified in the Using model previously trained by user section below.
topaz_denoise_modeloutput from the
Topaz Denoisejob with the trained model into the
denoise_modelinput slot. The
training_micrographsinput slot must remain empty.
Topaz Denoisejob is complete, the job will output micrograph comparisons, the amount of which is dependent on the
Number of plots to showbuild parameter. Each comparison features two micrographs on the same row. The micrograph to the left is the original micrograph prior to denoising and the micrograph to the right is the denoised version of the micrograph. This side-by-side comparison serves to inform the user of the effect of the denoising.
Topaz Denoisejob is used to train a model, a plot of the training and validation losses will also be shown. The plots for both losses should be descending overtime. If the plot for the training loss is decreasing while the plot for the validation loss is increasing, this indicates that the model has overfit and training parameters must be tuned. The simplest approach to resolving overfitting is to reduce the learning rate.