Computer Vision News - August 2021

2 Summary Co puter Vision Tool 14 class HeadAndNeck(HeadAndNeckBase): def __init__(self) -> None: super().__init__( ground_truth_ids=["parotid_l", "parotid_r", "smg_l", "smg_r", "spinal_cord"] azure_dataset_id="name-of-your-AML-dataset-with-prostate-data") Now if the model is created (or adapted to run for InnerEye) how does the training work? If one wants to train on the Azure platform using multiple machines (nodes) the following command is to b e run: python InnerEyeLocal/ML /runner.py --azureml --model=Prostate --num_ nodes=2 To test an existing model regis tered in AzureML: python Inner/ML /runner.py --azureml --model=Prostate --cluster=my_cluster_ name \ --no-train --mod el_id=Pros tate:1 Of course, as a last step, one would need to visualize the results. Assuming a python file which performs the cross validations and plots this, one needs to run the following command to get such results: python InnerEye/ML/visualizers/plot_cross_validation.py --run_recovery_id ... --epoch ... All the results from the model inference, any plots and validations are written in the output folder. A specific structure is followed (with checkpoints saved, metrics in CSV format and additional build information and much much more!). Feel free to investigate the documentation for more details of all the useful files that are created! Deployment An overview of a deployment method is shown in Fig 2. For deployment of a segmentation model, all code that was used in training, plus the checkpoint(s) of the model, are packaged up into a folder and then registered in AzureML for later use.

RkJQdWJsaXNoZXIy NTc3NzU=