Skip to content
Snippets Groups Projects
README.md 1.88 KiB
Newer Older
  • Learn to ignore specific revisions
  • blia's avatar
    blia committed
    ### On uncertainty estimation in active learning for image segmentation 
    
    
    <br /> This repository provides the implementation for our paper [**On uncertainty estimation in active learning for image segmentation** (Bo Li, Tommy Sonne Alstrøm)](https://arxiv.org/abs/2007.06364). We experimentally show that the region based active learning strategy can lead to higher segmentation accuracy and better calibrated model much faster than full image acquisition:
    
    blia's avatar
    blia committed
    
    ![performance](DATA/first_figure.jpg)
    
    #### Installation and preparation 
    
    1. Clone and enter this repo:
    
       ```bash
    
    blia's avatar
    blia committed
       git clone https://lab.compute.dtu.dk/papers/on-uncertainty-estimation-in-active-learning.git
    
    blia's avatar
    blia committed
       cd region_active_learning
       chmod +x requirement.sh
       chmod +x produce_figure.sh
    
    blia's avatar
    blia committed
       ```
    
    
    blia's avatar
    blia committed
    2. Create a virtual env with the required packages
    
    blia's avatar
    blia committed
    
       ```bash
    
    blia's avatar
    blia committed
       conda env create -f active_learning.yaml
       source activate act
    
    blia's avatar
    blia committed
       ```
    
    
    blia's avatar
    blia committed
    
    2. Prepare the Dataset and pertained resnet-50 ckpt 
    
    blia's avatar
    blia committed
    
       ```bash
    
    blia's avatar
    blia committed
       ./requirement.sh
    
    blia's avatar
    blia committed
       ```
    
    
    blia's avatar
    blia committed
    
    
    blia's avatar
    blia committed
    #### Evaluate the model
    
    In order to evaluate the model at each acquisition step, run
    
    ``` python
    python3 -c 'import Test as te;te.running_test_for_single_acquisition_step(model_dir)'
    Args:
    
    blia's avatar
    blia committed
      model_dir: the directory that saves the model ckpt
    
    blia's avatar
    blia committed
    ```
    
    #### Train the model
    
    - For full image based active learning, run
    
      ```python
      python3 -c 'import Train_Active_Full_Im as tafi;tafi.running_loop_active_learning_full_image(stage)'
      Args:
    
    blia's avatar
    blia committed
        stage: int, 0:random, 1:VarRatio, 2:Entropy, 3:BALD
    
    blia's avatar
    blia committed
      ```
    
    - For region based active learning, run
    
    blia's avatar
    blia committed
        
      ```python
      python3 -c 'import Train_Active_Region_Im as tari;tari.running_loop_active_learning_region(stage)'
      Args:
        stage: int, 0:random, 1:VarRatio, 2:Entropy, 3:BALD
      ```
    
    blia's avatar
    blia committed
    
    #### Reproduce figures
    
    
    The statistics that are used for reproducing the figures are saved in folder *Exp_Stat*. In order to reproduce the figures in the paper, run
    
    blia's avatar
    blia committed
    
    ```bash
    
    ./produce_figure.sh
    
    blia's avatar
    blia committed
    ```