天池医疗AI大赛[第一季]:肺部结节智能诊断 UNet/VGG/Inception/ResNet/DenseNet
- config.py # good practice to centralize hyper parameters
- preprocess.py # Step 1, preprocess, store numpy/meta 'cache' at ./preprocess/
- train_segmentation.py # Step 2, segmentation with UNet Model
- model_UNet.py # UNet model definition
- train_classificaion.py # Step 3, classificaiton with VGG/Inception/ResNet/DenseNet
- model_VGG.py # VGG model definition
- model_Inception.py # Inception model definition
- model_ResNet.py # ResNet model definition
- model_DenseNet.py # DenseNet model definition
- generators.py # generator for segmentation & classificaiton models
- visual_utils.py # 3D visual tools
- dataset/ # dataset, changed in config.py
- preprocess/ # 'cache' preprocessed numpy/meta data, changed in config.py
- train_ipynbs # training process notebooks
SimpleITK
to read CT files, process, and store into cache with numpy arraysscikit-image
lib, try lots of parameters for best cutting
/train_ipynbs/preprocess.ipynb
fileDistribution of the lung part takes on a whole CT.
Tumor size distribution
dice_coef_loss
as loss function.Pictures tells that: hyperparameter tunning really matters.
bottleneck
block instead of basic_block
for implementation.bottleneck
residual block consists of:
RESNET_BLOCKS
as config to tuneDenseNet
draws tons of experience from origin paper. https://arxiv.org/abs/1608.06993
3e-5
works well for UNet, 1e-4
works well for classification models./train_logs/<model-name>-run-<hour>-<minute>
.