From a5029783fbd18960c3f2916be11a88058d790124 Mon Sep 17 00:00:00 2001 From: majorli Date: Fri, 2 Jun 2023 16:13:34 +0800 Subject: [PATCH 1/2] add Cityscapes dataset template for README.md Signed-off-by: majorli --- docs/DATASET.md | 34 ++++++++++++++++++++++++++++++++++ 1 file changed, 34 insertions(+) diff --git a/docs/DATASET.md b/docs/DATASET.md index c71dba726..f4b231c63 100644 --- a/docs/DATASET.md +++ b/docs/DATASET.md @@ -67,3 +67,37 @@ coco2017 ├── val2017.txt └── ... ``` + +## 3. Cityscapes + +### 3.1 Introduction + +Cityscapes is a large-scale dataset for city scene understanding and autonomous driving. It contains 5000 source images, out of which 2975 are pixel-level annotated. +The dataset focuses on street scenes and provides finely annotated object segmentation with categories such as person, animals, vehicles and road facilities. + +### 3.2 Preparation + +Go to visit [Cityscapes official website](https://www.cityscapes-dataset.com/), then choose 'Download' to download the Cityscapes dataset. + +Specify `/path/to/cityscapes` to your Cityscapes path in later training process, the unzipped dataset path structure sholud look like: + +```bash +cityscapes/ +├── gtFine +│   ├── test +│   ├── train +│   │   ├── aachen +│   │   └── bochum +│   └── val +│   ├── frankfurt +│   ├── lindau +│   └── munster +└── leftImg8bit + ├── train + │   ├── aachen + │   └── bochum + └── val + ├── frankfurt + ├── lindau + └── munster +``` -- Gitee From 073bf83923864aa4a5a9f1ce3d081db19fe5c09f Mon Sep 17 00:00:00 2001 From: majorli Date: Fri, 2 Jun 2023 17:11:49 +0800 Subject: [PATCH 2/2] apply cityscapes dataset template to models Signed-off-by: majorli --- .../apcnet/pytorch/README.md | 27 +++++++++- .../bisenetv2/paddlepaddle/README.md | 37 +++++++++++--- .../deeplabv3/paddlepaddle/README.md | 35 ++++++++++--- .../deeplabv3plus/paddlepaddle/README.md | 35 ++++++++++--- .../dnlnet/paddlepaddle/README.md | 35 +++++++++++-- .../gcnet/pytorch/README.md | 28 +++++++++-- .../ocrnet/pytorch/README.md | 49 +++++++++++-------- .../unet/paddlepaddle/README.md | 39 ++++++++++++--- 8 files changed, 231 insertions(+), 54 deletions(-) diff --git a/cv/semantic_segmentation/apcnet/pytorch/README.md b/cv/semantic_segmentation/apcnet/pytorch/README.md index 81cb2a946..cfd2bafe1 100644 --- a/cv/semantic_segmentation/apcnet/pytorch/README.md +++ b/cv/semantic_segmentation/apcnet/pytorch/README.md @@ -23,11 +23,34 @@ python3 setup.py build && cp build/lib.linux*/mmcv/_ext.cpython* mmcv ## Step 2: Prepare Datasets -Download cityscapes from file server or official website [Cityscapes](https://www.cityscapes-dataset.com) +Go to visit [Cityscapes official website](https://www.cityscapes-dataset.com/), then choose 'Download' to download the Cityscapes dataset. + +Specify `/path/to/cityscapes` to your Cityscapes path in later training process, the unzipped dataset path structure sholud look like: + +```bash +cityscapes/ +├── gtFine +│   ├── test +│   ├── train +│   │   ├── aachen +│   │   └── bochum +│   └── val +│   ├── frankfurt +│   ├── lindau +│   └── munster +└── leftImg8bit + ├── train + │   ├── aachen + │   └── bochum + └── val + ├── frankfurt + ├── lindau + └── munster +``` ```shell mkdir -p data/ -ln -s ${CITYSCAPES_DATASET_PATH} data/cityscapes +ln -s /path/to/cityscapes data/cityscapes ``` ## Step 3: Training diff --git a/cv/semantic_segmentation/bisenetv2/paddlepaddle/README.md b/cv/semantic_segmentation/bisenetv2/paddlepaddle/README.md index 60cf6a180..1406f1444 100644 --- a/cv/semantic_segmentation/bisenetv2/paddlepaddle/README.md +++ b/cv/semantic_segmentation/bisenetv2/paddlepaddle/README.md @@ -17,17 +17,41 @@ pip3 install -r requirements.txt ## Step 2: Download data -Download the [CityScapes Dataset](https://www.cityscapes-dataset.com/) +Go to visit [Cityscapes official website](https://www.cityscapes-dataset.com/), then choose 'Download' to download the Cityscapes dataset. + +Specify `/path/to/cityscapes` to your Cityscapes path in later training process, the unzipped dataset path structure sholud look like: + +```bash +cityscapes/ +├── gtFine +│   ├── test +│   ├── train +│   │   ├── aachen +│   │   └── bochum +│   └── val +│   ├── frankfurt +│   ├── lindau +│   └── munster +└── leftImg8bit + ├── train + │   ├── aachen + │   └── bochum + └── val + ├── frankfurt + ├── lindau + └── munster +``` ```bash # Datasets preprocessing pip3 install cityscapesscripts -python3 tools/convert_cityscapes.py --cityscapes_path /home/datasets/cityscapes/ --num_workers 8 +python3 tools/convert_cityscapes.py --cityscapes_path /path/to/cityscapes --num_workers 8 + +python3 tools/create_dataset_list.py /path/to/cityscapes --type cityscapes --separator "," -python3 tools/create_dataset_list.py /home/datasets/cityscapes --type cityscapes --separator "," # CityScapes PATH as follow: -ls -al /home/datasets/cityscapes/ +ls -al /path/to/cityscapes total 11567948 drwxr-xr-x 4 root root 227 Jul 18 03:32 . drwxr-xr-x 6 root root 179 Jul 18 06:48 .. @@ -45,11 +69,12 @@ drwxr-xr-x 5 root root 58 Jul 18 03:30 leftImg8bit ## Step 3: Run BiSeNetV2 ```bash -# Make sure your dataset path is the same as above -data_dir=${data_dir:-/home/datasets/cityscapes/} +# Change '/path/to/cityscapes' as your local Cityscapes dataset path +data_dir=/path/to/cityscapes sed -i "s#: data/cityscapes#: ${data_dir}#g" configs/_base_/cityscapes.yml export FLAGS_cudnn_exhaustive_search=True export FLAGS_cudnn_batchnorm_spatial_persistent=True + # One GPU export CUDA_VISIBLE_DEVICES=0 python3 train.py --config configs/bisenet/bisenet_cityscapes_1024x1024_160k.yml --do_eval --use_vdl --save_interval 500 --save_dir output diff --git a/cv/semantic_segmentation/deeplabv3/paddlepaddle/README.md b/cv/semantic_segmentation/deeplabv3/paddlepaddle/README.md index c8043bbd8..cd77ec722 100644 --- a/cv/semantic_segmentation/deeplabv3/paddlepaddle/README.md +++ b/cv/semantic_segmentation/deeplabv3/paddlepaddle/README.md @@ -15,17 +15,40 @@ pip3 install -r requirements.txt ## Step 2: Download data -Download the [CityScapes Dataset](https://www.cityscapes-dataset.com/) +Go to visit [Cityscapes official website](https://www.cityscapes-dataset.com/), then choose 'Download' to download the Cityscapes dataset. + +Specify `/path/to/cityscapes` to your Cityscapes path in later training process, the unzipped dataset path structure sholud look like: + +```bash +cityscapes/ +├── gtFine +│   ├── test +│   ├── train +│   │   ├── aachen +│   │   └── bochum +│   └── val +│   ├── frankfurt +│   ├── lindau +│   └── munster +└── leftImg8bit + ├── train + │   ├── aachen + │   └── bochum + └── val + ├── frankfurt + ├── lindau + └── munster +``` ```bash # Datasets preprocessing pip3 install cityscapesscripts -python3 tools/convert_cityscapes.py --cityscapes_path /home/datasets/cityscapes/ --num_workers 8 +python3 tools/convert_cityscapes.py --cityscapes_path /path/to/cityscapes --num_workers 8 -python3 tools/create_dataset_list.py /home/datasets/cityscapes --type cityscapes --separator "," +python3 tools/create_dataset_list.py /path/to/cityscapes --type cityscapes --separator "," # CityScapes PATH as follow: -ls -al /home/datasets/cityscapes/ +ls -al /path/to/cityscapes total 11567948 drwxr-xr-x 4 root root 227 Jul 18 03:32 . drwxr-xr-x 6 root root 179 Jul 18 06:48 .. @@ -43,8 +66,8 @@ drwxr-xr-x 5 root root 58 Jul 18 03:30 leftImg8bit ## Step 3: Run DeepLab ```bash -# Make sure your dataset path is the same as above -data_dir=${data_dir:-/home/datasets/cityscapes/} +# Change '/path/to/cityscapes' as your local Cityscapes dataset path +data_dir=/path/to/cityscapes sed -i "s#: data/cityscapes#: ${data_dir}#g" configs/_base_/cityscapes.yml export FLAGS_cudnn_exhaustive_search=True export FLAGS_cudnn_batchnorm_spatial_persistent=True diff --git a/cv/semantic_segmentation/deeplabv3plus/paddlepaddle/README.md b/cv/semantic_segmentation/deeplabv3plus/paddlepaddle/README.md index a00aa6c52..a575696cf 100644 --- a/cv/semantic_segmentation/deeplabv3plus/paddlepaddle/README.md +++ b/cv/semantic_segmentation/deeplabv3plus/paddlepaddle/README.md @@ -15,17 +15,40 @@ pip3 install -r requirements.txt ## Step 2: Download data -Download the [CityScapes Dataset](https://www.cityscapes-dataset.com/) +Go to visit [Cityscapes official website](https://www.cityscapes-dataset.com/), then choose 'Download' to download the Cityscapes dataset. + +Specify `/path/to/cityscapes` to your Cityscapes path in later training process, the unzipped dataset path structure sholud look like: + +```bash +cityscapes/ +├── gtFine +│   ├── test +│   ├── train +│   │   ├── aachen +│   │   └── bochum +│   └── val +│   ├── frankfurt +│   ├── lindau +│   └── munster +└── leftImg8bit + ├── train + │   ├── aachen + │   └── bochum + └── val + ├── frankfurt + ├── lindau + └── munster +``` ```bash # Datasets preprocessing pip3 install cityscapesscripts -python3 tools/convert_cityscapes.py --cityscapes_path /home/datasets/cityscapes/ --num_workers 8 +python3 tools/convert_cityscapes.py --cityscapes_path /path/to/cityscapes --num_workers 8 -python3 tools/create_dataset_list.py /home/datasets/cityscapes --type cityscapes --separator "," +python3 tools/create_dataset_list.py /path/to/cityscapes --type cityscapes --separator "," # CityScapes PATH as follow: -ls -al /home/datasets/cityscapes/ +ls -al /path/to/cityscapes total 11567948 drwxr-xr-x 4 root root 227 Jul 18 03:32 . drwxr-xr-x 6 root root 179 Jul 18 06:48 .. @@ -43,8 +66,8 @@ drwxr-xr-x 5 root root 58 Jul 18 03:30 leftImg8bit ## Step 3: Run DeepLabV3+ ```bash -# Make sure your dataset path is the same as above -data_dir=${data_dir:-/home/datasets/cityscapes/} +# Change '/path/to/cityscapes' as your local Cityscapes dataset path +data_dir=/path/to/cityscapes sed -i "s#: data/cityscapes#: ${data_dir}#g" configs/_base_/cityscapes.yml export FLAGS_cudnn_exhaustive_search=True export FLAGS_cudnn_batchnorm_spatial_persistent=True diff --git a/cv/semantic_segmentation/dnlnet/paddlepaddle/README.md b/cv/semantic_segmentation/dnlnet/paddlepaddle/README.md index c23f2e0ae..b68eb2e67 100644 --- a/cv/semantic_segmentation/dnlnet/paddlepaddle/README.md +++ b/cv/semantic_segmentation/dnlnet/paddlepaddle/README.md @@ -15,19 +15,44 @@ pip3 install -r requirements.txt ``` ## Step 2: Prepare Datasets -Download [CityScapes](https://www.cityscapes-dataset.com/), the path as /home/datasets/cityscapes/. -Datasets preprocessing: + +Go to visit [Cityscapes official website](https://www.cityscapes-dataset.com/), then choose 'Download' to download the Cityscapes dataset. + +Specify `/path/to/cityscapes` to your Cityscapes path in later training process, the unzipped dataset path structure sholud look like: + +```bash +cityscapes/ +├── gtFine +│   ├── test +│   ├── train +│   │   ├── aachen +│   │   └── bochum +│   └── val +│   ├── frankfurt +│   ├── lindau +│   └── munster +└── leftImg8bit + ├── train + │   ├── aachen + │   └── bochum + └── val + ├── frankfurt + ├── lindau + └── munster ``` + +Datasets preprocessing: +```bash pip3 install cityscapesscripts -python3 tools/convert_cityscapes.py --cityscapes_path /home/datasets/cityscapes/ --num_workers 8 +python3 tools/convert_cityscapes.py --cityscapes_path /path/to/cityscapes --num_workers 8 -python3 tools/create_dataset_list.py /home/datasets/cityscapes --type cityscapes --separator "," +python3 tools/create_dataset_list.py /path/to/cityscapes --type cityscapes --separator "," ``` then the cityscapes path as follows: ``` -root@5574247e63f8:~# ls -al /home/datasets/cityscapes/ +root@5574247e63f8:~# ls -al /path/to/cityscapes total 11567948 drwxr-xr-x 4 root root 227 Jul 18 03:32 . drwxr-xr-x 6 root root 179 Jul 18 06:48 .. diff --git a/cv/semantic_segmentation/gcnet/pytorch/README.md b/cv/semantic_segmentation/gcnet/pytorch/README.md index 74f7ede19..079b01aea 100755 --- a/cv/semantic_segmentation/gcnet/pytorch/README.md +++ b/cv/semantic_segmentation/gcnet/pytorch/README.md @@ -9,12 +9,34 @@ Global context blocks are applied to multiple layers in a backbone network to co ## Step 1: Installing ### Datasets -- download cityscape from file server or official urls -[Cityscapes](https://www.cityscapes-dataset.com/login/) +Go to visit [Cityscapes official website](https://www.cityscapes-dataset.com/), then choose 'Download' to download the Cityscapes dataset. + +Specify `/path/to/cityscapes` to your Cityscapes path in later training process, the unzipped dataset path structure sholud look like: + +```bash +cityscapes/ +├── gtFine +│   ├── test +│   ├── train +│   │   ├── aachen +│   │   └── bochum +│   └── val +│   ├── frankfurt +│   ├── lindau +│   └── munster +└── leftImg8bit + ├── train + │   ├── aachen + │   └── bochum + └── val + ├── frankfurt + ├── lindau + └── munster +``` ```bash mkdir data/ -ln -s ${CITYSCAPES_DATASET_PATH} data/cityscapes +ln -s /path/to/cityscapes data/cityscapes ``` - convert_datasets ```bash diff --git a/cv/semantic_segmentation/ocrnet/pytorch/README.md b/cv/semantic_segmentation/ocrnet/pytorch/README.md index b9e7d2860..ad6cdaca1 100644 --- a/cv/semantic_segmentation/ocrnet/pytorch/README.md +++ b/cv/semantic_segmentation/ocrnet/pytorch/README.md @@ -11,26 +11,35 @@ Last, the representation similarity we compute the relation between each pixel a ## Step 1: Installing ### Datasets -- download cityscape from official urls -[Cityscapes](https://www.cityscapes-dataset.com/) - - -- when done data folder looks like -````bash -data/ -├── cityscapes -    ├── gtFine -    │   ├── test -    │   ├── train -    │   └── val -    └── leftImg8bit -    │ ├── test -    │   ├── train -    │ └── val -    ├── test.lst -    ├── trainval.lst -   └── val.lst -```` +Go to visit [Cityscapes official website](https://www.cityscapes-dataset.com/), then choose 'Download' to download the Cityscapes dataset. + +Specify `/path/to/cityscapes` to your Cityscapes path in later training process, the unzipped dataset path structure sholud look like: + +```bash +cityscapes/ +├── gtFine +│   ├── test +│   ├── train +│   │   ├── aachen +│   │   └── bochum +│   └── val +│   ├── frankfurt +│   ├── lindau +│   └── munster +└── leftImg8bit + ├── train + │   ├── aachen + │   └── bochum + └── val + ├── frankfurt + ├── lindau + └── munster +``` + +```bash +mkdir data/ +ln -s /path/to/cityscapes data/cityscapes +``` ### Environment ```bash diff --git a/cv/semantic_segmentation/unet/paddlepaddle/README.md b/cv/semantic_segmentation/unet/paddlepaddle/README.md index cbfe7bce7..474455e2f 100644 --- a/cv/semantic_segmentation/unet/paddlepaddle/README.md +++ b/cv/semantic_segmentation/unet/paddlepaddle/README.md @@ -14,19 +14,46 @@ pip3 install -r requirements.txt ``` ## Step 2: Prepare Datasets -Download [CityScapes](https://www.cityscapes-dataset.com/), the path as /home/datasets/cityscapes/. -Datasets preprocessing: + +Go to visit [Cityscapes official website](https://www.cityscapes-dataset.com/), then choose 'Download' to download the Cityscapes dataset. + +Specify `/path/to/cityscapes` to your Cityscapes path in later training process, the unzipped dataset path structure sholud look like: + +```bash +cityscapes/ +├── gtFine +│   ├── test +│   ├── train +│   │   ├── aachen +│   │   └── bochum +│   └── val +│   ├── frankfurt +│   ├── lindau +│   └── munster +└── leftImg8bit + ├── train + │   ├── aachen + │   └── bochum + └── val + ├── frankfurt + ├── lindau + └── munster ``` + +Datasets preprocessing: + +```bash pip3 install cityscapesscripts -python3 tools/convert_cityscapes.py --cityscapes_path /home/datasets/cityscapes/ --num_workers 8 +python3 tools/convert_cityscapes.py --cityscapes_path /path/to/cityscapes --num_workers 8 -python3 tools/create_dataset_list.py /home/datasets/cityscapes --type cityscapes --separator "," +python3 tools/create_dataset_list.py /path/to/cityscapes --type cityscapes --separator "," ``` then the cityscapes path as follows: -``` -root@5574247e63f8:~# ls -al /home/datasets/cityscapes/ + +```bash +root@5574247e63f8:~# ls -al /path/to/cityscapes total 11567948 drwxr-xr-x 4 root root 227 Jul 18 03:32 . drwxr-xr-x 6 root root 179 Jul 18 06:48 .. -- Gitee