Skip to content

Commit

Permalink
fix links in doc (#759)
Browse files Browse the repository at this point in the history
* fix doc

* fix link

* fix doc

* fix link

* fix doc

* fix mkdoc

* remove acl

* remove acl in code

* remove acl in code

* fix mkdoc

* remove gpu description

* fix link

* revert example change

* fix
  • Loading branch information
alien-0119 authored Nov 4, 2024
1 parent d20a39e commit dda7b75
Show file tree
Hide file tree
Showing 56 changed files with 192 additions and 388 deletions.
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -295,7 +295,7 @@ You can do MindSpore Lite inference in MindOCR using **MindOCR models** or **Thi

For the detailed performance of the trained models, please refer to [https://github.com/mindspore-lab/mindocr/blob/main/configs](./configs).

For details of MindSpore Lite and ACL inference models support, please refer to [MindOCR Models Support List](docs/en/inference/mindocr_models_list.md) and [Third-party Models Support List](docs/en/inference/thirdparty_models_list.md) (PaddleOCR etc.).
For details of MindSpore Lite inference models support, please refer to [MindOCR Models Support List](docs/en/inference/mindocr_models_list.md) and [Third-party Models Support List](docs/en/inference/thirdparty_models_list.md) (PaddleOCR etc.).

## Dataset List

Expand Down
2 changes: 1 addition & 1 deletion configs/cls/mobilenetv3/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -132,7 +132,7 @@ python tools/train.py -c configs/cls/mobilenetv3/cls_mv3.yaml
Please set `distribute` in yaml config file to be `True`.

```shell
# n is the number of GPUs/NPUs
# n is the number of NPUs
mpirun --allow-run-as-root -n 4 python tools/train.py -c configs/cls/mobilenetv3/cls_mv3.yaml
```

Expand Down
2 changes: 1 addition & 1 deletion configs/cls/mobilenetv3/README_CN.md
Original file line number Diff line number Diff line change
Expand Up @@ -134,7 +134,7 @@ python tools/train.py -c configs/cls/mobilenetv3/cls_mv3.yaml
请确保yaml文件中的`distribute`参数为`True`。

```shell
# n is the number of GPUs/NPUs
# n is the number of NPUs
mpirun --allow-run-as-root -n 4 python tools/train.py -c configs/cls/mobilenetv3/cls_mv3.yaml
yaml
```
Expand Down
8 changes: 4 additions & 4 deletions configs/det/dbnet/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -397,7 +397,7 @@ python tools/train.py -c=configs/det/dbnet/db_r50_icdar15.yaml
Please set `distribute` in yaml config file to be True.

```shell
# n is the number of GPUs/NPUs
# n is the number of NPUs
mpirun --allow-run-as-root -n 2 python tools/train.py --config configs/det/dbnet/db_r50_icdar15.yaml
```

Expand All @@ -418,7 +418,7 @@ Please refer to the tutorial [MindOCR Inference](../../../docs/en/inference/infe

- Model Export

Please [download](#2-results) the exported MindIR file first, or refer to the [Model Export](../../README.md) tutorial and use the following command to export the trained ckpt model to MindIR file:
Please [download](#3-results) the exported MindIR file first, or refer to the [Model Export](../../../docs/en/inference/convert_tutorial.md#1-model-export) tutorial and use the following command to export the trained ckpt model to MindIR file:

```shell
python tools/export.py --model_name_or_config dbnet_resnet50 --data_shape 736 1280 --local_ckpt_path /path/to/local_ckpt.ckpt
Expand All @@ -430,11 +430,11 @@ The `data_shape` is the model input shape of height and width for MindIR file. T

- Environment Installation

Please refer to [Environment Installation](../../../docs/en/inference/environment.md#2-mindspore-lite-inference) tutorial to configure the MindSpore Lite inference environment.
Please refer to [Environment Installation](../../../docs/en/inference/environment.md) tutorial to configure the MindSpore Lite inference environment.

- Model Conversion

Please refer to [Model Conversion](../../../docs/en/inference/convert_tutorial.md#1-mindocr-models),
Please refer to [Model Conversion](../../../docs/en/inference/convert_tutorial.md#2-mindspore-lite-mindir-convert),
and use the `converter_lite` tool for offline conversion of the MindIR file.

- Inference
Expand Down
10 changes: 5 additions & 5 deletions configs/det/dbnet/README_CN.md
Original file line number Diff line number Diff line change
Expand Up @@ -375,7 +375,7 @@ python tools/train.py --config configs/det/dbnet/db_r50_icdar15.yaml
请确保yaml文件中的`distribute`参数为True。

```shell
# n is the number of GPUs/NPUs
# n is the number of NPUs
mpirun --allow-run-as-root -n 2 python tools/train.py --config configs/det/dbnet/db_r50_icdar15.yaml
```

Expand All @@ -391,11 +391,11 @@ python tools/eval.py --config configs/det/dbnet/db_r50_icdar15.yaml

## 5. MindSpore Lite 推理

请参考[MindOCR 推理](../../../docs/cn/inference/inference_tutorial.md)教程,基于MindSpore Lite在Ascend 310上进行模型的推理,包括以下步骤:
请参考[MindOCR 推理](../../../docs/zh/inference/inference_tutorial.md)教程,基于MindSpore Lite在Ascend 310上进行模型的推理,包括以下步骤:

- 模型导出

请先[下载](#2-实验结果)已导出的MindIR文件,或者参考[模型导出](../../README.md)教程,使用以下命令将训练完成的ckpt导出为MindIR文件:
请先[下载](#3-实验结果)已导出的MindIR文件,或者参考[模型导出](../../../docs/zh/inference/convert_tutorial.md#1-模型导出)教程,使用以下命令将训练完成的ckpt导出为MindIR文件:

```shell
python tools/export.py --model_name_or_config dbnet_resnet50 --data_shape 736 1280 --local_ckpt_path /path/to/local_ckpt.ckpt
Expand All @@ -407,11 +407,11 @@ python tools/export.py --model_name_or_config configs/det/dbnet/db_r50_icdar15.y

- 环境搭建

请参考[环境安装](../../../docs/cn/inference/environment.md#2-mindspore-lite推理)教程,配置MindSpore Lite推理运行环境。
请参考[环境安装](../../../docs/zh/inference/environment.md)教程,配置MindSpore Lite推理运行环境。

- 模型转换

请参考[模型转换](../../../docs/cn/inference/convert_tutorial.md#1-mindocr模型)教程,使用`converter_lite`工具对MindIR模型进行离线转换。
请参考[模型转换](../../../docs/zh/inference/convert_tutorial.md#2-mindspore-lite-mindir-转换)教程,使用`converter_lite`工具对MindIR模型进行离线转换。

- 执行推理

Expand Down
6 changes: 3 additions & 3 deletions configs/det/dbnet/README_CN_PP-OCRv3.md
Original file line number Diff line number Diff line change
Expand Up @@ -326,10 +326,10 @@ model:

* 分布式训练

在大量数据的情况下,建议用户使用分布式训练。对于在多个昇腾910设备或着GPU卡的分布式训练,请将配置参数`system.distribute`修改为True, 例如:
在大量数据的情况下,建议用户使用分布式训练。对于在多个昇腾910设备的分布式训练,请将配置参数`system.distribute`修改为True, 例如:

```shell
# 在多个 GPU/Ascend 设备上进行分布式训练
# 在多个 Ascend 设备上进行分布式训练
mpirun --allow-run-as-root -n 4 python tools/train.py --config configs/det/dbnet/db_mobilenetv3_ppocrv3.yaml
```

Expand All @@ -338,7 +338,7 @@ mpirun --allow-run-as-root -n 4 python tools/train.py --config configs/det/dbnet
如果要在没有分布式训练的情况下在较小的数据集上训练模型,请将配置参数`distribute`修改为False 并运行:

```shell
# CPU/GPU/Ascend 设备上的单卡训练
# CPU/Ascend 设备上的单卡训练
python tools/train.py --config configs/det/dbnet/db_mobilenetv3_ppocrv3.yaml
```

Expand Down
8 changes: 4 additions & 4 deletions configs/det/east/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -138,7 +138,7 @@ python tools/train.py --config configs/det/east/east_r50_icdar15.yaml
Please set `distribute` in yaml config file to be True.

```shell
# n is the number of GPUs/NPUs
# n is the number of NPUs
mpirun --allow-run-as-root -n 8 python tools/train.py --config configs/det/east/east_r50_icdar15.yaml
```

Expand All @@ -158,7 +158,7 @@ Please refer to the tutorial [MindOCR Inference](../../../docs/en/inference/infe

- Model Export

Please [download](#2-results) the exported MindIR file first, or refer to the [Model Export](../../README.md) tutorial and use the following command to export the trained ckpt model to MindIR file:
Please [download](#2-results) the exported MindIR file first, or refer to the [Model Export](../../../docs/en/inference/convert_tutorial.md#1-model-export) tutorial and use the following command to export the trained ckpt model to MindIR file:

``` shell
python tools/export.py --model_name_or_config east_resnet50 --data_shape 720 1280 --local_ckpt_path /path/to/local_ckpt.ckpt
Expand All @@ -170,11 +170,11 @@ The `data_shape` is the model input shape of height and width for MindIR file. T

- Environment Installation

Please refer to [Environment Installation](../../../docs/en/inference/environment.md#2-mindspore-lite-inference) tutorial to configure the MindSpore Lite inference environment.
Please refer to [Environment Installation](../../../docs/en/inference/environment.md) tutorial to configure the MindSpore Lite inference environment.

- Model Conversion

Please refer to [Model Conversion](../../../docs/en/inference/convert_tutorial.md#1-mindocr-models),
Please refer to [Model Conversion](../../../docs/en/inference/convert_tutorial.md#2-mindspore-lite-mindir-convert),
and use the `converter_lite` tool for offline conversion of the MindIR file.

- Inference
Expand Down
10 changes: 5 additions & 5 deletions configs/det/east/README_CN.md
Original file line number Diff line number Diff line change
Expand Up @@ -133,7 +133,7 @@ python tools/train.py --config configs/det/east/east_r50_icdar15.yaml
请确保yaml文件中的`distribute`参数为True。

```shell
# n is the number of GPUs/NPUs
# n is the number of NPUs
mpirun --allow-run-as-root -n 8 python tools/train.py --config configs/det/east/east_r50_icdar15.yaml
```

Expand All @@ -149,11 +149,11 @@ python tools/eval.py --config configs/det/east/east_r50_icdar15.yaml

### 3.6 MindSpore Lite 推理

请参考[MindOCR 推理](../../../docs/cn/inference/inference_tutorial.md)教程,基于MindSpore Lite在Ascend 310上进行模型的推理,包括以下步骤:
请参考[MindOCR 推理](../../../docs/zh/inference/inference_tutorial.md)教程,基于MindSpore Lite在Ascend 310上进行模型的推理,包括以下步骤:

- 模型导出

请先[下载](#2-实验结果)已导出的MindIR文件,或者参考[模型导出](../../README.md)教程,使用以下命令将训练完成的ckpt导出为MindIR文件:
请先[下载](#2-实验结果)已导出的MindIR文件,或者参考[模型导出](../../../docs/zh/inference/convert_tutorial.md#1-模型导出)教程,使用以下命令将训练完成的ckpt导出为MindIR文件:

``` shell
python tools/export.py --model_name_or_config east_resnet50 --data_shape 720 1280 --local_ckpt_path /path/to/local_ckpt.ckpt
Expand All @@ -165,11 +165,11 @@ python tools/export.py --model_name_or_config configs/det/east/east_r50_icdar15.

- 环境搭建

请参考[环境安装](../../../docs/cn/inference/environment.md#2-mindspore-lite推理)教程,配置MindSpore Lite推理运行环境。
请参考[环境安装](../../../docs/zh/inference/environment.md)教程,配置MindSpore Lite推理运行环境。

- 模型转换

请参考[模型转换](../../../docs/cn/inference/convert_tutorial.md#1-mindocr模型)教程,使用`converter_lite`工具对MindIR模型进行离线转换。
请参考[模型转换](../../../docs/zh/inference/convert_tutorial.md#2-mindspore-lite-mindir-转换)教程,使用`converter_lite`工具对MindIR模型进行离线转换。


- 执行推理
Expand Down
10 changes: 5 additions & 5 deletions configs/det/fcenet/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -157,7 +157,7 @@ python tools/train.py -c=configs/det/fcenet/fce_icdar15.yaml
Please set `distribute` in yaml config file to be True.

```shell
# n is the number of GPUs/NPUs
# n is the number of NPUs
mpirun --allow-run-as-root -n 2 python tools/train.py --config configs/det/fcenet/fce_icdar15.yaml
```

Expand All @@ -174,11 +174,11 @@ python tools/eval.py -c=configs/det/fcenet/fce_icdar15.yaml

### 3.6 MindSpore Lite Inference

Please refer to the tutorial [MindOCR Inference](../../../docs/en/inference/inference_tutorial_en.md) for model inference based on MindSpot Lite on Ascend 310, including the following steps:
Please refer to the tutorial [MindOCR Inference](../../../docs/en/inference/inference_tutorial.md) for model inference based on MindSpot Lite on Ascend 310, including the following steps:

- Model Export

Please [download](#2-results) the exported MindIR file first, or refer to the [Model Export](../../README.md) tutorial and use the following command to export the trained ckpt model to MindIR file:
Please [download](#2-results) the exported MindIR file first, or refer to the [Model Export](../../../docs/en/inference/convert_tutorial.md#1-model-export) tutorial and use the following command to export the trained ckpt model to MindIR file:

```shell
python tools/export.py --model_name_or_config fcenet_resnet50 --data_shape 736 1280 --local_ckpt_path /path/to/local_ckpt.ckpt
Expand All @@ -190,11 +190,11 @@ The `data_shape` is the model input shape of height and width for MindIR file. T

- Environment Installation

Please refer to [Environment Installation](../../../docs/en/inference/environment_en.md#2-mindspore-lite-inference) tutorial to configure the MindSpore Lite inference environment.
Please refer to [Environment Installation](../../../docs/en/inference/environment.md) tutorial to configure the MindSpore Lite inference environment.

- Model Conversion

Please refer to [Model Conversion](../../../docs/en/inference/convert_tutorial_en.md#1-mindocr-models),
Please refer to [Model Conversion](../../../docs/en/inference/convert_tutorial.md#2-mindspore-lite-mindir-convert),
and use the `converter_lite` tool for offline conversion of the MindIR file.

- Inference
Expand Down
10 changes: 5 additions & 5 deletions configs/det/fcenet/README_CN.md
Original file line number Diff line number Diff line change
Expand Up @@ -165,7 +165,7 @@ python tools/train.py --config configs/det/fcenet/fce_icdar15.yaml
请确保yaml文件中的`distribute`参数为True。

```shell
# n is the number of GPUs/NPUs
# n is the number of NPUs
mpirun --allow-run-as-root -n 2 python tools/train.py --config configs/det/fcenet/fce_icdar15.yaml
```

Expand All @@ -181,11 +181,11 @@ python tools/eval.py --config configs/det/fcenet/fce_icdar15.yaml

### 3.6 MindSpore Lite 推理

请参考[MindOCR 推理](../../../docs/cn/inference/inference_tutorial_cn.md)教程,基于MindSpore Lite在Ascend 310上进行模型的推理,包括以下步骤:
请参考[MindOCR 推理](../../../docs/zh/inference/inference_tutorial.md)教程,基于MindSpore Lite在Ascend 310上进行模型的推理,包括以下步骤:

- 模型导出

请先[下载](#2-实验结果)已导出的MindIR文件,或者参考[模型导出](../../README.md)教程,使用以下命令将训练完成的ckpt导出为MindIR文件:
请先[下载](#2-实验结果)已导出的MindIR文件,或者参考[模型导出](../../../docs/zh/inference/convert_tutorial.md#1-模型导出)教程,使用以下命令将训练完成的ckpt导出为MindIR文件:

```shell
python tools/export.py --model_name_or_config fcenet_resnet50 --data_shape 736 1280 --local_ckpt_path /path/to/local_ckpt.ckpt
Expand All @@ -197,11 +197,11 @@ python tools/export.py --model_name_or_config configs/det/fcenet/fce_icdar15.yam

- 环境搭建

请参考[环境安装](../../../docs/cn/inference/environment_cn.md#2-mindspore-lite推理)教程,配置MindSpore Lite推理运行环境。
请参考[环境安装](../../../docs/zh/inference/environment.md)教程,配置MindSpore Lite推理运行环境。

- 模型转换

请参考[模型转换](../../../docs/cn/inference/convert_tutorial_cn.md#1-mindocr模型)教程,使用`converter_lite`工具对MindIR模型进行离线转换。
请参考[模型转换](../../../docs/zh/inference/convert_tutorial.md#2-mindspore-lite-mindir-转换)教程,使用`converter_lite`工具对MindIR模型进行离线转换。

- 执行推理

Expand Down
8 changes: 4 additions & 4 deletions configs/det/psenet/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -168,7 +168,7 @@ python tools/train.py --config configs/det/psenet/pse_r152_icdar15.yaml
Please set `distribute` in yaml config file to be True.

```shell
# n is the number of GPUs/NPUs
# n is the number of NPUs
mpirun --allow-run-as-root -n 8 python tools/train.py --config configs/det/psenet/pse_r152_icdar15.yaml
```

Expand All @@ -188,7 +188,7 @@ Please refer to the tutorial [MindOCR Inference](../../../docs/en/inference/infe

- Model Export

Please [download](#2-results) the exported MindIR file first, or refer to the [Model Export](../../README.md) tutorial and use the following command to export the trained ckpt model to MindIR file:
Please [download](#2-results) the exported MindIR file first, or refer to the [Model Export](../../../docs/en/inference/convert_tutorial.md#1-model-export) tutorial and use the following command to export the trained ckpt model to MindIR file:

```shell
python tools/export.py --model_name_or_config psenet_resnet152 --data_shape 1472 2624 --local_ckpt_path /path/to/local_ckpt.ckpt
Expand All @@ -200,11 +200,11 @@ The `data_shape` is the model input shape of height and width for MindIR file. T

- Environment Installation

Please refer to [Environment Installation](../../../docs/en/inference/environment.md#2-mindspore-lite-inference) tutorial to configure the MindSpore Lite inference environment.
Please refer to [Environment Installation](../../../docs/en/inference/environment.md) tutorial to configure the MindSpore Lite inference environment.

- Model Conversion

Please refer to [Model Conversion](../../../docs/en/inference/convert_tutorial.md#1-mindocr-models),
Please refer to [Model Conversion](../../../docs/en/inference/convert_tutorial.md#2-mindspore-lite-mindir-convert),
and use the `converter_lite` tool for offline conversion of the MindIR file.

- Inference
Expand Down
10 changes: 5 additions & 5 deletions configs/det/psenet/README_CN.md
Original file line number Diff line number Diff line change
Expand Up @@ -168,7 +168,7 @@ python tools/train.py --config configs/det/psenet/pse_r152_icdar15.yaml
请确保yaml文件中的`distribute`参数为True。

```shell
# n is the number of GPUs/NPUs
# n is the number of NPUs
mpirun --allow-run-as-root -n 8 python tools/train.py --config configs/det/psenet/pse_r152_icdar15.yaml
```

Expand All @@ -184,11 +184,11 @@ python tools/eval.py --config configs/det/psenet/pse_r152_icdar15.yaml

### 3.6 MindSpore Lite 推理

请参考[MindOCR 推理](../../../docs/cn/inference/inference_tutorial.md)教程,基于MindSpore Lite在Ascend 310上进行模型的推理,包括以下步骤:
请参考[MindOCR 推理](../../../docs/zh/inference/inference_tutorial.md)教程,基于MindSpore Lite在Ascend 310上进行模型的推理,包括以下步骤:

- 模型导出

请先[下载](#2-实验结果)已导出的MindIR文件,或者参考[模型导出](../../README.md)教程,使用以下命令将训练完成的ckpt导出为MindIR文件:
请先[下载](#2-实验结果)已导出的MindIR文件,或者参考[模型导出](../../../docs/zh/inference/convert_tutorial.md#1-模型导出)教程,使用以下命令将训练完成的ckpt导出为MindIR文件:

```shell
python tools/export.py --model_name_or_config psenet_resnet152 --data_shape 1472 2624 --local_ckpt_path /path/to/local_ckpt.ckpt
Expand All @@ -200,11 +200,11 @@ python tools/export.py --model_name_or_config configs/det/psenet/pse_r152_icdar1

- 环境搭建

请参考[环境安装](../../../docs/cn/inference/environment.md#2-mindspore-lite推理)教程,配置MindSpore Lite推理运行环境。
请参考[环境安装](../../../docs/zh/inference/environment.md)教程,配置MindSpore Lite推理运行环境。

- 模型转换

请参考[模型转换](../../../docs/cn/inference/convert_tutorial.md#1-mindocr模型)教程,使用`converter_lite`工具对MindIR模型进行离线转换。
请参考[模型转换](../../../docs/zh/inference/convert_tutorial.md#2-mindspore-lite-mindir-转换)教程,使用`converter_lite`工具对MindIR模型进行离线转换。

- 执行推理

Expand Down
6 changes: 3 additions & 3 deletions configs/kie/layoutlmv3/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -183,7 +183,7 @@ eval:
```

**Notes:**
- As the global batch size (batch_size x num_devices) is important for reproducing the result, please adjust `batch_size` accordingly to keep the global batch size unchanged for a different number of GPUs/NPUs, or adjust the learning rate linearly to a new global batch size.
- As the global batch size (batch_size x num_devices) is important for reproducing the result, please adjust `batch_size` accordingly to keep the global batch size unchanged for a different number of NPUs, or adjust the learning rate linearly to a new global batch size.


### 3.2 Model Training
Expand All @@ -193,7 +193,7 @@ eval:
It is easy to reproduce the reported results with the pre-defined training recipe. For distributed training on multiple Ascend 910 devices, please modify the configuration parameter `distribute` as True and run:

```shell
# distributed training on multiple GPU/Ascend devices
# distributed training on multiple Ascend devices
mpirun --allow-run-as-root -n 8 python tools/train.py --config configs/kie/layoutlmv3/ser_layoutlmv3_xfund_zh.yaml
```

Expand All @@ -203,7 +203,7 @@ mpirun --allow-run-as-root -n 8 python tools/train.py --config configs/kie/layou
If you want to train or finetune the model on a smaller dataset without distributed training, please modify the configuration parameter`distribute` as False and run:

```shell
# standalone training on a CPU/GPU/Ascend device
# standalone training on a CPU/Ascend device
python tools/train.py --config configs/kie/layoutlmv3/ser_layoutlmv3_xfund_zh.yaml
```

Expand Down
Loading

0 comments on commit dda7b75

Please sign in to comment.