From 6c566abbec7e1cc6e7254f74076d48cc79d263ef Mon Sep 17 00:00:00 2001 From: huanxiaoling <3174348550@qq.com> Date: Wed, 12 Oct 2022 11:04:34 +0800 Subject: [PATCH] modify the wrong links in files --- docs/lite/docs/source_en/use/micro.md | 6 ++-- .../source_en/faq/implement_problem.md | 34 +++++++++++++++++++ .../learning_rate_and_optimizer.md | 2 +- .../source_en/migration_guide/overview.md | 2 +- .../source_zh_cn/faq/implement_problem.md | 34 +++++++++++++++++++ tutorials/source_en/beginner/introduction.md | 2 +- 6 files changed, 74 insertions(+), 6 deletions(-) diff --git a/docs/lite/docs/source_en/use/micro.md b/docs/lite/docs/source_en/use/micro.md index be414f0065..22bf496233 100644 --- a/docs/lite/docs/source_en/use/micro.md +++ b/docs/lite/docs/source_en/use/micro.md @@ -132,7 +132,7 @@ The following describes how to prepare the environment for using the conversion ``` The `src` directory in the generated code is the directory where the model inference code is located. The `benchmark` is just a routine for calling the `src` directory code integratedly. - For more details on integrated calls, please refer to the section on [Code Integration and Compilation Deployment](#Code Integration and Compilation Deployment). + For more details on integrated calls, please refer to the section on [Code Integration and Compilation Deployment](#code-integration-and-compilation-deployment). ### (Optional) Model Input Shape Configuration @@ -171,7 +171,7 @@ For the meaning of each option in the configuration file, refer to Table 1. #### Involved Calling Interfaces By integrating the code and calling the following interfaces, the user can configure the multi-threaded inference of the model. -For specific interface parameters, refer to [API Document](https://www.mindspore.cn/lite/api/en/master/index.html). +For specific interface parameters, refer to [API Document](https://www.mindspore.cn/lite/api/en/master/index.html). | Function | Function definition | | ---------------- | ----------------------------------------------------------------------- | @@ -193,7 +193,7 @@ At present, this function is only enabled when the `target` is configured as x86 In MCU scenarios such as Cortex-M, limited by the memory size and computing power of the device, Int8 quantization operators are usually used for deployment inference to reduce the runtime memory size and speed up operations. -If the user already has an Int8 full quantitative model, you can refer to the section on [Generating Inference Code by Running converter_lite](#Generating Inference Code by Running converter_lite) to try to generate Int8 quantitative inference code directly without reading this chapter. +If the user already has an Int8 full quantitative model, you can refer to the section on [Generating Inference Code by Running converter_lite](#generating-inference-code-by-running-converter-lite) to try to generate Int8 quantitative inference code directly without reading this chapter. In general, the user has only one trained Float32 model. To generate Int8 quantitative inference code at this time, it is necessary to cooperate with the post quantization function of the conversion tool to generate code. See the following for specific steps. #### Configuration diff --git a/docs/mindspore/source_en/faq/implement_problem.md b/docs/mindspore/source_en/faq/implement_problem.md index 81938f493b..829983afc4 100644 --- a/docs/mindspore/source_en/faq/implement_problem.md +++ b/docs/mindspore/source_en/faq/implement_problem.md @@ -621,3 +621,37 @@ A: The reason for this error is that the user did not configure the operator par Therefore, the user needs to set the operator parameters appropriately to avoid such errors.
+ +**Q: How do I understand the "Ascend Error Message" in the error message?** + +A: The "Ascend Error Message" is a fault message thrown after there is an error during CANN execution when CANN (Ascend Heterogeneous Computing Architecture) interface is called by MindSpore, which contains information such as error code and error description. For example: + +```python +Traceback (most recent call last): + File "train.py", line 292, in + train_net() + File "/home/resnet_csj2/scripts/train_parallel0/src/model_utils/moxing_adapter.py", line 104, in wrapped_func + run_func(*args, **kwargs) + File "train.py", line 227, in train_net + set_parameter() + File "train.py", line 114, in set_parameter + init() + File "/home/miniconda3/envs/ms/lib/python3.7/site-packages/mindspore/communication/management.py", line 149, in init + init_hccl() + RuntimeError: Ascend kernel runtime initialization failed. + + \---------------------------------------------------- + \- Ascend Error Message: + \---------------------------------------------------- + EJ0001: Failed to initialize the HCCP process. Reason: Maybe the last training process is running. //EJ0001 is the error code, followed by the description and cause of the error. The cause of the error in this example is that the distributed training of the same 8 nodes was started several times, causing process conflicts + Solution: Wait for 10s after killing the last training process and try again. //The print message here gives the solution to the problem, and this example suggests that the user clean up the process + TraceBack (most recent call last): //The information printed here is the stack information used by the developer for positioning, and generally the user do not need to pay attention +``` + +```text +tsd client wait response fail, device response code[1]. unknown device error.[FUNC:WaitRsp][FILE:process_mode_manager.cpp][LINE:233] +``` + +In addition, CANN may throw some Inner Errors, for example, the error code is "EI9999: Inner Error". If you cannot search the case description in MindSpore official website or forum, you can ask for help in the community by raising an issue. + +
\ No newline at end of file diff --git a/docs/mindspore/source_en/migration_guide/model_development/learning_rate_and_optimizer.md b/docs/mindspore/source_en/migration_guide/model_development/learning_rate_and_optimizer.md index 10903e53d0..adc50fd267 100644 --- a/docs/mindspore/source_en/migration_guide/model_development/learning_rate_and_optimizer.md +++ b/docs/mindspore/source_en/migration_guide/model_development/learning_rate_and_optimizer.md @@ -1,6 +1,6 @@ # Learning Rate and Optimizer - + Before reading this chapter, please read the official MindSpore tutorial [Optimizer](https://www.mindspore.cn/tutorials/en/master/advanced/modules/optim.html). diff --git a/docs/mindspore/source_en/migration_guide/overview.md b/docs/mindspore/source_en/migration_guide/overview.md index 00cb3128b0..d2b7b394a6 100644 --- a/docs/mindspore/source_en/migration_guide/overview.md +++ b/docs/mindspore/source_en/migration_guide/overview.md @@ -41,6 +41,6 @@ This chapter will introduce some methods of debugging and tuning from three aspe This chapter contains a complete network migration sample. From the analysis and replication of the benchmark network, it details the steps of script development and precision debugging and tuning, and finally lists the common problems and corresponding optimization methods during the migration process, framework performance issues. -## [FAQs](https://www.mindspore.cn/docs/en/master/migration_guide/faq.html) +## FAQs This chapter lists the frequently-asked questions and corresponding solutions. diff --git a/docs/mindspore/source_zh_cn/faq/implement_problem.md b/docs/mindspore/source_zh_cn/faq/implement_problem.md index 39d2f075d0..10724bc8b0 100644 --- a/docs/mindspore/source_zh_cn/faq/implement_problem.md +++ b/docs/mindspore/source_zh_cn/faq/implement_problem.md @@ -606,3 +606,37 @@ A: 此问题的原因为:用户未正确配置算子参数,导致算子申 因此,用户需要适当设置算子参数,以避免此类报错。
+ +**Q: 如何理解报错提示中的"Ascend Error Message"?** + + A: "Ascend Error Message"是MindSpore调用CANN(昇腾异构计算架构)接口时,CANN执行出错后抛出的故障信息,其中包含错误码和错误描述等信息,如下例子: + +```python +Traceback (most recent call last): + File "train.py", line 292, in + train_net() + File "/home/resnet_csj2/scripts/train_parallel0/src/model_utils/moxing_adapter.py", line 104, in wrapped_func + run_func(*args, **kwargs) + File "train.py", line 227, in train_net + set_parameter() + File "train.py", line 114, in set_parameter + init() + File "/home/miniconda3/envs/ms/lib/python3.7/site-packages/mindspore/communication/management.py", line 149, in init + init_hccl() + RuntimeError: Ascend kernel runtime initialization failed. + + \---------------------------------------------------- + \- Ascend Error Message: + \---------------------------------------------------- + EJ0001: Failed to initialize the HCCP process. Reason: Maybe the last training process is running. //EJ0001为错误码,之后是错误的描述与原因,本例子的错误原因是多次启动了相同8节点的分布式训练,造成进程冲突 + Solution: Wait for 10s after killing the last training process and try again. //此处打印信息给出了问题的解决方案,此例子建议用户清理进程 + TraceBack (most recent call last): //此处打印的信息是开发用于定位的堆栈信息,一般情况下用户不需关注 +``` + +```text + tsd client wait response fail, device response code[1]. unknown device error.[FUNC:WaitRsp][FILE:process_mode_manager.cpp][LINE:233] +``` + +另外在一些情况下,CANN会抛出一些内部错误(Inner Error),例如:错误码为 "EI9999: Inner Error" 此种情况如果在MindSpore官网或者论坛无法搜索到案例说明,可在社区提单求助。 + +
\ No newline at end of file diff --git a/tutorials/source_en/beginner/introduction.md b/tutorials/source_en/beginner/introduction.md index 42f5ae69a3..512e426137 100644 --- a/tutorials/source_en/beginner/introduction.md +++ b/tutorials/source_en/beginner/introduction.md @@ -76,7 +76,7 @@ After the neural network model is trained, you can export the model or load the MindSpore provides users with three different levels of APIs to support AI application (algorithm/model) development, from high to low: High-Level Python API, Medium-Level Python API and Low-Level Python API. The High-Level API provides better encapsulation, the Low-Level API provides better flexibility, and the Mid-Level API combines flexibility and encapsulation to meet the needs of developers in different fields and levels. -![MindSpore API](https://mindspore-website.obs.cn-north-4.myhuaweicloud.com/website-images/master/tutorials/source_zh_cn/beginner/images/introduction3.png) +![MindSpore API](https://mindspore-website.obs.cn-north-4.myhuaweicloud.com/website-images/master/tutorials/source_en/beginner/images/introduction3.png) - High-Level Python API -- Gitee