diff --git a/tutorials/source_en/beginner/autograd.md b/tutorials/source_en/beginner/autograd.md index 6fead1c1c05ce181726eb7a3791320bcc8879d0c..4fc41ed0172053675ba3dba3ac76c1058abba86c 100644 --- a/tutorials/source_en/beginner/autograd.md +++ b/tutorials/source_en/beginner/autograd.md @@ -14,10 +14,8 @@ This chapter uses `ops.GradOperation` in MindSpore to find first-order derivativ ## First-order Derivative of the Input -The formula needs to be defined before the input can be derived: -$$ -f(x)=wx+b \tag {1} -$$ +The formula needs to be defined before the input can be derived: $f(x)=wx+b \tag {1}$ + The example code below is an expression of Equation (1), and since MindSpore is functionally programmed, all expressions of computational formulas are represented as functions. ```python @@ -36,10 +34,7 @@ class Net(nn.Cell): return f ``` -Define the derivative class `GradNet`. In the `__init__` function, define the `self.net` and `ops.GradOperation` networks. In the `construct` function, compute the derivative of `self.net`. Its corresponding MindSpore internally produces the following formula (2): -$$ -f^{'}(x)=w\tag {2} -$$ +Define the derivative class `GradNet`. In the `__init__` function, define the `self.net` and `ops.GradOperation` networks. In the `construct` function, compute the derivative of `self.net`. Its corresponding MindSpore internally produces the following formula (2): $f^{'}(x)=w\tag {2}$ ```python from mindspore import dtype as mstype