From 9f387d01c2d99356f867079e08e1fae5a4eceb60 Mon Sep 17 00:00:00 2001 From: praneet pabolu Date: Wed, 22 Jun 2022 12:35:04 +0530 Subject: [PATCH] Corrected some typos --- .../tutorials/getting-started/crash-course/3-autograd.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/docs/python_docs/python/tutorials/getting-started/crash-course/3-autograd.md b/docs/python_docs/python/tutorials/getting-started/crash-course/3-autograd.md index b1496590ffd4..79a50cdd306d 100644 --- a/docs/python_docs/python/tutorials/getting-started/crash-course/3-autograd.md +++ b/docs/python_docs/python/tutorials/getting-started/crash-course/3-autograd.md @@ -217,7 +217,7 @@ c.backward() ``` You can see that `b` is a linear function of `a`, and `c` is chosen from `b`. -The gradient with respect to `a` be will be either `[c/a[0], 0]` or `[0, +The gradient with respect to `a` will be either `[c/a[0], 0]` or `[0, c/a[1]]`, depending on which element from `b` is picked. You see the results of this example with this code: @@ -233,7 +233,7 @@ along this axis is the same as summing that axis and multiplying by `1/3`. You can control gradients for different ndarray operations. For instance, perhaps you want to check that the gradients are propagating properly? the `attach_grad()` method automatically detaches itself from the gradient. -Therefore, the input up until y will no longer look like it has `x`. To +Therefore, the input up until `y` will no longer look like it has `x`. To illustrate this notice that `x.grad` and `y.grad` is not the same in the second example.