[GitHub] [incubator-mxnet] aaronmarkham commented on a change in pull request #16500: Fixing broken links
aaronmarkham commented on a change in pull request #16500: Fixing broken links URL: https://github.com/apache/incubator-mxnet/pull/16500#discussion_r335668894 ## File path: docs/python_docs/python/tutorials/packages/gluon/loss/loss.md ## @@ -19,8 +19,8 @@ Loss functions are used to train neural networks and to compute the difference between output and target variable. A critical component of training neural networks is the loss function. A loss function is a quantative measure of how bad the predictions of the network are when compared to ground truth labels. Given this score, a network can improve by iteratively updating its weights to minimise this loss. Some tasks use a combination of multiple loss functions, but often you'll just use one. MXNet Gluon provides a number of the most commonly used loss functions, and you'll choose certain loss functions depending on your network and task. Some common task and loss function pairs include: -- regression: [L1Loss](/api/python/docs/api/gluon/_autogen/mxnet.gluon.loss.L1Loss.html), [L2Loss](/api/python/docs/api/gluon/_autogen/mxnet.gluon.loss.L2Loss.html) -- classification: [SigmoidBinaryCrossEntropyLoss](/api/python/docs/api/gluon/_autogen/mxnet.gluon.loss.SigmoidBinaryCrossEntropyLoss.html), [SoftmaxBinaryCrossEntropyLoss](/api/python/docs/api/gluon/_autogen/mxnet.gluon.loss.SoftmaxBinaryCrossEntropyLoss.html) +- regression: [L1Loss](/api/python/docs/api/gluon/_autogen/mxnet.gluon.loss.L1Loss.html), [L2Loss](/api/python/docs/api/gluon/loss/index.html#mxnet.gluon.loss.L2Loss) +- classification: [SigmoidBinaryCrossEntropyLoss](/api/python/docs/api/gluon/_autogen/mxnet.gluon.loss.SigmoidBinaryCrossEntropyLoss.html), [SoftmaxBinaryCrossEntropyLoss](/api/python/docs/api/gluon/_autogen/mxnet.gluon.loss.SoftmaxBinaryCrossEntropyLoss.html) Review comment: ```suggestion - classification: [SigmoidBinaryCrossEntropyLoss](/api/python/docs/api/gluon/loss/index.html#mxnet.gluon.loss.SigmoidBinaryCrossEntropyLoss), [SoftmaxCrossEntropyLoss](/api/python/docs/api/gluon/loss/index.html#mxnet.gluon.loss.SoftmaxCrossEntropyLoss) ``` This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [incubator-mxnet] aaronmarkham commented on a change in pull request #16500: Fixing broken links
aaronmarkham commented on a change in pull request #16500: Fixing broken links URL: https://github.com/apache/incubator-mxnet/pull/16500#discussion_r335666384 ## File path: docs/python_docs/python/tutorials/packages/gluon/blocks/hybridize.md ## @@ -294,7 +294,7 @@ def hybrid_forward(self, F, x): Would get you this error `TypeError: 'Symbol' object does not support item assignment`. -Direct item assignment is not possible in symbolic graph since it needs to be part of a computational graph. One way is to use add more inputs to your graph and use masking or the [`F.where`](https://mxnet.apache.org/api/python/docs/api/ndarray/_autogen/mxnet.ndarray.where.html) operator. +Direct item assignment is not possible in symbolic graph since it needs to be part of a computational graph. One way is to use add more inputs to your graph and use masking or the [`F.where`](/api/python/docs/api/ndarray/ndarray.htmlmxnet.ndarray.where.html) operator. Review comment: ```suggestion Direct item assignment is not possible in symbolic graph since it needs to be part of a computational graph. One way is to use add more inputs to your graph and use masking or the [F.where](/api/python/docs/api/ndarray/ndarray.html#mxnet.ndarray.where) operator. ``` This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [incubator-mxnet] aaronmarkham commented on a change in pull request #16500: Fixing broken links
aaronmarkham commented on a change in pull request #16500: Fixing broken links URL: https://github.com/apache/incubator-mxnet/pull/16500#discussion_r335665652 ## File path: docs/python_docs/python/tutorials/packages/gluon/blocks/hybridize.md ## @@ -277,10 +277,10 @@ def hybrid_forward(self, F, x): Trying to access the shape of a tensor in a hybridized block would result in this error: `AttributeError: 'Symbol' object has no attribute 'shape'`. -Again, you cannot use the shape of the symbol at runtime as symbols only describe operations and not the underlying data they operate on. -Note: This will change in the future as Apache MXNet will support [dynamic shape inference](https://cwiki.apache.org/confluence/display/MXNET/Dynamic+shape), and the shapes of symbols will be symbols themselves +Again, you cannot use the shape of the symbol at runtime as symbols only describe operations and not the underlying data they operate on. +Note: This will change in the future as Apache MXNet will support [dynamic shape inference](https://cwiki.apache.org/confluence/display/MXNET/Dynamic+shape), and the shapes of symbols will be symbols themselves -There are also a lot of operators that support special indices to help with most of the use-cases where you would want to access the shape information. For example, `F.reshape(x, (0,0,-1))` will keep the first two dimensions unchanged and collapse all further dimensions into the third dimension. See the documentation of the [`F.reshape`](https://mxnet.apache.org/api/python/docs/api/ndarray/_autogen/mxnet.ndarray.reshape.html) for more details. +There are also a lot of operators that support special indices to help with most of the use-cases where you would want to access the shape information. For example, `F.reshape(x, (0,0,-1))` will keep the first two dimensions unchanged and collapse all further dimensions into the third dimension. See the documentation of the [`F.reshape`](/api/python/docs/api/ndarray/ndarray.htmlmxnet.ndarray.reshape.html) for more details. Review comment: ```suggestion There are also a lot of operators that support special indices to help with most of the use-cases where you would want to access the shape information. For example, `F.reshape(x, (0,0,-1))` will keep the first two dimensions unchanged and collapse all further dimensions into the third dimension. See the documentation of the [`F.reshape`](/api/python/docs/api/ndarray/ndarray.html#mxnet.ndarray.reshape) for more details. ``` This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [incubator-mxnet] aaronmarkham commented on a change in pull request #16500: Fixing broken links
aaronmarkham commented on a change in pull request #16500: Fixing broken links URL: https://github.com/apache/incubator-mxnet/pull/16500#discussion_r335665652 ## File path: docs/python_docs/python/tutorials/packages/gluon/blocks/hybridize.md ## @@ -277,10 +277,10 @@ def hybrid_forward(self, F, x): Trying to access the shape of a tensor in a hybridized block would result in this error: `AttributeError: 'Symbol' object has no attribute 'shape'`. -Again, you cannot use the shape of the symbol at runtime as symbols only describe operations and not the underlying data they operate on. -Note: This will change in the future as Apache MXNet will support [dynamic shape inference](https://cwiki.apache.org/confluence/display/MXNET/Dynamic+shape), and the shapes of symbols will be symbols themselves +Again, you cannot use the shape of the symbol at runtime as symbols only describe operations and not the underlying data they operate on. +Note: This will change in the future as Apache MXNet will support [dynamic shape inference](https://cwiki.apache.org/confluence/display/MXNET/Dynamic+shape), and the shapes of symbols will be symbols themselves -There are also a lot of operators that support special indices to help with most of the use-cases where you would want to access the shape information. For example, `F.reshape(x, (0,0,-1))` will keep the first two dimensions unchanged and collapse all further dimensions into the third dimension. See the documentation of the [`F.reshape`](https://mxnet.apache.org/api/python/docs/api/ndarray/_autogen/mxnet.ndarray.reshape.html) for more details. +There are also a lot of operators that support special indices to help with most of the use-cases where you would want to access the shape information. For example, `F.reshape(x, (0,0,-1))` will keep the first two dimensions unchanged and collapse all further dimensions into the third dimension. See the documentation of the [`F.reshape`](/api/python/docs/api/ndarray/ndarray.htmlmxnet.ndarray.reshape.html) for more details. Review comment: ```suggestion There are also a lot of operators that support special indices to help with most of the use-cases where you would want to access the shape information. For example, `F.reshape(x, (0,0,-1))` will keep the first two dimensions unchanged and collapse all further dimensions into the third dimension. See the documentation of the [F.reshape](/api/python/docs/api/ndarray/ndarray.html#mxnet.ndarray.reshape) for more details. ``` This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [incubator-mxnet] aaronmarkham commented on a change in pull request #16500: Fixing broken links
aaronmarkham commented on a change in pull request #16500: Fixing broken links URL: https://github.com/apache/incubator-mxnet/pull/16500#discussion_r335664581 ## File path: docs/python_docs/python/tutorials/packages/autograd/index.md ## @@ -159,7 +159,7 @@ print('is_training:', is_training, output) We called `dropout` while `autograd` was recording this time, so our network was in training mode and we see dropout of the input this time. Since the probability of dropout was 50%, the output is automatically scaled by 1/0.5=2 to preserve the average activation. -We can force some operators to behave as they would during training, even in inference mode. One example is setting `mode='always'` on the [Dropout](https://mxnet.apache.org/api/python/ndarray/ndarray.html?highlight=dropout#mxnet.ndarray.Dropout) operator, but this usage is uncommon. +We can force some operators to behave as they would during training, even in inference mode. One example is setting `mode='always'` on the [Dropout](/api/python/ndarray/ndarray.html?highlight=dropout#mxnet.ndarray.Dropout) operator, but this usage is uncommon. Review comment: ```suggestion We can force some operators to behave as they would during training, even in inference mode. One example is setting `mode='always'` on the [Dropout](/api/python/ndarray/ndarray.html#mxnet.ndarray.Dropout) operator, but this usage is uncommon. ``` This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [incubator-mxnet] aaronmarkham commented on a change in pull request #16500: Fixing broken links
aaronmarkham commented on a change in pull request #16500: Fixing broken links URL: https://github.com/apache/incubator-mxnet/pull/16500#discussion_r335489138 ## File path: docs/python_docs/python/tutorials/packages/onnx/inference_on_onnx_model.md ## @@ -29,7 +29,7 @@ In this tutorial we will: ## Pre-requisite To run the tutorial you will need to have installed the following python modules: -- [MXNet > 1.1.0](http://mxnet.apache.org/install/index.html) +- [MXNet > 1.1.0]() Review comment: ```suggestion - [MXNet > 1.1.0](/get_started) ``` This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [incubator-mxnet] aaronmarkham commented on a change in pull request #16500: Fixing broken links
aaronmarkham commented on a change in pull request #16500: Fixing broken links URL: https://github.com/apache/incubator-mxnet/pull/16500#discussion_r335488905 ## File path: docs/python_docs/python/tutorials/packages/onnx/fine_tuning_gluon.md ## @@ -31,7 +31,7 @@ In this tutorial we will: ## Pre-requisite To run the tutorial you will need to have installed the following python modules: -- [MXNet > 1.1.0](http://mxnet.apache.org/install/index.html) +- [MXNet > 1.1.0]() Review comment: ```suggestion - [MXNet > 1.1.0](/get_started) ``` This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [incubator-mxnet] aaronmarkham commented on a change in pull request #16500: Fixing broken links
aaronmarkham commented on a change in pull request #16500: Fixing broken links URL: https://github.com/apache/incubator-mxnet/pull/16500#discussion_r335488509 ## File path: docs/python_docs/python/tutorials/packages/onnx/super_resolution.md ## @@ -24,7 +24,7 @@ In this tutorial we will: ## Prerequisites This example assumes that the following python packages are installed: -- [mxnet](http://mxnet.apache.org/install/index.html) +- [mxnet]() Review comment: ```suggestion - [mxnet](/get_started) ``` This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [incubator-mxnet] aaronmarkham commented on a change in pull request #16500: Fixing broken links
aaronmarkham commented on a change in pull request #16500: Fixing broken links URL: https://github.com/apache/incubator-mxnet/pull/16500#discussion_r335488231 ## File path: docs/static_site/src/pages/api/faq/distributed_training.md ## @@ -91,7 +91,7 @@ In the case of distributed training though, we would need to divide the dataset Typically, this split of data for each worker happens through the data iterator, on passing the number of parts and the index of parts to iterate over. -Some iterators in MXNet that support this feature are [mxnet.io.MNISTIterator]({{'/api/python/docs/api/gluon-related/_autogen/mxnet.io.MNISTIter.html#mxnet.io.MNISTIter'|relative_url}}) and [mxnet.io.ImageRecordIter]({{'/api/python/docs/api/gluon-related/_autogen/mxnet.io.ImageRecordIter.html#mxnet.io.ImageRecordIter'|relative_url}}). +Some iterators in MXNet that support this feature are [mxnet.io.MNISTIterator]({{'//api/mxnet/io/index.html#mxnet.io.MNISTIter'|relative_url}}) and [mxnet.io.ImageRecordIter]({{'/api/mxnet/io/index.html#mxnet.io.ImageRecordIter'|relative_url}}). Review comment: @ThomasDelteil what are these fancy urls all about? In what context should they be used? This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [incubator-mxnet] aaronmarkham commented on a change in pull request #16500: Fixing broken links
aaronmarkham commented on a change in pull request #16500: Fixing broken links URL: https://github.com/apache/incubator-mxnet/pull/16500#discussion_r335487459 ## File path: docs/python_docs/python/tutorials/deploy/export/onnx.md ## @@ -28,7 +28,7 @@ In this tutorial, we will learn how to use MXNet to ONNX exporter on pre-trained ## Prerequisites To run the tutorial you will need to have installed the following python modules: -- [MXNet >= 1.3.0](http://mxnet.apache.org/install/index.html) +- [MXNet >= 1.3.0]() Review comment: ```suggestion - [MXNet >= 1.3.0](/get_started) ``` This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services