sandeep-krishnamurthy closed pull request #46: Removed torch.html and all its 
references. Fixed Nesterov Momentum education ?
URL: https://github.com/apache/incubator-mxnet-site/pull/46
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/_modules/mxnet/optimizer.html b/_modules/mxnet/optimizer.html
index ec3ead4c..9b29ac84 100644
--- a/_modules/mxnet/optimizer.html
+++ b/_modules/mxnet/optimizer.html
@@ -1267,7 +1267,7 @@ <h1>Source code for mxnet.optimizer</h1><div 
class="highlight"><pre>
 
 <span class="sd">    Much like Adam is essentially RMSprop with 
momentum,</span>
 <span class="sd">    Nadam is Adam RMSprop with Nesterov momentum 
available</span>
-<span class="sd">    at 
http://cs229.stanford.edu/proj2015/054_report.pdf.</span>
+<span class="sd">    at 
https://openreview.net/pdf?id=OM0jvwB8jIp57ZJjtNEZ.</span>
 
 <span class="sd">    This optimizer accepts the following parameters in 
addition to those accepted</span>
 <span class="sd">    by :class:`.Optimizer`.</span>
diff --git a/api/python/model.html b/api/python/model.html
index 321daa14..bd9979b7 100644
--- a/api/python/model.html
+++ b/api/python/model.html
@@ -2187,7 +2187,7 @@ <h1 id="logo-wrap">
 <dd><p>The Nesterov Adam optimizer.</p>
 <p>Much like Adam is essentially RMSprop with momentum,
 Nadam is Adam RMSprop with Nesterov momentum available
-at <a class="reference external" 
href="http://cs229.stanford.edu/proj2015/054_report.pdf";>http://cs229.stanford.edu/proj2015/054_report.pdf</a>.</p>
+at <a class="reference external" 
href="https://openreview.net/pdf?id=OM0jvwB8jIp57ZJjtNEZ";>https://openreview.net/pdf?id=OM0jvwB8jIp57ZJjtNEZ</a>.</p>
 <p>This optimizer accepts the following parameters in addition to those 
accepted
 by <a class="reference internal" 
href="optimization/optimization.html#mxnet.optimizer.Optimizer" 
title="mxnet.optimizer.Optimizer"><code class="xref py py-class docutils 
literal"><span class="pre">Optimizer</span></code></a>.</p>
 <table class="docutils field-list" frame="void" rules="none">
diff --git a/api/python/optimization.html b/api/python/optimization.html
index bb8cf9c6..3c563e60 100644
--- a/api/python/optimization.html
+++ b/api/python/optimization.html
@@ -811,7 +811,7 @@ <h1 id="logo-wrap">
 <dd><p>The Nesterov Adam optimizer.</p>
 <p>Much like Adam is essentially RMSprop with momentum,
 Nadam is Adam RMSprop with Nesterov momentum available
-at <a class="reference external" 
href="http://cs229.stanford.edu/proj2015/054_report.pdf";>http://cs229.stanford.edu/proj2015/054_report.pdf</a>.</p>
+at <a class="reference external" 
href="https://openreview.net/pdf?id=OM0jvwB8jIp57ZJjtNEZ";>https://openreview.net/pdf?id=OM0jvwB8jIp57ZJjtNEZ</a>.</p>
 <p>This optimizer accepts the following parameters in addition to those 
accepted
 by <a class="reference internal" href="#mxnet.optimizer.Optimizer" 
title="mxnet.optimizer.Optimizer"><code class="xref py py-class docutils 
literal"><span class="pre">Optimizer</span></code></a>.</p>
 <table class="docutils field-list" frame="void" rules="none">
diff --git a/api/python/optimization/optimization.html 
b/api/python/optimization/optimization.html
index 03e4c76b..4fb684e1 100644
--- a/api/python/optimization/optimization.html
+++ b/api/python/optimization/optimization.html
@@ -978,7 +978,7 @@ <h1 id="logo-wrap">
 <dd><p>The Nesterov Adam optimizer.</p>
 <p>Much like Adam is essentially RMSprop with momentum,
 Nadam is Adam RMSprop with Nesterov momentum available
-at <a class="reference external" 
href="http://cs229.stanford.edu/proj2015/054_report.pdf";>http://cs229.stanford.edu/proj2015/054_report.pdf</a>.</p>
+at <a class="reference external" 
href="https://openreview.net/pdf?id=OM0jvwB8jIp57ZJjtNEZ";>https://openreview.net/pdf?id=OM0jvwB8jIp57ZJjtNEZ</a>.</p>
 <p>This optimizer accepts the following parameters in addition to those 
accepted
 by <a class="reference internal" href="#mxnet.optimizer.Optimizer" 
title="mxnet.optimizer.Optimizer"><code class="xref py py-class docutils 
literal"><span class="pre">Optimizer</span></code></a>.</p>
 <table class="docutils field-list" frame="void" rules="none">
diff --git a/faq/develop_and_hack.html b/faq/develop_and_hack.html
index b105ca99..1f068729 100644
--- a/faq/develop_and_hack.html
+++ b/faq/develop_and_hack.html
@@ -207,7 +207,6 @@ <h1 id="logo-wrap">
 <div class="toctree-wrapper compound">
 <ul>
 <li class="toctree-l1"><a class="reference internal" href="new_op.html">Create 
new operators</a></li>
-<li class="toctree-l1"><a class="reference internal" href="torch.html">Use 
Torch from MXNet</a></li>
 <li class="toctree-l1"><a class="reference internal" href="env_var.html">Set 
environment variables of MXNet</a></li>
 </ul>
 </div>
diff --git a/faq/index.html b/faq/index.html
index aedc70fd..c74d9f6c 100644
--- a/faq/index.html
+++ b/faq/index.html
@@ -229,7 +229,6 @@ <h1 id="logo-wrap">
 <li class="toctree-l3"><a class="reference external" 
href="https://mxnet.incubator.apache.org/how_to/add_op_in_backend.html";>How do 
I implement operators in MXNet backend?</a></li>
 <li class="toctree-l3"><a class="reference external" 
href="https://mxnet.incubator.apache.org/how_to/new_op.html";>How do I create 
new operators in MXNet?</a></li>
 <li class="toctree-l3"><a class="reference external" 
href="https://mxnet.incubator.apache.org/how_to/env_var.html";>How do I set 
MXNet?s environmental variables?</a></li>
-<li class="toctree-l3"><a class="reference external" 
href="https://mxnet.incubator.apache.org/how_to/torch.html";>How do I use MXNet 
as a front end for Torch?</a></li>
 </ul>
 </li>
 <li class="toctree-l2"><a class="reference internal" 
href="#questions-about-using-mxnet">Questions about Using MXNet</a></li>
@@ -303,7 +302,6 @@ <h1 id="logo-wrap">
 <li class="toctree-l1"><a class="reference external" 
href="https://mxnet.incubator.apache.org/how_to/add_op_in_backend.html";>How do 
I implement operators in MXNet backend?</a></li>
 <li class="toctree-l1"><a class="reference external" 
href="https://mxnet.incubator.apache.org/how_to/new_op.html";>How do I create 
new operators in MXNet?</a></li>
 <li class="toctree-l1"><a class="reference external" 
href="https://mxnet.incubator.apache.org/how_to/env_var.html";>How do I set 
MXNet?s environmental variables?</a></li>
-<li class="toctree-l1"><a class="reference external" 
href="https://mxnet.incubator.apache.org/how_to/torch.html";>How do I use MXNet 
as a front end for Torch?</a></li>
 </ul>
 </div>
 </div>
diff --git a/faq/torch.html b/faq/torch.html
deleted file mode 100644
index f3553a0c..00000000
--- a/faq/torch.html
+++ /dev/null
@@ -1,315 +0,0 @@
-<!DOCTYPE html>
-
-<html lang="en">
-<head>
-<meta charset="utf-8"/>
-<meta content="IE=edge" http-equiv="X-UA-Compatible"/>
-<meta content="width=device-width, initial-scale=1" name="viewport"/>
-<title>How to Use MXNet As an (Almost) Full-function Torch Front End ? mxnet  
documentation</title>
-<link crossorigin="anonymous" 
href="https://maxcdn.bootstrapcdn.com/bootstrap/3.3.6/css/bootstrap.min.css"; 
integrity="sha384-1q8mTJOASx8j1Au+a5WDVnPi2lkFfwwEAa8hDDdjZlpLegxhjVME1fgjWPGmkzs7"
 rel="stylesheet"/>
-<link 
href="https://maxcdn.bootstrapcdn.com/font-awesome/4.5.0/css/font-awesome.min.css";
 rel="stylesheet"/>
-<link href="../_static/basic.css" rel="stylesheet" type="text/css">
-<link href="../_static/pygments.css" rel="stylesheet" type="text/css">
-<link href="../_static/mxnet.css" rel="stylesheet" type="text/css"/>
-<script type="text/javascript">
-      var DOCUMENTATION_OPTIONS = {
-        URL_ROOT:    '../',
-        VERSION:     '',
-        COLLAPSE_INDEX: false,
-        FILE_SUFFIX: '.html',
-        HAS_SOURCE:  true,
-        SOURCELINK_SUFFIX: ''
-      };
-    </script>
-<script src="https://code.jquery.com/jquery-1.11.1.min.js"; 
type="text/javascript"></script>
-<script src="../_static/underscore.js" type="text/javascript"></script>
-<script src="../_static/searchtools_custom.js" type="text/javascript"></script>
-<script src="../_static/doctools.js" type="text/javascript"></script>
-<script src="../_static/selectlang.js" type="text/javascript"></script>
-<script 
src="https://cdnjs.cloudflare.com/ajax/libs/mathjax/2.7.1/MathJax.js?config=TeX-AMS-MML_HTMLorMML";
 type="text/javascript"></script>
-<script type="text/javascript"> jQuery(function() { 
Search.loadIndex("/searchindex.js"); Search.init();}); </script>
-<script>
-      
(function(i,s,o,g,r,a,m){i['GoogleAnalyticsObject']=r;i[r]=i[r]||function(){
-      (i[r].q=i[r].q||[]).push(arguments)},i[r].l=1*new
-      Date();a=s.createElement(o),
-      
m=s.getElementsByTagName(o)[0];a.async=1;a.src=g;m.parentNode.insertBefore(a,m)
-      
})(window,document,'script','https://www.google-analytics.com/analytics.js','ga');
-
-      ga('create', 'UA-96378503-1', 'auto');
-      ga('send', 'pageview');
-
-    </script>
-<!-- -->
-<!-- <script type="text/javascript" src="../_static/jquery.js"></script> -->
-<!-- -->
-<!-- <script type="text/javascript" src="../_static/underscore.js"></script> 
-->
-<!-- -->
-<!-- <script type="text/javascript" src="../_static/doctools.js"></script> -->
-<!-- -->
-<!-- <script type="text/javascript" 
src="https://cdn.mathjax.org/mathjax/latest/MathJax.js?config=TeX-AMS-MML_HTMLorMML";></script>
 -->
-<!-- -->
-<link 
href="https://raw.githubusercontent.com/dmlc/web-data/master/mxnet/image/mxnet-icon.png";
 rel="icon" type="image/png"/>
-</link></link></head>
-<body 
background="https://raw.githubusercontent.com/dmlc/web-data/master/mxnet/image/mxnet-background-compressed.jpeg";
 role="document">
-<div class="content-block"><div class="navbar navbar-fixed-top">
-<div class="container" id="navContainer">
-<div class="innder" id="header-inner">
-<h1 id="logo-wrap">
-<a href="../" id="logo"><img 
src="https://raw.githubusercontent.com/dmlc/web-data/master/mxnet/image/mxnet_logo.png"/></a>
-</h1>
-<nav class="nav-bar" id="main-nav">
-<a class="main-nav-link" href="../install/index.html">Install</a>
-<a class="main-nav-link" href="../tutorials/index.html">Tutorials</a>
-<span id="dropdown-menu-position-anchor">
-<a aria-expanded="true" aria-haspopup="true" class="main-nav-link 
dropdown-toggle" data-toggle="dropdown" href="#" role="button">Gluon <span 
class="caret"></span></a>
-<ul class="dropdown-menu navbar-menu" id="package-dropdown-menu">
-<li><a class="main-nav-link" href="../gluon/index.html">About</a></li>
-<li><a class="main-nav-link" href="http://gluon.mxnet.io";>Tutorials</a></li>
-</ul>
-</span>
-<span id="dropdown-menu-position-anchor">
-<a aria-expanded="true" aria-haspopup="true" class="main-nav-link 
dropdown-toggle" data-toggle="dropdown" href="#" role="button">API <span 
class="caret"></span></a>
-<ul class="dropdown-menu navbar-menu" id="package-dropdown-menu">
-<li><a class="main-nav-link" href="../api/python/index.html">Python</a></li>
-<li><a class="main-nav-link" href="../api/scala/index.html">Scala</a></li>
-<li><a class="main-nav-link" href="../api/r/index.html">R</a></li>
-<li><a class="main-nav-link" href="../api/julia/index.html">Julia</a></li>
-<li><a class="main-nav-link" href="../api/c++/index.html">C++</a></li>
-<li><a class="main-nav-link" href="../api/perl/index.html">Perl</a></li>
-</ul>
-</span>
-<span id="dropdown-menu-position-anchor-docs">
-<a aria-expanded="true" aria-haspopup="true" class="main-nav-link 
dropdown-toggle" data-toggle="dropdown" href="#" role="button">Docs <span 
class="caret"></span></a>
-<ul class="dropdown-menu navbar-menu" id="package-dropdown-menu-docs">
-<li><a class="main-nav-link" href="../faq/index.html">FAQ</a></li>
-<li><a class="main-nav-link" 
href="../architecture/index.html">Architecture</a></li>
-<li><a class="main-nav-link" 
href="https://github.com/apache/incubator-mxnet/tree/1.0.0/example";>Examples</a></li>
-<li><a class="main-nav-link" href="../model_zoo/index.html">Model Zoo</a></li>
-</ul>
-</span>
-<a class="main-nav-link" href="https://github.com/dmlc/mxnet";>Github</a>
-<span id="dropdown-menu-position-anchor-community">
-<a aria-expanded="true" aria-haspopup="true" class="main-nav-link 
dropdown-toggle" data-toggle="dropdown" href="#" role="button">Community <span 
class="caret"></span></a>
-<ul class="dropdown-menu navbar-menu" id="package-dropdown-menu-community">
-<li><a class="main-nav-link" href="../community/index.html">Community</a></li>
-<li><a class="main-nav-link" 
href="../community/contribute.html">Contribute</a></li>
-<li><a class="main-nav-link" href="../community/powered_by.html">Powered 
By</a></li>
-</ul>
-</span>
-<a class="main-nav-link" href="http://discuss.mxnet.io";>Discuss</a>
-<span id="dropdown-menu-position-anchor-version" style="position: relative"><a 
href="#" class="main-nav-link dropdown-toggle" data-toggle="dropdown" 
role="button" aria-haspopup="true" aria-expanded="true">Versions(1.0.0)<span 
class="caret"></span></a><ul id="package-dropdown-menu" 
class="dropdown-menu"><li><a class="main-nav-link" 
href=https://mxnet.incubator.apache.org/>1.0.0</a></li><li><a 
class="main-nav-link" 
href=https://mxnet.incubator.apache.org/versions/0.12.1/index.html>0.12.1</a></li><li><a
 class="main-nav-link" 
href=https://mxnet.incubator.apache.org/versions/0.12.0/index.html>0.12.0</a></li><li><a
 class="main-nav-link" 
href=https://mxnet.incubator.apache.org/versions/0.11.0/index.html>0.11.0</a></li><li><a
 class="main-nav-link" 
href=https://mxnet.incubator.apache.org/versions/master/index.html>master</a></li></ul></span></nav>
-<script> function getRootPath(){ return "../" } </script>
-<div class="burgerIcon dropdown">
-<a class="dropdown-toggle" data-toggle="dropdown" href="#" role="button">?</a>
-<ul class="dropdown-menu" id="burgerMenu">
-<li><a href="../install/index.html">Install</a></li>
-<li><a class="main-nav-link" href="../tutorials/index.html">Tutorials</a></li>
-<li class="dropdown-submenu">
-<a href="#" tabindex="-1">Community</a>
-<ul class="dropdown-menu">
-<li><a href="../community/index.html" tabindex="-1">Community</a></li>
-<li><a href="../community/contribute.html" tabindex="-1">Contribute</a></li>
-<li><a href="../community/powered_by.html" tabindex="-1">Powered By</a></li>
-</ul>
-</li>
-<li class="dropdown-submenu">
-<a href="#" tabindex="-1">API</a>
-<ul class="dropdown-menu">
-<li><a href="../api/python/index.html" tabindex="-1">Python</a>
-</li>
-<li><a href="../api/scala/index.html" tabindex="-1">Scala</a>
-</li>
-<li><a href="../api/r/index.html" tabindex="-1">R</a>
-</li>
-<li><a href="../api/julia/index.html" tabindex="-1">Julia</a>
-</li>
-<li><a href="../api/c++/index.html" tabindex="-1">C++</a>
-</li>
-<li><a href="../api/perl/index.html" tabindex="-1">Perl</a>
-</li>
-</ul>
-</li>
-<li class="dropdown-submenu">
-<a href="#" tabindex="-1">Docs</a>
-<ul class="dropdown-menu">
-<li><a href="../tutorials/index.html" tabindex="-1">Tutorials</a></li>
-<li><a href="../faq/index.html" tabindex="-1">FAQ</a></li>
-<li><a href="../architecture/index.html" tabindex="-1">Architecture</a></li>
-<li><a href="https://github.com/apache/incubator-mxnet/tree/1.0.0/example"; 
tabindex="-1">Examples</a></li>
-<li><a href="../model_zoo/index.html" tabindex="-1">Model Zoo</a></li>
-</ul>
-</li>
-<li><a href="../architecture/index.html">Architecture</a></li>
-<li><a class="main-nav-link" 
href="https://github.com/dmlc/mxnet";>Github</a></li>
-<li id="dropdown-menu-position-anchor-version-mobile" class="dropdown-submenu" 
style="position: relative"><a href="#" tabindex="-1">Versions(1.0.0)</a><ul 
class="dropdown-menu"><li><a tabindex="-1" 
href=https://mxnet.incubator.apache.org/>1.0.0</a></li><li><a tabindex="-1" 
href=https://mxnet.incubator.apache.org/versions/0.12.1/index.html>0.12.1</a></li><li><a
 tabindex="-1" 
href=https://mxnet.incubator.apache.org/versions/0.12.0/index.html>0.12.0</a></li><li><a
 tabindex="-1" 
href=https://mxnet.incubator.apache.org/versions/0.11.0/index.html>0.11.0</a></li><li><a
 tabindex="-1" 
href=https://mxnet.incubator.apache.org/versions/master/index.html>master</a></li></ul></li></ul>
-</div>
-<div class="plusIcon dropdown">
-<a class="dropdown-toggle" data-toggle="dropdown" href="#" role="button"><span 
aria-hidden="true" class="glyphicon glyphicon-plus"></span></a>
-<ul class="dropdown-menu dropdown-menu-right" id="plusMenu"></ul>
-</div>
-<div id="search-input-wrap">
-<form action="../search.html" autocomplete="off" class="" method="get" 
role="search">
-<div class="form-group inner-addon left-addon">
-<i class="glyphicon glyphicon-search"></i>
-<input class="form-control" name="q" placeholder="Search" type="text"/>
-</div>
-<input name="check_keywords" type="hidden" value="yes">
-<input name="area" type="hidden" value="default"/>
-</input></form>
-<div id="search-preview"></div>
-</div>
-<div id="searchIcon">
-<span aria-hidden="true" class="glyphicon glyphicon-search"></span>
-</div>
-<!-- <div id="lang-select-wrap"> -->
-<!--   <label id="lang-select-label"> -->
-<!--     <\!-- <i class="fa fa-globe"></i> -\-> -->
-<!--     <span></span> -->
-<!--   </label> -->
-<!--   <select id="lang-select"> -->
-<!--     <option value="en">Eng</option> -->
-<!--     <option value="zh">??</option> -->
-<!--   </select> -->
-<!-- </div> -->
-<!--     <a id="mobile-nav-toggle">
-        <span class="mobile-nav-toggle-bar"></span>
-        <span class="mobile-nav-toggle-bar"></span>
-        <span class="mobile-nav-toggle-bar"></span>
-      </a> -->
-</div>
-</div>
-</div>
-<script type="text/javascript">
-        $('body').css('background', 'white');
-    </script>
-<div class="container">
-<div class="row">
-<div aria-label="main navigation" class="sphinxsidebar leftsidebar" 
role="navigation">
-<div class="sphinxsidebarwrapper">
-<ul>
-<li class="toctree-l1"><a class="reference internal" 
href="../api/python/index.html">Python Documents</a></li>
-<li class="toctree-l1"><a class="reference internal" 
href="../api/r/index.html">R Documents</a></li>
-<li class="toctree-l1"><a class="reference internal" 
href="../api/julia/index.html">Julia Documents</a></li>
-<li class="toctree-l1"><a class="reference internal" 
href="../api/c++/index.html">C++ Documents</a></li>
-<li class="toctree-l1"><a class="reference internal" 
href="../api/scala/index.html">Scala Documents</a></li>
-<li class="toctree-l1"><a class="reference internal" 
href="../api/perl/index.html">Perl Documents</a></li>
-<li class="toctree-l1"><a class="reference internal" href="index.html">HowTo 
Documents</a></li>
-<li class="toctree-l1"><a class="reference internal" 
href="../architecture/index.html">System Documents</a></li>
-<li class="toctree-l1"><a class="reference internal" 
href="../tutorials/index.html">Tutorials</a></li>
-<li class="toctree-l1"><a class="reference internal" 
href="../community/index.html">Community</a></li>
-</ul>
-</div>
-</div>
-<div class="content">
-<div class="page-tracker"></div>
-<div class="section" 
id="how-to-use-mxnet-as-an-almost-full-function-torch-front-end">
-<span 
id="how-to-use-mxnet-as-an-almost-full-function-torch-front-end"></span><h1>How 
to Use MXNet As an (Almost) Full-function Torch Front End<a class="headerlink" 
href="#how-to-use-mxnet-as-an-almost-full-function-torch-front-end" 
title="Permalink to this headline">?</a></h1>
-<p>This topic demonstrates how to use MXNet as a front end to two of Torch?s 
major functionalities:</p>
-<ul class="simple">
-<li>Call Torch?s tensor mathematical functions with MXNet.NDArray</li>
-<li>Embed Torch?s neural network modules (layers) into MXNet?s symbolic 
graph</li>
-</ul>
-<div class="section" id="compile-with-torch">
-<span id="compile-with-torch"></span><h2>Compile with Torch<a 
class="headerlink" href="#compile-with-torch" title="Permalink to this 
headline">?</a></h2>
-<ul class="simple">
-<li>Install Torch using the <a class="reference external" 
href="http://torch.ch/docs/getting-started.html";>official guide</a>.<ul>
-<li>If you haven?t already done so, copy <code class="docutils literal"><span 
class="pre">make/config.mk</span></code> (Linux) or <code class="docutils 
literal"><span class="pre">make/osx.mk</span></code> (Mac) into the MXNet root 
folder as <code class="docutils literal"><span 
class="pre">config.mk</span></code>. In <code class="docutils literal"><span 
class="pre">config.mk</span></code> uncomment the lines <code class="docutils 
literal"><span class="pre">TORCH_PATH</span> <span class="pre">=</span> <span 
class="pre">$(HOME)/torch</span></code> and <code class="docutils 
literal"><span class="pre">MXNET_PLUGINS</span> <span class="pre">+=</span> 
<span class="pre">plugin/torch/torch.mk</span></code>.</li>
-<li>By default, Torch should be installed in your home folder (so <code 
class="docutils literal"><span class="pre">TORCH_PATH</span> <span 
class="pre">=</span> <span class="pre">$(HOME)/torch</span></code>). Modify 
TORCH_PATH to point to your torch installation, if necessary.</li>
-</ul>
-</li>
-<li>Run <code class="docutils literal"><span class="pre">make</span> <span 
class="pre">clean</span> <span class="pre">&amp;&amp;</span> <span 
class="pre">make</span></code> to build MXNet with Torch support.</li>
-</ul>
-</div>
-<div class="section" id="tensor-mathematics">
-<span id="tensor-mathematics"></span><h2>Tensor Mathematics<a 
class="headerlink" href="#tensor-mathematics" title="Permalink to this 
headline">?</a></h2>
-<p>The mxnet.th module supports calling Torch?s tensor mathematical functions 
with mxnet.nd.NDArray. See <a class="reference external" 
href="https://github.com/dmlc/mxnet/blob/master/example/torch/torch_function.py";>complete
 code</a>:</p>
-<div class="highlight-python"><div class="highlight"><pre><span></span>    
<span class="kn">import</span> <span class="nn">mxnet</span> <span 
class="kn">as</span> <span class="nn">mx</span>
-    <span class="n">x</span> <span class="o">=</span> <span 
class="n">mx</span><span class="o">.</span><span class="n">th</span><span 
class="o">.</span><span class="n">randn</span><span class="p">(</span><span 
class="mi">2</span><span class="p">,</span> <span class="mi">2</span><span 
class="p">,</span> <span class="n">ctx</span><span class="o">=</span><span 
class="n">mx</span><span class="o">.</span><span class="n">cpu</span><span 
class="p">(</span><span class="mi">0</span><span class="p">))</span>
-    <span class="k">print</span> <span class="n">x</span><span 
class="o">.</span><span class="n">asnumpy</span><span class="p">()</span>
-    <span class="n">y</span> <span class="o">=</span> <span 
class="n">mx</span><span class="o">.</span><span class="n">th</span><span 
class="o">.</span><span class="n">abs</span><span class="p">(</span><span 
class="n">x</span><span class="p">)</span>
-    <span class="k">print</span> <span class="n">y</span><span 
class="o">.</span><span class="n">asnumpy</span><span class="p">()</span>
-
-    <span class="n">x</span> <span class="o">=</span> <span 
class="n">mx</span><span class="o">.</span><span class="n">th</span><span 
class="o">.</span><span class="n">randn</span><span class="p">(</span><span 
class="mi">2</span><span class="p">,</span> <span class="mi">2</span><span 
class="p">,</span> <span class="n">ctx</span><span class="o">=</span><span 
class="n">mx</span><span class="o">.</span><span class="n">cpu</span><span 
class="p">(</span><span class="mi">0</span><span class="p">))</span>
-    <span class="k">print</span> <span class="n">x</span><span 
class="o">.</span><span class="n">asnumpy</span><span class="p">()</span>
-    <span class="n">mx</span><span class="o">.</span><span 
class="n">th</span><span class="o">.</span><span class="n">abs</span><span 
class="p">(</span><span class="n">x</span><span class="p">,</span> <span 
class="n">x</span><span class="p">)</span> <span class="c1"># in-place</span>
-    <span class="k">print</span> <span class="n">x</span><span 
class="o">.</span><span class="n">asnumpy</span><span class="p">()</span>
-</pre></div>
-</div>
-<p>For help, use the <code class="docutils literal"><span 
class="pre">help(mx.th)</span></code> command.</p>
-<p>We?ve added support for most common functions listed on <a class="reference 
external" 
href="https://github.com/torch/torch7/blob/master/doc/maths.md";>Torch?s 
documentation page</a>.
-If you find that the function you need is not supported, you can easily 
register it in <code class="docutils literal"><span 
class="pre">mxnet_root/plugin/torch/torch_function.cc</span></code> by using 
the existing registrations as examples.</p>
-</div>
-<div class="section" id="torch-modules-layers">
-<span id="torch-modules-layers"></span><h2>Torch Modules (Layers)<a 
class="headerlink" href="#torch-modules-layers" title="Permalink to this 
headline">?</a></h2>
-<p>MXNet supports Torch?s neural network modules through  the<code 
class="docutils literal"><span 
class="pre">mxnet.symbol.TorchModule</span></code> symbol.
-For example, the following code defines a three-layer DNN for classifying 
MNIST digits (<a class="reference external" 
href="https://github.com/dmlc/mxnet/blob/master/example/torch/torch_module.py";>full
 code</a>):</p>
-<div class="highlight-python"><div class="highlight"><pre><span></span>    
<span class="n">data</span> <span class="o">=</span> <span 
class="n">mx</span><span class="o">.</span><span class="n">symbol</span><span 
class="o">.</span><span class="n">Variable</span><span class="p">(</span><span 
class="s1">'data'</span><span class="p">)</span>
-    <span class="n">fc1</span> <span class="o">=</span> <span 
class="n">mx</span><span class="o">.</span><span class="n">symbol</span><span 
class="o">.</span><span class="n">TorchModule</span><span 
class="p">(</span><span class="n">data_0</span><span class="o">=</span><span 
class="n">data</span><span class="p">,</span> <span 
class="n">lua_string</span><span class="o">=</span><span 
class="s1">'nn.Linear(784, 128)'</span><span class="p">,</span> <span 
class="n">num_data</span><span class="o">=</span><span class="mi">1</span><span 
class="p">,</span> <span class="n">num_params</span><span 
class="o">=</span><span class="mi">2</span><span class="p">,</span> <span 
class="n">num_outputs</span><span class="o">=</span><span 
class="mi">1</span><span class="p">,</span> <span class="n">name</span><span 
class="o">=</span><span class="s1">'fc1'</span><span class="p">)</span>
-    <span class="n">act1</span> <span class="o">=</span> <span 
class="n">mx</span><span class="o">.</span><span class="n">symbol</span><span 
class="o">.</span><span class="n">TorchModule</span><span 
class="p">(</span><span class="n">data_0</span><span class="o">=</span><span 
class="n">fc1</span><span class="p">,</span> <span 
class="n">lua_string</span><span class="o">=</span><span 
class="s1">'nn.ReLU(false)'</span><span class="p">,</span> <span 
class="n">num_data</span><span class="o">=</span><span class="mi">1</span><span 
class="p">,</span> <span class="n">num_params</span><span 
class="o">=</span><span class="mi">0</span><span class="p">,</span> <span 
class="n">num_outputs</span><span class="o">=</span><span 
class="mi">1</span><span class="p">,</span> <span class="n">name</span><span 
class="o">=</span><span class="s1">'relu1'</span><span class="p">)</span>
-    <span class="n">fc2</span> <span class="o">=</span> <span 
class="n">mx</span><span class="o">.</span><span class="n">symbol</span><span 
class="o">.</span><span class="n">TorchModule</span><span 
class="p">(</span><span class="n">data_0</span><span class="o">=</span><span 
class="n">act1</span><span class="p">,</span> <span 
class="n">lua_string</span><span class="o">=</span><span 
class="s1">'nn.Linear(128, 64)'</span><span class="p">,</span> <span 
class="n">num_data</span><span class="o">=</span><span class="mi">1</span><span 
class="p">,</span> <span class="n">num_params</span><span 
class="o">=</span><span class="mi">2</span><span class="p">,</span> <span 
class="n">num_outputs</span><span class="o">=</span><span 
class="mi">1</span><span class="p">,</span> <span class="n">name</span><span 
class="o">=</span><span class="s1">'fc2'</span><span class="p">)</span>
-    <span class="n">act2</span> <span class="o">=</span> <span 
class="n">mx</span><span class="o">.</span><span class="n">symbol</span><span 
class="o">.</span><span class="n">TorchModule</span><span 
class="p">(</span><span class="n">data_0</span><span class="o">=</span><span 
class="n">fc2</span><span class="p">,</span> <span 
class="n">lua_string</span><span class="o">=</span><span 
class="s1">'nn.ReLU(false)'</span><span class="p">,</span> <span 
class="n">num_data</span><span class="o">=</span><span class="mi">1</span><span 
class="p">,</span> <span class="n">num_params</span><span 
class="o">=</span><span class="mi">0</span><span class="p">,</span> <span 
class="n">num_outputs</span><span class="o">=</span><span 
class="mi">1</span><span class="p">,</span> <span class="n">name</span><span 
class="o">=</span><span class="s1">'relu2'</span><span class="p">)</span>
-    <span class="n">fc3</span> <span class="o">=</span> <span 
class="n">mx</span><span class="o">.</span><span class="n">symbol</span><span 
class="o">.</span><span class="n">TorchModule</span><span 
class="p">(</span><span class="n">data_0</span><span class="o">=</span><span 
class="n">act2</span><span class="p">,</span> <span 
class="n">lua_string</span><span class="o">=</span><span 
class="s1">'nn.Linear(64, 10)'</span><span class="p">,</span> <span 
class="n">num_data</span><span class="o">=</span><span class="mi">1</span><span 
class="p">,</span> <span class="n">num_params</span><span 
class="o">=</span><span class="mi">2</span><span class="p">,</span> <span 
class="n">num_outputs</span><span class="o">=</span><span 
class="mi">1</span><span class="p">,</span> <span class="n">name</span><span 
class="o">=</span><span class="s1">'fc3'</span><span class="p">)</span>
-    <span class="n">mlp</span> <span class="o">=</span> <span 
class="n">mx</span><span class="o">.</span><span class="n">symbol</span><span 
class="o">.</span><span class="n">SoftmaxOutput</span><span 
class="p">(</span><span class="n">data</span><span class="o">=</span><span 
class="n">fc3</span><span class="p">,</span> <span class="n">name</span><span 
class="o">=</span><span class="s1">'softmax'</span><span class="p">)</span>
-</pre></div>
-</div>
-<p>Let?s break it down. First <code class="docutils literal"><span 
class="pre">data</span> <span class="pre">=</span> <span 
class="pre">mx.symbol.Variable('data')</span></code> defines a Variable as a 
placeholder for input.
-Then, it?s fed through Torch?s nn modules with:
-<code class="docutils literal"><span class="pre">fc1</span> <span 
class="pre">=</span> <span 
class="pre">mx.symbol.TorchModule(data_0=data,</span> <span 
class="pre">lua_string='nn.Linear(784,</span> <span class="pre">128)',</span> 
<span class="pre">num_data=1,</span> <span class="pre">num_params=2,</span> 
<span class="pre">num_outputs=1,</span> <span 
class="pre">name='fc1')</span></code>.
-To use Torch?s criterion as loss functions, you can replace the last line 
with:</p>
-<div class="highlight-python"><div class="highlight"><pre><span></span>    
<span class="n">logsoftmax</span> <span class="o">=</span> <span 
class="n">mx</span><span class="o">.</span><span class="n">symbol</span><span 
class="o">.</span><span class="n">TorchModule</span><span 
class="p">(</span><span class="n">data_0</span><span class="o">=</span><span 
class="n">fc3</span><span class="p">,</span> <span 
class="n">lua_string</span><span class="o">=</span><span 
class="s1">'nn.LogSoftMax()'</span><span class="p">,</span> <span 
class="n">num_data</span><span class="o">=</span><span class="mi">1</span><span 
class="p">,</span> <span class="n">num_params</span><span 
class="o">=</span><span class="mi">0</span><span class="p">,</span> <span 
class="n">num_outputs</span><span class="o">=</span><span 
class="mi">1</span><span class="p">,</span> <span class="n">name</span><span 
class="o">=</span><span class="s1">'logsoftmax'</span><span class="p">)</span>
-    <span class="c1"># Torch's label starts from 1</span>
-    <span class="n">label</span> <span class="o">=</span> <span 
class="n">mx</span><span class="o">.</span><span class="n">symbol</span><span 
class="o">.</span><span class="n">Variable</span><span class="p">(</span><span 
class="s1">'softmax_label'</span><span class="p">)</span> <span 
class="o">+</span> <span class="mi">1</span>
-    <span class="n">mlp</span> <span class="o">=</span> <span 
class="n">mx</span><span class="o">.</span><span class="n">symbol</span><span 
class="o">.</span><span class="n">TorchCriterion</span><span 
class="p">(</span><span class="n">data</span><span class="o">=</span><span 
class="n">logsoftmax</span><span class="p">,</span> <span 
class="n">label</span><span class="o">=</span><span class="n">label</span><span 
class="p">,</span> <span class="n">lua_string</span><span 
class="o">=</span><span class="s1">'nn.ClassNLLCriterion()'</span><span 
class="p">,</span> <span class="n">name</span><span class="o">=</span><span 
class="s1">'softmax'</span><span class="p">)</span>
-</pre></div>
-</div>
-<p>The input to the nn module is named data_i for i = 0 ... num_data-1. <code 
class="docutils literal"><span class="pre">lua_string</span></code> is a single 
Lua statement that creates the module object.
-For Torch?s built-in module, this is simply <code class="docutils 
literal"><span class="pre">nn.module_name(arguments)</span></code>.
-If you are using a custom module, place it in a .lua script file and load it 
with <code class="docutils literal"><span class="pre">require</span> <span 
class="pre">'module_file.lua'</span></code> if your script returns a torch.nn 
object, or <code class="docutils literal"><span class="pre">(require</span> 
<span class="pre">'module_file.lua')()</span></code> if your script returns a 
torch.nn class.</p>
-</div>
-</div>
-</div>
-</div>
-<div aria-label="main navigation" class="sphinxsidebar rightsidebar" 
role="navigation">
-<div class="sphinxsidebarwrapper">
-<h3><a href="../index.html">Table Of Contents</a></h3>
-<ul>
-<li><a class="reference internal" href="#">How to Use MXNet As an (Almost) 
Full-function Torch Front End</a><ul>
-<li><a class="reference internal" href="#compile-with-torch">Compile with 
Torch</a></li>
-<li><a class="reference internal" href="#tensor-mathematics">Tensor 
Mathematics</a></li>
-<li><a class="reference internal" href="#torch-modules-layers">Torch Modules 
(Layers)</a></li>
-</ul>
-</li>
-</ul>
-</div>
-</div>
-</div><div class="footer">
-<div class="section-disclaimer">
-<div class="container">
-<div>
-<img height="60" 
src="https://raw.githubusercontent.com/dmlc/web-data/master/mxnet/image/apache_incubator_logo.png"/>
-<p>
-            Apache MXNet is an effort undergoing incubation at The Apache 
Software Foundation (ASF), <strong>sponsored by the <i>Apache 
Incubator</i></strong>. Incubation is required of all newly accepted projects 
until a further review indicates that the infrastructure, communications, and 
decision making process have stabilized in a manner consistent with other 
successful ASF projects. While incubation status is not necessarily a 
reflection of the completeness or stability of the code, it does indicate that 
the project has yet to be fully endorsed by the ASF.
-        </p>
-<p>
-            "Copyright ? 2017, The Apache Software Foundation
-            Apache MXNet, MXNet, Apache, the Apache feather, and the Apache 
MXNet project logo are either registered trademarks or trademarks of the Apache 
Software Foundation."
-        </p>
-</div>
-</div>
-</div>
-</div> <!-- pagename != index -->
-</div>
-<script crossorigin="anonymous" 
integrity="sha384-0mSbJDEHialfmuBBQP6A4Qrprq5OVfW37PRR3j5ELqxss1yVqOtnepnHVP9aJ7xS"
 
src="https://maxcdn.bootstrapcdn.com/bootstrap/3.3.6/js/bootstrap.min.js";></script>
-<script src="../_static/js/sidebar.js" type="text/javascript"></script>
-<script src="../_static/js/search.js" type="text/javascript"></script>
-<script src="../_static/js/navbar.js" type="text/javascript"></script>
-<script src="../_static/js/clipboard.min.js" type="text/javascript"></script>
-<script src="../_static/js/copycode.js" type="text/javascript"></script>
-<script src="../_static/js/page.js" type="text/javascript"></script>
-<script type="text/javascript">
-        $('body').ready(function () {
-            $('body').css('visibility', 'visible');
-        });
-    </script>
-</body>
-</html>
\ No newline at end of file
diff --git a/how_to/develop_and_hack.html b/how_to/develop_and_hack.html
index ddc5aa9c..bbb4c1d7 100644
--- a/how_to/develop_and_hack.html
+++ b/how_to/develop_and_hack.html
@@ -169,7 +169,6 @@ <h1 id="logo-wrap">
 <div class="toctree-wrapper compound">
 <ul>
 <li class="toctree-l1"><a class="reference internal" href="new_op.html">Create 
new operators</a></li>
-<li class="toctree-l1"><a class="reference internal" href="torch.html">Use 
Torch from MXNet</a></li>
 <li class="toctree-l1"><a class="reference internal" href="env_var.html">Set 
environment variables of MXNet</a></li>
 </ul>
 </div>
diff --git a/how_to/index.html b/how_to/index.html
index 380d51cb..f88b0b62 100644
--- a/how_to/index.html
+++ b/how_to/index.html
@@ -188,7 +188,6 @@ <h1 id="logo-wrap">
 <li class="toctree-l3"><a class="reference external" 
href="https://mxnet.incubator.apache.org/community/contribute.html";>How do I 
contribute a patch to MXNet?</a></li>
 <li class="toctree-l3"><a class="reference external" 
href="https://mxnet.incubator.apache.org/how_to/new_op.html";>How do I create 
new operators in MXNet?</a></li>
 <li class="toctree-l3"><a class="reference external" 
href="https://mxnet.incubator.apache.org/how_to/env_var.html";>How do I set 
MXNet?s environmental variables?</a></li>
-<li class="toctree-l3"><a class="reference external" 
href="https://mxnet.incubator.apache.org/how_to/torch.html";>How do I use MXNet 
as a front end for Torch?</a></li>
 </ul>
 </li>
 <li class="toctree-l2"><a class="reference internal" 
href="#questions-about-using-mxnet">Questions about Using MXNet</a></li>
@@ -251,7 +250,6 @@ <h1 id="logo-wrap">
 <li class="toctree-l1"><a class="reference external" 
href="https://mxnet.incubator.apache.org/community/contribute.html";>How do I 
contribute a patch to MXNet?</a></li>
 <li class="toctree-l1"><a class="reference external" 
href="https://mxnet.incubator.apache.org/how_to/new_op.html";>How do I create 
new operators in MXNet?</a></li>
 <li class="toctree-l1"><a class="reference external" 
href="https://mxnet.incubator.apache.org/how_to/env_var.html";>How do I set 
MXNet?s environmental variables?</a></li>
-<li class="toctree-l1"><a class="reference external" 
href="https://mxnet.incubator.apache.org/how_to/torch.html";>How do I use MXNet 
as a front end for Torch?</a></li>
 </ul>
 </div>
 </div>
diff --git a/how_to/torch.html b/how_to/torch.html
deleted file mode 100644
index f69a78f8..00000000
--- a/how_to/torch.html
+++ /dev/null
@@ -1,264 +0,0 @@
-<!DOCTYPE html>
-
-<html lang="en">
-<head>
-<meta charset="utf-8"/>
-<meta content="IE=edge" http-equiv="X-UA-Compatible"/>
-<meta content="width=device-width, initial-scale=1" name="viewport"/>
-<title>How to Use MXNet As an (Almost) Full-function Torch Front End ? mxnet  
documentation</title>
-<link crossorigin="anonymous" 
href="https://maxcdn.bootstrapcdn.com/bootstrap/3.3.6/css/bootstrap.min.css"; 
integrity="sha384-1q8mTJOASx8j1Au+a5WDVnPi2lkFfwwEAa8hDDdjZlpLegxhjVME1fgjWPGmkzs7"
 rel="stylesheet"/>
-<link 
href="https://maxcdn.bootstrapcdn.com/font-awesome/4.5.0/css/font-awesome.min.css";
 rel="stylesheet"/>
-<link href="../_static/basic.css" rel="stylesheet" type="text/css">
-<link href="../_static/pygments.css" rel="stylesheet" type="text/css">
-<link href="../_static/mxnet.css" rel="stylesheet" type="text/css"/>
-<script type="text/javascript">
-      var DOCUMENTATION_OPTIONS = {
-        URL_ROOT:    '../',
-        VERSION:     '',
-        COLLAPSE_INDEX: false,
-        FILE_SUFFIX: '.html',
-        HAS_SOURCE:  true,
-        SOURCELINK_SUFFIX: ''
-      };
-    </script>
-<script src="../_static/jquery-1.11.1.js" type="text/javascript"></script>
-<script src="../_static/underscore.js" type="text/javascript"></script>
-<script src="../_static/searchtools_custom.js" type="text/javascript"></script>
-<script src="../_static/doctools.js" type="text/javascript"></script>
-<script src="../_static/selectlang.js" type="text/javascript"></script>
-<script 
src="https://cdn.mathjax.org/mathjax/latest/MathJax.js?config=TeX-AMS-MML_HTMLorMML";
 type="text/javascript"></script>
-<script type="text/javascript"> jQuery(function() { 
Search.loadIndex("/searchindex.js"); Search.init();}); </script>
-<script>
-      
(function(i,s,o,g,r,a,m){i['GoogleAnalyticsObject']=r;i[r]=i[r]||function(){
-      (i[r].q=i[r].q||[]).push(arguments)},i[r].l=1*new
-      Date();a=s.createElement(o),
-      
m=s.getElementsByTagName(o)[0];a.async=1;a.src=g;m.parentNode.insertBefore(a,m)
-      
})(window,document,'script','https://www.google-analytics.com/analytics.js','ga');
-
-      ga('create', 'UA-96378503-1', 'auto');
-      ga('send', 'pageview');
-
-    </script>
-<!-- -->
-<!-- <script type="text/javascript" src="../_static/jquery.js"></script> -->
-<!-- -->
-<!-- <script type="text/javascript" src="../_static/underscore.js"></script> 
-->
-<!-- -->
-<!-- <script type="text/javascript" src="../_static/doctools.js"></script> -->
-<!-- -->
-<!-- <script type="text/javascript" 
src="https://cdn.mathjax.org/mathjax/latest/MathJax.js?config=TeX-AMS-MML_HTMLorMML";></script>
 -->
-<!-- -->
-<link 
href="https://raw.githubusercontent.com/dmlc/web-data/master/mxnet/image/mxnet-icon.png";
 rel="icon" type="image/png"/>
-</link></link></head>
-<body role="document"><div class="navbar navbar-fixed-top">
-<div class="container" id="navContainer">
-<div class="innder" id="header-inner">
-<h1 id="logo-wrap">
-<a href="../" id="logo"><img src="../_static/mxnet.png"/></a>
-</h1>
-<nav class="nav-bar" id="main-nav">
-<a class="main-nav-link" href="../get_started/install.html">Install</a>
-<a class="main-nav-link" href="../tutorials/index.html">Tutorials</a>
-<span id="dropdown-menu-position-anchor">
-<a aria-expanded="true" aria-haspopup="true" class="main-nav-link 
dropdown-toggle" data-toggle="dropdown" href="#" role="button">Gluon <span 
class="caret"></span></a>
-<ul class="dropdown-menu" id="package-dropdown-menu">
-<li><a class="main-nav-link" href="../gluon/index.html">About</a></li>
-<li><a class="main-nav-link" href="http://gluon.mxnet.io/";>Tutorials</a></li>
-</ul>
-</span>
-<a class="main-nav-link" href="../how_to/index.html">How To</a>
-<span id="dropdown-menu-position-anchor">
-<a aria-expanded="true" aria-haspopup="true" class="main-nav-link 
dropdown-toggle" data-toggle="dropdown" href="#" role="button">API <span 
class="caret"></span></a>
-<ul class="dropdown-menu" id="package-dropdown-menu">
-<li><a class="main-nav-link" href="../api/python/index.html">Python</a></li>
-<li><a class="main-nav-link" href="../api/scala/index.html">Scala</a></li>
-<li><a class="main-nav-link" href="../api/r/index.html">R</a></li>
-<li><a class="main-nav-link" href="../api/julia/index.html">Julia</a></li>
-<li><a class="main-nav-link" href="../api/c++/index.html">C++</a></li>
-<li><a class="main-nav-link" href="../api/perl/index.html">Perl</a></li>
-</ul>
-</span>
-<a class="main-nav-link" href="../architecture/index.html">Architecture</a>
-<!-- <a class="main-nav-link" href="../community/index.html">Community</a> -->
-<a class="main-nav-link" href="https://github.com/dmlc/mxnet";>Github</a>
-<span id="dropdown-menu-position-anchor-version" style="position: relative"><a 
href="#" class="main-nav-link dropdown-toggle" data-toggle="dropdown" 
role="button" aria-haspopup="true" aria-expanded="true">Versions(0.11.0)<span 
class="caret"></span></a><ul id="package-dropdown-menu" 
class="dropdown-menu"><li><a class="main-nav-link" 
href=https://mxnet.incubator.apache.org/>0.11.0</a></li><li><a 
class="main-nav-link" 
href=https://mxnet.incubator.apache.org/versions/master/index.html>master</a></li></ul></span></nav>
-<script> function getRootPath(){ return "../" } </script>
-<div class="burgerIcon dropdown">
-<a class="dropdown-toggle" data-toggle="dropdown" href="#" role="button">?</a>
-<ul class="dropdown-menu dropdown-menu-right" id="burgerMenu">
-<li><a href="../get_started/install.html">Install</a></li>
-<li><a href="../tutorials/index.html">Tutorials</a></li>
-<li><a href="../how_to/index.html">How To</a></li>
-<li class="dropdown-submenu">
-<a href="#" tabindex="-1">API</a>
-<ul class="dropdown-menu">
-<li><a href="../api/python/index.html" tabindex="-1">Python</a>
-</li>
-<li><a href="../api/scala/index.html" tabindex="-1">Scala</a>
-</li>
-<li><a href="../api/r/index.html" tabindex="-1">R</a>
-</li>
-<li><a href="../api/julia/index.html" tabindex="-1">Julia</a>
-</li>
-<li><a href="../api/c++/index.html" tabindex="-1">C++</a>
-</li>
-<li><a href="../api/perl/index.html" tabindex="-1">Perl</a>
-</li>
-</ul>
-</li>
-<li><a href="../architecture/index.html">Architecture</a></li>
-<li><a class="main-nav-link" 
href="https://github.com/dmlc/mxnet";>Github</a></li>
-<li id="dropdown-menu-position-anchor-version-mobile" class="dropdown-submenu" 
style="position: relative"><a href="#" tabindex="-1">Versions(0.11.0)</a><ul 
class="dropdown-menu"><li><a tabindex="-1" 
href=https://mxnet.incubator.apache.org/>0.11.0</a></li><li><a tabindex="-1" 
href=https://mxnet.incubator.apache.org/versions/master/index.html>master</a></li></ul></li></ul>
-</div>
-<div class="plusIcon dropdown">
-<a class="dropdown-toggle" data-toggle="dropdown" href="#" role="button"><span 
aria-hidden="true" class="glyphicon glyphicon-plus"></span></a>
-<ul class="dropdown-menu dropdown-menu-right" id="plusMenu"></ul>
-</div>
-<div id="search-input-wrap">
-<form action="../search.html" autocomplete="off" class="" method="get" 
role="search">
-<div class="form-group inner-addon left-addon">
-<i class="glyphicon glyphicon-search"></i>
-<input class="form-control" name="q" placeholder="Search" type="text"/>
-</div>
-<input name="check_keywords" type="hidden" value="yes">
-<input name="area" type="hidden" value="default"/>
-</input></form>
-<div id="search-preview"></div>
-</div>
-<div id="searchIcon">
-<span aria-hidden="true" class="glyphicon glyphicon-search"></span>
-</div>
-<!-- <div id="lang-select-wrap"> -->
-<!--   <label id="lang-select-label"> -->
-<!--     <\!-- <i class="fa fa-globe"></i> -\-> -->
-<!--     <span></span> -->
-<!--   </label> -->
-<!--   <select id="lang-select"> -->
-<!--     <option value="en">Eng</option> -->
-<!--     <option value="zh">??</option> -->
-<!--   </select> -->
-<!-- </div> -->
-<!--     <a id="mobile-nav-toggle">
-        <span class="mobile-nav-toggle-bar"></span>
-        <span class="mobile-nav-toggle-bar"></span>
-        <span class="mobile-nav-toggle-bar"></span>
-      </a> -->
-</div>
-</div>
-</div>
-<div class="container">
-<div class="row">
-<div aria-label="main navigation" class="sphinxsidebar leftsidebar" 
role="navigation">
-<div class="sphinxsidebarwrapper">
-<ul>
-<li class="toctree-l1"><a class="reference internal" 
href="../api/python/index.html">Python Documents</a></li>
-<li class="toctree-l1"><a class="reference internal" 
href="../api/r/index.html">R Documents</a></li>
-<li class="toctree-l1"><a class="reference internal" 
href="../api/julia/index.html">Julia Documents</a></li>
-<li class="toctree-l1"><a class="reference internal" 
href="../api/c++/index.html">C++ Documents</a></li>
-<li class="toctree-l1"><a class="reference internal" 
href="../api/scala/index.html">Scala Documents</a></li>
-<li class="toctree-l1"><a class="reference internal" 
href="../api/perl/index.html">Perl Documents</a></li>
-<li class="toctree-l1"><a class="reference internal" href="index.html">HowTo 
Documents</a></li>
-<li class="toctree-l1"><a class="reference internal" 
href="../architecture/index.html">System Documents</a></li>
-<li class="toctree-l1"><a class="reference internal" 
href="../tutorials/index.html">Tutorials</a></li>
-</ul>
-</div>
-</div>
-<div class="content">
-<div class="section" 
id="how-to-use-mxnet-as-an-almost-full-function-torch-front-end">
-<span 
id="how-to-use-mxnet-as-an-almost-full-function-torch-front-end"></span><h1>How 
to Use MXNet As an (Almost) Full-function Torch Front End<a class="headerlink" 
href="#how-to-use-mxnet-as-an-almost-full-function-torch-front-end" 
title="Permalink to this headline">?</a></h1>
-<p>This topic demonstrates how to use MXNet as a front end to two of Torch?s 
major functionalities:</p>
-<ul class="simple">
-<li>Call Torch?s tensor mathematical functions with MXNet.NDArray</li>
-<li>Embed Torch?s neural network modules (layers) into MXNet?s symbolic 
graph</li>
-</ul>
-<div class="section" id="compile-with-torch">
-<span id="compile-with-torch"></span><h2>Compile with Torch<a 
class="headerlink" href="#compile-with-torch" title="Permalink to this 
headline">?</a></h2>
-<ul class="simple">
-<li>Install Torch using the <a class="reference external" 
href="http://torch.ch/docs/getting-started.html";>official guide</a>.<ul>
-<li>If you haven?t already done so, copy <code class="docutils literal"><span 
class="pre">make/config.mk</span></code> (Linux) or <code class="docutils 
literal"><span class="pre">make/osx.mk</span></code> (Mac) into the MXNet root 
folder as <code class="docutils literal"><span 
class="pre">config.mk</span></code>. In <code class="docutils literal"><span 
class="pre">config.mk</span></code> uncomment the lines <code class="docutils 
literal"><span class="pre">TORCH_PATH</span> <span class="pre">=</span> <span 
class="pre">$(HOME)/torch</span></code> and <code class="docutils 
literal"><span class="pre">MXNET_PLUGINS</span> <span class="pre">+=</span> 
<span class="pre">plugin/torch/torch.mk</span></code>.</li>
-<li>By default, Torch should be installed in your home folder (so <code 
class="docutils literal"><span class="pre">TORCH_PATH</span> <span 
class="pre">=</span> <span class="pre">$(HOME)/torch</span></code>). Modify 
TORCH_PATH to point to your torch installation, if necessary.</li>
-</ul>
-</li>
-<li>Run <code class="docutils literal"><span class="pre">make</span> <span 
class="pre">clean</span> <span class="pre">&amp;&amp;</span> <span 
class="pre">make</span></code> to build MXNet with Torch support.</li>
-</ul>
-</div>
-<div class="section" id="tensor-mathematics">
-<span id="tensor-mathematics"></span><h2>Tensor Mathematics<a 
class="headerlink" href="#tensor-mathematics" title="Permalink to this 
headline">?</a></h2>
-<p>The mxnet.th module supports calling Torch?s tensor mathematical functions 
with mxnet.nd.NDArray. See <a class="reference external" 
href="https://github.com/dmlc/mxnet/blob/master/example/torch/torch_function.py";>complete
 code</a>:</p>
-<div class="highlight-python"><div class="highlight"><pre><span></span>    
<span class="kn">import</span> <span class="nn">mxnet</span> <span 
class="kn">as</span> <span class="nn">mx</span>
-    <span class="n">x</span> <span class="o">=</span> <span 
class="n">mx</span><span class="o">.</span><span class="n">th</span><span 
class="o">.</span><span class="n">randn</span><span class="p">(</span><span 
class="mi">2</span><span class="p">,</span> <span class="mi">2</span><span 
class="p">,</span> <span class="n">ctx</span><span class="o">=</span><span 
class="n">mx</span><span class="o">.</span><span class="n">cpu</span><span 
class="p">(</span><span class="mi">0</span><span class="p">))</span>
-    <span class="k">print</span> <span class="n">x</span><span 
class="o">.</span><span class="n">asnumpy</span><span class="p">()</span>
-    <span class="n">y</span> <span class="o">=</span> <span 
class="n">mx</span><span class="o">.</span><span class="n">th</span><span 
class="o">.</span><span class="n">abs</span><span class="p">(</span><span 
class="n">x</span><span class="p">)</span>
-    <span class="k">print</span> <span class="n">y</span><span 
class="o">.</span><span class="n">asnumpy</span><span class="p">()</span>
-
-    <span class="n">x</span> <span class="o">=</span> <span 
class="n">mx</span><span class="o">.</span><span class="n">th</span><span 
class="o">.</span><span class="n">randn</span><span class="p">(</span><span 
class="mi">2</span><span class="p">,</span> <span class="mi">2</span><span 
class="p">,</span> <span class="n">ctx</span><span class="o">=</span><span 
class="n">mx</span><span class="o">.</span><span class="n">cpu</span><span 
class="p">(</span><span class="mi">0</span><span class="p">))</span>
-    <span class="k">print</span> <span class="n">x</span><span 
class="o">.</span><span class="n">asnumpy</span><span class="p">()</span>
-    <span class="n">mx</span><span class="o">.</span><span 
class="n">th</span><span class="o">.</span><span class="n">abs</span><span 
class="p">(</span><span class="n">x</span><span class="p">,</span> <span 
class="n">x</span><span class="p">)</span> <span class="c1"># in-place</span>
-    <span class="k">print</span> <span class="n">x</span><span 
class="o">.</span><span class="n">asnumpy</span><span class="p">()</span>
-</pre></div>
-</div>
-<p>For help, use the <code class="docutils literal"><span 
class="pre">help(mx.th)</span></code> command.</p>
-<p>We?ve added support for most common functions listed on <a class="reference 
external" 
href="https://github.com/torch/torch7/blob/master/doc/maths.md";>Torch?s 
documentation page</a>.
-If you find that the function you need is not supported, you can easily 
register it in <code class="docutils literal"><span 
class="pre">mxnet_root/plugin/torch/torch_function.cc</span></code> by using 
the existing registrations as examples.</p>
-</div>
-<div class="section" id="torch-modules-layers">
-<span id="torch-modules-layers"></span><h2>Torch Modules (Layers)<a 
class="headerlink" href="#torch-modules-layers" title="Permalink to this 
headline">?</a></h2>
-<p>MXNet supports Torch?s neural network modules through  the<code 
class="docutils literal"><span 
class="pre">mxnet.symbol.TorchModule</span></code> symbol.
-For example, the following code defines a three-layer DNN for classifying 
MNIST digits (<a class="reference external" 
href="https://github.com/dmlc/mxnet/blob/master/example/torch/torch_module.py";>full
 code</a>):</p>
-<div class="highlight-python"><div class="highlight"><pre><span></span>    
<span class="n">data</span> <span class="o">=</span> <span 
class="n">mx</span><span class="o">.</span><span class="n">symbol</span><span 
class="o">.</span><span class="n">Variable</span><span class="p">(</span><span 
class="s1">'data'</span><span class="p">)</span>
-    <span class="n">fc1</span> <span class="o">=</span> <span 
class="n">mx</span><span class="o">.</span><span class="n">symbol</span><span 
class="o">.</span><span class="n">TorchModule</span><span 
class="p">(</span><span class="n">data_0</span><span class="o">=</span><span 
class="n">data</span><span class="p">,</span> <span 
class="n">lua_string</span><span class="o">=</span><span 
class="s1">'nn.Linear(784, 128)'</span><span class="p">,</span> <span 
class="n">num_data</span><span class="o">=</span><span class="mi">1</span><span 
class="p">,</span> <span class="n">num_params</span><span 
class="o">=</span><span class="mi">2</span><span class="p">,</span> <span 
class="n">num_outputs</span><span class="o">=</span><span 
class="mi">1</span><span class="p">,</span> <span class="n">name</span><span 
class="o">=</span><span class="s1">'fc1'</span><span class="p">)</span>
-    <span class="n">act1</span> <span class="o">=</span> <span 
class="n">mx</span><span class="o">.</span><span class="n">symbol</span><span 
class="o">.</span><span class="n">TorchModule</span><span 
class="p">(</span><span class="n">data_0</span><span class="o">=</span><span 
class="n">fc1</span><span class="p">,</span> <span 
class="n">lua_string</span><span class="o">=</span><span 
class="s1">'nn.ReLU(false)'</span><span class="p">,</span> <span 
class="n">num_data</span><span class="o">=</span><span class="mi">1</span><span 
class="p">,</span> <span class="n">num_params</span><span 
class="o">=</span><span class="mi">0</span><span class="p">,</span> <span 
class="n">num_outputs</span><span class="o">=</span><span 
class="mi">1</span><span class="p">,</span> <span class="n">name</span><span 
class="o">=</span><span class="s1">'relu1'</span><span class="p">)</span>
-    <span class="n">fc2</span> <span class="o">=</span> <span 
class="n">mx</span><span class="o">.</span><span class="n">symbol</span><span 
class="o">.</span><span class="n">TorchModule</span><span 
class="p">(</span><span class="n">data_0</span><span class="o">=</span><span 
class="n">act1</span><span class="p">,</span> <span 
class="n">lua_string</span><span class="o">=</span><span 
class="s1">'nn.Linear(128, 64)'</span><span class="p">,</span> <span 
class="n">num_data</span><span class="o">=</span><span class="mi">1</span><span 
class="p">,</span> <span class="n">num_params</span><span 
class="o">=</span><span class="mi">2</span><span class="p">,</span> <span 
class="n">num_outputs</span><span class="o">=</span><span 
class="mi">1</span><span class="p">,</span> <span class="n">name</span><span 
class="o">=</span><span class="s1">'fc2'</span><span class="p">)</span>
-    <span class="n">act2</span> <span class="o">=</span> <span 
class="n">mx</span><span class="o">.</span><span class="n">symbol</span><span 
class="o">.</span><span class="n">TorchModule</span><span 
class="p">(</span><span class="n">data_0</span><span class="o">=</span><span 
class="n">fc2</span><span class="p">,</span> <span 
class="n">lua_string</span><span class="o">=</span><span 
class="s1">'nn.ReLU(false)'</span><span class="p">,</span> <span 
class="n">num_data</span><span class="o">=</span><span class="mi">1</span><span 
class="p">,</span> <span class="n">num_params</span><span 
class="o">=</span><span class="mi">0</span><span class="p">,</span> <span 
class="n">num_outputs</span><span class="o">=</span><span 
class="mi">1</span><span class="p">,</span> <span class="n">name</span><span 
class="o">=</span><span class="s1">'relu2'</span><span class="p">)</span>
-    <span class="n">fc3</span> <span class="o">=</span> <span 
class="n">mx</span><span class="o">.</span><span class="n">symbol</span><span 
class="o">.</span><span class="n">TorchModule</span><span 
class="p">(</span><span class="n">data_0</span><span class="o">=</span><span 
class="n">act2</span><span class="p">,</span> <span 
class="n">lua_string</span><span class="o">=</span><span 
class="s1">'nn.Linear(64, 10)'</span><span class="p">,</span> <span 
class="n">num_data</span><span class="o">=</span><span class="mi">1</span><span 
class="p">,</span> <span class="n">num_params</span><span 
class="o">=</span><span class="mi">2</span><span class="p">,</span> <span 
class="n">num_outputs</span><span class="o">=</span><span 
class="mi">1</span><span class="p">,</span> <span class="n">name</span><span 
class="o">=</span><span class="s1">'fc3'</span><span class="p">)</span>
-    <span class="n">mlp</span> <span class="o">=</span> <span 
class="n">mx</span><span class="o">.</span><span class="n">symbol</span><span 
class="o">.</span><span class="n">SoftmaxOutput</span><span 
class="p">(</span><span class="n">data</span><span class="o">=</span><span 
class="n">fc3</span><span class="p">,</span> <span class="n">name</span><span 
class="o">=</span><span class="s1">'softmax'</span><span class="p">)</span>
-</pre></div>
-</div>
-<p>Let?s break it down. First <code class="docutils literal"><span 
class="pre">data</span> <span class="pre">=</span> <span 
class="pre">mx.symbol.Variable('data')</span></code> defines a Variable as a 
placeholder for input.
-Then, it?s fed through Torch?s nn modules with:
-<code class="docutils literal"><span class="pre">fc1</span> <span 
class="pre">=</span> <span 
class="pre">mx.symbol.TorchModule(data_0=data,</span> <span 
class="pre">lua_string='nn.Linear(784,</span> <span class="pre">128)',</span> 
<span class="pre">num_data=1,</span> <span class="pre">num_params=2,</span> 
<span class="pre">num_outputs=1,</span> <span 
class="pre">name='fc1')</span></code>.
-To use Torch?s criterion as loss functions, you can replace the last line 
with:</p>
-<div class="highlight-python"><div class="highlight"><pre><span></span>    
<span class="n">logsoftmax</span> <span class="o">=</span> <span 
class="n">mx</span><span class="o">.</span><span class="n">symbol</span><span 
class="o">.</span><span class="n">TorchModule</span><span 
class="p">(</span><span class="n">data_0</span><span class="o">=</span><span 
class="n">fc3</span><span class="p">,</span> <span 
class="n">lua_string</span><span class="o">=</span><span 
class="s1">'nn.LogSoftMax()'</span><span class="p">,</span> <span 
class="n">num_data</span><span class="o">=</span><span class="mi">1</span><span 
class="p">,</span> <span class="n">num_params</span><span 
class="o">=</span><span class="mi">0</span><span class="p">,</span> <span 
class="n">num_outputs</span><span class="o">=</span><span 
class="mi">1</span><span class="p">,</span> <span class="n">name</span><span 
class="o">=</span><span class="s1">'logsoftmax'</span><span class="p">)</span>
-    <span class="c1"># Torch's label starts from 1</span>
-    <span class="n">label</span> <span class="o">=</span> <span 
class="n">mx</span><span class="o">.</span><span class="n">symbol</span><span 
class="o">.</span><span class="n">Variable</span><span class="p">(</span><span 
class="s1">'softmax_label'</span><span class="p">)</span> <span 
class="o">+</span> <span class="mi">1</span>
-    <span class="n">mlp</span> <span class="o">=</span> <span 
class="n">mx</span><span class="o">.</span><span class="n">symbol</span><span 
class="o">.</span><span class="n">TorchCriterion</span><span 
class="p">(</span><span class="n">data</span><span class="o">=</span><span 
class="n">logsoftmax</span><span class="p">,</span> <span 
class="n">label</span><span class="o">=</span><span class="n">label</span><span 
class="p">,</span> <span class="n">lua_string</span><span 
class="o">=</span><span class="s1">'nn.ClassNLLCriterion()'</span><span 
class="p">,</span> <span class="n">name</span><span class="o">=</span><span 
class="s1">'softmax'</span><span class="p">)</span>
-</pre></div>
-</div>
-<p>The input to the nn module is named data_i for i = 0 ... num_data-1. <code 
class="docutils literal"><span class="pre">lua_string</span></code> is a single 
Lua statement that creates the module object.
-For Torch?s built-in module, this is simply <code class="docutils 
literal"><span class="pre">nn.module_name(arguments)</span></code>.
-If you are using a custom module, place it in a .lua script file and load it 
with <code class="docutils literal"><span class="pre">require</span> <span 
class="pre">'module_file.lua'</span></code> if your script returns a torch.nn 
object, or <code class="docutils literal"><span class="pre">(require</span> 
<span class="pre">'module_file.lua')()</span></code> if your script returns a 
torch.nn class.</p>
-</div>
-</div>
-<div class="container">
-<div class="footer">
-<p> </p>
-</div>
-</div>
-</div>
-<div aria-label="main navigation" class="sphinxsidebar rightsidebar" 
role="navigation">
-<div class="sphinxsidebarwrapper">
-<h3><a href="../index.html">Table Of Contents</a></h3>
-<ul>
-<li><a class="reference internal" href="#">How to Use MXNet As an (Almost) 
Full-function Torch Front End</a><ul>
-<li><a class="reference internal" href="#compile-with-torch">Compile with 
Torch</a></li>
-<li><a class="reference internal" href="#tensor-mathematics">Tensor 
Mathematics</a></li>
-<li><a class="reference internal" href="#torch-modules-layers">Torch Modules 
(Layers)</a></li>
-</ul>
-</li>
-</ul>
-</div>
-</div>
-</div> <!-- pagename != index -->
-<script crossorigin="anonymous" 
integrity="sha384-0mSbJDEHialfmuBBQP6A4Qrprq5OVfW37PRR3j5ELqxss1yVqOtnepnHVP9aJ7xS"
 
src="https://maxcdn.bootstrapcdn.com/bootstrap/3.3.6/js/bootstrap.min.js";></script>
-<script src="../_static/js/sidebar.js" type="text/javascript"></script>
-<script src="../_static/js/search.js" type="text/javascript"></script>
-<script src="../_static/js/navbar.js" type="text/javascript"></script>
-<script src="../_static/js/clipboard.min.js" type="text/javascript"></script>
-<script src="../_static/js/copycode.js" type="text/javascript"></script>
-<script type="text/javascript">
-        $('body').ready(function () {
-            $('body').css('visibility', 'visible');
-        });
-    </script>
-</div></body>
-</html>
diff --git a/versions/0.11.0/api/python/model.html 
b/versions/0.11.0/api/python/model.html
index fc1d1247..ba1dc53d 100644
--- a/versions/0.11.0/api/python/model.html
+++ b/versions/0.11.0/api/python/model.html
@@ -1966,7 +1966,7 @@ <h1 id="logo-wrap">
 <dd><p>The Nesterov Adam optimizer.</p>
 <p>Much like Adam is essentially RMSprop with momentum,
 Nadam is Adam RMSprop with Nesterov momentum available
-at <a class="reference external" 
href="http://cs229.stanford.edu/proj2015/054_report.pdf";>http://cs229.stanford.edu/proj2015/054_report.pdf</a>.</p>
+at <a class="reference external" 
href="https://openreview.net/pdf?id=OM0jvwB8jIp57ZJjtNEZ";>https://openreview.net/pdf?id=OM0jvwB8jIp57ZJjtNEZ</a>.</p>
 <p>This optimizer accepts the following parameters in addition to those 
accepted
 by <a class="reference internal" 
href="optimization.html#mxnet.optimizer.Optimizer" 
title="mxnet.optimizer.Optimizer"><code class="xref py py-class docutils 
literal"><span class="pre">Optimizer</span></code></a>.</p>
 <table class="docutils field-list" frame="void" rules="none">
diff --git a/versions/0.11.0/api/python/optimization.html 
b/versions/0.11.0/api/python/optimization.html
index 52d6e1de..1abc12e0 100644
--- a/versions/0.11.0/api/python/optimization.html
+++ b/versions/0.11.0/api/python/optimization.html
@@ -811,7 +811,7 @@ <h1 id="logo-wrap">
 <dd><p>The Nesterov Adam optimizer.</p>
 <p>Much like Adam is essentially RMSprop with momentum,
 Nadam is Adam RMSprop with Nesterov momentum available
-at <a class="reference external" 
href="http://cs229.stanford.edu/proj2015/054_report.pdf";>http://cs229.stanford.edu/proj2015/054_report.pdf</a>.</p>
+at <a class="reference external" 
href="https://openreview.net/pdf?id=OM0jvwB8jIp57ZJjtNEZ";>https://openreview.net/pdf?id=OM0jvwB8jIp57ZJjtNEZ</a>.</p>
 <p>This optimizer accepts the following parameters in addition to those 
accepted
 by <a class="reference internal" href="#mxnet.optimizer.Optimizer" 
title="mxnet.optimizer.Optimizer"><code class="xref py py-class docutils 
literal"><span class="pre">Optimizer</span></code></a>.</p>
 <table class="docutils field-list" frame="void" rules="none">
diff --git a/versions/0.11.0/how_to/develop_and_hack.html 
b/versions/0.11.0/how_to/develop_and_hack.html
index 7251e80f..2872dcb5 100644
--- a/versions/0.11.0/how_to/develop_and_hack.html
+++ b/versions/0.11.0/how_to/develop_and_hack.html
@@ -169,7 +169,6 @@ <h1 id="logo-wrap">
 <div class="toctree-wrapper compound">
 <ul>
 <li class="toctree-l1"><a class="reference internal" href="new_op.html">Create 
new operators</a></li>
-<li class="toctree-l1"><a class="reference internal" href="torch.html">Use 
Torch from MXNet</a></li>
 <li class="toctree-l1"><a class="reference internal" href="env_var.html">Set 
environment variables of MXNet</a></li>
 </ul>
 </div>
diff --git a/versions/0.11.0/how_to/index.html 
b/versions/0.11.0/how_to/index.html
index 7b39bce0..237cafc8 100644
--- a/versions/0.11.0/how_to/index.html
+++ b/versions/0.11.0/how_to/index.html
@@ -188,7 +188,6 @@ <h1 id="logo-wrap">
 <li class="toctree-l3"><a class="reference external" 
href="https://mxnet.incubator.apache.org/community/contribute.html";>How do I 
contribute a patch to MXNet?</a></li>
 <li class="toctree-l3"><a class="reference external" 
href="https://mxnet.incubator.apache.org/how_to/new_op.html";>How do I create 
new operators in MXNet?</a></li>
 <li class="toctree-l3"><a class="reference external" 
href="https://mxnet.incubator.apache.org/how_to/env_var.html";>How do I set 
MXNet?s environmental variables?</a></li>
-<li class="toctree-l3"><a class="reference external" 
href="https://mxnet.incubator.apache.org/how_to/torch.html";>How do I use MXNet 
as a front end for Torch?</a></li>
 </ul>
 </li>
 <li class="toctree-l2"><a class="reference internal" 
href="#questions-about-using-mxnet">Questions about Using MXNet</a></li>
@@ -251,7 +250,6 @@ <h1 id="logo-wrap">
 <li class="toctree-l1"><a class="reference external" 
href="https://mxnet.incubator.apache.org/community/contribute.html";>How do I 
contribute a patch to MXNet?</a></li>
 <li class="toctree-l1"><a class="reference external" 
href="https://mxnet.incubator.apache.org/how_to/new_op.html";>How do I create 
new operators in MXNet?</a></li>
 <li class="toctree-l1"><a class="reference external" 
href="https://mxnet.incubator.apache.org/how_to/env_var.html";>How do I set 
MXNet?s environmental variables?</a></li>
-<li class="toctree-l1"><a class="reference external" 
href="https://mxnet.incubator.apache.org/how_to/torch.html";>How do I use MXNet 
as a front end for Torch?</a></li>
 </ul>
 </div>
 </div>
diff --git a/versions/0.12.0/api/python/model.html 
b/versions/0.12.0/api/python/model.html
index 1fbcaa53..c998ee65 100644
--- a/versions/0.12.0/api/python/model.html
+++ b/versions/0.12.0/api/python/model.html
@@ -2170,7 +2170,7 @@ <h1 id="logo-wrap">
 <dd><p>The Nesterov Adam optimizer.</p>
 <p>Much like Adam is essentially RMSprop with momentum,
 Nadam is Adam RMSprop with Nesterov momentum available
-at <a class="reference external" 
href="http://cs229.stanford.edu/proj2015/054_report.pdf";>http://cs229.stanford.edu/proj2015/054_report.pdf</a>.</p>
+at <a class="reference external" 
href="https://openreview.net/pdf?id=OM0jvwB8jIp57ZJjtNEZ";>https://openreview.net/pdf?id=OM0jvwB8jIp57ZJjtNEZ</a>.</p>
 <p>This optimizer accepts the following parameters in addition to those 
accepted
 by <a class="reference internal" 
href="optimization/optimization.html#mxnet.optimizer.Optimizer" 
title="mxnet.optimizer.Optimizer"><code class="xref py py-class docutils 
literal"><span class="pre">Optimizer</span></code></a>.</p>
 <table class="docutils field-list" frame="void" rules="none">
diff --git a/versions/0.12.0/api/python/optimization.html 
b/versions/0.12.0/api/python/optimization.html
index 800d05f0..b057f0c0 100644
--- a/versions/0.12.0/api/python/optimization.html
+++ b/versions/0.12.0/api/python/optimization.html
@@ -811,7 +811,7 @@ <h1 id="logo-wrap">
 <dd><p>The Nesterov Adam optimizer.</p>
 <p>Much like Adam is essentially RMSprop with momentum,
 Nadam is Adam RMSprop with Nesterov momentum available
-at <a class="reference external" 
href="http://cs229.stanford.edu/proj2015/054_report.pdf";>http://cs229.stanford.edu/proj2015/054_report.pdf</a>.</p>
+at <a class="reference external" 
href="https://openreview.net/pdf?id=OM0jvwB8jIp57ZJjtNEZ";>https://openreview.net/pdf?id=OM0jvwB8jIp57ZJjtNEZ</a>.</p>
 <p>This optimizer accepts the following parameters in addition to those 
accepted
 by <a class="reference internal" href="#mxnet.optimizer.Optimizer" 
title="mxnet.optimizer.Optimizer"><code class="xref py py-class docutils 
literal"><span class="pre">Optimizer</span></code></a>.</p>
 <table class="docutils field-list" frame="void" rules="none">
diff --git a/versions/0.12.0/api/python/optimization/optimization.html 
b/versions/0.12.0/api/python/optimization/optimization.html
index 75e4df51..3d761c19 100644
--- a/versions/0.12.0/api/python/optimization/optimization.html
+++ b/versions/0.12.0/api/python/optimization/optimization.html
@@ -961,7 +961,7 @@ <h1 id="logo-wrap">
 <dd><p>The Nesterov Adam optimizer.</p>
 <p>Much like Adam is essentially RMSprop with momentum,
 Nadam is Adam RMSprop with Nesterov momentum available
-at <a class="reference external" 
href="http://cs229.stanford.edu/proj2015/054_report.pdf";>http://cs229.stanford.edu/proj2015/054_report.pdf</a>.</p>
+at <a class="reference external" 
href="https://openreview.net/pdf?id=OM0jvwB8jIp57ZJjtNEZ";>https://openreview.net/pdf?id=OM0jvwB8jIp57ZJjtNEZ</a>.</p>
 <p>This optimizer accepts the following parameters in addition to those 
accepted
 by <a class="reference internal" href="#mxnet.optimizer.Optimizer" 
title="mxnet.optimizer.Optimizer"><code class="xref py py-class docutils 
literal"><span class="pre">Optimizer</span></code></a>.</p>
 <table class="docutils field-list" frame="void" rules="none">
diff --git a/versions/0.12.0/faq/develop_and_hack.html 
b/versions/0.12.0/faq/develop_and_hack.html
index e1aa7825..8393f576 100644
--- a/versions/0.12.0/faq/develop_and_hack.html
+++ b/versions/0.12.0/faq/develop_and_hack.html
@@ -207,7 +207,6 @@ <h1 id="logo-wrap">
 <div class="toctree-wrapper compound">
 <ul>
 <li class="toctree-l1"><a class="reference internal" href="new_op.html">Create 
new operators</a></li>
-<li class="toctree-l1"><a class="reference internal" href="torch.html">Use 
Torch from MXNet</a></li>
 <li class="toctree-l1"><a class="reference internal" href="env_var.html">Set 
environment variables of MXNet</a></li>
 </ul>
 </div>
diff --git a/versions/0.12.0/faq/index.html b/versions/0.12.0/faq/index.html
index 40e9495e..8c997b11 100644
--- a/versions/0.12.0/faq/index.html
+++ b/versions/0.12.0/faq/index.html
@@ -225,7 +225,6 @@ <h1 id="logo-wrap">
 <li class="toctree-l3"><a class="reference external" 
href="https://mxnet.incubator.apache.org/how_to/add_op_in_backend.html";>How do 
I implement operators in MXNet backend?</a></li>
 <li class="toctree-l3"><a class="reference external" 
href="https://mxnet.incubator.apache.org/how_to/new_op.html";>How do I create 
new operators in MXNet?</a></li>
 <li class="toctree-l3"><a class="reference external" 
href="https://mxnet.incubator.apache.org/how_to/env_var.html";>How do I set 
MXNet?s environmental variables?</a></li>
-<li class="toctree-l3"><a class="reference external" 
href="https://mxnet.incubator.apache.org/how_to/torch.html";>How do I use MXNet 
as a front end for Torch?</a></li>
 </ul>
 </li>
 <li class="toctree-l2"><a class="reference internal" 
href="#questions-about-using-mxnet">Questions about Using MXNet</a></li>
@@ -291,7 +290,6 @@ <h1 id="logo-wrap">
 <li class="toctree-l1"><a class="reference external" 
href="https://mxnet.incubator.apache.org/how_to/add_op_in_backend.html";>How do 
I implement operators in MXNet backend?</a></li>
 <li class="toctree-l1"><a class="reference external" 
href="https://mxnet.incubator.apache.org/how_to/new_op.html";>How do I create 
new operators in MXNet?</a></li>
 <li class="toctree-l1"><a class="reference external" 
href="https://mxnet.incubator.apache.org/how_to/env_var.html";>How do I set 
MXNet?s environmental variables?</a></li>
-<li class="toctree-l1"><a class="reference external" 
href="https://mxnet.incubator.apache.org/how_to/torch.html";>How do I use MXNet 
as a front end for Torch?</a></li>
 </ul>
 </div>
 </div>
diff --git a/versions/0.12.0/how_to/develop_and_hack.html 
b/versions/0.12.0/how_to/develop_and_hack.html
index 267c6472..cd3b98b6 100644
--- a/versions/0.12.0/how_to/develop_and_hack.html
+++ b/versions/0.12.0/how_to/develop_and_hack.html
@@ -169,7 +169,6 @@ <h1 id="logo-wrap">
 <div class="toctree-wrapper compound">
 <ul>
 <li class="toctree-l1"><a class="reference internal" href="new_op.html">Create 
new operators</a></li>
-<li class="toctree-l1"><a class="reference internal" href="torch.html">Use 
Torch from MXNet</a></li>
 <li class="toctree-l1"><a class="reference internal" href="env_var.html">Set 
environment variables of MXNet</a></li>
 </ul>
 </div>
diff --git a/versions/0.12.0/how_to/index.html 
b/versions/0.12.0/how_to/index.html
index de6c9258..7a75c635 100644
--- a/versions/0.12.0/how_to/index.html
+++ b/versions/0.12.0/how_to/index.html
@@ -188,7 +188,6 @@ <h1 id="logo-wrap">
 <li class="toctree-l3"><a class="reference external" 
href="https://mxnet.incubator.apache.org/community/contribute.html";>How do I 
contribute a patch to MXNet?</a></li>
 <li class="toctree-l3"><a class="reference external" 
href="https://mxnet.incubator.apache.org/how_to/new_op.html";>How do I create 
new operators in MXNet?</a></li>
 <li class="toctree-l3"><a class="reference external" 
href="https://mxnet.incubator.apache.org/how_to/env_var.html";>How do I set 
MXNet?s environmental variables?</a></li>
-<li class="toctree-l3"><a class="reference external" 
href="https://mxnet.incubator.apache.org/how_to/torch.html";>How do I use MXNet 
as a front end for Torch?</a></li>
 </ul>
 </li>
 <li class="toctree-l2"><a class="reference internal" 
href="#questions-about-using-mxnet">Questions about Using MXNet</a></li>
@@ -251,7 +250,6 @@ <h1 id="logo-wrap">
 <li class="toctree-l1"><a class="reference external" 
href="https://mxnet.incubator.apache.org/community/contribute.html";>How do I 
contribute a patch to MXNet?</a></li>
 <li class="toctree-l1"><a class="reference external" 
href="https://mxnet.incubator.apache.org/how_to/new_op.html";>How do I create 
new operators in MXNet?</a></li>
 <li class="toctree-l1"><a class="reference external" 
href="https://mxnet.incubator.apache.org/how_to/env_var.html";>How do I set 
MXNet?s environmental variables?</a></li>
-<li class="toctree-l1"><a class="reference external" 
href="https://mxnet.incubator.apache.org/how_to/torch.html";>How do I use MXNet 
as a front end for Torch?</a></li>
 </ul>
 </div>
 </div>
diff --git a/versions/0.12.1/api/python/model.html 
b/versions/0.12.1/api/python/model.html
index d9efe56c..73e8bced 100644
--- a/versions/0.12.1/api/python/model.html
+++ b/versions/0.12.1/api/python/model.html
@@ -2170,7 +2170,7 @@ <h1 id="logo-wrap">
 <dd><p>The Nesterov Adam optimizer.</p>
 <p>Much like Adam is essentially RMSprop with momentum,
 Nadam is Adam RMSprop with Nesterov momentum available
-at <a class="reference external" 
href="http://cs229.stanford.edu/proj2015/054_report.pdf";>http://cs229.stanford.edu/proj2015/054_report.pdf</a>.</p>
+at <a class="reference external" 
href="https://openreview.net/pdf?id=OM0jvwB8jIp57ZJjtNEZ";>https://openreview.net/pdf?id=OM0jvwB8jIp57ZJjtNEZ</a>.</p>
 <p>This optimizer accepts the following parameters in addition to those 
accepted
 by <a class="reference internal" 
href="optimization/optimization.html#mxnet.optimizer.Optimizer" 
title="mxnet.optimizer.Optimizer"><code class="xref py py-class docutils 
literal"><span class="pre">Optimizer</span></code></a>.</p>
 <table class="docutils field-list" frame="void" rules="none">
diff --git a/versions/0.12.1/api/python/optimization.html 
b/versions/0.12.1/api/python/optimization.html
index 3e6c6ae1..ee20509b 100644
--- a/versions/0.12.1/api/python/optimization.html
+++ b/versions/0.12.1/api/python/optimization.html
@@ -811,7 +811,7 @@ <h1 id="logo-wrap">
 <dd><p>The Nesterov Adam optimizer.</p>
 <p>Much like Adam is essentially RMSprop with momentum,
 Nadam is Adam RMSprop with Nesterov momentum available
-at <a class="reference external" 
href="http://cs229.stanford.edu/proj2015/054_report.pdf";>http://cs229.stanford.edu/proj2015/054_report.pdf</a>.</p>
+at <a class="reference external" 
href="https://openreview.net/pdf?id=OM0jvwB8jIp57ZJjtNEZ";>https://openreview.net/pdf?id=OM0jvwB8jIp57ZJjtNEZ</a>.</p>
 <p>This optimizer accepts the following parameters in addition to those 
accepted
 by <a class="reference internal" href="#mxnet.optimizer.Optimizer" 
title="mxnet.optimizer.Optimizer"><code class="xref py py-class docutils 
literal"><span class="pre">Optimizer</span></code></a>.</p>
 <table class="docutils field-list" frame="void" rules="none">
diff --git a/versions/0.12.1/api/python/optimization/optimization.html 
b/versions/0.12.1/api/python/optimization/optimization.html
index b6195786..e6e50a6a 100644
--- a/versions/0.12.1/api/python/optimization/optimization.html
+++ b/versions/0.12.1/api/python/optimization/optimization.html
@@ -961,7 +961,7 @@ <h1 id="logo-wrap">
 <dd><p>The Nesterov Adam optimizer.</p>
 <p>Much like Adam is essentially RMSprop with momentum,
 Nadam is Adam RMSprop with Nesterov momentum available
-at <a class="reference external" 
href="http://cs229.stanford.edu/proj2015/054_report.pdf";>http://cs229.stanford.edu/proj2015/054_report.pdf</a>.</p>
+at <a class="reference external" 
href="https://openreview.net/pdf?id=OM0jvwB8jIp57ZJjtNEZ";>https://openreview.net/pdf?id=OM0jvwB8jIp57ZJjtNEZ</a>.</p>
 <p>This optimizer accepts the following parameters in addition to those 
accepted
 by <a class="reference internal" href="#mxnet.optimizer.Optimizer" 
title="mxnet.optimizer.Optimizer"><code class="xref py py-class docutils 
literal"><span class="pre">Optimizer</span></code></a>.</p>
 <table class="docutils field-list" frame="void" rules="none">
diff --git a/versions/0.12.1/faq/develop_and_hack.html 
b/versions/0.12.1/faq/develop_and_hack.html
index 3380cbf1..c156f09a 100644
--- a/versions/0.12.1/faq/develop_and_hack.html
+++ b/versions/0.12.1/faq/develop_and_hack.html
@@ -207,7 +207,6 @@ <h1 id="logo-wrap">
 <div class="toctree-wrapper compound">
 <ul>
 <li class="toctree-l1"><a class="reference internal" href="new_op.html">Create 
new operators</a></li>
-<li class="toctree-l1"><a class="reference internal" href="torch.html">Use 
Torch from MXNet</a></li>
 <li class="toctree-l1"><a class="reference internal" href="env_var.html">Set 
environment variables of MXNet</a></li>
 </ul>
 </div>
diff --git a/versions/0.12.1/faq/index.html b/versions/0.12.1/faq/index.html
index a500ff30..b44be604 100644
--- a/versions/0.12.1/faq/index.html
+++ b/versions/0.12.1/faq/index.html
@@ -225,7 +225,6 @@ <h1 id="logo-wrap">
 <li class="toctree-l3"><a class="reference external" 
href="https://mxnet.incubator.apache.org/how_to/add_op_in_backend.html";>How do 
I implement operators in MXNet backend?</a></li>
 <li class="toctree-l3"><a class="reference external" 
href="https://mxnet.incubator.apache.org/how_to/new_op.html";>How do I create 
new operators in MXNet?</a></li>
 <li class="toctree-l3"><a class="reference external" 
href="https://mxnet.incubator.apache.org/how_to/env_var.html";>How do I set 
MXNet?s environmental variables?</a></li>
-<li class="toctree-l3"><a class="reference external" 
href="https://mxnet.incubator.apache.org/how_to/torch.html";>How do I use MXNet 
as a front end for Torch?</a></li>
 </ul>
 </li>
 <li class="toctree-l2"><a class="reference internal" 
href="#questions-about-using-mxnet">Questions about Using MXNet</a></li>
@@ -291,7 +290,6 @@ <h1 id="logo-wrap">
 <li class="toctree-l1"><a class="reference external" 
href="https://mxnet.incubator.apache.org/how_to/add_op_in_backend.html";>How do 
I implement operators in MXNet backend?</a></li>
 <li class="toctree-l1"><a class="reference external" 
href="https://mxnet.incubator.apache.org/how_to/new_op.html";>How do I create 
new operators in MXNet?</a></li>
 <li class="toctree-l1"><a class="reference external" 
href="https://mxnet.incubator.apache.org/how_to/env_var.html";>How do I set 
MXNet?s environmental variables?</a></li>
-<li class="toctree-l1"><a class="reference external" 
href="https://mxnet.incubator.apache.org/how_to/torch.html";>How do I use MXNet 
as a front end for Torch?</a></li>
 </ul>
 </div>
 </div>
diff --git a/versions/0.12.1/how_to/develop_and_hack.html 
b/versions/0.12.1/how_to/develop_and_hack.html
index abb1e38c..ffe4f1b5 100644
--- a/versions/0.12.1/how_to/develop_and_hack.html
+++ b/versions/0.12.1/how_to/develop_and_hack.html
@@ -169,7 +169,6 @@ <h1 id="logo-wrap">
 <div class="toctree-wrapper compound">
 <ul>
 <li class="toctree-l1"><a class="reference internal" href="new_op.html">Create 
new operators</a></li>
-<li class="toctree-l1"><a class="reference internal" href="torch.html">Use 
Torch from MXNet</a></li>
 <li class="toctree-l1"><a class="reference internal" href="env_var.html">Set 
environment variables of MXNet</a></li>
 </ul>
 </div>
diff --git a/versions/0.12.1/how_to/index.html 
b/versions/0.12.1/how_to/index.html
index c7b49a4d..ac836950 100644
--- a/versions/0.12.1/how_to/index.html
+++ b/versions/0.12.1/how_to/index.html
@@ -188,7 +188,6 @@ <h1 id="logo-wrap">
 <li class="toctree-l3"><a class="reference external" 
href="https://mxnet.incubator.apache.org/community/contribute.html";>How do I 
contribute a patch to MXNet?</a></li>
 <li class="toctree-l3"><a class="reference external" 
href="https://mxnet.incubator.apache.org/how_to/new_op.html";>How do I create 
new operators in MXNet?</a></li>
 <li class="toctree-l3"><a class="reference external" 
href="https://mxnet.incubator.apache.org/how_to/env_var.html";>How do I set 
MXNet?s environmental variables?</a></li>
-<li class="toctree-l3"><a class="reference external" 
href="https://mxnet.incubator.apache.org/how_to/torch.html";>How do I use MXNet 
as a front end for Torch?</a></li>
 </ul>
 </li>
 <li class="toctree-l2"><a class="reference internal" 
href="#questions-about-using-mxnet">Questions about Using MXNet</a></li>
@@ -251,7 +250,6 @@ <h1 id="logo-wrap">
 <li class="toctree-l1"><a class="reference external" 
href="https://mxnet.incubator.apache.org/community/contribute.html";>How do I 
contribute a patch to MXNet?</a></li>
 <li class="toctree-l1"><a class="reference external" 
href="https://mxnet.incubator.apache.org/how_to/new_op.html";>How do I create 
new operators in MXNet?</a></li>
 <li class="toctree-l1"><a class="reference external" 
href="https://mxnet.incubator.apache.org/how_to/env_var.html";>How do I set 
MXNet?s environmental variables?</a></li>
-<li class="toctree-l1"><a class="reference external" 
href="https://mxnet.incubator.apache.org/how_to/torch.html";>How do I use MXNet 
as a front end for Torch?</a></li>
 </ul>
 </div>
 </div>
diff --git a/versions/master/_modules/mxnet/optimizer.html 
b/versions/master/_modules/mxnet/optimizer.html
index daee3d9f..aca10f1f 100644
--- a/versions/master/_modules/mxnet/optimizer.html
+++ b/versions/master/_modules/mxnet/optimizer.html
@@ -1381,7 +1381,7 @@ <h1>Source code for mxnet.optimizer</h1><div 
class="highlight"><pre>
 
 <span class="sd">    Much like Adam is essentially RMSprop with 
momentum,</span>
 <span class="sd">    Nadam is Adam RMSprop with Nesterov momentum 
available</span>
-<span class="sd">    at 
http://cs229.stanford.edu/proj2015/054_report.pdf.</span>
+<span class="sd">    at 
https://openreview.net/pdf?id=OM0jvwB8jIp57ZJjtNEZ.</span>
 
 <span class="sd">    This optimizer accepts the following parameters in 
addition to those accepted</span>
 <span class="sd">    by :class:`.Optimizer`.</span>
diff --git a/versions/master/api/python/model.html 
b/versions/master/api/python/model.html
index 496a3b98..40404c7c 100644
--- a/versions/master/api/python/model.html
+++ b/versions/master/api/python/model.html
@@ -2238,7 +2238,7 @@ <h1 id="logo-wrap">
 <dd><p>The Nesterov Adam optimizer.</p>
 <p>Much like Adam is essentially RMSprop with momentum,
 Nadam is Adam RMSprop with Nesterov momentum available
-at <a class="reference external" 
href="http://cs229.stanford.edu/proj2015/054_report.pdf";>http://cs229.stanford.edu/proj2015/054_report.pdf</a>.</p>
+at <a class="reference external" 
href="https://openreview.net/pdf?id=OM0jvwB8jIp57ZJjtNEZ";>https://openreview.net/pdf?id=OM0jvwB8jIp57ZJjtNEZ</a>.</p>
 <p>This optimizer accepts the following parameters in addition to those 
accepted
 by <a class="reference internal" 
href="optimization/optimization.html#mxnet.optimizer.Optimizer" 
title="mxnet.optimizer.Optimizer"><code class="xref py py-class docutils 
literal"><span class="pre">Optimizer</span></code></a>.</p>
 <table class="docutils field-list" frame="void" rules="none">
diff --git a/versions/master/api/python/optimization.html 
b/versions/master/api/python/optimization.html
index d0a5c530..5fd057d6 100644
--- a/versions/master/api/python/optimization.html
+++ b/versions/master/api/python/optimization.html
@@ -863,7 +863,7 @@ <h1 id="logo-wrap">
 <dd><p>The Nesterov Adam optimizer.</p>
 <p>Much like Adam is essentially RMSprop with momentum,
 Nadam is Adam RMSprop with Nesterov momentum available
-at <a class="reference external" 
href="http://cs229.stanford.edu/proj2015/054_report.pdf";>http://cs229.stanford.edu/proj2015/054_report.pdf</a>.</p>
+at <a class="reference external" 
href="https://openreview.net/pdf?id=OM0jvwB8jIp57ZJjtNEZ";>https://openreview.net/pdf?id=OM0jvwB8jIp57ZJjtNEZ</a>.</p>
 <p>This optimizer accepts the following parameters in addition to those 
accepted
 by <a class="reference internal" href="#mxnet.optimizer.Optimizer" 
title="mxnet.optimizer.Optimizer"><code class="xref py py-class docutils 
literal"><span class="pre">Optimizer</span></code></a>.</p>
 <table class="docutils field-list" frame="void" rules="none">
diff --git a/versions/master/api/python/optimization/optimization.html 
b/versions/master/api/python/optimization/optimization.html
index 4683430b..4abd9b57 100644
--- a/versions/master/api/python/optimization/optimization.html
+++ b/versions/master/api/python/optimization/optimization.html
@@ -1029,7 +1029,7 @@ <h1 id="logo-wrap">
 <dd><p>The Nesterov Adam optimizer.</p>
 <p>Much like Adam is essentially RMSprop with momentum,
 Nadam is Adam RMSprop with Nesterov momentum available
-at <a class="reference external" 
href="http://cs229.stanford.edu/proj2015/054_report.pdf";>http://cs229.stanford.edu/proj2015/054_report.pdf</a>.</p>
+at <a class="reference external" 
href="https://openreview.net/pdf?id=OM0jvwB8jIp57ZJjtNEZ";>https://openreview.net/pdf?id=OM0jvwB8jIp57ZJjtNEZ</a>.</p>
 <p>This optimizer accepts the following parameters in addition to those 
accepted
 by <a class="reference internal" href="#mxnet.optimizer.Optimizer" 
title="mxnet.optimizer.Optimizer"><code class="xref py py-class docutils 
literal"><span class="pre">Optimizer</span></code></a>.</p>
 <table class="docutils field-list" frame="void" rules="none">
diff --git a/versions/master/how_to/develop_and_hack.html 
b/versions/master/how_to/develop_and_hack.html
index 3b09fd27..15973515 100644
--- a/versions/master/how_to/develop_and_hack.html
+++ b/versions/master/how_to/develop_and_hack.html
@@ -198,7 +198,6 @@ <h1 id="logo-wrap">
 <div class="toctree-wrapper compound">
 <ul>
 <li class="toctree-l1"><a class="reference internal" href="new_op.html">Create 
new operators</a></li>
-<li class="toctree-l1"><a class="reference internal" href="torch.html">Use 
Torch from MXNet</a></li>
 <li class="toctree-l1"><a class="reference internal" href="env_var.html">Set 
environment variables of MXNet</a></li>
 </ul>
 </div>
diff --git a/versions/master/how_to/index.html 
b/versions/master/how_to/index.html
index 9720ea4a..eba17956 100644
--- a/versions/master/how_to/index.html
+++ b/versions/master/how_to/index.html
@@ -215,7 +215,6 @@ <h1 id="logo-wrap">
 <li class="toctree-l3"><a class="reference external" 
href="https://mxnet.incubator.apache.org/versions/master/community/contribute.html";>How
 do I contribute a patch to MXNet?</a></li>
 <li class="toctree-l3"><a class="reference external" 
href="https://mxnet.incubator.apache.org/versions/master/how_to/new_op.html";>How
 do I create new operators in MXNet?</a></li>
 <li class="toctree-l3"><a class="reference external" 
href="https://mxnet.incubator.apache.org/versions/master/how_to/env_var.html";>How
 do I set MXNet?s environmental variables?</a></li>
-<li class="toctree-l3"><a class="reference external" 
href="https://mxnet.incubator.apache.org/versions/master/how_to/torch.html";>How 
do I use MXNet as a front end for Torch?</a></li>
 </ul>
 </li>
 <li class="toctree-l2"><a class="reference internal" 
href="#questions-about-using-mxnet">Questions about Using MXNet</a></li>


 

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

Reply via email to