http://git-wip-us.apache.org/repos/asf/madlib-site/blob/9a2b301d/docs/rc/group__grp__deprecated.html ---------------------------------------------------------------------- diff --git a/docs/rc/group__grp__deprecated.html b/docs/rc/group__grp__deprecated.html new file mode 100644 index 0000000..aaa9813 --- /dev/null +++ b/docs/rc/group__grp__deprecated.html @@ -0,0 +1,149 @@ +<!-- HTML header for doxygen 1.8.4--> +<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"> +<html xmlns="http://www.w3.org/1999/xhtml"> +<head> +<meta http-equiv="Content-Type" content="text/xhtml;charset=UTF-8"/> +<meta http-equiv="X-UA-Compatible" content="IE=9"/> +<meta name="generator" content="Doxygen 1.8.14"/> +<meta name="keywords" content="madlib,postgres,greenplum,machine learning,data mining,deep learning,ensemble methods,data science,market basket analysis,affinity analysis,pca,lda,regression,elastic net,huber white,proportional hazards,k-means,latent dirichlet allocation,bayes,support vector machines,svm"/> +<title>MADlib: Deprecated Modules</title> +<link href="tabs.css" rel="stylesheet" type="text/css"/> +<script type="text/javascript" src="jquery.js"></script> +<script type="text/javascript" src="dynsections.js"></script> +<link href="navtree.css" rel="stylesheet" type="text/css"/> +<script type="text/javascript" src="resize.js"></script> +<script type="text/javascript" src="navtreedata.js"></script> +<script type="text/javascript" src="navtree.js"></script> +<script type="text/javascript"> +/* @license magnet:?xt=urn:btih:cf05388f2679ee054f2beb29a391d25f4e673ac3&dn=gpl-2.0.txt GPL-v2 */ + $(document).ready(initResizable); +/* @license-end */</script> +<link href="search/search.css" rel="stylesheet" type="text/css"/> +<script type="text/javascript" src="search/searchdata.js"></script> +<script type="text/javascript" src="search/search.js"></script> +<script type="text/javascript"> +/* @license magnet:?xt=urn:btih:cf05388f2679ee054f2beb29a391d25f4e673ac3&dn=gpl-2.0.txt GPL-v2 */ + $(document).ready(function() { init_search(); }); +/* @license-end */ +</script> +<script type="text/x-mathjax-config"> + MathJax.Hub.Config({ + extensions: ["tex2jax.js", "TeX/AMSmath.js", "TeX/AMSsymbols.js"], + jax: ["input/TeX","output/HTML-CSS"], +}); +</script><script type="text/javascript" async src="https://cdnjs.cloudflare.com/ajax/libs/mathjax/2.7.2/MathJax.js"></script> +<!-- hack in the navigation tree --> +<script type="text/javascript" src="eigen_navtree_hacks.js"></script> +<link href="doxygen.css" rel="stylesheet" type="text/css" /> +<link href="madlib_extra.css" rel="stylesheet" type="text/css"/> +<!-- google analytics --> +<script> + (function(i,s,o,g,r,a,m){i['GoogleAnalyticsObject']=r;i[r]=i[r]||function(){ + (i[r].q=i[r].q||[]).push(arguments)},i[r].l=1*new Date();a=s.createElement(o), + m=s.getElementsByTagName(o)[0];a.async=1;a.src=g;m.parentNode.insertBefore(a,m) + })(window,document,'script','//www.google-analytics.com/analytics.js','ga'); + ga('create', 'UA-45382226-1', 'madlib.apache.org'); + ga('send', 'pageview'); +</script> +</head> +<body> +<div id="top"><!-- do not remove this div, it is closed by doxygen! --> +<div id="titlearea"> +<table cellspacing="0" cellpadding="0"> + <tbody> + <tr style="height: 56px;"> + <td id="projectlogo"><a href="http://madlib.apache.org"><img alt="Logo" src="madlib.png" height="50" style="padding-left:0.5em;" border="0"/ ></a></td> + <td style="padding-left: 0.5em;"> + <div id="projectname"> + <span id="projectnumber">1.15</span> + </div> + <div id="projectbrief">User Documentation for Apache MADlib</div> + </td> + <td> <div id="MSearchBox" class="MSearchBoxInactive"> + <span class="left"> + <img id="MSearchSelect" src="search/mag_sel.png" + onmouseover="return searchBox.OnSearchSelectShow()" + onmouseout="return searchBox.OnSearchSelectHide()" + alt=""/> + <input type="text" id="MSearchField" value="Search" accesskey="S" + onfocus="searchBox.OnSearchFieldFocus(true)" + onblur="searchBox.OnSearchFieldFocus(false)" + onkeyup="searchBox.OnSearchFieldChange(event)"/> + </span><span class="right"> + <a id="MSearchClose" href="javascript:searchBox.CloseResultsWindow()"><img id="MSearchCloseImg" border="0" src="search/close.png" alt=""/></a> + </span> + </div> +</td> + </tr> + </tbody> +</table> +</div> +<!-- end header part --> +<!-- Generated by Doxygen 1.8.14 --> +<script type="text/javascript"> +/* @license magnet:?xt=urn:btih:cf05388f2679ee054f2beb29a391d25f4e673ac3&dn=gpl-2.0.txt GPL-v2 */ +var searchBox = new SearchBox("searchBox", "search",false,'Search'); +/* @license-end */ +</script> +</div><!-- top --> +<div id="side-nav" class="ui-resizable side-nav-resizable"> + <div id="nav-tree"> + <div id="nav-tree-contents"> + <div id="nav-sync" class="sync"></div> + </div> + </div> + <div id="splitbar" style="-moz-user-select:none;" + class="ui-resizable-handle"> + </div> +</div> +<script type="text/javascript"> +/* @license magnet:?xt=urn:btih:cf05388f2679ee054f2beb29a391d25f4e673ac3&dn=gpl-2.0.txt GPL-v2 */ +$(document).ready(function(){initNavTree('group__grp__deprecated.html','');}); +/* @license-end */ +</script> +<div id="doc-content"> +<!-- window showing the filter options --> +<div id="MSearchSelectWindow" + onmouseover="return searchBox.OnSearchSelectShow()" + onmouseout="return searchBox.OnSearchSelectHide()" + onkeydown="return searchBox.OnSearchSelectKey(event)"> +</div> + +<!-- iframe showing the search results (closed by default) --> +<div id="MSearchResultsWindow"> +<iframe src="javascript:void(0)" frameborder="0" + name="MSearchResults" id="MSearchResults"> +</iframe> +</div> + +<div class="header"> + <div class="summary"> +<a href="#groups">Modules</a> </div> + <div class="headertitle"> +<div class="title">Deprecated Modules</div> </div> +</div><!--header--> +<div class="contents"> +<a name="details" id="details"></a><h2 class="groupheader">Detailed Description</h2> +<p>Deprecated modules that will be removed in the next major version (2.0). There are newer MADlib modules that have replaced these functions. </p> +<table class="memberdecls"> +<tr class="heading"><td colspan="2"><h2 class="groupheader"><a name="groups"></a> +Modules</h2></td></tr> +<tr class="memitem:group__grp__indicator"><td class="memItemLeft" align="right" valign="top"> </td><td class="memItemRight" valign="bottom"><a class="el" href="group__grp__indicator.html">Create Indicator Variables</a></td></tr> +<tr class="memdesc:group__grp__indicator"><td class="mdescLeft"> </td><td class="mdescRight">Provides utility functions helpful for data preparation before modeling. <br /></td></tr> +<tr class="separator:"><td class="memSeparator" colspan="2"> </td></tr> +<tr class="memitem:group__grp__mlogreg"><td class="memItemLeft" align="right" valign="top"> </td><td class="memItemRight" valign="bottom"><a class="el" href="group__grp__mlogreg.html">Multinomial Logistic Regression</a></td></tr> +<tr class="memdesc:group__grp__mlogreg"><td class="mdescLeft"> </td><td class="mdescRight">Also called as softmax regression, models the relationship between one or more independent variables and a categorical dependent variable. <br /></td></tr> +<tr class="separator:"><td class="memSeparator" colspan="2"> </td></tr> +</table> +</div><!-- contents --> +</div><!-- doc-content --> +<!-- start footer part --> +<div id="nav-path" class="navpath"><!-- id is needed for treeview function! --> + <ul> + <li class="footer">Generated on Mon Aug 6 2018 21:55:39 for MADlib by + <a href="http://www.doxygen.org/index.html"> + <img class="footer" src="doxygen.png" alt="doxygen"/></a> 1.8.14 </li> + </ul> +</div> +</body> +</html>
http://git-wip-us.apache.org/repos/asf/madlib-site/blob/9a2b301d/docs/rc/group__grp__deprecated.js ---------------------------------------------------------------------- diff --git a/docs/rc/group__grp__deprecated.js b/docs/rc/group__grp__deprecated.js new file mode 100644 index 0000000..05ef03b --- /dev/null +++ b/docs/rc/group__grp__deprecated.js @@ -0,0 +1,5 @@ +var group__grp__deprecated = +[ + [ "Create Indicator Variables", "group__grp__indicator.html", null ], + [ "Multinomial Logistic Regression", "group__grp__mlogreg.html", null ] +]; \ No newline at end of file http://git-wip-us.apache.org/repos/asf/madlib-site/blob/9a2b301d/docs/rc/group__grp__desc__stats.html ---------------------------------------------------------------------- diff --git a/docs/rc/group__grp__desc__stats.html b/docs/rc/group__grp__desc__stats.html new file mode 100644 index 0000000..21c7333 --- /dev/null +++ b/docs/rc/group__grp__desc__stats.html @@ -0,0 +1,152 @@ +<!-- HTML header for doxygen 1.8.4--> +<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"> +<html xmlns="http://www.w3.org/1999/xhtml"> +<head> +<meta http-equiv="Content-Type" content="text/xhtml;charset=UTF-8"/> +<meta http-equiv="X-UA-Compatible" content="IE=9"/> +<meta name="generator" content="Doxygen 1.8.14"/> +<meta name="keywords" content="madlib,postgres,greenplum,machine learning,data mining,deep learning,ensemble methods,data science,market basket analysis,affinity analysis,pca,lda,regression,elastic net,huber white,proportional hazards,k-means,latent dirichlet allocation,bayes,support vector machines,svm"/> +<title>MADlib: Descriptive Statistics</title> +<link href="tabs.css" rel="stylesheet" type="text/css"/> +<script type="text/javascript" src="jquery.js"></script> +<script type="text/javascript" src="dynsections.js"></script> +<link href="navtree.css" rel="stylesheet" type="text/css"/> +<script type="text/javascript" src="resize.js"></script> +<script type="text/javascript" src="navtreedata.js"></script> +<script type="text/javascript" src="navtree.js"></script> +<script type="text/javascript"> +/* @license magnet:?xt=urn:btih:cf05388f2679ee054f2beb29a391d25f4e673ac3&dn=gpl-2.0.txt GPL-v2 */ + $(document).ready(initResizable); +/* @license-end */</script> +<link href="search/search.css" rel="stylesheet" type="text/css"/> +<script type="text/javascript" src="search/searchdata.js"></script> +<script type="text/javascript" src="search/search.js"></script> +<script type="text/javascript"> +/* @license magnet:?xt=urn:btih:cf05388f2679ee054f2beb29a391d25f4e673ac3&dn=gpl-2.0.txt GPL-v2 */ + $(document).ready(function() { init_search(); }); +/* @license-end */ +</script> +<script type="text/x-mathjax-config"> + MathJax.Hub.Config({ + extensions: ["tex2jax.js", "TeX/AMSmath.js", "TeX/AMSsymbols.js"], + jax: ["input/TeX","output/HTML-CSS"], +}); +</script><script type="text/javascript" async src="https://cdnjs.cloudflare.com/ajax/libs/mathjax/2.7.2/MathJax.js"></script> +<!-- hack in the navigation tree --> +<script type="text/javascript" src="eigen_navtree_hacks.js"></script> +<link href="doxygen.css" rel="stylesheet" type="text/css" /> +<link href="madlib_extra.css" rel="stylesheet" type="text/css"/> +<!-- google analytics --> +<script> + (function(i,s,o,g,r,a,m){i['GoogleAnalyticsObject']=r;i[r]=i[r]||function(){ + (i[r].q=i[r].q||[]).push(arguments)},i[r].l=1*new Date();a=s.createElement(o), + m=s.getElementsByTagName(o)[0];a.async=1;a.src=g;m.parentNode.insertBefore(a,m) + })(window,document,'script','//www.google-analytics.com/analytics.js','ga'); + ga('create', 'UA-45382226-1', 'madlib.apache.org'); + ga('send', 'pageview'); +</script> +</head> +<body> +<div id="top"><!-- do not remove this div, it is closed by doxygen! --> +<div id="titlearea"> +<table cellspacing="0" cellpadding="0"> + <tbody> + <tr style="height: 56px;"> + <td id="projectlogo"><a href="http://madlib.apache.org"><img alt="Logo" src="madlib.png" height="50" style="padding-left:0.5em;" border="0"/ ></a></td> + <td style="padding-left: 0.5em;"> + <div id="projectname"> + <span id="projectnumber">1.15</span> + </div> + <div id="projectbrief">User Documentation for Apache MADlib</div> + </td> + <td> <div id="MSearchBox" class="MSearchBoxInactive"> + <span class="left"> + <img id="MSearchSelect" src="search/mag_sel.png" + onmouseover="return searchBox.OnSearchSelectShow()" + onmouseout="return searchBox.OnSearchSelectHide()" + alt=""/> + <input type="text" id="MSearchField" value="Search" accesskey="S" + onfocus="searchBox.OnSearchFieldFocus(true)" + onblur="searchBox.OnSearchFieldFocus(false)" + onkeyup="searchBox.OnSearchFieldChange(event)"/> + </span><span class="right"> + <a id="MSearchClose" href="javascript:searchBox.CloseResultsWindow()"><img id="MSearchCloseImg" border="0" src="search/close.png" alt=""/></a> + </span> + </div> +</td> + </tr> + </tbody> +</table> +</div> +<!-- end header part --> +<!-- Generated by Doxygen 1.8.14 --> +<script type="text/javascript"> +/* @license magnet:?xt=urn:btih:cf05388f2679ee054f2beb29a391d25f4e673ac3&dn=gpl-2.0.txt GPL-v2 */ +var searchBox = new SearchBox("searchBox", "search",false,'Search'); +/* @license-end */ +</script> +</div><!-- top --> +<div id="side-nav" class="ui-resizable side-nav-resizable"> + <div id="nav-tree"> + <div id="nav-tree-contents"> + <div id="nav-sync" class="sync"></div> + </div> + </div> + <div id="splitbar" style="-moz-user-select:none;" + class="ui-resizable-handle"> + </div> +</div> +<script type="text/javascript"> +/* @license magnet:?xt=urn:btih:cf05388f2679ee054f2beb29a391d25f4e673ac3&dn=gpl-2.0.txt GPL-v2 */ +$(document).ready(function(){initNavTree('group__grp__desc__stats.html','');}); +/* @license-end */ +</script> +<div id="doc-content"> +<!-- window showing the filter options --> +<div id="MSearchSelectWindow" + onmouseover="return searchBox.OnSearchSelectShow()" + onmouseout="return searchBox.OnSearchSelectHide()" + onkeydown="return searchBox.OnSearchSelectKey(event)"> +</div> + +<!-- iframe showing the search results (closed by default) --> +<div id="MSearchResultsWindow"> +<iframe src="javascript:void(0)" frameborder="0" + name="MSearchResults" id="MSearchResults"> +</iframe> +</div> + +<div class="header"> + <div class="summary"> +<a href="#groups">Modules</a> </div> + <div class="headertitle"> +<div class="title">Descriptive Statistics<div class="ingroups"><a class="el" href="group__grp__stats.html">Statistics</a></div></div> </div> +</div><!--header--> +<div class="contents"> +<a name="details" id="details"></a><h2 class="groupheader">Detailed Description</h2> +<p>Methods to compute descriptive statistics of a dataset. </p> +<table class="memberdecls"> +<tr class="heading"><td colspan="2"><h2 class="groupheader"><a name="groups"></a> +Modules</h2></td></tr> +<tr class="memitem:group__grp__sketches"><td class="memItemLeft" align="right" valign="top"> </td><td class="memItemRight" valign="bottom"><a class="el" href="group__grp__sketches.html">Cardinality Estimators</a></td></tr> +<tr class="memdesc:group__grp__sketches"><td class="mdescLeft"> </td><td class="mdescRight">Methods to estimate the number of unique values contained in data. <br /></td></tr> +<tr class="separator:"><td class="memSeparator" colspan="2"> </td></tr> +<tr class="memitem:group__grp__correlation"><td class="memItemLeft" align="right" valign="top"> </td><td class="memItemRight" valign="bottom"><a class="el" href="group__grp__correlation.html">Covariance and Correlation</a></td></tr> +<tr class="memdesc:group__grp__correlation"><td class="mdescLeft"> </td><td class="mdescRight">Generates a covariance or Pearson correlation matrix for pairs of numeric columns in a table. <br /></td></tr> +<tr class="separator:"><td class="memSeparator" colspan="2"> </td></tr> +<tr class="memitem:group__grp__summary"><td class="memItemLeft" align="right" valign="top"> </td><td class="memItemRight" valign="bottom"><a class="el" href="group__grp__summary.html">Summary</a></td></tr> +<tr class="memdesc:group__grp__summary"><td class="mdescLeft"> </td><td class="mdescRight">Calculates general descriptive statistics for any data table. <br /></td></tr> +<tr class="separator:"><td class="memSeparator" colspan="2"> </td></tr> +</table> +</div><!-- contents --> +</div><!-- doc-content --> +<!-- start footer part --> +<div id="nav-path" class="navpath"><!-- id is needed for treeview function! --> + <ul> + <li class="footer">Generated on Mon Aug 6 2018 21:55:39 for MADlib by + <a href="http://www.doxygen.org/index.html"> + <img class="footer" src="doxygen.png" alt="doxygen"/></a> 1.8.14 </li> + </ul> +</div> +</body> +</html> http://git-wip-us.apache.org/repos/asf/madlib-site/blob/9a2b301d/docs/rc/group__grp__desc__stats.js ---------------------------------------------------------------------- diff --git a/docs/rc/group__grp__desc__stats.js b/docs/rc/group__grp__desc__stats.js new file mode 100644 index 0000000..d49a7aa --- /dev/null +++ b/docs/rc/group__grp__desc__stats.js @@ -0,0 +1,6 @@ +var group__grp__desc__stats = +[ + [ "Cardinality Estimators", "group__grp__sketches.html", "group__grp__sketches" ], + [ "Covariance and Correlation", "group__grp__correlation.html", null ], + [ "Summary", "group__grp__summary.html", null ] +]; \ No newline at end of file http://git-wip-us.apache.org/repos/asf/madlib-site/blob/9a2b301d/docs/rc/group__grp__early__stage.html ---------------------------------------------------------------------- diff --git a/docs/rc/group__grp__early__stage.html b/docs/rc/group__grp__early__stage.html new file mode 100644 index 0000000..307443d --- /dev/null +++ b/docs/rc/group__grp__early__stage.html @@ -0,0 +1,155 @@ +<!-- HTML header for doxygen 1.8.4--> +<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"> +<html xmlns="http://www.w3.org/1999/xhtml"> +<head> +<meta http-equiv="Content-Type" content="text/xhtml;charset=UTF-8"/> +<meta http-equiv="X-UA-Compatible" content="IE=9"/> +<meta name="generator" content="Doxygen 1.8.14"/> +<meta name="keywords" content="madlib,postgres,greenplum,machine learning,data mining,deep learning,ensemble methods,data science,market basket analysis,affinity analysis,pca,lda,regression,elastic net,huber white,proportional hazards,k-means,latent dirichlet allocation,bayes,support vector machines,svm"/> +<title>MADlib: Early Stage Development</title> +<link href="tabs.css" rel="stylesheet" type="text/css"/> +<script type="text/javascript" src="jquery.js"></script> +<script type="text/javascript" src="dynsections.js"></script> +<link href="navtree.css" rel="stylesheet" type="text/css"/> +<script type="text/javascript" src="resize.js"></script> +<script type="text/javascript" src="navtreedata.js"></script> +<script type="text/javascript" src="navtree.js"></script> +<script type="text/javascript"> +/* @license magnet:?xt=urn:btih:cf05388f2679ee054f2beb29a391d25f4e673ac3&dn=gpl-2.0.txt GPL-v2 */ + $(document).ready(initResizable); +/* @license-end */</script> +<link href="search/search.css" rel="stylesheet" type="text/css"/> +<script type="text/javascript" src="search/searchdata.js"></script> +<script type="text/javascript" src="search/search.js"></script> +<script type="text/javascript"> +/* @license magnet:?xt=urn:btih:cf05388f2679ee054f2beb29a391d25f4e673ac3&dn=gpl-2.0.txt GPL-v2 */ + $(document).ready(function() { init_search(); }); +/* @license-end */ +</script> +<script type="text/x-mathjax-config"> + MathJax.Hub.Config({ + extensions: ["tex2jax.js", "TeX/AMSmath.js", "TeX/AMSsymbols.js"], + jax: ["input/TeX","output/HTML-CSS"], +}); +</script><script type="text/javascript" async src="https://cdnjs.cloudflare.com/ajax/libs/mathjax/2.7.2/MathJax.js"></script> +<!-- hack in the navigation tree --> +<script type="text/javascript" src="eigen_navtree_hacks.js"></script> +<link href="doxygen.css" rel="stylesheet" type="text/css" /> +<link href="madlib_extra.css" rel="stylesheet" type="text/css"/> +<!-- google analytics --> +<script> + (function(i,s,o,g,r,a,m){i['GoogleAnalyticsObject']=r;i[r]=i[r]||function(){ + (i[r].q=i[r].q||[]).push(arguments)},i[r].l=1*new Date();a=s.createElement(o), + m=s.getElementsByTagName(o)[0];a.async=1;a.src=g;m.parentNode.insertBefore(a,m) + })(window,document,'script','//www.google-analytics.com/analytics.js','ga'); + ga('create', 'UA-45382226-1', 'madlib.apache.org'); + ga('send', 'pageview'); +</script> +</head> +<body> +<div id="top"><!-- do not remove this div, it is closed by doxygen! --> +<div id="titlearea"> +<table cellspacing="0" cellpadding="0"> + <tbody> + <tr style="height: 56px;"> + <td id="projectlogo"><a href="http://madlib.apache.org"><img alt="Logo" src="madlib.png" height="50" style="padding-left:0.5em;" border="0"/ ></a></td> + <td style="padding-left: 0.5em;"> + <div id="projectname"> + <span id="projectnumber">1.15</span> + </div> + <div id="projectbrief">User Documentation for Apache MADlib</div> + </td> + <td> <div id="MSearchBox" class="MSearchBoxInactive"> + <span class="left"> + <img id="MSearchSelect" src="search/mag_sel.png" + onmouseover="return searchBox.OnSearchSelectShow()" + onmouseout="return searchBox.OnSearchSelectHide()" + alt=""/> + <input type="text" id="MSearchField" value="Search" accesskey="S" + onfocus="searchBox.OnSearchFieldFocus(true)" + onblur="searchBox.OnSearchFieldFocus(false)" + onkeyup="searchBox.OnSearchFieldChange(event)"/> + </span><span class="right"> + <a id="MSearchClose" href="javascript:searchBox.CloseResultsWindow()"><img id="MSearchCloseImg" border="0" src="search/close.png" alt=""/></a> + </span> + </div> +</td> + </tr> + </tbody> +</table> +</div> +<!-- end header part --> +<!-- Generated by Doxygen 1.8.14 --> +<script type="text/javascript"> +/* @license magnet:?xt=urn:btih:cf05388f2679ee054f2beb29a391d25f4e673ac3&dn=gpl-2.0.txt GPL-v2 */ +var searchBox = new SearchBox("searchBox", "search",false,'Search'); +/* @license-end */ +</script> +</div><!-- top --> +<div id="side-nav" class="ui-resizable side-nav-resizable"> + <div id="nav-tree"> + <div id="nav-tree-contents"> + <div id="nav-sync" class="sync"></div> + </div> + </div> + <div id="splitbar" style="-moz-user-select:none;" + class="ui-resizable-handle"> + </div> +</div> +<script type="text/javascript"> +/* @license magnet:?xt=urn:btih:cf05388f2679ee054f2beb29a391d25f4e673ac3&dn=gpl-2.0.txt GPL-v2 */ +$(document).ready(function(){initNavTree('group__grp__early__stage.html','');}); +/* @license-end */ +</script> +<div id="doc-content"> +<!-- window showing the filter options --> +<div id="MSearchSelectWindow" + onmouseover="return searchBox.OnSearchSelectShow()" + onmouseout="return searchBox.OnSearchSelectHide()" + onkeydown="return searchBox.OnSearchSelectKey(event)"> +</div> + +<!-- iframe showing the search results (closed by default) --> +<div id="MSearchResultsWindow"> +<iframe src="javascript:void(0)" frameborder="0" + name="MSearchResults" id="MSearchResults"> +</iframe> +</div> + +<div class="header"> + <div class="summary"> +<a href="#groups">Modules</a> </div> + <div class="headertitle"> +<div class="title">Early Stage Development</div> </div> +</div><!--header--> +<div class="contents"> +<a name="details" id="details"></a><h2 class="groupheader">Detailed Description</h2> +<p>Implementations which are in an early stage of development. Interface and implementation are subject to change. </p> +<table class="memberdecls"> +<tr class="heading"><td colspan="2"><h2 class="groupheader"><a name="groups"></a> +Modules</h2></td></tr> +<tr class="memitem:group__grp__cg"><td class="memItemLeft" align="right" valign="top"> </td><td class="memItemRight" valign="bottom"><a class="el" href="group__grp__cg.html">Conjugate Gradient</a></td></tr> +<tr class="memdesc:group__grp__cg"><td class="mdescLeft"> </td><td class="mdescRight">Finds the solution to the function \( \boldsymbol Ax = \boldsymbol b \), where \(A\) is a symmetric, positive-definite matrix and \(x\) and \( \boldsymbol b \) are vectors. <br /></td></tr> +<tr class="separator:"><td class="memSeparator" colspan="2"> </td></tr> +<tr class="memitem:group__grp__knn"><td class="memItemLeft" align="right" valign="top"> </td><td class="memItemRight" valign="bottom"><a class="el" href="group__grp__knn.html">k-Nearest Neighbors</a></td></tr> +<tr class="memdesc:group__grp__knn"><td class="mdescLeft"> </td><td class="mdescRight">Finds k nearest data points to the given data point and outputs majority vote value of output classes for classification, and average value of target values for regression. <br /></td></tr> +<tr class="separator:"><td class="memSeparator" colspan="2"> </td></tr> +<tr class="memitem:group__grp__bayes"><td class="memItemLeft" align="right" valign="top"> </td><td class="memItemRight" valign="bottom"><a class="el" href="group__grp__bayes.html">Naive Bayes Classification</a></td></tr> +<tr class="memdesc:group__grp__bayes"><td class="mdescLeft"> </td><td class="mdescRight">Constructs a classification model from a dataset where each attribute independently contributes to the probability that a data point belongs to a category. <br /></td></tr> +<tr class="separator:"><td class="memSeparator" colspan="2"> </td></tr> +<tr class="memitem:group__grp__sample"><td class="memItemLeft" align="right" valign="top"> </td><td class="memItemRight" valign="bottom"><a class="el" href="group__grp__sample.html">Random Sampling</a></td></tr> +<tr class="memdesc:group__grp__sample"><td class="mdescLeft"> </td><td class="mdescRight">Provides utility functions for sampling operations. <br /></td></tr> +<tr class="separator:"><td class="memSeparator" colspan="2"> </td></tr> +</table> +</div><!-- contents --> +</div><!-- doc-content --> +<!-- start footer part --> +<div id="nav-path" class="navpath"><!-- id is needed for treeview function! --> + <ul> + <li class="footer">Generated on Mon Aug 6 2018 21:55:39 for MADlib by + <a href="http://www.doxygen.org/index.html"> + <img class="footer" src="doxygen.png" alt="doxygen"/></a> 1.8.14 </li> + </ul> +</div> +</body> +</html> http://git-wip-us.apache.org/repos/asf/madlib-site/blob/9a2b301d/docs/rc/group__grp__early__stage.js ---------------------------------------------------------------------- diff --git a/docs/rc/group__grp__early__stage.js b/docs/rc/group__grp__early__stage.js new file mode 100644 index 0000000..407c515 --- /dev/null +++ b/docs/rc/group__grp__early__stage.js @@ -0,0 +1,7 @@ +var group__grp__early__stage = +[ + [ "Conjugate Gradient", "group__grp__cg.html", null ], + [ "k-Nearest Neighbors", "group__grp__knn.html", null ], + [ "Naive Bayes Classification", "group__grp__bayes.html", null ], + [ "Random Sampling", "group__grp__sample.html", null ] +]; \ No newline at end of file http://git-wip-us.apache.org/repos/asf/madlib-site/blob/9a2b301d/docs/rc/group__grp__elasticnet.html ---------------------------------------------------------------------- diff --git a/docs/rc/group__grp__elasticnet.html b/docs/rc/group__grp__elasticnet.html new file mode 100644 index 0000000..e3d2c0f --- /dev/null +++ b/docs/rc/group__grp__elasticnet.html @@ -0,0 +1,764 @@ +<!-- HTML header for doxygen 1.8.4--> +<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"> +<html xmlns="http://www.w3.org/1999/xhtml"> +<head> +<meta http-equiv="Content-Type" content="text/xhtml;charset=UTF-8"/> +<meta http-equiv="X-UA-Compatible" content="IE=9"/> +<meta name="generator" content="Doxygen 1.8.14"/> +<meta name="keywords" content="madlib,postgres,greenplum,machine learning,data mining,deep learning,ensemble methods,data science,market basket analysis,affinity analysis,pca,lda,regression,elastic net,huber white,proportional hazards,k-means,latent dirichlet allocation,bayes,support vector machines,svm"/> +<title>MADlib: Elastic Net Regularization</title> +<link href="tabs.css" rel="stylesheet" type="text/css"/> +<script type="text/javascript" src="jquery.js"></script> +<script type="text/javascript" src="dynsections.js"></script> +<link href="navtree.css" rel="stylesheet" type="text/css"/> +<script type="text/javascript" src="resize.js"></script> +<script type="text/javascript" src="navtreedata.js"></script> +<script type="text/javascript" src="navtree.js"></script> +<script type="text/javascript"> +/* @license magnet:?xt=urn:btih:cf05388f2679ee054f2beb29a391d25f4e673ac3&dn=gpl-2.0.txt GPL-v2 */ + $(document).ready(initResizable); +/* @license-end */</script> +<link href="search/search.css" rel="stylesheet" type="text/css"/> +<script type="text/javascript" src="search/searchdata.js"></script> +<script type="text/javascript" src="search/search.js"></script> +<script type="text/javascript"> +/* @license magnet:?xt=urn:btih:cf05388f2679ee054f2beb29a391d25f4e673ac3&dn=gpl-2.0.txt GPL-v2 */ + $(document).ready(function() { init_search(); }); +/* @license-end */ +</script> +<script type="text/x-mathjax-config"> + MathJax.Hub.Config({ + extensions: ["tex2jax.js", "TeX/AMSmath.js", "TeX/AMSsymbols.js"], + jax: ["input/TeX","output/HTML-CSS"], +}); +</script><script type="text/javascript" async src="https://cdnjs.cloudflare.com/ajax/libs/mathjax/2.7.2/MathJax.js"></script> +<!-- hack in the navigation tree --> +<script type="text/javascript" src="eigen_navtree_hacks.js"></script> +<link href="doxygen.css" rel="stylesheet" type="text/css" /> +<link href="madlib_extra.css" rel="stylesheet" type="text/css"/> +<!-- google analytics --> +<script> + (function(i,s,o,g,r,a,m){i['GoogleAnalyticsObject']=r;i[r]=i[r]||function(){ + (i[r].q=i[r].q||[]).push(arguments)},i[r].l=1*new Date();a=s.createElement(o), + m=s.getElementsByTagName(o)[0];a.async=1;a.src=g;m.parentNode.insertBefore(a,m) + })(window,document,'script','//www.google-analytics.com/analytics.js','ga'); + ga('create', 'UA-45382226-1', 'madlib.apache.org'); + ga('send', 'pageview'); +</script> +</head> +<body> +<div id="top"><!-- do not remove this div, it is closed by doxygen! --> +<div id="titlearea"> +<table cellspacing="0" cellpadding="0"> + <tbody> + <tr style="height: 56px;"> + <td id="projectlogo"><a href="http://madlib.apache.org"><img alt="Logo" src="madlib.png" height="50" style="padding-left:0.5em;" border="0"/ ></a></td> + <td style="padding-left: 0.5em;"> + <div id="projectname"> + <span id="projectnumber">1.15</span> + </div> + <div id="projectbrief">User Documentation for Apache MADlib</div> + </td> + <td> <div id="MSearchBox" class="MSearchBoxInactive"> + <span class="left"> + <img id="MSearchSelect" src="search/mag_sel.png" + onmouseover="return searchBox.OnSearchSelectShow()" + onmouseout="return searchBox.OnSearchSelectHide()" + alt=""/> + <input type="text" id="MSearchField" value="Search" accesskey="S" + onfocus="searchBox.OnSearchFieldFocus(true)" + onblur="searchBox.OnSearchFieldFocus(false)" + onkeyup="searchBox.OnSearchFieldChange(event)"/> + </span><span class="right"> + <a id="MSearchClose" href="javascript:searchBox.CloseResultsWindow()"><img id="MSearchCloseImg" border="0" src="search/close.png" alt=""/></a> + </span> + </div> +</td> + </tr> + </tbody> +</table> +</div> +<!-- end header part --> +<!-- Generated by Doxygen 1.8.14 --> +<script type="text/javascript"> +/* @license magnet:?xt=urn:btih:cf05388f2679ee054f2beb29a391d25f4e673ac3&dn=gpl-2.0.txt GPL-v2 */ +var searchBox = new SearchBox("searchBox", "search",false,'Search'); +/* @license-end */ +</script> +</div><!-- top --> +<div id="side-nav" class="ui-resizable side-nav-resizable"> + <div id="nav-tree"> + <div id="nav-tree-contents"> + <div id="nav-sync" class="sync"></div> + </div> + </div> + <div id="splitbar" style="-moz-user-select:none;" + class="ui-resizable-handle"> + </div> +</div> +<script type="text/javascript"> +/* @license magnet:?xt=urn:btih:cf05388f2679ee054f2beb29a391d25f4e673ac3&dn=gpl-2.0.txt GPL-v2 */ +$(document).ready(function(){initNavTree('group__grp__elasticnet.html','');}); +/* @license-end */ +</script> +<div id="doc-content"> +<!-- window showing the filter options --> +<div id="MSearchSelectWindow" + onmouseover="return searchBox.OnSearchSelectShow()" + onmouseout="return searchBox.OnSearchSelectHide()" + onkeydown="return searchBox.OnSearchSelectKey(event)"> +</div> + +<!-- iframe showing the search results (closed by default) --> +<div id="MSearchResultsWindow"> +<iframe src="javascript:void(0)" frameborder="0" + name="MSearchResults" id="MSearchResults"> +</iframe> +</div> + +<div class="header"> + <div class="headertitle"> +<div class="title">Elastic Net Regularization<div class="ingroups"><a class="el" href="group__grp__super.html">Supervised Learning</a> » <a class="el" href="group__grp__regml.html">Regression Models</a></div></div> </div> +</div><!--header--> +<div class="contents"> +<div class="toc"><b>Contents</b><ul> +<li class="level1"> +<a href="#train">Training Function</a> </li> +<li class="level1"> +<a href="#optimizer">Optimizer Parameters</a> </li> +<li class="level1"> +<a href="#predict">Prediction Functions</a> </li> +<li class="level1"> +<a href="#examples">Examples</a> </li> +<li class="level1"> +<a href="#background">Technical Background</a> </li> +<li class="level1"> +<a href="#literature">Literature</a> </li> +<li class="level1"> +<a href="#related">Related Topics</a> </li> +</ul> +</div><p>This module implements elastic net regularization [1] for linear and logistic regression. Regularization is a technique often used to prevent overfitting.</p> +<p><a class="anchor" id="train"></a></p><dl class="section user"><dt>Training Function</dt><dd>The training function has the following syntax: <pre class="syntax"> +elastic_net_train( tbl_source, + tbl_result, + col_dep_var, + col_ind_var, + regress_family, + alpha, + lambda_value, + standardize, + grouping_col, + optimizer, + optimizer_params, + excluded, + max_iter, + tolerance + ) +</pre></dd></dl> +<p><b>Arguments</b> </p><dl class="arglist"> +<dt>tbl_source </dt> +<dd><p class="startdd">TEXT. The name of the table containing the training data.</p> +<p class="enddd"></p> +</dd> +<dt>tbl_result </dt> +<dd><p class="startdd">TEXT. Name of the output table containing output model. The output table produced by the <a class="el" href="elastic__net_8sql__in.html#a735038a5090c112505c740a90a203e83" title="Interface for elastic net. ">elastic_net_train()</a> function has the following columns: </p><table class="output"> +<tr> +<th>regress_family </th><td>The regression type: 'gaussian' or 'binomial'. </td></tr> +<tr> +<th>features </th><td>Array of features (independent variables) passed to the algorithm. </td></tr> +<tr> +<th>features_selected </th><td>Array of features selected by the algorithm. </td></tr> +<tr> +<th>coef_nonzero </th><td>Coefficients of the selected features. </td></tr> +<tr> +<th>coef_all </th><td>Coefficients of all features, both selected and unselected. </td></tr> +<tr> +<th>intercept </th><td>Intercept for the model. </td></tr> +<tr> +<th>log_likelihood </th><td>Log of the likelihood value produced by the algorithm. </td></tr> +<tr> +<th>standardize </th><td>BOOLEAN. If data has been normalized, will be set to TRUE. </td></tr> +<tr> +<th>iteration_run </th><td>The number of iterations executed. </td></tr> +</table> +<p class="enddd"></p> +</dd> +<dt>col_dep_var </dt> +<dd><p class="startdd">TEXT. An expression for the dependent variable.</p> +<dl class="section note"><dt>Note</dt><dd>Both <em>col_dep_var</em> and <em>col_ind_var</em> can be valid PostgreSQL expressions. For example, <code>col_dep_var = 'log(y+1)'</code>, and <code>col_ind_var = 'array[exp(x[1]), x[2], 1/(1+x[3])]'</code>. In the binomial case, you can use a Boolean expression, for example, <code>col_dep_var = 'y < 0'</code>.</dd></dl> +</dd> +<dt>col_ind_var </dt> +<dd><p class="startdd">TEXT. An expression for the independent variables. Use <code>'*'</code> to specify all columns of <em>tbl_source</em> except those listed in the <em>excluded</em> string described below. If <em>col_dep_var</em> is a column name, it is automatically excluded from the independent variables. However, if <em>col_dep_var</em> is a valid PostgreSQL expression, any column names used within the expression are only excluded if they are explicitly listed in the <em>excluded</em> argument. Therefore, it is a good idea to add all column names involved in the dependent variable expression to the <em>excluded</em> string.</p> +<p class="enddd"></p> +</dd> +<dt>regress_family </dt> +<dd><p class="startdd">TEXT. For regression type, specify either 'gaussian' ('linear') or 'binomial' ('logistic').</p> +<p class="enddd"></p> +</dd> +<dt>alpha </dt> +<dd><p class="startdd">FLOAT8. Elastic net control parameter with a value in the range [0, 1]. A value of 1 means L1 regularization, and a value of 0 means L2 regularization.</p> +<p class="enddd"></p> +</dd> +<dt>lambda_value </dt> +<dd><p class="startdd">FLOAT8. Regularization parameter (must be positive).</p> +<p class="enddd"></p> +</dd> +<dt>standardize (optional) </dt> +<dd><p class="startdd">BOOLEAN, default: TRUE. Whether to normalize the data or not. Setting to TRUE usually yields better results and faster convergence.</p> +<p class="enddd"></p> +</dd> +<dt>grouping_col (optional) </dt> +<dd><p class="startdd">TEXT, default: NULL. A single column or a list of comma-separated columns that divides the input data into discrete groups, resulting in one regression per group. When this value is NULL, no grouping is used and a single model is generated for all data.</p> +<dl class="section note"><dt>Note</dt><dd>Expressions are not currently supported for 'grouping_col'.</dd></dl> +</dd> +<dt>optimizer (optional) </dt> +<dd><p class="startdd">TEXT, default: 'fista'. Name of optimizer, either 'fista' or 'igd'. FISTA [2] is an algorithm with a fast global rate of convergence for solving linear inverse problems. Incremental gradient descent (IGD) is a stochastic approach to minimizing an objective function [4].</p> +<p class="enddd"></p> +</dd> +<dt>optimizer_params (optional) </dt> +<dd><p class="startdd">TEXT, default: NULL. Optimizer parameters, delimited with commas. These parameters differ depending on the value of <em>optimizer</em> parameter. See the descriptions below for details.</p> +<p class="enddd"></p> +</dd> +<dt>excluded (optional) </dt> +<dd><p class="startdd">TEXT, default: NULL. If the <em>col_ind_var</em> input is '*' then <em>excluded</em> can be provided as a comma-delimited list of column names that are to be excluded from the features. For example, <code>'col1, col2'</code>. If the <em>col_ind_var</em> is an array, <em>excluded</em> must be a list of the integer array positions to exclude, for example <code>'1,2'</code>. If this argument is NULL or an empty string, no columns are excluded.</p> +<p class="enddd"></p> +</dd> +<dt>max_iter (optional) </dt> +<dd><p class="startdd">INTEGER, default: 1000. The maximum number of iterations allowed.</p> +<p class="enddd"></p> +</dd> +<dt>tolerance </dt> +<dd>FLOAT8, default: 1e-6. This is the criterion to stop iterating. Both the 'fista' and 'igd' optimizers compute the difference between the log likelihood of two consecutive iterations, and when the difference is smaller than <em>tolerance</em> or the iteration number is larger than <em>max_iter</em>, the computation stops. </dd> +</dl> +<p><a class="anchor" id="optimizer"></a></p><dl class="section user"><dt>Other Parameters</dt><dd></dd></dl> +<p>For <em>optimizer_params</em>, there are several parameters that can be supplied in a string containing a comma-delimited list of name-value pairs . All of these named parameters are optional and use the format "<param_name> = <value>".</p> +<p>The parameters described below are organized by category: warmup, cross validation and optimization.</p> +<p><em><b>Warmup parameters</b></em> </p><pre class="syntax"> + $$ + warmup = <value>, + warmup_lambdas = <value>, + warmup_lambda_no = <value>, + warmup_tolerance = <value> + $$ +</pre><dl class="arglist"> +<dt>warmup </dt> +<dd><p class="startdd">Default: FALSE. If <em>warmup</em> is TRUE, a series of strictly descending lambda values are used, which end with the lambda value that the user wants to calculate. A larger lambda gives a sparser solution, and the sparse solution is then used as the initial guess for the next lambda's solution, which can speed up the computation for the next lambda. For larger data sets, this can sometimes accelerate the whole computation and may in fact be faster than computation with only a single lambda value.</p> +<p class="enddd"></p> +</dd> +<dt>warmup_lambdas </dt> +<dd><p class="startdd">Default: NULL. Set of lambda values to use when <em>warmup</em> is TRUE. The default is NULL, which means that lambda values will be automatically generated.</p> +<p class="enddd"></p> +</dd> +<dt>warmup_lambda_no </dt> +<dd><p class="startdd">Default: 15. Number of lambda values used in <em>warm-up</em>. If <em>warmup_lambdas</em> is not NULL, this value is overridden by the number of provided lambda values.</p> +<p class="enddd"></p> +</dd> +<dt>warmup_tolerance </dt> +<dd>The value of tolerance used during warmup. The default value is the same as the <em>tolerance</em> argument described above. </dd> +</dl> +<p><em><b>Cross validation parameters</b></em> </p><dl class="section note"><dt>Note</dt><dd>Please note that for performance reasons, warmup is disabled whenever cross validation is used. Also, cross validation is not supported if grouping is used.</dd></dl> +<pre class="syntax"> + $$ + n_folds = <value>, + validation_result = <value>, + lambda_value = <value>, + n_lambdas = <value>, + alpha = <value> + $$ +</pre><p>Hyperparameter optimization can be carried out using the built-in cross validation mechanism, which is activated by assigning a value greater than 1 to the parameter <em>n_folds</em>.</p> +<p>The cross validation scores are the mean and standard deviation of the accuracy when predicted on the validation fold, averaged over all folds and all rows. For classification, the accuracy metric used is the ratio of correct classifications. For regression, the accuracy metric used is the negative of mean squared error (negative to make it a concave problem, thus selecting <em>max</em> means the highest accuracy).</p> +<p>The values of a parameter to cross validate should be provided in a list. For example, to regularize with the L1 norm and use a lambda value from the set {0.3, 0.4, 0.5}, include 'lambda_value={0.3, 0.4, 0.5}'. Note that the use of '{}' and '[]' are both valid here.</p> +<dl class="arglist"> +<dt>n_folds </dt> +<dd><p class="startdd">Default: 0. Number of folds (k). Must be at least 2 to activate cross validation. If a value of k > 2 is specified, each fold is then used as a validation set once, while the other k - 1 folds form the training set. </p> +<p class="enddd"></p> +</dd> +<dt>validation_result </dt> +<dd><p class="startdd">Default: NULL. Name of the table to store the cross validation results, including the values of parameters and their averaged error values. The table is only created if the name is not NULL. </p> +<p class="enddd"></p> +</dd> +<dt>lambda_value </dt> +<dd><p class="startdd">Default: NULL. Set of regularization values to be used for cross validation. The default is NULL, which means that lambda values will be automatically generated.</p> +<p class="enddd"></p> +</dd> +<dt>n_lambdas </dt> +<dd><p class="startdd">Default: 15. Number of lambdas to cross validate over. If a list of lambda values is not provided in the <em>lambda_value</em> set above, this parameter can be used to autogenerate the set of lambdas. If the <em>lambda_value</em> set is not NULL, this value is overridden by the number of provided lambda values. </p> +<dl class="section note"><dt>Note</dt><dd>If you want to cross validate over alpha only and not lambda, then set <em>lambda_value</em> to NULL and <em>n_lambdas</em> to 0. In this case, cross validation will be done on the set of <em>alpha</em> values specified in the next parameter. The lambda value used will be the one specified in the main function call at the top of this page.</dd></dl> +</dd> +<dt>alpha </dt> +<dd>Elastic net control parameter. This is a list of values to apply cross validation on. (Note that alpha values are not autogenerated.) If not specified, the alpha value used will be the one specified in the main function call at the top of this page. </dd> +</dl> +<p><em><b>Optimizer parameters</b></em></p> +<p><b>FISTA</b> Parameters </p><pre class="syntax"> + $$ + max_stepsize = <value>, + eta = <value>, + use_active_set = <value>, + activeset_tolerance = <value>, + random_stepsize = <value> + $$ +</pre><dl class="arglist"> +<dt>max_stepsize </dt> +<dd><p class="startdd">Default: 4.0. Initial backtracking step size. At each iteration, the algorithm first tries <em>stepsize = max_stepsize</em>, and if it does not work out, it then tries a smaller step size, <em>stepsize = stepsize/eta</em>, where <em>eta</em> must be larger than 1. At first glance, this seems to perform repeated iterations for even one step, but using a larger step size actually greatly increases the computation speed and minimizes the total number of iterations. A careful choice of <em>max_stepsize</em> can decrease the computation time by more than 10 times.</p> +<p class="enddd"></p> +</dd> +<dt>eta </dt> +<dd><p class="startdd">Default: 2.0 If stepsize does not work, <em>stepsize/<em>eta</em> is</em> tried. Must be greater than 1. </p> +<p class="enddd"></p> +</dd> +<dt>use_active_set </dt> +<dd><p class="startdd">Default: FALSE. If <em>use_active_set</em> is TRUE, an active-set method is used to speed up the computation. Considerable speedup is obtained by organizing the iterations around the active set of features—those with nonzero coefficients. After a complete cycle through all the variables, we iterate only on the active set until convergence. If another complete cycle does not change the active set, we are done. Otherwise, the process is repeated.</p> +<p class="enddd"></p> +</dd> +<dt>activeset_tolerance </dt> +<dd><p class="startdd">The value of tolerance used during active set calculation. The default value is the same as the <em>tolerance</em> argument described above. </p> +<p class="enddd"></p> +</dd> +<dt>random_stepsize </dt> +<dd>Default: FALSE. Whether to add some randomness to the step size. Sometimes, this can speed up the calculation. </dd> +</dl> +<p><b>IGD</b> parameters </p><pre class="syntax"> + $$ + stepsize = <value>, + step_decay = <value>, + threshold = <value>, + parallel = <value> + $$ +</pre> <dl class="arglist"> +<dt>stepsize </dt> +<dd><p class="startdd">The default is 0.01.</p> +<p class="enddd"></p> +</dd> +<dt>step_decay </dt> +<dd><p class="startdd">The actual stepsize used for current step is (previous stepsize) / exp(step_decay). The default value is 0, which means that a constant stepsize is used in IGD.</p> +<p class="enddd"></p> +</dd> +<dt>threshold </dt> +<dd><p class="startdd">Default: 1e-10. When a coefficient is really small, set this coefficient to be 0.</p> +<p>Due to the stochastic nature of SGD, we can only obtain very small values for the fitting coefficients. Therefore, <em>threshold</em> is needed at the end of the computation to screen out tiny values and hard-set them to zeros. This is accomplished as follows: (1) multiply each coefficient with the standard deviation of the corresponding feature; (2) compute the average of absolute values of re-scaled coefficients; (3) divide each rescaled coefficient with the average, and if the resulting absolute value is smaller than <em>threshold</em>, set the original coefficient to zero.</p> +<p class="enddd"></p> +</dd> +<dt>parallel </dt> +<dd><p class="startdd">Whether to run the computation on multiple segments. The default is TRUE.</p> +<p class="enddd">SGD is a sequential algorithm in nature. When running in a distributed manner, each segment of the data runs its own SGD model and then the models are averaged to get a model for each iteration. This averaging might slow down the convergence speed, but it affords the ability to process large datasets on a cluster of machines. This algorithm, therefore, provides the <em>parallel</em> option to allow you to choose whether to do parallel computation. </p> +</dd> +</dl> +<p><a class="anchor" id="predict"></a></p><dl class="section user"><dt>Prediction Function</dt><dd></dd></dl> +<h4>Per-Tuple Prediction</h4> +<p>The prediction function returns a double value for the Gaussian family and a Boolean value for the Binomial family.</p> +<p>The predict function has the following syntax (<a class="el" href="elastic__net_8sql__in.html#a96db4ff4ba3ea363fafbf6c036c19fae" title="Prediction for linear models use learned coefficients for a given example. ">elastic_net_gaussian_predict()</a> and <a class="el" href="elastic__net_8sql__in.html#aa78cde79f1f2caa7c5b38f933001d793" title="Prediction for logistic models use learned coefficients for a given example. ">elastic_net_binomial_predict()</a>): </p><pre class="syntax"> +elastic_net_<family>_predict( + coefficients, + intercept, + ind_var + ) +</pre><p><b>Arguments</b> </p><dl class="arglist"> +<dt>coefficients </dt> +<dd>DOUBLE PRECISION[]. Fitting coefficients, usually <em>coef_all</em> or <em>coef_nonzero</em>. </dd> +<dt>intercept </dt> +<dd>DOUBLE PRECISION. Intercept for the model. </dd> +<dt>ind_var </dt> +<dd>DOUBLE PRECISION[]. Independent variables that correspond to coefficients. Use <em>features</em> column in <em>tbl_result</em> for <em>coef_all</em>, and <em>features_selected</em> for <em>coef_nonzero</em>. See the <a href="#additional_example">examples for this case below</a>. <dl class="section note"><dt>Note</dt><dd>Unexpected results or errors may be returned in the case that this argument <em>ind_var</em> is not specified properly. </dd></dl> +</dd> +</dl> +<p>For the binomial family, there is a function (<a class="el" href="elastic__net_8sql__in.html#a308718fd5234bc1007b971a639aadf71" title="Compute the probability of belonging to the True class for a given observation. ">elastic_net_binomial_prob()</a>) that outputs the probability of the instance being TRUE: </p><pre class="syntax"> +elastic_net_binomial_prob( + coefficients, + intercept, + ind_var + ) +</pre><h4>Per-Table Prediction</h4> +<p>Alternatively, you can use another prediction function that stores the prediction result in a table (<a class="el" href="elastic__net_8sql__in.html#a3578608204ac9b2d3442ff42977f632b" title="Prediction and put the result in a table can be used together with General-CV. ">elastic_net_predict()</a>). This is useful if you want to use elastic net together with the general cross validation function. </p><pre class="syntax"> +elastic_net_predict( tbl_model, + tbl_new_sourcedata, + col_id, + tbl_predict + ) +</pre><p> <b>Arguments</b> </p><dl class="arglist"> +<dt>tbl_model </dt> +<dd>TEXT. Name of the table containing the output from the training function. </dd> +<dt>tbl_new_sourcedata </dt> +<dd>TEXT. Name of the table containing the new source data. </dd> +<dt>col_id </dt> +<dd>TEXT. Unique ID associated with each row. </dd> +<dt>tbl_predict </dt> +<dd>TEXT. Name of table to store the prediction result. </dd> +</dl> +<p>You do not need to specify whether the model is "linear" or "logistic" because this information is already included in the <em>tbl_model</em> table.</p> +<p><a class="anchor" id="examples"></a></p><dl class="section user"><dt>Examples</dt><dd></dd></dl> +<ol type="1"> +<li>Display online help for the <a class="el" href="elastic__net_8sql__in.html#a735038a5090c112505c740a90a203e83" title="Interface for elastic net. ">elastic_net_train()</a> function: <pre class="example"> +SELECT madlib.elastic_net_train(); +</pre></li> +<li>Create an input data set of house prices and features: <pre class="example"> +DROP TABLE IF EXISTS houses; +CREATE TABLE houses ( id INT, + tax INT, + bedroom INT, + bath FLOAT, + price INT, + size INT, + lot INT, + zipcode INT); +INSERT INTO houses (id, tax, bedroom, bath, price, size, lot, zipcode) VALUES +(1 , 590 , 2 , 1 , 50000 , 770 , 22100 , 94301), +(2 , 1050 , 3 , 2 , 85000 , 1410 , 12000 , 94301), +(3 , 20 , 3 , 1 , 22500 , 1060 , 3500 , 94301), +(4 , 870 , 2 , 2 , 90000 , 1300 , 17500 , 94301), +(5 , 1320 , 3 , 2 , 133000 , 1500 , 30000 , 94301), +(6 , 1350 , 2 , 1 , 90500 , 820 , 25700 , 94301), +(7 , 2790 , 3 , 2.5 , 260000 , 2130 , 25000 , 94301), +(8 , 680 , 2 , 1 , 142500 , 1170 , 22000 , 94301), +(9 , 1840 , 3 , 2 , 160000 , 1500 , 19000 , 94301), +(10 , 3680 , 4 , 2 , 240000 , 2790 , 20000 , 94301), +(11 , 1660 , 3 , 1 , 87000 , 1030 , 17500 , 94301), +(12 , 1620 , 3 , 2 , 118600 , 1250 , 20000 , 94301), +(13 , 3100 , 3 , 2 , 140000 , 1760 , 38000 , 94301), +(14 , 2070 , 2 , 3 , 148000 , 1550 , 14000 , 94301), +(15 , 650 , 3 , 1.5 , 65000 , 1450 , 12000 , 94301), +(16 , 770 , 2 , 2 , 91000 , 1300 , 17500 , 76010), +(17 , 1220 , 3 , 2 , 132300 , 1500 , 30000 , 76010), +(18 , 1150 , 2 , 1 , 91100 , 820 , 25700 , 76010), +(19 , 2690 , 3 , 2.5 , 260011 , 2130 , 25000 , 76010), +(20 , 780 , 2 , 1 , 141800 , 1170 , 22000 , 76010), +(21 , 1910 , 3 , 2 , 160900 , 1500 , 19000 , 76010), +(22 , 3600 , 4 , 2 , 239000 , 2790 , 20000 , 76010), +(23 , 1600 , 3 , 1 , 81010 , 1030 , 17500 , 76010), +(24 , 1590 , 3 , 2 , 117910 , 1250 , 20000 , 76010), +(25 , 3200 , 3 , 2 , 141100 , 1760 , 38000 , 76010), +(26 , 2270 , 2 , 3 , 148011 , 1550 , 14000 , 76010), +(27 , 750 , 3 , 1.5 , 66000 , 1450 , 12000 , 76010); +</pre></li> +<li>Train the model: <pre class="example"> +DROP TABLE IF EXISTS houses_en, houses_en_summary; +SELECT madlib.elastic_net_train( 'houses', -- Source table + 'houses_en', -- Result table + 'price', -- Dependent variable + 'array[tax, bath, size]', -- Independent variable + 'gaussian', -- Regression family + 0.5, -- Alpha value + 0.1, -- Lambda value + TRUE, -- Standardize + NULL, -- Grouping column(s) + 'fista', -- Optimizer + '', -- Optimizer parameters + NULL, -- Excluded columns + 10000, -- Maximum iterations + 1e-6 -- Tolerance value + ); +</pre></li> +<li>View the resulting model: <pre class="example"> +-- Turn on expanded display to make it easier to read results. +\x on +SELECT * FROM houses_en; +</pre> Result: <pre class="result"> +-[ RECORD 1 ]-----+------------------------------------------- +family | gaussian +features | {tax,bath,size} +features_selected | {tax,bath,size} +coef_nonzero | {22.785201806,10707.9664343,54.7959774173} +coef_all | {22.785201806,10707.9664343,54.7959774173} +intercept | -7798.71393905 +log_likelihood | -512248641.971 +standardize | t +iteration_run | 10000 +</pre></li> +<li>Use the prediction function to evaluate residuals: <pre class="example"> +\x off +SELECT id, price, predict, price - predict AS residual +FROM ( + SELECT + houses.*, + madlib.elastic_net_gaussian_predict( + m.coef_all, -- Coefficients + m.intercept, -- Intercept + ARRAY[tax,bath,size] -- Features (corresponding to coefficients) + ) AS predict + FROM houses, houses_en m) s +ORDER BY id; +</pre> Result: <pre class="result"> + id | price | predict | residual +----+--------+------------------+------------------- + 1 | 50000 | 58545.391894031 | -8545.391894031 + 2 | 85000 | 114804.077663003 | -29804.077663003 + 3 | 22500 | 61448.835664388 | -38948.835664388 + 4 | 90000 | 104675.17768007 | -14675.17768007 + 5 | 133000 | 125887.70644358 | 7112.29355642 + 6 | 90500 | 78601.843595366 | 11898.156404634 + 7 | 260000 | 199257.358231079 | 60742.641768921 + 8 | 142500 | 82514.559377081 | 59985.440622919 + 9 | 160000 | 137735.93215082 | 22264.06784918 + 10 | 240000 | 250347.627648647 | -10347.627648647 + 11 | 87000 | 97172.428263539 | -10172.428263539 + 12 | 118600 | 119024.150628605 | -424.150628604999 + 13 | 140000 | 180692.127913358 | -40692.127913358 + 14 | 148000 | 156424.249824545 | -8424.249824545 + 15 | 65000 | 102527.938104575 | -37527.938104575 + 16 | 91000 | 102396.67273637 | -11396.67273637 + 17 | 132300 | 123609.20149988 | 8690.79850012 + 18 | 91100 | 74044.833707966 | 17055.166292034 + 19 | 260011 | 196978.853287379 | 63032.146712621 + 20 | 141800 | 84793.064320781 | 57006.935679219 + 21 | 160900 | 139330.88561141 | 21569.11438859 + 22 | 239000 | 248524.823693687 | -9524.82369368701 + 23 | 81010 | 95805.325297319 | -14795.325297319 + 24 | 117910 | 118340.599145495 | -430.599145494998 + 25 | 141100 | 182970.632857058 | -41870.632857058 + 26 | 148011 | 160981.259711945 | -12970.259711945 + 27 | 66000 | 104806.443048275 | -38806.443048275 +</pre></li> +</ol> +<h4>Example with Grouping</h4> +<ol type="1"> +<li>Reuse the houses table above and train the model by grouping on zip code: <pre class="example"> +DROP TABLE IF EXISTS houses_en1, houses_en1_summary; +SELECT madlib.elastic_net_train( 'houses', -- Source table + 'houses_en1', -- Result table + 'price', -- Dependent variable + 'array[tax, bath, size]', -- Independent variable + 'gaussian', -- Regression family + 0.5, -- Alpha value + 0.1, -- Lambda value + TRUE, -- Standardize + 'zipcode', -- Grouping column(s) + 'fista', -- Optimizer + '', -- Optimizer parameters + NULL, -- Excluded columns + 10000, -- Maximum iterations + 1e-6 -- Tolerance value + ); +</pre></li> +<li>View the resulting model with a separate model for each group: <pre class="example"> +-- Turn on expanded display to make it easier to read results. +\x on +SELECT * FROM houses_en1; +</pre> Result: <pre class="result"> +-[ RECORD 1 ]-----+-------------------------------------------- +zipcode | 94301 +family | gaussian +features | {tax,bath,size} +features_selected | {tax,bath,size} +coef_nonzero | {27.0542096962,12351.5244083,47.5833289771} +coef_all | {27.0542096962,12351.5244083,47.5833289771} +intercept | -7191.19791597 +log_likelihood | -519199964.967 +standardize | t +iteration_run | 10000 +-[ RECORD 2 ]-----+-------------------------------------------- +zipcode | 76010 +family | gaussian +features | {tax,bath,size} +features_selected | {tax,bath,size} +coef_nonzero | {15.6325953499,10166.6608469,57.8689916035} +coef_all | {15.6325953499,10166.6608469,57.8689916035} +intercept | 513.912201627 +log_likelihood | -538806528.45 +standardize | t +iteration_run | 10000 +</pre></li> +<li>Use the prediction function to evaluate residuals: <pre class="example"> +\x off +SELECT madlib.elastic_net_predict( + 'houses_en1', -- Model table + 'houses', -- New source data table + 'id', -- Unique ID associated with each row + 'houses_en1_prediction' -- Table to store prediction result + ); +SELECT houses.id, + houses.price, + houses_en1_prediction.prediction, + houses.price - houses_en1_prediction.prediction AS residual +FROM houses_en1_prediction, houses +WHERE houses.id = houses_en1_prediction.id ORDER BY id; +</pre></li> +</ol> +<p><a class="anchor" id="additional_example"></a></p><h4>Example where coef_nonzero is different from coef_all</h4> +<ol type="1"> +<li>Reuse the <a href="#examples">houses</a> table above and train the model with alpha=1 (L1) and a large lambda value (30000). <pre class="example"> +DROP TABLE IF EXISTS houses_en2, houses_en2_summary; +SELECT madlib.elastic_net_train( 'houses', -- Source table + 'houses_en2', -- Result table + 'price', -- Dependent variable + 'array[tax, bath, size]', -- Independent variable + 'gaussian', -- Regression family + 1, -- Alpha value + 30000, -- Lambda value + TRUE, -- Standardize + NULL, -- Grouping column(s) + 'fista', -- Optimizer + '', -- Optimizer parameters + NULL, -- Excluded columns + 10000, -- Maximum iterations + 1e-6 -- Tolerance value + ); +</pre></li> +<li>View the resulting model and see coef_nonzero is different from coef_all: <pre class="example"> +-- Turn on expanded display to make it easier to read results. +\x on +SELECT * FROM houses_en2; +</pre> Result: <pre class="result"> +-[ RECORD 1 ]-----+-------------------------------- +family | gaussian +features | {tax,bath,size} +features_selected | {tax,size} +coef_nonzero | {6.94744249834,29.7137297658} +coef_all | {6.94744249834,0,29.7137297658} +intercept | 74445.7039382 +log_likelihood | -1635348585.07 +standardize | t +iteration_run | 151 +</pre></li> +<li>We can still use the prediction function with <em>coef_all</em> to evaluate residuals: <pre class="example"> +\x off +SELECT id, price, predict, price - predict AS residual +FROM ( + SELECT + houses.*, + madlib.elastic_net_gaussian_predict( + m.coef_all, -- All coefficients + m.intercept, -- Intercept + ARRAY[tax,bath,size] -- All features + ) AS predict + FROM houses, houses_en2 m) s +ORDER BY id; +</pre></li> +<li>We can speed up the prediction function with <em>coef_nonzero</em> to evaluate residuals. This requires the user to examine the <em>feature_selected</em> column in the result table to construct the correct set of independent variables to provide to the prediction function: <pre class="example"> +\x off +SELECT id, price, predict, price - predict AS residual +FROM ( + SELECT + houses.*, + madlib.elastic_net_gaussian_predict( + m.coef_nonzero, -- Non-zero coefficients + m.intercept, -- Intercept + ARRAY[tax,size] -- Features corresponding to non-zero coefficients + ) AS predict + FROM houses, houses_en2 m) s +ORDER BY id; +</pre> The two queries above will result in same residuals: <pre class="result"> + id | price | predict | residual +----+--------+------------------+------------------- + 1 | 50000 | 101424.266931887 | -51424.2669318866 + 2 | 85000 | 123636.877531235 | -38636.877531235 + 3 | 22500 | 106081.206339915 | -83581.2063399148 + 4 | 90000 | 119117.827607296 | -29117.8276072958 + 5 | 133000 | 128186.922684709 | 4813.0773152912 + 6 | 90500 | 108190.009718915 | -17690.009718915 + 7 | 260000 | 157119.312909723 | 102880.687090277 + 8 | 142500 | 113935.028663057 | 28564.9713369428 + 9 | 160000 | 131799.592783846 | 28200.4072161544 + 10 | 240000 | 182913.598378673 | 57086.4016213268 + 11 | 87000 | 116583.600144218 | -29583.6001442184 + 12 | 118600 | 122842.722992761 | -4242.7229927608 + 13 | 140000 | 148278.940070862 | -8278.94007086201 + 14 | 148000 | 134883.191046754 | 13116.8089532462 + 15 | 65000 | 122046.449722531 | -57046.449722531 + 16 | 91000 | 118423.083357462 | -27423.0833574618 + 17 | 132300 | 127492.178434875 | 4807.8215651252 + 18 | 91100 | 106800.521219247 | -15700.521219247 + 19 | 260011 | 156424.568659889 | 103586.431340111 + 20 | 141800 | 114629.772912891 | 27170.2270871088 + 21 | 160900 | 132285.913758729 | 28614.0862412706 + 22 | 239000 | 182357.802978806 | 56642.197021194 + 23 | 81010 | 116166.753594318 | -35156.753594318 + 24 | 117910 | 122634.299717811 | -4724.29971781059 + 25 | 141100 | 148973.684320696 | -7873.68432069599 + 26 | 148011 | 136272.679546422 | 11738.3204535782 + 27 | 66000 | 122741.193972365 | -56741.193972365 +(27 rows) +</pre></li> +</ol> +<h4>Example with Cross Validation</h4> +<ol type="1"> +<li>Reuse the houses table above. Here we use 3-fold cross validation with 3 automatically generated lambda values and 3 specified alpha values. (This can take some time to run since elastic net is effectively being called 27 times for these combinations, then a 28th time for the whole dataset.) <pre class="example"> +DROP TABLE IF EXISTS houses_en3, houses_en3_summary, houses_en3_cv; +SELECT madlib.elastic_net_train( 'houses', -- Source table + 'houses_en3', -- Result table + 'price', -- Dependent variable + 'array[tax, bath, size]', -- Independent variable + 'gaussian', -- Regression family + 0.5, -- Alpha value + 0.1, -- Lambda value + TRUE, -- Standardize + NULL, -- Grouping column(s) + 'fista', -- Optimizer + $$ n_folds = 3, -- Cross validation parameters + validation_result=houses_en3_cv, + n_lambdas = 3, + alpha = {0, 0.1, 1} + $$, + NULL, -- Excluded columns + 10000, -- Maximum iterations + 1e-6 -- Tolerance value + ); +SELECT * FROM houses_en3; +</pre> <pre class="result"> +-[ RECORD 1 ]-----+-------------------------------------------- +family | gaussian +features | {tax,bath,size} +features_selected | {tax,bath,size} +coef_nonzero | {22.4584188479,11657.0739045,52.1624090811} +coef_all | {22.4584188479,11657.0739045,52.1624090811} +intercept | -5067.33396522 +log_likelihood | -543193170.15 +standardize | t +iteration_run | 10000 +</pre></li> +<li>Details of the cross validation: <pre class="example"> +SELECT * FROM houses_en3_cv ORDER BY mean_neg_loss DESC; +</pre> <pre class="result"> + alpha | lambda_value | mean_neg_loss | std_neg_loss +<br /> +-------+--------------+------------------------------------------+ + 0.0 | 0.1 | -36094.4685768 | 10524.4473253 + 0.1 | 0.1 | -36136.2448004 | 10682.4136993 + 1.0 | 100.0 | -37007.9496501 | 12679.3781975 + 1.0 | 0.1 | -37018.1019927 | 12716.7438015 + 0.1 | 100.0 | -59275.6940173 | 9764.50064237 + 0.0 | 100.0 | -59380.252681 | 9763.26373034 + 1.0 | 100000.0 | -60353.0220769 | 9748.10305107 + 0.1 | 100000.0 | {large neg number} | {large pos number} + 0.0 | 100000.0 | {large neg number} | {large pos number} +(9 rows) +</pre></li> +</ol> +<p><a class="anchor" id="notes"></a></p><dl class="section user"><dt>Note</dt><dd>It is <b>strongly</b> <b>recommended</b> that you run <code><a class="el" href="elastic__net_8sql__in.html#a735038a5090c112505c740a90a203e83" title="Interface for elastic net. ">elastic_net_train()</a></code> on a subset of the data with a limited <em>max_iter</em> before applying it to the full data set with a large <em>max_iter</em>. In the pre-run, you can adjust the parameters to get the best performance and then apply the best set of parameters to the whole data set.</dd></dl> +<p><a class="anchor" id="background"></a></p><dl class="section user"><dt>Technical Background</dt><dd></dd></dl> +<p>Elastic net regularization seeks to find a weight vector that, for any given training example set, minimizes: </p><p class="formulaDsp"> +\[\min_{w \in R^N} L(w) + \lambda \left(\frac{(1-\alpha)}{2} \|w\|_2^2 + \alpha \|w\|_1 \right)\] +</p> +<p> where \(L\) is the metric function that the user wants to minimize. Here \( \alpha \in [0,1] \) and \( lambda \geq 0 \). If \(alpha = 0\), we have the ridge regularization (known also as Tikhonov regularization), and if \(\alpha = 1\), we have the LASSO regularization.</p> +<p>For the Gaussian response family (or linear model), we have </p><p class="formulaDsp"> +\[L(\vec{w}) = \frac{1}{2}\left[\frac{1}{M} \sum_{m=1}^M (w^{t} x_m + w_{0} - y_m)^2 \right] \] +</p> +<p>For the Binomial response family (or logistic model), we have </p><p class="formulaDsp"> +\[ L(\vec{w}) = \sum_{m=1}^M\left[y_m \log\left(1 + e^{-(w_0 + \vec{w}\cdot\vec{x}_m)}\right) + (1-y_m) \log\left(1 + e^{w_0 + \vec{w}\cdot\vec{x}_m}\right)\right]\ , \] +</p> +<p> where \(y_m \in {0,1}\).</p> +<p>To get better convergence, one can rescale the value of each element of x </p><p class="formulaDsp"> +\[ x' \leftarrow \frac{x - \bar{x}}{\sigma_x} \] +</p> +<p> and for Gaussian case we also let </p><p class="formulaDsp"> +\[y' \leftarrow y - \bar{y} \] +</p> +<p> and then minimize with the regularization terms. At the end of the calculation, the orginal scales will be restored and an intercept term will be obtained at the same time as a by-product.</p> +<p>Note that fitting after scaling is not equivalent to directly fitting.</p> +<p><a class="anchor" id="literature"></a></p><dl class="section user"><dt>Literature</dt><dd></dd></dl> +<p>[1] Elastic net regularization, <a href="http://en.wikipedia.org/wiki/Elastic_net_regularization">http://en.wikipedia.org/wiki/Elastic_net_regularization</a></p> +<p>[2] Beck, A. and M. Teboulle (2009), A fast iterative shrinkage-thresholding algorithm for linear inverse problems. SIAM J. on Imaging Sciences 2(1), 183-202.</p> +<p>[3] Shai Shalev-Shwartz and Ambuj Tewari, Stochastic Methods for L1 Regularized Loss Minimization. Proceedings of the 26th International Conference on Machine Learning, Montreal, Canada, 2009.</p> +<p>[4] Stochastic gradient descent, <a href="https://en.wikipedia.org/wiki/Stochastic_gradient_descent">https://en.wikipedia.org/wiki/Stochastic_gradient_descent</a></p> +<p><a class="anchor" id="related"></a></p><dl class="section user"><dt>Related Topics</dt><dd></dd></dl> +<p>File <a class="el" href="elastic__net_8sql__in.html" title="SQL functions for elastic net regularization. ">elastic_net.sql_in</a> documenting the SQL functions. </p> +</div><!-- contents --> +</div><!-- doc-content --> +<!-- start footer part --> +<div id="nav-path" class="navpath"><!-- id is needed for treeview function! --> + <ul> + <li class="footer">Generated on Mon Aug 6 2018 21:55:39 for MADlib by + <a href="http://www.doxygen.org/index.html"> + <img class="footer" src="doxygen.png" alt="doxygen"/></a> 1.8.14 </li> + </ul> +</div> +</body> +</html>