http://git-wip-us.apache.org/repos/asf/spark-website/blob/26c57a24/site/docs/2.3.0/api/R/write.jdbc.html ---------------------------------------------------------------------- diff --git a/site/docs/2.3.0/api/R/write.jdbc.html b/site/docs/2.3.0/api/R/write.jdbc.html new file mode 100644 index 0000000..4045f3a --- /dev/null +++ b/site/docs/2.3.0/api/R/write.jdbc.html @@ -0,0 +1,148 @@ +<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Strict//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd"><html xmlns="http://www.w3.org/1999/xhtml"><head><title>R: Save the content of SparkDataFrame to an external database...</title> +<meta http-equiv="Content-Type" content="text/html; charset=utf-8" /> +<link rel="stylesheet" type="text/css" href="R.css" /> + +<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/8.3/styles/github.min.css"> +<script src="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/8.3/highlight.min.js"></script> +<script src="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/8.3/languages/r.min.js"></script> +<script>hljs.initHighlightingOnLoad();</script> +</head><body> + +<table width="100%" summary="page for write.jdbc {SparkR}"><tr><td>write.jdbc {SparkR}</td><td style="text-align: right;">R Documentation</td></tr></table> + +<h2>Save the content of SparkDataFrame to an external database table via JDBC.</h2> + +<h3>Description</h3> + +<p>Save the content of the SparkDataFrame to an external database table via JDBC. Additional JDBC +database connection properties can be set (...) +</p> + + +<h3>Usage</h3> + +<pre> +write.jdbc(x, url, tableName, mode = "error", ...) + +## S4 method for signature 'SparkDataFrame,character,character' +write.jdbc(x, url, tableName, + mode = "error", ...) +</pre> + + +<h3>Arguments</h3> + +<table summary="R argblock"> +<tr valign="top"><td><code>x</code></td> +<td> +<p>a SparkDataFrame.</p> +</td></tr> +<tr valign="top"><td><code>url</code></td> +<td> +<p>JDBC database url of the form <code>jdbc:subprotocol:subname</code>.</p> +</td></tr> +<tr valign="top"><td><code>tableName</code></td> +<td> +<p>yhe name of the table in the external database.</p> +</td></tr> +<tr valign="top"><td><code>mode</code></td> +<td> +<p>one of 'append', 'overwrite', 'error', 'errorifexists', 'ignore' +save mode (it is 'error' by default)</p> +</td></tr> +<tr valign="top"><td><code>...</code></td> +<td> +<p>additional JDBC database connection properties.</p> +</td></tr> +</table> + + +<h3>Details</h3> + +<p>Also, mode is used to specify the behavior of the save operation when +data already exists in the data source. There are four modes: +</p> + +<ul> +<li><p> 'append': Contents of this SparkDataFrame are expected to be appended to existing data. +</p> +</li> +<li><p> 'overwrite': Existing data is expected to be overwritten by the contents of this +SparkDataFrame. +</p> +</li> +<li><p> 'error' or 'errorifexists': An exception is expected to be thrown. +</p> +</li> +<li><p> 'ignore': The save operation is expected to not save the contents of the SparkDataFrame +and to not change the existing data. +</p> +</li></ul> + + + +<h3>Note</h3> + +<p>write.jdbc since 2.0.0 +</p> + + +<h3>See Also</h3> + +<p>Other SparkDataFrame functions: <code><a href="SparkDataFrame.html">SparkDataFrame-class</a></code>, +<code><a href="summarize.html">agg</a></code>, <code><a href="alias.html">alias</a></code>, +<code><a href="arrange.html">arrange</a></code>, <code><a href="as.data.frame.html">as.data.frame</a></code>, +<code><a href="attach.html">attach,SparkDataFrame-method</a></code>, +<code><a href="broadcast.html">broadcast</a></code>, <code><a href="cache.html">cache</a></code>, +<code><a href="checkpoint.html">checkpoint</a></code>, <code><a href="coalesce.html">coalesce</a></code>, +<code><a href="collect.html">collect</a></code>, <code><a href="columns.html">colnames</a></code>, +<code><a href="coltypes.html">coltypes</a></code>, +<code><a href="createOrReplaceTempView.html">createOrReplaceTempView</a></code>, +<code><a href="crossJoin.html">crossJoin</a></code>, <code><a href="cube.html">cube</a></code>, +<code><a href="dapplyCollect.html">dapplyCollect</a></code>, <code><a href="dapply.html">dapply</a></code>, +<code><a href="describe.html">describe</a></code>, <code><a href="dim.html">dim</a></code>, +<code><a href="distinct.html">distinct</a></code>, <code><a href="dropDuplicates.html">dropDuplicates</a></code>, +<code><a href="nafunctions.html">dropna</a></code>, <code><a href="drop.html">drop</a></code>, +<code><a href="dtypes.html">dtypes</a></code>, <code><a href="except.html">except</a></code>, +<code><a href="explain.html">explain</a></code>, <code><a href="filter.html">filter</a></code>, +<code><a href="first.html">first</a></code>, <code><a href="gapplyCollect.html">gapplyCollect</a></code>, +<code><a href="gapply.html">gapply</a></code>, <code><a href="getNumPartitions.html">getNumPartitions</a></code>, +<code><a href="groupBy.html">group_by</a></code>, <code><a href="head.html">head</a></code>, +<code><a href="hint.html">hint</a></code>, <code><a href="histogram.html">histogram</a></code>, +<code><a href="insertInto.html">insertInto</a></code>, <code><a href="intersect.html">intersect</a></code>, +<code><a href="isLocal.html">isLocal</a></code>, <code><a href="isStreaming.html">isStreaming</a></code>, +<code><a href="join.html">join</a></code>, <code><a href="limit.html">limit</a></code>, +<code><a href="localCheckpoint.html">localCheckpoint</a></code>, <code><a href="merge.html">merge</a></code>, +<code><a href="mutate.html">mutate</a></code>, <code><a href="ncol.html">ncol</a></code>, +<code><a href="nrow.html">nrow</a></code>, <code><a href="persist.html">persist</a></code>, +<code><a href="printSchema.html">printSchema</a></code>, <code><a href="randomSplit.html">randomSplit</a></code>, +<code><a href="rbind.html">rbind</a></code>, <code><a href="registerTempTable-deprecated.html">registerTempTable</a></code>, +<code><a href="rename.html">rename</a></code>, <code><a href="repartition.html">repartition</a></code>, +<code><a href="rollup.html">rollup</a></code>, <code><a href="sample.html">sample</a></code>, +<code><a href="saveAsTable.html">saveAsTable</a></code>, <code><a href="schema.html">schema</a></code>, +<code><a href="selectExpr.html">selectExpr</a></code>, <code><a href="select.html">select</a></code>, +<code><a href="showDF.html">showDF</a></code>, <code><a href="show.html">show</a></code>, +<code><a href="storageLevel.html">storageLevel</a></code>, <code><a href="str.html">str</a></code>, +<code><a href="subset.html">subset</a></code>, <code><a href="summary.html">summary</a></code>, +<code><a href="take.html">take</a></code>, <code><a href="toJSON.html">toJSON</a></code>, +<code><a href="unionByName.html">unionByName</a></code>, <code><a href="union.html">union</a></code>, +<code><a href="unpersist.html">unpersist</a></code>, <code><a href="withColumn.html">withColumn</a></code>, +<code><a href="withWatermark.html">withWatermark</a></code>, <code><a href="with.html">with</a></code>, +<code><a href="write.df.html">write.df</a></code>, <code><a href="write.json.html">write.json</a></code>, +<code><a href="write.orc.html">write.orc</a></code>, <code><a href="write.parquet.html">write.parquet</a></code>, +<code><a href="write.stream.html">write.stream</a></code>, <code><a href="write.text.html">write.text</a></code> +</p> + + +<h3>Examples</h3> + +<pre><code class="r">## Not run: +##D sparkR.session() +##D jdbcUrl <- "jdbc:mysql://localhost:3306/databasename" +##D write.jdbc(df, jdbcUrl, "table", user = "username", password = "password") +## End(Not run) +</code></pre> + + +<hr /><div style="text-align: center;">[Package <em>SparkR</em> version 2.3.0 <a href="00Index.html">Index</a>]</div> +</body></html>
http://git-wip-us.apache.org/repos/asf/spark-website/blob/26c57a24/site/docs/2.3.0/api/R/write.json.html ---------------------------------------------------------------------- diff --git a/site/docs/2.3.0/api/R/write.json.html b/site/docs/2.3.0/api/R/write.json.html new file mode 100644 index 0000000..9dcac7d --- /dev/null +++ b/site/docs/2.3.0/api/R/write.json.html @@ -0,0 +1,121 @@ +<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Strict//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd"><html xmlns="http://www.w3.org/1999/xhtml"><head><title>R: Save the contents of SparkDataFrame as a JSON file</title> +<meta http-equiv="Content-Type" content="text/html; charset=utf-8" /> +<link rel="stylesheet" type="text/css" href="R.css" /> + +<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/8.3/styles/github.min.css"> +<script src="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/8.3/highlight.min.js"></script> +<script src="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/8.3/languages/r.min.js"></script> +<script>hljs.initHighlightingOnLoad();</script> +</head><body> + +<table width="100%" summary="page for write.json {SparkR}"><tr><td>write.json {SparkR}</td><td style="text-align: right;">R Documentation</td></tr></table> + +<h2>Save the contents of SparkDataFrame as a JSON file</h2> + +<h3>Description</h3> + +<p>Save the contents of a SparkDataFrame as a JSON file (<a href="http://jsonlines.org/"> +JSON Lines text format or newline-delimited JSON</a>). Files written out +with this method can be read back in as a SparkDataFrame using read.json(). +</p> + + +<h3>Usage</h3> + +<pre> +write.json(x, path, ...) + +## S4 method for signature 'SparkDataFrame,character' +write.json(x, path, mode = "error", ...) +</pre> + + +<h3>Arguments</h3> + +<table summary="R argblock"> +<tr valign="top"><td><code>x</code></td> +<td> +<p>A SparkDataFrame</p> +</td></tr> +<tr valign="top"><td><code>path</code></td> +<td> +<p>The directory where the file is saved</p> +</td></tr> +<tr valign="top"><td><code>...</code></td> +<td> +<p>additional argument(s) passed to the method.</p> +</td></tr> +<tr valign="top"><td><code>mode</code></td> +<td> +<p>one of 'append', 'overwrite', 'error', 'errorifexists', 'ignore' +save mode (it is 'error' by default)</p> +</td></tr> +</table> + + +<h3>Note</h3> + +<p>write.json since 1.6.0 +</p> + + +<h3>See Also</h3> + +<p>Other SparkDataFrame functions: <code><a href="SparkDataFrame.html">SparkDataFrame-class</a></code>, +<code><a href="summarize.html">agg</a></code>, <code><a href="alias.html">alias</a></code>, +<code><a href="arrange.html">arrange</a></code>, <code><a href="as.data.frame.html">as.data.frame</a></code>, +<code><a href="attach.html">attach,SparkDataFrame-method</a></code>, +<code><a href="broadcast.html">broadcast</a></code>, <code><a href="cache.html">cache</a></code>, +<code><a href="checkpoint.html">checkpoint</a></code>, <code><a href="coalesce.html">coalesce</a></code>, +<code><a href="collect.html">collect</a></code>, <code><a href="columns.html">colnames</a></code>, +<code><a href="coltypes.html">coltypes</a></code>, +<code><a href="createOrReplaceTempView.html">createOrReplaceTempView</a></code>, +<code><a href="crossJoin.html">crossJoin</a></code>, <code><a href="cube.html">cube</a></code>, +<code><a href="dapplyCollect.html">dapplyCollect</a></code>, <code><a href="dapply.html">dapply</a></code>, +<code><a href="describe.html">describe</a></code>, <code><a href="dim.html">dim</a></code>, +<code><a href="distinct.html">distinct</a></code>, <code><a href="dropDuplicates.html">dropDuplicates</a></code>, +<code><a href="nafunctions.html">dropna</a></code>, <code><a href="drop.html">drop</a></code>, +<code><a href="dtypes.html">dtypes</a></code>, <code><a href="except.html">except</a></code>, +<code><a href="explain.html">explain</a></code>, <code><a href="filter.html">filter</a></code>, +<code><a href="first.html">first</a></code>, <code><a href="gapplyCollect.html">gapplyCollect</a></code>, +<code><a href="gapply.html">gapply</a></code>, <code><a href="getNumPartitions.html">getNumPartitions</a></code>, +<code><a href="groupBy.html">group_by</a></code>, <code><a href="head.html">head</a></code>, +<code><a href="hint.html">hint</a></code>, <code><a href="histogram.html">histogram</a></code>, +<code><a href="insertInto.html">insertInto</a></code>, <code><a href="intersect.html">intersect</a></code>, +<code><a href="isLocal.html">isLocal</a></code>, <code><a href="isStreaming.html">isStreaming</a></code>, +<code><a href="join.html">join</a></code>, <code><a href="limit.html">limit</a></code>, +<code><a href="localCheckpoint.html">localCheckpoint</a></code>, <code><a href="merge.html">merge</a></code>, +<code><a href="mutate.html">mutate</a></code>, <code><a href="ncol.html">ncol</a></code>, +<code><a href="nrow.html">nrow</a></code>, <code><a href="persist.html">persist</a></code>, +<code><a href="printSchema.html">printSchema</a></code>, <code><a href="randomSplit.html">randomSplit</a></code>, +<code><a href="rbind.html">rbind</a></code>, <code><a href="registerTempTable-deprecated.html">registerTempTable</a></code>, +<code><a href="rename.html">rename</a></code>, <code><a href="repartition.html">repartition</a></code>, +<code><a href="rollup.html">rollup</a></code>, <code><a href="sample.html">sample</a></code>, +<code><a href="saveAsTable.html">saveAsTable</a></code>, <code><a href="schema.html">schema</a></code>, +<code><a href="selectExpr.html">selectExpr</a></code>, <code><a href="select.html">select</a></code>, +<code><a href="showDF.html">showDF</a></code>, <code><a href="show.html">show</a></code>, +<code><a href="storageLevel.html">storageLevel</a></code>, <code><a href="str.html">str</a></code>, +<code><a href="subset.html">subset</a></code>, <code><a href="summary.html">summary</a></code>, +<code><a href="take.html">take</a></code>, <code><a href="toJSON.html">toJSON</a></code>, +<code><a href="unionByName.html">unionByName</a></code>, <code><a href="union.html">union</a></code>, +<code><a href="unpersist.html">unpersist</a></code>, <code><a href="withColumn.html">withColumn</a></code>, +<code><a href="withWatermark.html">withWatermark</a></code>, <code><a href="with.html">with</a></code>, +<code><a href="write.df.html">write.df</a></code>, <code><a href="write.jdbc.html">write.jdbc</a></code>, +<code><a href="write.orc.html">write.orc</a></code>, <code><a href="write.parquet.html">write.parquet</a></code>, +<code><a href="write.stream.html">write.stream</a></code>, <code><a href="write.text.html">write.text</a></code> +</p> + + +<h3>Examples</h3> + +<pre><code class="r">## Not run: +##D sparkR.session() +##D path <- "path/to/file.json" +##D df <- read.json(path) +##D write.json(df, "/tmp/sparkr-tmp/") +## End(Not run) +</code></pre> + + +<hr /><div style="text-align: center;">[Package <em>SparkR</em> version 2.3.0 <a href="00Index.html">Index</a>]</div> +</body></html> http://git-wip-us.apache.org/repos/asf/spark-website/blob/26c57a24/site/docs/2.3.0/api/R/write.ml.html ---------------------------------------------------------------------- diff --git a/site/docs/2.3.0/api/R/write.ml.html b/site/docs/2.3.0/api/R/write.ml.html new file mode 100644 index 0000000..2fdd3bd --- /dev/null +++ b/site/docs/2.3.0/api/R/write.ml.html @@ -0,0 +1,62 @@ +<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Strict//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd"><html xmlns="http://www.w3.org/1999/xhtml"><head><title>R: Saves the MLlib model to the input path</title> +<meta http-equiv="Content-Type" content="text/html; charset=utf-8" /> +<link rel="stylesheet" type="text/css" href="R.css" /> +</head><body> + +<table width="100%" summary="page for write.ml {SparkR}"><tr><td>write.ml {SparkR}</td><td style="text-align: right;">R Documentation</td></tr></table> + +<h2>Saves the MLlib model to the input path</h2> + +<h3>Description</h3> + +<p>Saves the MLlib model to the input path. For more information, see the specific +MLlib model below. +</p> + + +<h3>Usage</h3> + +<pre> +write.ml(object, path, ...) +</pre> + + +<h3>Arguments</h3> + +<table summary="R argblock"> +<tr valign="top"><td><code>object</code></td> +<td> +<p>a fitted ML model object.</p> +</td></tr> +<tr valign="top"><td><code>path</code></td> +<td> +<p>the directory where the model is saved.</p> +</td></tr> +<tr valign="top"><td><code>...</code></td> +<td> +<p>additional argument(s) passed to the method.</p> +</td></tr> +</table> + + +<h3>See Also</h3> + +<p><a href="spark.als.html">spark.als</a>, <a href="spark.bisectingKmeans.html">spark.bisectingKmeans</a>, <a href="spark.decisionTree.html">spark.decisionTree</a>, +</p> +<p><a href="spark.gaussianMixture.html">spark.gaussianMixture</a>, <a href="spark.gbt.html">spark.gbt</a>, +</p> +<p><a href="spark.glm.html">spark.glm</a>, <a href="glm.html">glm</a>, <a href="spark.isoreg.html">spark.isoreg</a>, +</p> +<p><a href="spark.kmeans.html">spark.kmeans</a>, +</p> +<p><a href="spark.lda.html">spark.lda</a>, <a href="spark.logit.html">spark.logit</a>, +</p> +<p><a href="spark.mlp.html">spark.mlp</a>, <a href="spark.naiveBayes.html">spark.naiveBayes</a>, +</p> +<p><a href="spark.randomForest.html">spark.randomForest</a>, <a href="spark.survreg.html">spark.survreg</a>, <a href="spark.svmLinear.html">spark.svmLinear</a>, +</p> +<p><a href="read.ml.html">read.ml</a> +</p> + +<hr /><div style="text-align: center;">[Package <em>SparkR</em> version 2.3.0 <a href="00Index.html">Index</a>]</div> +</body></html> http://git-wip-us.apache.org/repos/asf/spark-website/blob/26c57a24/site/docs/2.3.0/api/R/write.orc.html ---------------------------------------------------------------------- diff --git a/site/docs/2.3.0/api/R/write.orc.html b/site/docs/2.3.0/api/R/write.orc.html new file mode 100644 index 0000000..14e5133 --- /dev/null +++ b/site/docs/2.3.0/api/R/write.orc.html @@ -0,0 +1,120 @@ +<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Strict//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd"><html xmlns="http://www.w3.org/1999/xhtml"><head><title>R: Save the contents of SparkDataFrame as an ORC file,...</title> +<meta http-equiv="Content-Type" content="text/html; charset=utf-8" /> +<link rel="stylesheet" type="text/css" href="R.css" /> + +<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/8.3/styles/github.min.css"> +<script src="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/8.3/highlight.min.js"></script> +<script src="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/8.3/languages/r.min.js"></script> +<script>hljs.initHighlightingOnLoad();</script> +</head><body> + +<table width="100%" summary="page for write.orc {SparkR}"><tr><td>write.orc {SparkR}</td><td style="text-align: right;">R Documentation</td></tr></table> + +<h2>Save the contents of SparkDataFrame as an ORC file, preserving the schema.</h2> + +<h3>Description</h3> + +<p>Save the contents of a SparkDataFrame as an ORC file, preserving the schema. Files written out +with this method can be read back in as a SparkDataFrame using read.orc(). +</p> + + +<h3>Usage</h3> + +<pre> +write.orc(x, path, ...) + +## S4 method for signature 'SparkDataFrame,character' +write.orc(x, path, mode = "error", ...) +</pre> + + +<h3>Arguments</h3> + +<table summary="R argblock"> +<tr valign="top"><td><code>x</code></td> +<td> +<p>A SparkDataFrame</p> +</td></tr> +<tr valign="top"><td><code>path</code></td> +<td> +<p>The directory where the file is saved</p> +</td></tr> +<tr valign="top"><td><code>...</code></td> +<td> +<p>additional argument(s) passed to the method.</p> +</td></tr> +<tr valign="top"><td><code>mode</code></td> +<td> +<p>one of 'append', 'overwrite', 'error', 'errorifexists', 'ignore' +save mode (it is 'error' by default)</p> +</td></tr> +</table> + + +<h3>Note</h3> + +<p>write.orc since 2.0.0 +</p> + + +<h3>See Also</h3> + +<p>Other SparkDataFrame functions: <code><a href="SparkDataFrame.html">SparkDataFrame-class</a></code>, +<code><a href="summarize.html">agg</a></code>, <code><a href="alias.html">alias</a></code>, +<code><a href="arrange.html">arrange</a></code>, <code><a href="as.data.frame.html">as.data.frame</a></code>, +<code><a href="attach.html">attach,SparkDataFrame-method</a></code>, +<code><a href="broadcast.html">broadcast</a></code>, <code><a href="cache.html">cache</a></code>, +<code><a href="checkpoint.html">checkpoint</a></code>, <code><a href="coalesce.html">coalesce</a></code>, +<code><a href="collect.html">collect</a></code>, <code><a href="columns.html">colnames</a></code>, +<code><a href="coltypes.html">coltypes</a></code>, +<code><a href="createOrReplaceTempView.html">createOrReplaceTempView</a></code>, +<code><a href="crossJoin.html">crossJoin</a></code>, <code><a href="cube.html">cube</a></code>, +<code><a href="dapplyCollect.html">dapplyCollect</a></code>, <code><a href="dapply.html">dapply</a></code>, +<code><a href="describe.html">describe</a></code>, <code><a href="dim.html">dim</a></code>, +<code><a href="distinct.html">distinct</a></code>, <code><a href="dropDuplicates.html">dropDuplicates</a></code>, +<code><a href="nafunctions.html">dropna</a></code>, <code><a href="drop.html">drop</a></code>, +<code><a href="dtypes.html">dtypes</a></code>, <code><a href="except.html">except</a></code>, +<code><a href="explain.html">explain</a></code>, <code><a href="filter.html">filter</a></code>, +<code><a href="first.html">first</a></code>, <code><a href="gapplyCollect.html">gapplyCollect</a></code>, +<code><a href="gapply.html">gapply</a></code>, <code><a href="getNumPartitions.html">getNumPartitions</a></code>, +<code><a href="groupBy.html">group_by</a></code>, <code><a href="head.html">head</a></code>, +<code><a href="hint.html">hint</a></code>, <code><a href="histogram.html">histogram</a></code>, +<code><a href="insertInto.html">insertInto</a></code>, <code><a href="intersect.html">intersect</a></code>, +<code><a href="isLocal.html">isLocal</a></code>, <code><a href="isStreaming.html">isStreaming</a></code>, +<code><a href="join.html">join</a></code>, <code><a href="limit.html">limit</a></code>, +<code><a href="localCheckpoint.html">localCheckpoint</a></code>, <code><a href="merge.html">merge</a></code>, +<code><a href="mutate.html">mutate</a></code>, <code><a href="ncol.html">ncol</a></code>, +<code><a href="nrow.html">nrow</a></code>, <code><a href="persist.html">persist</a></code>, +<code><a href="printSchema.html">printSchema</a></code>, <code><a href="randomSplit.html">randomSplit</a></code>, +<code><a href="rbind.html">rbind</a></code>, <code><a href="registerTempTable-deprecated.html">registerTempTable</a></code>, +<code><a href="rename.html">rename</a></code>, <code><a href="repartition.html">repartition</a></code>, +<code><a href="rollup.html">rollup</a></code>, <code><a href="sample.html">sample</a></code>, +<code><a href="saveAsTable.html">saveAsTable</a></code>, <code><a href="schema.html">schema</a></code>, +<code><a href="selectExpr.html">selectExpr</a></code>, <code><a href="select.html">select</a></code>, +<code><a href="showDF.html">showDF</a></code>, <code><a href="show.html">show</a></code>, +<code><a href="storageLevel.html">storageLevel</a></code>, <code><a href="str.html">str</a></code>, +<code><a href="subset.html">subset</a></code>, <code><a href="summary.html">summary</a></code>, +<code><a href="take.html">take</a></code>, <code><a href="toJSON.html">toJSON</a></code>, +<code><a href="unionByName.html">unionByName</a></code>, <code><a href="union.html">union</a></code>, +<code><a href="unpersist.html">unpersist</a></code>, <code><a href="withColumn.html">withColumn</a></code>, +<code><a href="withWatermark.html">withWatermark</a></code>, <code><a href="with.html">with</a></code>, +<code><a href="write.df.html">write.df</a></code>, <code><a href="write.jdbc.html">write.jdbc</a></code>, +<code><a href="write.json.html">write.json</a></code>, <code><a href="write.parquet.html">write.parquet</a></code>, +<code><a href="write.stream.html">write.stream</a></code>, <code><a href="write.text.html">write.text</a></code> +</p> + + +<h3>Examples</h3> + +<pre><code class="r">## Not run: +##D sparkR.session() +##D path <- "path/to/file.json" +##D df <- read.json(path) +##D write.orc(df, "/tmp/sparkr-tmp1/") +## End(Not run) +</code></pre> + + +<hr /><div style="text-align: center;">[Package <em>SparkR</em> version 2.3.0 <a href="00Index.html">Index</a>]</div> +</body></html> http://git-wip-us.apache.org/repos/asf/spark-website/blob/26c57a24/site/docs/2.3.0/api/R/write.parquet.html ---------------------------------------------------------------------- diff --git a/site/docs/2.3.0/api/R/write.parquet.html b/site/docs/2.3.0/api/R/write.parquet.html new file mode 100644 index 0000000..4c2d008 --- /dev/null +++ b/site/docs/2.3.0/api/R/write.parquet.html @@ -0,0 +1,129 @@ +<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Strict//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd"><html xmlns="http://www.w3.org/1999/xhtml"><head><title>R: Save the contents of SparkDataFrame as a Parquet file,...</title> +<meta http-equiv="Content-Type" content="text/html; charset=utf-8" /> +<link rel="stylesheet" type="text/css" href="R.css" /> + +<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/8.3/styles/github.min.css"> +<script src="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/8.3/highlight.min.js"></script> +<script src="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/8.3/languages/r.min.js"></script> +<script>hljs.initHighlightingOnLoad();</script> +</head><body> + +<table width="100%" summary="page for write.parquet {SparkR}"><tr><td>write.parquet {SparkR}</td><td style="text-align: right;">R Documentation</td></tr></table> + +<h2>Save the contents of SparkDataFrame as a Parquet file, preserving the schema.</h2> + +<h3>Description</h3> + +<p>Save the contents of a SparkDataFrame as a Parquet file, preserving the schema. Files written out +with this method can be read back in as a SparkDataFrame using read.parquet(). +</p> + + +<h3>Usage</h3> + +<pre> +write.parquet(x, path, ...) + +saveAsParquetFile(x, path) + +## S4 method for signature 'SparkDataFrame,character' +write.parquet(x, path, mode = "error", + ...) + +## S4 method for signature 'SparkDataFrame,character' +saveAsParquetFile(x, path) +</pre> + + +<h3>Arguments</h3> + +<table summary="R argblock"> +<tr valign="top"><td><code>x</code></td> +<td> +<p>A SparkDataFrame</p> +</td></tr> +<tr valign="top"><td><code>path</code></td> +<td> +<p>The directory where the file is saved</p> +</td></tr> +<tr valign="top"><td><code>...</code></td> +<td> +<p>additional argument(s) passed to the method.</p> +</td></tr> +<tr valign="top"><td><code>mode</code></td> +<td> +<p>one of 'append', 'overwrite', 'error', 'errorifexists', 'ignore' +save mode (it is 'error' by default)</p> +</td></tr> +</table> + + +<h3>Note</h3> + +<p>write.parquet since 1.6.0 +</p> +<p>saveAsParquetFile since 1.4.0 +</p> + + +<h3>See Also</h3> + +<p>Other SparkDataFrame functions: <code><a href="SparkDataFrame.html">SparkDataFrame-class</a></code>, +<code><a href="summarize.html">agg</a></code>, <code><a href="alias.html">alias</a></code>, +<code><a href="arrange.html">arrange</a></code>, <code><a href="as.data.frame.html">as.data.frame</a></code>, +<code><a href="attach.html">attach,SparkDataFrame-method</a></code>, +<code><a href="broadcast.html">broadcast</a></code>, <code><a href="cache.html">cache</a></code>, +<code><a href="checkpoint.html">checkpoint</a></code>, <code><a href="coalesce.html">coalesce</a></code>, +<code><a href="collect.html">collect</a></code>, <code><a href="columns.html">colnames</a></code>, +<code><a href="coltypes.html">coltypes</a></code>, +<code><a href="createOrReplaceTempView.html">createOrReplaceTempView</a></code>, +<code><a href="crossJoin.html">crossJoin</a></code>, <code><a href="cube.html">cube</a></code>, +<code><a href="dapplyCollect.html">dapplyCollect</a></code>, <code><a href="dapply.html">dapply</a></code>, +<code><a href="describe.html">describe</a></code>, <code><a href="dim.html">dim</a></code>, +<code><a href="distinct.html">distinct</a></code>, <code><a href="dropDuplicates.html">dropDuplicates</a></code>, +<code><a href="nafunctions.html">dropna</a></code>, <code><a href="drop.html">drop</a></code>, +<code><a href="dtypes.html">dtypes</a></code>, <code><a href="except.html">except</a></code>, +<code><a href="explain.html">explain</a></code>, <code><a href="filter.html">filter</a></code>, +<code><a href="first.html">first</a></code>, <code><a href="gapplyCollect.html">gapplyCollect</a></code>, +<code><a href="gapply.html">gapply</a></code>, <code><a href="getNumPartitions.html">getNumPartitions</a></code>, +<code><a href="groupBy.html">group_by</a></code>, <code><a href="head.html">head</a></code>, +<code><a href="hint.html">hint</a></code>, <code><a href="histogram.html">histogram</a></code>, +<code><a href="insertInto.html">insertInto</a></code>, <code><a href="intersect.html">intersect</a></code>, +<code><a href="isLocal.html">isLocal</a></code>, <code><a href="isStreaming.html">isStreaming</a></code>, +<code><a href="join.html">join</a></code>, <code><a href="limit.html">limit</a></code>, +<code><a href="localCheckpoint.html">localCheckpoint</a></code>, <code><a href="merge.html">merge</a></code>, +<code><a href="mutate.html">mutate</a></code>, <code><a href="ncol.html">ncol</a></code>, +<code><a href="nrow.html">nrow</a></code>, <code><a href="persist.html">persist</a></code>, +<code><a href="printSchema.html">printSchema</a></code>, <code><a href="randomSplit.html">randomSplit</a></code>, +<code><a href="rbind.html">rbind</a></code>, <code><a href="registerTempTable-deprecated.html">registerTempTable</a></code>, +<code><a href="rename.html">rename</a></code>, <code><a href="repartition.html">repartition</a></code>, +<code><a href="rollup.html">rollup</a></code>, <code><a href="sample.html">sample</a></code>, +<code><a href="saveAsTable.html">saveAsTable</a></code>, <code><a href="schema.html">schema</a></code>, +<code><a href="selectExpr.html">selectExpr</a></code>, <code><a href="select.html">select</a></code>, +<code><a href="showDF.html">showDF</a></code>, <code><a href="show.html">show</a></code>, +<code><a href="storageLevel.html">storageLevel</a></code>, <code><a href="str.html">str</a></code>, +<code><a href="subset.html">subset</a></code>, <code><a href="summary.html">summary</a></code>, +<code><a href="take.html">take</a></code>, <code><a href="toJSON.html">toJSON</a></code>, +<code><a href="unionByName.html">unionByName</a></code>, <code><a href="union.html">union</a></code>, +<code><a href="unpersist.html">unpersist</a></code>, <code><a href="withColumn.html">withColumn</a></code>, +<code><a href="withWatermark.html">withWatermark</a></code>, <code><a href="with.html">with</a></code>, +<code><a href="write.df.html">write.df</a></code>, <code><a href="write.jdbc.html">write.jdbc</a></code>, +<code><a href="write.json.html">write.json</a></code>, <code><a href="write.orc.html">write.orc</a></code>, +<code><a href="write.stream.html">write.stream</a></code>, <code><a href="write.text.html">write.text</a></code> +</p> + + +<h3>Examples</h3> + +<pre><code class="r">## Not run: +##D sparkR.session() +##D path <- "path/to/file.json" +##D df <- read.json(path) +##D write.parquet(df, "/tmp/sparkr-tmp1/") +##D saveAsParquetFile(df, "/tmp/sparkr-tmp2/") +## End(Not run) +</code></pre> + + +<hr /><div style="text-align: center;">[Package <em>SparkR</em> version 2.3.0 <a href="00Index.html">Index</a>]</div> +</body></html> http://git-wip-us.apache.org/repos/asf/spark-website/blob/26c57a24/site/docs/2.3.0/api/R/write.stream.html ---------------------------------------------------------------------- diff --git a/site/docs/2.3.0/api/R/write.stream.html b/site/docs/2.3.0/api/R/write.stream.html new file mode 100644 index 0000000..68dfd68 --- /dev/null +++ b/site/docs/2.3.0/api/R/write.stream.html @@ -0,0 +1,181 @@ +<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Strict//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd"><html xmlns="http://www.w3.org/1999/xhtml"><head><title>R: Write the streaming SparkDataFrame to a data source.</title> +<meta http-equiv="Content-Type" content="text/html; charset=utf-8" /> +<link rel="stylesheet" type="text/css" href="R.css" /> + +<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/8.3/styles/github.min.css"> +<script src="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/8.3/highlight.min.js"></script> +<script src="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/8.3/languages/r.min.js"></script> +<script>hljs.initHighlightingOnLoad();</script> +</head><body> + +<table width="100%" summary="page for write.stream {SparkR}"><tr><td>write.stream {SparkR}</td><td style="text-align: right;">R Documentation</td></tr></table> + +<h2>Write the streaming SparkDataFrame to a data source.</h2> + +<h3>Description</h3> + +<p>The data source is specified by the <code>source</code> and a set of options (...). +If <code>source</code> is not specified, the default data source configured by +spark.sql.sources.default will be used. +</p> + + +<h3>Usage</h3> + +<pre> +write.stream(df, source = NULL, outputMode = NULL, ...) + +## S4 method for signature 'SparkDataFrame' +write.stream(df, source = NULL, + outputMode = NULL, partitionBy = NULL, trigger.processingTime = NULL, + trigger.once = NULL, ...) +</pre> + + +<h3>Arguments</h3> + +<table summary="R argblock"> +<tr valign="top"><td><code>df</code></td> +<td> +<p>a streaming SparkDataFrame.</p> +</td></tr> +<tr valign="top"><td><code>source</code></td> +<td> +<p>a name for external data source.</p> +</td></tr> +<tr valign="top"><td><code>outputMode</code></td> +<td> +<p>one of 'append', 'complete', 'update'.</p> +</td></tr> +<tr valign="top"><td><code>...</code></td> +<td> +<p>additional external data source specific named options.</p> +</td></tr> +<tr valign="top"><td><code>partitionBy</code></td> +<td> +<p>a name or a list of names of columns to partition the output by on the file +system. If specified, the output is laid out on the file system similar to Hive's +partitioning scheme.</p> +</td></tr> +<tr valign="top"><td><code>trigger.processingTime</code></td> +<td> +<p>a processing time interval as a string, e.g. '5 seconds', +'1 minute'. This is a trigger that runs a query periodically based on the processing +time. If value is '0 seconds', the query will run as fast as possible, this is the +default. Only one trigger can be set.</p> +</td></tr> +<tr valign="top"><td><code>trigger.once</code></td> +<td> +<p>a logical, must be set to <code>TRUE</code>. This is a trigger that processes only +one batch of data in a streaming query then terminates the query. Only one trigger can be +set.</p> +</td></tr> +</table> + + +<h3>Details</h3> + +<p>Additionally, <code>outputMode</code> specifies how data of a streaming SparkDataFrame is written to a +output data source. There are three modes: +</p> + +<ul> +<li><p> append: Only the new rows in the streaming SparkDataFrame will be written out. This +output mode can be only be used in queries that do not contain any aggregation. +</p> +</li> +<li><p> complete: All the rows in the streaming SparkDataFrame will be written out every time +there are some updates. This output mode can only be used in queries that +contain aggregations. +</p> +</li> +<li><p> update: Only the rows that were updated in the streaming SparkDataFrame will be written +out every time there are some updates. If the query doesn't contain aggregations, +it will be equivalent to <code>append</code> mode. +</p> +</li></ul> + + + +<h3>Note</h3> + +<p>write.stream since 2.2.0 +</p> +<p>experimental +</p> + + +<h3>See Also</h3> + +<p><a href="read.stream.html">read.stream</a> +</p> +<p>Other SparkDataFrame functions: <code><a href="SparkDataFrame.html">SparkDataFrame-class</a></code>, +<code><a href="summarize.html">agg</a></code>, <code><a href="alias.html">alias</a></code>, +<code><a href="arrange.html">arrange</a></code>, <code><a href="as.data.frame.html">as.data.frame</a></code>, +<code><a href="attach.html">attach,SparkDataFrame-method</a></code>, +<code><a href="broadcast.html">broadcast</a></code>, <code><a href="cache.html">cache</a></code>, +<code><a href="checkpoint.html">checkpoint</a></code>, <code><a href="coalesce.html">coalesce</a></code>, +<code><a href="collect.html">collect</a></code>, <code><a href="columns.html">colnames</a></code>, +<code><a href="coltypes.html">coltypes</a></code>, +<code><a href="createOrReplaceTempView.html">createOrReplaceTempView</a></code>, +<code><a href="crossJoin.html">crossJoin</a></code>, <code><a href="cube.html">cube</a></code>, +<code><a href="dapplyCollect.html">dapplyCollect</a></code>, <code><a href="dapply.html">dapply</a></code>, +<code><a href="describe.html">describe</a></code>, <code><a href="dim.html">dim</a></code>, +<code><a href="distinct.html">distinct</a></code>, <code><a href="dropDuplicates.html">dropDuplicates</a></code>, +<code><a href="nafunctions.html">dropna</a></code>, <code><a href="drop.html">drop</a></code>, +<code><a href="dtypes.html">dtypes</a></code>, <code><a href="except.html">except</a></code>, +<code><a href="explain.html">explain</a></code>, <code><a href="filter.html">filter</a></code>, +<code><a href="first.html">first</a></code>, <code><a href="gapplyCollect.html">gapplyCollect</a></code>, +<code><a href="gapply.html">gapply</a></code>, <code><a href="getNumPartitions.html">getNumPartitions</a></code>, +<code><a href="groupBy.html">group_by</a></code>, <code><a href="head.html">head</a></code>, +<code><a href="hint.html">hint</a></code>, <code><a href="histogram.html">histogram</a></code>, +<code><a href="insertInto.html">insertInto</a></code>, <code><a href="intersect.html">intersect</a></code>, +<code><a href="isLocal.html">isLocal</a></code>, <code><a href="isStreaming.html">isStreaming</a></code>, +<code><a href="join.html">join</a></code>, <code><a href="limit.html">limit</a></code>, +<code><a href="localCheckpoint.html">localCheckpoint</a></code>, <code><a href="merge.html">merge</a></code>, +<code><a href="mutate.html">mutate</a></code>, <code><a href="ncol.html">ncol</a></code>, +<code><a href="nrow.html">nrow</a></code>, <code><a href="persist.html">persist</a></code>, +<code><a href="printSchema.html">printSchema</a></code>, <code><a href="randomSplit.html">randomSplit</a></code>, +<code><a href="rbind.html">rbind</a></code>, <code><a href="registerTempTable-deprecated.html">registerTempTable</a></code>, +<code><a href="rename.html">rename</a></code>, <code><a href="repartition.html">repartition</a></code>, +<code><a href="rollup.html">rollup</a></code>, <code><a href="sample.html">sample</a></code>, +<code><a href="saveAsTable.html">saveAsTable</a></code>, <code><a href="schema.html">schema</a></code>, +<code><a href="selectExpr.html">selectExpr</a></code>, <code><a href="select.html">select</a></code>, +<code><a href="showDF.html">showDF</a></code>, <code><a href="show.html">show</a></code>, +<code><a href="storageLevel.html">storageLevel</a></code>, <code><a href="str.html">str</a></code>, +<code><a href="subset.html">subset</a></code>, <code><a href="summary.html">summary</a></code>, +<code><a href="take.html">take</a></code>, <code><a href="toJSON.html">toJSON</a></code>, +<code><a href="unionByName.html">unionByName</a></code>, <code><a href="union.html">union</a></code>, +<code><a href="unpersist.html">unpersist</a></code>, <code><a href="withColumn.html">withColumn</a></code>, +<code><a href="withWatermark.html">withWatermark</a></code>, <code><a href="with.html">with</a></code>, +<code><a href="write.df.html">write.df</a></code>, <code><a href="write.jdbc.html">write.jdbc</a></code>, +<code><a href="write.json.html">write.json</a></code>, <code><a href="write.orc.html">write.orc</a></code>, +<code><a href="write.parquet.html">write.parquet</a></code>, <code><a href="write.text.html">write.text</a></code> +</p> + + +<h3>Examples</h3> + +<pre><code class="r">## Not run: +##D sparkR.session() +##D df <- read.stream("socket", host = "localhost", port = 9999) +##D isStreaming(df) +##D wordCounts <- count(group_by(df, "value")) +##D +##D # console +##D q <- write.stream(wordCounts, "console", outputMode = "complete") +##D # text stream +##D q <- write.stream(df, "text", path = "/home/user/out", checkpointLocation = "/home/user/cp" +##D partitionBy = c("year", "month"), trigger.processingTime = "30 seconds") +##D # memory stream +##D q <- write.stream(wordCounts, "memory", queryName = "outs", outputMode = "complete") +##D head(sql("SELECT * from outs")) +##D queryName(q) +##D +##D stopQuery(q) +## End(Not run) +</code></pre> + + +<hr /><div style="text-align: center;">[Package <em>SparkR</em> version 2.3.0 <a href="00Index.html">Index</a>]</div> +</body></html> http://git-wip-us.apache.org/repos/asf/spark-website/blob/26c57a24/site/docs/2.3.0/api/R/write.text.html ---------------------------------------------------------------------- diff --git a/site/docs/2.3.0/api/R/write.text.html b/site/docs/2.3.0/api/R/write.text.html new file mode 100644 index 0000000..66e3f84 --- /dev/null +++ b/site/docs/2.3.0/api/R/write.text.html @@ -0,0 +1,121 @@ +<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Strict//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd"><html xmlns="http://www.w3.org/1999/xhtml"><head><title>R: Save the content of SparkDataFrame in a text file at the...</title> +<meta http-equiv="Content-Type" content="text/html; charset=utf-8" /> +<link rel="stylesheet" type="text/css" href="R.css" /> + +<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/8.3/styles/github.min.css"> +<script src="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/8.3/highlight.min.js"></script> +<script src="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/8.3/languages/r.min.js"></script> +<script>hljs.initHighlightingOnLoad();</script> +</head><body> + +<table width="100%" summary="page for write.text {SparkR}"><tr><td>write.text {SparkR}</td><td style="text-align: right;">R Documentation</td></tr></table> + +<h2>Save the content of SparkDataFrame in a text file at the specified path.</h2> + +<h3>Description</h3> + +<p>Save the content of the SparkDataFrame in a text file at the specified path. +The SparkDataFrame must have only one column of string type with the name "value". +Each row becomes a new line in the output file. +</p> + + +<h3>Usage</h3> + +<pre> +write.text(x, path, ...) + +## S4 method for signature 'SparkDataFrame,character' +write.text(x, path, mode = "error", ...) +</pre> + + +<h3>Arguments</h3> + +<table summary="R argblock"> +<tr valign="top"><td><code>x</code></td> +<td> +<p>A SparkDataFrame</p> +</td></tr> +<tr valign="top"><td><code>path</code></td> +<td> +<p>The directory where the file is saved</p> +</td></tr> +<tr valign="top"><td><code>...</code></td> +<td> +<p>additional argument(s) passed to the method.</p> +</td></tr> +<tr valign="top"><td><code>mode</code></td> +<td> +<p>one of 'append', 'overwrite', 'error', 'errorifexists', 'ignore' +save mode (it is 'error' by default)</p> +</td></tr> +</table> + + +<h3>Note</h3> + +<p>write.text since 2.0.0 +</p> + + +<h3>See Also</h3> + +<p>Other SparkDataFrame functions: <code><a href="SparkDataFrame.html">SparkDataFrame-class</a></code>, +<code><a href="summarize.html">agg</a></code>, <code><a href="alias.html">alias</a></code>, +<code><a href="arrange.html">arrange</a></code>, <code><a href="as.data.frame.html">as.data.frame</a></code>, +<code><a href="attach.html">attach,SparkDataFrame-method</a></code>, +<code><a href="broadcast.html">broadcast</a></code>, <code><a href="cache.html">cache</a></code>, +<code><a href="checkpoint.html">checkpoint</a></code>, <code><a href="coalesce.html">coalesce</a></code>, +<code><a href="collect.html">collect</a></code>, <code><a href="columns.html">colnames</a></code>, +<code><a href="coltypes.html">coltypes</a></code>, +<code><a href="createOrReplaceTempView.html">createOrReplaceTempView</a></code>, +<code><a href="crossJoin.html">crossJoin</a></code>, <code><a href="cube.html">cube</a></code>, +<code><a href="dapplyCollect.html">dapplyCollect</a></code>, <code><a href="dapply.html">dapply</a></code>, +<code><a href="describe.html">describe</a></code>, <code><a href="dim.html">dim</a></code>, +<code><a href="distinct.html">distinct</a></code>, <code><a href="dropDuplicates.html">dropDuplicates</a></code>, +<code><a href="nafunctions.html">dropna</a></code>, <code><a href="drop.html">drop</a></code>, +<code><a href="dtypes.html">dtypes</a></code>, <code><a href="except.html">except</a></code>, +<code><a href="explain.html">explain</a></code>, <code><a href="filter.html">filter</a></code>, +<code><a href="first.html">first</a></code>, <code><a href="gapplyCollect.html">gapplyCollect</a></code>, +<code><a href="gapply.html">gapply</a></code>, <code><a href="getNumPartitions.html">getNumPartitions</a></code>, +<code><a href="groupBy.html">group_by</a></code>, <code><a href="head.html">head</a></code>, +<code><a href="hint.html">hint</a></code>, <code><a href="histogram.html">histogram</a></code>, +<code><a href="insertInto.html">insertInto</a></code>, <code><a href="intersect.html">intersect</a></code>, +<code><a href="isLocal.html">isLocal</a></code>, <code><a href="isStreaming.html">isStreaming</a></code>, +<code><a href="join.html">join</a></code>, <code><a href="limit.html">limit</a></code>, +<code><a href="localCheckpoint.html">localCheckpoint</a></code>, <code><a href="merge.html">merge</a></code>, +<code><a href="mutate.html">mutate</a></code>, <code><a href="ncol.html">ncol</a></code>, +<code><a href="nrow.html">nrow</a></code>, <code><a href="persist.html">persist</a></code>, +<code><a href="printSchema.html">printSchema</a></code>, <code><a href="randomSplit.html">randomSplit</a></code>, +<code><a href="rbind.html">rbind</a></code>, <code><a href="registerTempTable-deprecated.html">registerTempTable</a></code>, +<code><a href="rename.html">rename</a></code>, <code><a href="repartition.html">repartition</a></code>, +<code><a href="rollup.html">rollup</a></code>, <code><a href="sample.html">sample</a></code>, +<code><a href="saveAsTable.html">saveAsTable</a></code>, <code><a href="schema.html">schema</a></code>, +<code><a href="selectExpr.html">selectExpr</a></code>, <code><a href="select.html">select</a></code>, +<code><a href="showDF.html">showDF</a></code>, <code><a href="show.html">show</a></code>, +<code><a href="storageLevel.html">storageLevel</a></code>, <code><a href="str.html">str</a></code>, +<code><a href="subset.html">subset</a></code>, <code><a href="summary.html">summary</a></code>, +<code><a href="take.html">take</a></code>, <code><a href="toJSON.html">toJSON</a></code>, +<code><a href="unionByName.html">unionByName</a></code>, <code><a href="union.html">union</a></code>, +<code><a href="unpersist.html">unpersist</a></code>, <code><a href="withColumn.html">withColumn</a></code>, +<code><a href="withWatermark.html">withWatermark</a></code>, <code><a href="with.html">with</a></code>, +<code><a href="write.df.html">write.df</a></code>, <code><a href="write.jdbc.html">write.jdbc</a></code>, +<code><a href="write.json.html">write.json</a></code>, <code><a href="write.orc.html">write.orc</a></code>, +<code><a href="write.parquet.html">write.parquet</a></code>, <code><a href="write.stream.html">write.stream</a></code> +</p> + + +<h3>Examples</h3> + +<pre><code class="r">## Not run: +##D sparkR.session() +##D path <- "path/to/file.txt" +##D df <- read.text(path) +##D write.text(df, "/tmp/sparkr-tmp/") +## End(Not run) +</code></pre> + + +<hr /><div style="text-align: center;">[Package <em>SparkR</em> version 2.3.0 <a href="00Index.html">Index</a>]</div> +</body></html> --------------------------------------------------------------------- To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org For additional commands, e-mail: commits-h...@spark.apache.org