Hello,
My hard drive has about 80 GB of space left on it, and the RAM is about
12GB.
I am not sure the size of the .tsv file, but it will most likely be around
30 GB.
Thanks,
Wilbert Seoane
On Fri, May 29, 2020 at 5:03 PM Anwar AliKhan
wrote:
> What is the size of your .tsv file sir ?
What is the size of your .tsv file sir ?
What is the size of your local hard drive sir ?
Regards
Wali Ahaad
On Fri, 29 May 2020, 16:21 , wrote:
> Hello,
>
> I plan to load in a local .tsv file from my hard drive using sparklyr (an
> R package). I have figured out how to do this alread
If you load a file on your computer, that is unrelated to Spark.
Whatever you load via Spark APIs will at some point live in memory on the
Spark cluster, or the storage you back it with if you store it.
Whether the cluster and storage are secure (like, ACLs / auth enabled) is
up to whoever runs the
What do you mean by secure here?
On Fri, May 29, 2020 at 10:21 AM wrote:
> Hello,
>
> I plan to load in a local .tsv file from my hard drive using sparklyr (an
> R package). I have figured out how to do this already on small files.
>
> When I decide to receive my client’s large .tsv file, can I
Hello,
I plan to load in a local .tsv file from my hard drive using sparklyr (an R
package). I have figured out how to do this already on small files.
When I decide to receive my client’s large .tsv file, can I be confident that
loading in data this way will be secure? I know that this creates
On 13 Oct 2016, at 14:40, Mendelson, Assaf
mailto:assaf.mendel...@rsa.com>> wrote:
Hi,
We have a spark cluster and we wanted to add some security for it. I was
looking at the documentation (in
http://spark.apache.org/docs/latest/security.html) and had some questions.
1. Do all executors
Anyone can assist with this?
From: Mendelson, Assaf [mailto:assaf.mendel...@rsa.com]
Sent: Thursday, October 13, 2016 3:41 PM
To: user@spark.apache.org
Subject: Spark security
Hi,
We have a spark cluster and we wanted to add some security for it. I was
looking at the documentation (in
http
Hi,
We have a spark cluster and we wanted to add some security for it. I was
looking at the documentation (in
http://spark.apache.org/docs/latest/security.html) and had some questions.
1. Do all executors listen by the same blockManager port? For example, in
yarn there are multiple execu