RE: SparkR Error in sparkR.init(master=“local”) in RStudio

2015-10-09 Thread Khandeshi, Ami
It seems the problem is with creating Usage: RBackend 

From: Sun, Rui [mailto:rui@intel.com]
Sent: Wednesday, October 07, 2015 10:23 PM
To: Khandeshi, Ami; Hossein
Cc: akhandeshi; user@spark.apache.org
Subject: RE: SparkR Error in sparkR.init(master=“local”) in RStudio

Can you extract the spark-submit command from the console output, and run it on 
the Shell, and see if there is any error message?

From: Khandeshi, Ami [mailto:ami.khande...@fmr.com]
Sent: Wednesday, October 7, 2015 9:57 PM
To: Sun, Rui; Hossein
Cc: akhandeshi; user@spark.apache.org<mailto:user@spark.apache.org>
Subject: RE: SparkR Error in sparkR.init(master=“local”) in RStudio

Tried, multiple permutation of setting home… Still same issue
> Sys.setenv(SPARK_HOME="c:\\DevTools\\spark-1.5.1")
> .libPaths(c(file.path(Sys.getenv("SPARK_HOME"),"R","lib"),.libPaths()))
> library(SparkR)

Attaching package: ‘SparkR’

The following objects are masked from ‘package:stats’:

filter, na.omit

The following objects are masked from ‘package:base’:

intersect, rbind, sample, subset, summary, table, transform

> sc<-sparkR.init(master = "local")
Launching java with spark-submit command 
c:\DevTools\spark-1.5.1/bin/spark-submit.cmd   sparkr-shell 
C:\Users\a554719\AppData\Local\Temp\RtmpkXZVBa\backend_port45ac487f2fbd
Error in sparkR.init(master = "local") :
  JVM is not ready after 10 seconds


From: Sun, Rui [mailto:rui....@intel.com]
Sent: Wednesday, October 07, 2015 2:35 AM
To: Hossein; Khandeshi, Ami
Cc: akhandeshi; user@spark.apache.org<mailto:user@spark.apache.org>
Subject: RE: SparkR Error in sparkR.init(master=“local”) in RStudio

Not sure "/C/DevTools/spark-1.5.1/bin/spark-submit.cmd" is a valid?

From: Hossein [mailto:fal...@gmail.com]
Sent: Wednesday, October 7, 2015 12:46 AM
To: Khandeshi, Ami
Cc: Sun, Rui; akhandeshi; user@spark.apache.org<mailto:user@spark.apache.org>
Subject: Re: SparkR Error in sparkR.init(master=“local”) in RStudio

Have you built the Spark jars? Can you run the Spark Scala shell?

--Hossein

On Tuesday, October 6, 2015, Khandeshi, Ami 
<ami.khande...@fmr.com.invalid<mailto:ami.khande...@fmr.com.invalid>> wrote:
> Sys.setenv(SPARKR_SUBMIT_ARGS="--verbose sparkr-shell")
> Sys.setenv(SPARK_PRINT_LAUNCH_COMMAND=1)
>
> sc <- sparkR.init(master="local")
Launching java with spark-submit command 
/C/DevTools/spark-1.5.1/bin/spark-submit.cmd   --verbose sparkr-shell 
C:\Users\a554719\AppData\Local\Temp\Rtmpw11KJ1\backend_port31b0afd4391
Error in sparkR.init(master = "local") :
  JVM is not ready after 10 seconds
In addition: Warning message:
running command '"/C/DevTools/spark-1.5.1/bin/spark-submit.cmd"   --verbose 
sparkr-shell 
C:\Users\a554719\AppData\Local\Temp\Rtmpw11KJ1\backend_port31b0afd4391' had 
status 127

-Original Message-
From: Sun, Rui [mailto:rui@intel.com<javascript:;>]
Sent: Tuesday, October 06, 2015 9:39 AM
To: akhandeshi; user@spark.apache.org<javascript:;>
Subject: RE: SparkR Error in sparkR.init(master=“local”) in RStudio

What you have done is supposed to work.  Need more debugging information to 
find the cause.

Could you add the following lines before calling sparkR.init()?

Sys.setenv(SPARKR_SUBMIT_ARGS="--verbose sparkr-shell")
Sys.setenv(SPARK_PRINT_LAUNCH_COMMAND=1)

Then to see if you can find any hint in the console output

-Original Message-
From: akhandeshi [mailto:ami.khande...@gmail.com<javascript:;>]
Sent: Tuesday, October 6, 2015 8:21 PM
To: user@spark.apache.org<javascript:;>
Subject: Re: SparkR Error in sparkR.init(master=“local”) in RStudio

I couldn't get this working...

I have have JAVA_HOME set.
I have defined SPARK_HOME
Sys.setenv(SPARK_HOME="c:\DevTools\spark-1.5.1")
.libPaths(c(file.path(Sys.getenv("SPARK_HOME"), "R", "lib"), .libPaths())) 
library("SparkR", lib.loc="c:\\DevTools\\spark-1.5.1\\lib")
library(SparkR)
sc<-sparkR.init(master="local")

I get
Error in sparkR.init(master = "local") :
  JVM is not ready after 10 seconds

What am I missing??






--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/SparkR-Error-in-sparkR-init-master-local-in-RStudio-tp23768p24949.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org<javascript:;> For 
additional commands, e-mail: user-h...@spark.apache.org<javascript:;>


-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org<javascript:;>
For additional commands, e-mail: user-h...@spark.apache.org<javascript:;>


-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org<javascript:;>
For additional commands, e-mail: user-h...@spark.apache.org<javascript:;>


--
--Hossein


RE: SparkR Error in sparkR.init(master=“local”) in RStudio

2015-10-09 Thread Khandeshi, Ami
Thank you for your help!  I was able to resolve it by changing my working 
directory to local.  The default was a map drive.

From: Khandeshi, Ami
Sent: Friday, October 09, 2015 11:23 AM
To: 'Sun, Rui'; Hossein
Cc: akhandeshi; user@spark.apache.org
Subject: RE: SparkR Error in sparkR.init(master=“local”) in RStudio

It seems the problem is with creating Usage: RBackend 

From: Sun, Rui [mailto:rui@intel.com]
Sent: Wednesday, October 07, 2015 10:23 PM
To: Khandeshi, Ami; Hossein
Cc: akhandeshi; user@spark.apache.org<mailto:user@spark.apache.org>
Subject: RE: SparkR Error in sparkR.init(master=“local”) in RStudio

Can you extract the spark-submit command from the console output, and run it on 
the Shell, and see if there is any error message?

From: Khandeshi, Ami [mailto:ami.khande...@fmr.com]
Sent: Wednesday, October 7, 2015 9:57 PM
To: Sun, Rui; Hossein
Cc: akhandeshi; user@spark.apache.org<mailto:user@spark.apache.org>
Subject: RE: SparkR Error in sparkR.init(master=“local”) in RStudio

Tried, multiple permutation of setting home… Still same issue
> Sys.setenv(SPARK_HOME="c:\\DevTools\\spark-1.5.1")
> .libPaths(c(file.path(Sys.getenv("SPARK_HOME"),"R","lib"),.libPaths()))
> library(SparkR)

Attaching package: ‘SparkR’

The following objects are masked from ‘package:stats’:

filter, na.omit

The following objects are masked from ‘package:base’:

intersect, rbind, sample, subset, summary, table, transform

> sc<-sparkR.init(master = "local")
Launching java with spark-submit command 
c:\DevTools\spark-1.5.1/bin/spark-submit.cmd   sparkr-shell 
C:\Users\a554719\AppData\Local\Temp\RtmpkXZVBa\backend_port45ac487f2fbd
Error in sparkR.init(master = "local") :
  JVM is not ready after 10 seconds


From: Sun, Rui [mailto:rui@intel.com]
Sent: Wednesday, October 07, 2015 2:35 AM
To: Hossein; Khandeshi, Ami
Cc: akhandeshi; user@spark.apache.org<mailto:user@spark.apache.org>
Subject: RE: SparkR Error in sparkR.init(master=“local”) in RStudio

Not sure "/C/DevTools/spark-1.5.1/bin/spark-submit.cmd" is a valid?

From: Hossein [mailto:fal...@gmail.com]
Sent: Wednesday, October 7, 2015 12:46 AM
To: Khandeshi, Ami
Cc: Sun, Rui; akhandeshi; user@spark.apache.org<mailto:user@spark.apache.org>
Subject: Re: SparkR Error in sparkR.init(master=“local”) in RStudio

Have you built the Spark jars? Can you run the Spark Scala shell?

--Hossein

On Tuesday, October 6, 2015, Khandeshi, Ami 
<ami.khande...@fmr.com.invalid<mailto:ami.khande...@fmr.com.invalid>> wrote:
> Sys.setenv(SPARKR_SUBMIT_ARGS="--verbose sparkr-shell")
> Sys.setenv(SPARK_PRINT_LAUNCH_COMMAND=1)
>
> sc <- sparkR.init(master="local")
Launching java with spark-submit command 
/C/DevTools/spark-1.5.1/bin/spark-submit.cmd   --verbose sparkr-shell 
C:\Users\a554719\AppData\Local\Temp\Rtmpw11KJ1\backend_port31b0afd4391
Error in sparkR.init(master = "local") :
  JVM is not ready after 10 seconds
In addition: Warning message:
running command '"/C/DevTools/spark-1.5.1/bin/spark-submit.cmd"   --verbose 
sparkr-shell 
C:\Users\a554719\AppData\Local\Temp\Rtmpw11KJ1\backend_port31b0afd4391' had 
status 127

-Original Message-
From: Sun, Rui [mailto:rui@intel.com<javascript:;>]
Sent: Tuesday, October 06, 2015 9:39 AM
To: akhandeshi; user@spark.apache.org<javascript:;>
Subject: RE: SparkR Error in sparkR.init(master=“local”) in RStudio

What you have done is supposed to work.  Need more debugging information to 
find the cause.

Could you add the following lines before calling sparkR.init()?

Sys.setenv(SPARKR_SUBMIT_ARGS="--verbose sparkr-shell")
Sys.setenv(SPARK_PRINT_LAUNCH_COMMAND=1)

Then to see if you can find any hint in the console output

-Original Message-
From: akhandeshi [mailto:ami.khande...@gmail.com<javascript:;>]
Sent: Tuesday, October 6, 2015 8:21 PM
To: user@spark.apache.org<javascript:;>
Subject: Re: SparkR Error in sparkR.init(master=“local”) in RStudio

I couldn't get this working...

I have have JAVA_HOME set.
I have defined SPARK_HOME
Sys.setenv(SPARK_HOME="c:\DevTools\spark-1.5.1")
.libPaths(c(file.path(Sys.getenv("SPARK_HOME"), "R", "lib"), .libPaths())) 
library("SparkR", lib.loc="c:\\DevTools\\spark-1.5.1\\lib")
library(SparkR)
sc<-sparkR.init(master="local")

I get
Error in sparkR.init(master = "local") :
  JVM is not ready after 10 seconds

What am I missing??






--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/SparkR-Error-in-sparkR-init-master-local-in-RStudio-tp23768p24949.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org

RE: SparkR Error in sparkR.init(master=“local”) in RStudio

2015-10-07 Thread Khandeshi, Ami
Tried, multiple permutation of setting home… Still same issue
> Sys.setenv(SPARK_HOME="c:\\DevTools\\spark-1.5.1")
> .libPaths(c(file.path(Sys.getenv("SPARK_HOME"),"R","lib"),.libPaths()))
> library(SparkR)

Attaching package: ‘SparkR’

The following objects are masked from ‘package:stats’:

filter, na.omit

The following objects are masked from ‘package:base’:

intersect, rbind, sample, subset, summary, table, transform

> sc<-sparkR.init(master = "local")
Launching java with spark-submit command 
c:\DevTools\spark-1.5.1/bin/spark-submit.cmd   sparkr-shell 
C:\Users\a554719\AppData\Local\Temp\RtmpkXZVBa\backend_port45ac487f2fbd
Error in sparkR.init(master = "local") :
  JVM is not ready after 10 seconds


From: Sun, Rui [mailto:rui@intel.com]
Sent: Wednesday, October 07, 2015 2:35 AM
To: Hossein; Khandeshi, Ami
Cc: akhandeshi; user@spark.apache.org
Subject: RE: SparkR Error in sparkR.init(master=“local”) in RStudio

Not sure "/C/DevTools/spark-1.5.1/bin/spark-submit.cmd" is a valid?

From: Hossein [mailto:fal...@gmail.com]
Sent: Wednesday, October 7, 2015 12:46 AM
To: Khandeshi, Ami
Cc: Sun, Rui; akhandeshi; user@spark.apache.org<mailto:user@spark.apache.org>
Subject: Re: SparkR Error in sparkR.init(master=“local”) in RStudio

Have you built the Spark jars? Can you run the Spark Scala shell?

--Hossein

On Tuesday, October 6, 2015, Khandeshi, Ami 
<ami.khande...@fmr.com.invalid<mailto:ami.khande...@fmr.com.invalid>> wrote:
> Sys.setenv(SPARKR_SUBMIT_ARGS="--verbose sparkr-shell")
> Sys.setenv(SPARK_PRINT_LAUNCH_COMMAND=1)
>
> sc <- sparkR.init(master="local")
Launching java with spark-submit command 
/C/DevTools/spark-1.5.1/bin/spark-submit.cmd   --verbose sparkr-shell 
C:\Users\a554719\AppData\Local\Temp\Rtmpw11KJ1\backend_port31b0afd4391
Error in sparkR.init(master = "local") :
  JVM is not ready after 10 seconds
In addition: Warning message:
running command '"/C/DevTools/spark-1.5.1/bin/spark-submit.cmd"   --verbose 
sparkr-shell 
C:\Users\a554719\AppData\Local\Temp\Rtmpw11KJ1\backend_port31b0afd4391' had 
status 127

-Original Message-
From: Sun, Rui [mailto:rui@intel.com<javascript:;>]
Sent: Tuesday, October 06, 2015 9:39 AM
To: akhandeshi; user@spark.apache.org<javascript:;>
Subject: RE: SparkR Error in sparkR.init(master=“local”) in RStudio

What you have done is supposed to work.  Need more debugging information to 
find the cause.

Could you add the following lines before calling sparkR.init()?

Sys.setenv(SPARKR_SUBMIT_ARGS="--verbose sparkr-shell")
Sys.setenv(SPARK_PRINT_LAUNCH_COMMAND=1)

Then to see if you can find any hint in the console output

-Original Message-
From: akhandeshi [mailto:ami.khande...@gmail.com<javascript:;>]
Sent: Tuesday, October 6, 2015 8:21 PM
To: user@spark.apache.org<javascript:;>
Subject: Re: SparkR Error in sparkR.init(master=“local”) in RStudio

I couldn't get this working...

I have have JAVA_HOME set.
I have defined SPARK_HOME
Sys.setenv(SPARK_HOME="c:\DevTools\spark-1.5.1")
.libPaths(c(file.path(Sys.getenv("SPARK_HOME"), "R", "lib"), .libPaths())) 
library("SparkR", lib.loc="c:\\DevTools\\spark-1.5.1\\lib")
library(SparkR)
sc<-sparkR.init(master="local")

I get
Error in sparkR.init(master = "local") :
  JVM is not ready after 10 seconds

What am I missing??






--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/SparkR-Error-in-sparkR-init-master-local-in-RStudio-tp23768p24949.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org<javascript:;> For 
additional commands, e-mail: user-h...@spark.apache.org<javascript:;>


-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org<javascript:;>
For additional commands, e-mail: user-h...@spark.apache.org<javascript:;>


-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org<javascript:;>
For additional commands, e-mail: user-h...@spark.apache.org<javascript:;>


--
--Hossein


RE: SparkR Error in sparkR.init(master=“local”) in RStudio

2015-10-06 Thread Khandeshi, Ami
> Sys.setenv(SPARKR_SUBMIT_ARGS="--verbose sparkr-shell")
> Sys.setenv(SPARK_PRINT_LAUNCH_COMMAND=1)
> 
> sc <- sparkR.init(master="local") 
Launching java with spark-submit command 
/C/DevTools/spark-1.5.1/bin/spark-submit.cmd   --verbose sparkr-shell 
C:\Users\a554719\AppData\Local\Temp\Rtmpw11KJ1\backend_port31b0afd4391 
Error in sparkR.init(master = "local") : 
  JVM is not ready after 10 seconds
In addition: Warning message:
running command '"/C/DevTools/spark-1.5.1/bin/spark-submit.cmd"   --verbose 
sparkr-shell 
C:\Users\a554719\AppData\Local\Temp\Rtmpw11KJ1\backend_port31b0afd4391' had 
status 127

-Original Message-
From: Sun, Rui [mailto:rui@intel.com] 
Sent: Tuesday, October 06, 2015 9:39 AM
To: akhandeshi; user@spark.apache.org
Subject: RE: SparkR Error in sparkR.init(master=“local”) in RStudio

What you have done is supposed to work.  Need more debugging information to 
find the cause.

Could you add the following lines before calling sparkR.init()? 

Sys.setenv(SPARKR_SUBMIT_ARGS="--verbose sparkr-shell")
Sys.setenv(SPARK_PRINT_LAUNCH_COMMAND=1)

Then to see if you can find any hint in the console output

-Original Message-
From: akhandeshi [mailto:ami.khande...@gmail.com] 
Sent: Tuesday, October 6, 2015 8:21 PM
To: user@spark.apache.org
Subject: Re: SparkR Error in sparkR.init(master=“local”) in RStudio

I couldn't get this working...

I have have JAVA_HOME set.
I have defined SPARK_HOME
Sys.setenv(SPARK_HOME="c:\DevTools\spark-1.5.1")
.libPaths(c(file.path(Sys.getenv("SPARK_HOME"), "R", "lib"), .libPaths())) 
library("SparkR", lib.loc="c:\\DevTools\\spark-1.5.1\\lib")
library(SparkR)
sc<-sparkR.init(master="local")

I get
Error in sparkR.init(master = "local") : 
  JVM is not ready after 10 seconds

What am I missing??






--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/SparkR-Error-in-sparkR-init-master-local-in-RStudio-tp23768p24949.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional 
commands, e-mail: user-h...@spark.apache.org


-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org


-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Spark shell never leaves ACCEPTED state in YARN CDH5

2015-03-25 Thread Khandeshi, Ami
I am seeing the same behavior.  I have enough resources.  How do I resolve 
it?

Thanks,

Ami


RE: Help with processing multiple RDDs

2014-11-11 Thread Khandeshi, Ami
I am running as Local in client mode.  I have allocated as high as 85g to the 
driver, executor, and daemon.   When I look at java processes.  I see two.  I 
see
20974 SparkSubmitDriverBootstrapper
21650 Jps
21075 SparkSubmit
I have tried repartition before, but my understanding is that comes with an 
overhead.  In my previous attempt, I didn't achieve much success.  I am not 
clear, how to best get even partitions, any thoughts??

I am caching the RDD, and performing count on the keys.

I am running it again, with repartitioning on the dataset. Let us see if that 
helps!  I will send you the logs as soon as this completes!

Thank you,  I sincerely appreciate your help!

Regards,

Ami

-Original Message-
From: Kapil Malik [mailto:kma...@adobe.com] 
Sent: Tuesday, November 11, 2014 9:05 PM
To: akhandeshi; u...@spark.incubator.apache.org
Subject: RE: Help with processing multiple RDDs

Hi,

How is 78g distributed in driver, daemon, executor ?

Can you please paste the logs regarding  that I don't have enough memory to 
hold the data in memory
Are you collecting any data in driver ?

Lastly, did you try doing a re-partition to create smaller and evenly 
distributed partitions?

Regards,

Kapil 

-Original Message-
From: akhandeshi [mailto:ami.khande...@gmail.com] 
Sent: 12 November 2014 03:44
To: u...@spark.incubator.apache.org
Subject: Help with processing multiple RDDs

I have been struggling to process a set of RDDs.  Conceptually, it is is not a 
large data set. It seems, no matter how much I provide to JVM or partition, I 
can't seem to process this data.  I am caching the RDD.  I have tried 
persit(disk and memory), perist(memory) and persist(off_heap) with no success.  
Currently I am giving 78g to my driver, daemon and executor
memory.   

Currently, it seems to have trouble with one of the largest partition,
rdd_22_29 which is 25.9 GB.  

The metrics page shows Summary Metrics for 29 Completed Tasks.  However, I 
don't see few partitions on the list below.  However, i do seem to have 
warnings in the log file, indicating that I don't have enough memory to hold 
the data in memory.  I don't understand, what I am doing wrong or how I can 
troubleshoot. Any pointers will be appreciated...

14/11/11 21:28:45 WARN CacheManager: Not enough space to cache partition
rdd_22_20 in memory! Free memory is 17190150496 bytes.
14/11/11 21:29:27 WARN CacheManager: Not enough space to cache partition
rdd_22_13 in memory! Free memory is 17190150496 bytes.


Block Name  Storage Level   Size in Memory  Size on DiskExecutors
rdd_22_0Memory Deserialized 1x Replicated   2.1 MB  0.0 B
mddworker.c.fi-mdd-poc.internal:54974
rdd_22_10   Memory Deserialized 1x Replicated   7.0 GB  0.0 B
mddworker.c.fi-mdd-poc.internal:54974
rdd_22_11   Memory Deserialized 1x Replicated   1290.2 MB   0.0 B
mddworker.c.fi-mdd-poc.internal:54974
rdd_22_12   Memory Deserialized 1x Replicated   1167.7 KB   0.0 B
mddworker.c.fi-mdd-poc.internal:54974
rdd_22_14   Memory Deserialized 1x Replicated   3.8 GB  0.0 B
mddworker.c.fi-mdd-poc.internal:54974
rdd_22_15   Memory Deserialized 1x Replicated   4.0 MB  0.0 B
mddworker.c.fi-mdd-poc.internal:54974
rdd_22_16   Memory Deserialized 1x Replicated   2.4 GB  0.0 B
mddworker.c.fi-mdd-poc.internal:54974
rdd_22_17   Memory Deserialized 1x Replicated   37.6 MB 0.0 B
mddworker.c.fi-mdd-poc.internal:54974
rdd_22_18   Memory Deserialized 1x Replicated   120.9 MB0.0 B
mddworker.c.fi-mdd-poc.internal:54974
rdd_22_19   Memory Deserialized 1x Replicated   755.9 KB0.0 B
mddworker.c.fi-mdd-poc.internal:54974
rdd_22_2Memory Deserialized 1x Replicated   289.5 KB0.0 B
mddworker.c.fi-mdd-poc.internal:54974
rdd_22_21   Memory Deserialized 1x Replicated   11.9 KB 0.0 B
mddworker.c.fi-mdd-poc.internal:54974
rdd_22_22   Memory Deserialized 1x Replicated   24.0 B  0.0 B
mddworker.c.fi-mdd-poc.internal:54974
rdd_22_23   Memory Deserialized 1x Replicated   24.0 B  0.0 B
mddworker.c.fi-mdd-poc.internal:54974
rdd_22_24   Memory Deserialized 1x Replicated   3.0 MB  0.0 B
mddworker.c.fi-mdd-poc.internal:54974
rdd_22_25   Memory Deserialized 1x Replicated   24.0 B  0.0 B
mddworker.c.fi-mdd-poc.internal:54974
rdd_22_26   Memory Deserialized 1x Replicated   4.0 GB  0.0 B
mddworker.c.fi-mdd-poc.internal:54974
rdd_22_27   Memory Deserialized 1x Replicated   24.0 B  0.0 B
mddworker.c.fi-mdd-poc.internal:54974
rdd_22_28   Memory Deserialized 1x Replicated   1846.1 KB   0.0 B
mddworker.c.fi-mdd-poc.internal:54974
rdd_22_29   Memory Deserialized 1x Replicated   25.9 GB 0.0 B
mddworker.c.fi-mdd-poc.internal:54974
rdd_22_3Memory Deserialized 1x Replicated   267.1 KB0.0 B
mddworker.c.fi-mdd-poc.internal:54974
rdd_22_4Memory Deserialized 1x Replicated   24.0 B  0.0 B