Re: SparkContext initialization error- java.io.IOException: No space left on device
Thank you both - yup: the /tmp disk space was filled up:) On Sun, Sep 6, 2015 at 11:51 AM, Ted Yuwrote: > Use the following command if needed: > df -i /tmp > > See > https://wiki.gentoo.org/wiki/Knowledge_Base:No_space_left_on_device_while_there_is_plenty_of_space_available > > On Sun, Sep 6, 2015 at 6:15 AM, Shixiong Zhu wrote: > >> The folder is in "/tmp" by default. Could you use "df -h" to check the >> free space of /tmp? >> >> Best Regards, >> Shixiong Zhu >> >> 2015-09-05 9:50 GMT+08:00 shenyan zhen : >> >>> Has anyone seen this error? Not sure which dir the program was trying to >>> write to. >>> >>> I am running Spark 1.4.1, submitting Spark job to Yarn, in yarn-client >>> mode. >>> >>> 15/09/04 21:36:06 ERROR SparkContext: Error adding jar >>> (java.io.IOException: No space left on device), was the --addJars option >>> used? >>> >>> 15/09/04 21:36:08 ERROR SparkContext: Error initializing SparkContext. >>> >>> java.io.IOException: No space left on device >>> >>> at java.io.FileOutputStream.writeBytes(Native Method) >>> >>> at java.io.FileOutputStream.write(FileOutputStream.java:300) >>> >>> at >>> java.util.zip.DeflaterOutputStream.deflate(DeflaterOutputStream.java:178) >>> >>> at java.util.zip.ZipOutputStream.closeEntry(ZipOutputStream.java:213) >>> >>> at java.util.zip.ZipOutputStream.finish(ZipOutputStream.java:318) >>> >>> at >>> java.util.zip.DeflaterOutputStream.close(DeflaterOutputStream.java:163) >>> >>> at java.util.zip.ZipOutputStream.close(ZipOutputStream.java:338) >>> >>> at >>> org.apache.spark.deploy.yarn.Client.createConfArchive(Client.scala:432) >>> >>> at >>> org.apache.spark.deploy.yarn.Client.prepareLocalResources(Client.scala:338) >>> >>> at >>> org.apache.spark.deploy.yarn.Client.createContainerLaunchContext(Client.scala:561) >>> >>> at >>> org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:115) >>> >>> at >>> org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:57) >>> >>> at >>> org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:141) >>> >>> at org.apache.spark.SparkContext.(SparkContext.scala:497) >>> >>> Thanks, >>> Shenyan >>> >> >> >
Re: SparkContext initialization error- java.io.IOException: No space left on device
The folder is in "/tmp" by default. Could you use "df -h" to check the free space of /tmp? Best Regards, Shixiong Zhu 2015-09-05 9:50 GMT+08:00 shenyan zhen: > Has anyone seen this error? Not sure which dir the program was trying to > write to. > > I am running Spark 1.4.1, submitting Spark job to Yarn, in yarn-client > mode. > > 15/09/04 21:36:06 ERROR SparkContext: Error adding jar > (java.io.IOException: No space left on device), was the --addJars option > used? > > 15/09/04 21:36:08 ERROR SparkContext: Error initializing SparkContext. > > java.io.IOException: No space left on device > > at java.io.FileOutputStream.writeBytes(Native Method) > > at java.io.FileOutputStream.write(FileOutputStream.java:300) > > at > java.util.zip.DeflaterOutputStream.deflate(DeflaterOutputStream.java:178) > > at java.util.zip.ZipOutputStream.closeEntry(ZipOutputStream.java:213) > > at java.util.zip.ZipOutputStream.finish(ZipOutputStream.java:318) > > at java.util.zip.DeflaterOutputStream.close(DeflaterOutputStream.java:163) > > at java.util.zip.ZipOutputStream.close(ZipOutputStream.java:338) > > at org.apache.spark.deploy.yarn.Client.createConfArchive(Client.scala:432) > > at > org.apache.spark.deploy.yarn.Client.prepareLocalResources(Client.scala:338) > > at > org.apache.spark.deploy.yarn.Client.createContainerLaunchContext(Client.scala:561) > > at org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:115) > > at > org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:57) > > at > org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:141) > > at org.apache.spark.SparkContext.(SparkContext.scala:497) > > Thanks, > Shenyan >
Re: SparkContext initialization error- java.io.IOException: No space left on device
Use the following command if needed: df -i /tmp See https://wiki.gentoo.org/wiki/Knowledge_Base:No_space_left_on_device_while_there_is_plenty_of_space_available On Sun, Sep 6, 2015 at 6:15 AM, Shixiong Zhuwrote: > The folder is in "/tmp" by default. Could you use "df -h" to check the > free space of /tmp? > > Best Regards, > Shixiong Zhu > > 2015-09-05 9:50 GMT+08:00 shenyan zhen : > >> Has anyone seen this error? Not sure which dir the program was trying to >> write to. >> >> I am running Spark 1.4.1, submitting Spark job to Yarn, in yarn-client >> mode. >> >> 15/09/04 21:36:06 ERROR SparkContext: Error adding jar >> (java.io.IOException: No space left on device), was the --addJars option >> used? >> >> 15/09/04 21:36:08 ERROR SparkContext: Error initializing SparkContext. >> >> java.io.IOException: No space left on device >> >> at java.io.FileOutputStream.writeBytes(Native Method) >> >> at java.io.FileOutputStream.write(FileOutputStream.java:300) >> >> at >> java.util.zip.DeflaterOutputStream.deflate(DeflaterOutputStream.java:178) >> >> at java.util.zip.ZipOutputStream.closeEntry(ZipOutputStream.java:213) >> >> at java.util.zip.ZipOutputStream.finish(ZipOutputStream.java:318) >> >> at java.util.zip.DeflaterOutputStream.close(DeflaterOutputStream.java:163) >> >> at java.util.zip.ZipOutputStream.close(ZipOutputStream.java:338) >> >> at org.apache.spark.deploy.yarn.Client.createConfArchive(Client.scala:432) >> >> at >> org.apache.spark.deploy.yarn.Client.prepareLocalResources(Client.scala:338) >> >> at >> org.apache.spark.deploy.yarn.Client.createContainerLaunchContext(Client.scala:561) >> >> at org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:115) >> >> at >> org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:57) >> >> at >> org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:141) >> >> at org.apache.spark.SparkContext.(SparkContext.scala:497) >> >> Thanks, >> Shenyan >> > >
SparkContext initialization error- java.io.IOException: No space left on device
Has anyone seen this error? Not sure which dir the program was trying to write to. I am running Spark 1.4.1, submitting Spark job to Yarn, in yarn-client mode. 15/09/04 21:36:06 ERROR SparkContext: Error adding jar (java.io.IOException: No space left on device), was the --addJars option used? 15/09/04 21:36:08 ERROR SparkContext: Error initializing SparkContext. java.io.IOException: No space left on device at java.io.FileOutputStream.writeBytes(Native Method) at java.io.FileOutputStream.write(FileOutputStream.java:300) at java.util.zip.DeflaterOutputStream.deflate(DeflaterOutputStream.java:178) at java.util.zip.ZipOutputStream.closeEntry(ZipOutputStream.java:213) at java.util.zip.ZipOutputStream.finish(ZipOutputStream.java:318) at java.util.zip.DeflaterOutputStream.close(DeflaterOutputStream.java:163) at java.util.zip.ZipOutputStream.close(ZipOutputStream.java:338) at org.apache.spark.deploy.yarn.Client.createConfArchive(Client.scala:432) at org.apache.spark.deploy.yarn.Client.prepareLocalResources(Client.scala:338) at org.apache.spark.deploy.yarn.Client.createContainerLaunchContext(Client.scala:561) at org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:115) at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:57) at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:141) at org.apache.spark.SparkContext.(SparkContext.scala:497) Thanks, Shenyan