[ 
https://issues.apache.org/jira/browse/MAHOUT-814?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13108392#comment-13108392
 ] 

Sean Owen commented on MAHOUT-814:
----------------------------------

Your analysis is completely correct. java.io.tmpdir is more than that -- it's 
the definition of "writable temp space" for Java itself. There should be 
nothing wrong with using this; in fact, it's the only reliable temp space 
available.

1. If it's not writable, that's a problem with the host machine. As you might 
have noticed, Jenkins seems to be quite flaky; a few times a week it will fail 
for some internal machine-specific reason.

2. However I too am not sure that's the issue; it could still be a clash 
somehow. Can you change the job that uses q-temp.seq to stash it in a file that 
maybe includes the timestamp? like q-temp-125095095090.seq. In general I think 
this is a good strategy, and why the test framework does exactly this.

It's either nothing for us to fix (1), or, I think, a case of making temp files 
more unique (2). You shouldn't have to do anything more complex.

> SSVD local tests should use their own tmp space to avoid collisions
> -------------------------------------------------------------------
>
>                 Key: MAHOUT-814
>                 URL: https://issues.apache.org/jira/browse/MAHOUT-814
>             Project: Mahout
>          Issue Type: Bug
>    Affects Versions: 0.5
>            Reporter: Grant Ingersoll
>            Assignee: Dmitriy Lyubimov
>            Priority: Minor
>             Fix For: 0.6
>
>         Attachments: MAHOUT-814.patch
>
>
> Running Mahout in an environment with Jenkins also running and am getting:
> {quote}
> java.io.FileNotFoundException: /tmp/q-temp.seq (Permission denied)
>         at java.io.FileOutputStream.open(Native Method)
>         at java.io.FileOutputStream.<init>(FileOutputStream.java:209)
>         at 
> org.apache.hadoop.fs.RawLocalFileSystem$LocalFSFileOutputStream.<init>(RawLocalFileSystem.java:187)
>         at 
> org.apache.hadoop.fs.RawLocalFileSystem$LocalFSFileOutputStream.<init>(RawLocalFileSystem.java:183)
>         at 
> org.apache.hadoop.fs.RawLocalFileSystem.create(RawLocalFileSystem.java:241)
>         at 
> org.apache.hadoop.fs.ChecksumFileSystem$ChecksumFSOutputSummer.<init>(ChecksumFileSystem.java:335)
>         at 
> org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:368)
>         at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:528)
>         at 
> org.apache.hadoop.io.SequenceFile$BlockCompressWriter.<init>(SequenceFile.java:1198)
>         at 
> org.apache.hadoop.io.SequenceFile.createWriter(SequenceFile.java:401)
>         at 
> org.apache.hadoop.io.SequenceFile.createWriter(SequenceFile.java:284)
>         at 
> org.apache.mahout.math.hadoop.stochasticsvd.qr.QRFirstStep.getTempQw(QRFirstStep.java:263)
>         at 
> org.apache.mahout.math.hadoop.stochasticsvd.qr.QRFirstStep.flushSolver(QRFirstStep.java:104)
>         at 
> org.apache.mahout.math.hadoop.stochasticsvd.qr.QRFirstStep.map(QRFirstStep.java:175)
>         at 
> org.apache.mahout.math.hadoop.stochasticsvd.qr.QRFirstStep.collect(QRFirstStep.java:279)
>         at 
> org.apache.mahout.math.hadoop.stochasticsvd.QJob$QMapper.map(QJob.java:142)
>         at 
> org.apache.mahout.math.hadoop.stochasticsvd.QJob$QMapper.map(QJob.java:71)
>         at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>         at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
>         at 
> org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:212)
> {quote}
> Also seeing the following tests fail:
> {quote}
> Tests in error: 
>   
> testSSVDSolverSparse(org.apache.mahout.math.hadoop.stochasticsvd.LocalSSVDSolverSparseSequentialTest):
>  Q job unsuccessful.
>   
> testSSVDSolverPowerIterations1(org.apache.mahout.math.hadoop.stochasticsvd.LocalSSVDSolverSparseSequentialTest):
>  Q job unsuccessful.
>   
> testSSVDSolverPowerIterations1(org.apache.mahout.math.hadoop.stochasticsvd.LocalSSVDSolverDenseTest):
>  Q job unsuccessful.
>   
> testSSVDSolverDense(org.apache.mahout.math.hadoop.stochasticsvd.LocalSSVDSolverDenseTest):
>  Q job unsuccessful.
> {quote}
> I haven't checked all of them, but I suspect they are all due to the same 
> reason.  We should dynamically create a temp area for each test using 
> temporary directories under the main temp dir.

--
This message is automatically generated by JIRA.
For more information on JIRA, see: http://www.atlassian.com/software/jira

        

Reply via email to