[jira] [Commented] (OOZIE-3529) Oozie not supported for s3 as filesystem

2019-10-04 Thread Hadoop QA (Jira)


[ 
https://issues.apache.org/jira/browse/OOZIE-3529?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16944350#comment-16944350
 ] 

Hadoop QA commented on OOZIE-3529:
--

PreCommit-OOZIE-Build started


> Oozie not supported for s3 as filesystem
> 
>
> Key: OOZIE-3529
> URL: https://issues.apache.org/jira/browse/OOZIE-3529
> Project: Oozie
>  Issue Type: Bug
>  Components: core
>Affects Versions: 4.3.1, 5.1.0
>Reporter: Denes Bodo
>Assignee: Denes Bodo
>Priority: Critical
>  Labels: S3
> Fix For: 5.2.0
>
> Attachments: OOZIE-3529.001.patch, OOZIE-3529.002.patch, 
> OOZIE-3529.003.patch, OOZIE-3529.004.patch, OOZIE-3529.005.patch, 
> OOZIE-3529.006.patch, id.pig, job.properties, workflow.xml
>
>
> Many customer who uses s3 file system as secondary one experiences the 
> following error when Oozie tries to submit the Yarn application:
> {noformat}
> 2019-04-29 13:02:53,770  WARN ForkedActionStartXCommand:523 - 
> SERVER[hwnode1.puretec.purestorage.com] USER[hrt_qa] GROUP[-] TOKEN[] 
> APP[demo-wf] JOB[001-190423141707256-oozie-oozi-W] 
> ACTION[001-190423141707256-oozie-oozi-W@streaming-node] Error starting 
> action [streaming-node]. ErrorType [ERROR], ErrorCode 
> [UnsupportedOperationException], Message [UnsupportedOperationException: 
> Accessing local file system is not allowed]
> org.apache.oozie.action.ActionExecutorException: 
> UnsupportedOperationException: Accessing local file system is not allowed
>   at 
> org.apache.oozie.action.ActionExecutor.convertException(ActionExecutor.java:446)
>   at 
> org.apache.oozie.action.hadoop.JavaActionExecutor.createLauncherConf(JavaActionExecutor.java:1092)
>   at 
> org.apache.oozie.action.hadoop.MapReduceActionExecutor.createLauncherConf(MapReduceActionExecutor.java:309)
>   at 
> org.apache.oozie.action.hadoop.JavaActionExecutor.submitLauncher(JavaActionExecutor.java:1197)
>   at 
> org.apache.oozie.action.hadoop.JavaActionExecutor.start(JavaActionExecutor.java:1472)
>   at 
> org.apache.oozie.command.wf.ActionStartXCommand.execute(ActionStartXCommand.java:234)
>   at 
> org.apache.oozie.command.wf.ForkedActionStartXCommand.execute(ForkedActionStartXCommand.java:41)
>   at 
> org.apache.oozie.command.wf.ForkedActionStartXCommand.execute(ForkedActionStartXCommand.java:30)
>   at org.apache.oozie.command.XCommand.call(XCommand.java:287)
>   at java.util.concurrent.FutureTask.run(FutureTask.java:266)
>   at 
> org.apache.oozie.service.CallableQueueService$CallableWrapper.run(CallableQueueService.java:179)
>   at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>   at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>   at java.lang.Thread.run(Thread.java:745)
> Caused by: java.lang.UnsupportedOperationException: Accessing local file 
> system is not allowed
>   at 
> org.apache.hadoop.fs.RawLocalFileSystem.initialize(RawLocalFileSystem.java:48)
>   at 
> org.apache.hadoop.fs.LocalFileSystem.initialize(LocalFileSystem.java:47)
>   at 
> org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:3354)
>   at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:124)
>   at 
> org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:3403)
>   at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:3371)
>   at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:477)
>   at org.apache.hadoop.fs.FileSystem.getLocal(FileSystem.java:433)
>   at 
> org.apache.hadoop.fs.LocalDirAllocator$AllocatorPerContext.confChanged(LocalDirAllocator.java:301)
>   at 
> org.apache.hadoop.fs.LocalDirAllocator$AllocatorPerContext.getLocalPathForWrite(LocalDirAllocator.java:378)
>   at 
> org.apache.hadoop.fs.LocalDirAllocator$AllocatorPerContext.createTmpFileForWrite(LocalDirAllocator.java:461)
>   at 
> org.apache.hadoop.fs.LocalDirAllocator.createTmpFileForWrite(LocalDirAllocator.java:200)
>   at 
> org.apache.hadoop.fs.s3a.S3AFileSystem.createTmpFileForWrite(S3AFileSystem.java:572)
>   at 
> org.apache.hadoop.fs.s3a.S3ADataBlocks$DiskBlockFactory.create(S3ADataBlocks.java:811)
>   at 
> org.apache.hadoop.fs.s3a.S3ABlockOutputStream.createBlockIfNeeded(S3ABlockOutputStream.java:190)
>   at 
> org.apache.hadoop.fs.s3a.S3ABlockOutputStream.(S3ABlockOutputStream.java:168)
>   at org.apache.hadoop.fs.s3a.S3AFileSystem.create(S3AFileSystem.java:778)
>   at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:1169)
>   at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:1149)
>   at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:1038)
>   at org.apache.hadoop.fs.FileSystem.create(FileSystem.ja

Failed: OOZIE-3529 PreCommit Build #1233

2019-10-04 Thread Apache Jenkins Server
Jira: https://issues.apache.org/jira/browse/OOZIE-3529
Build: https://builds.apache.org/job/PreCommit-OOZIE-Build/1233/

###
## LAST 100 LINES OF THE CONSOLE 
###
[...truncated 909.05 KB...]
error: 
a/core/src/test/java/org/apache/oozie/service/TestHadoopAccessorService.java: 
No such file or directory
Checking patch a/core/src/test/java/org/apache/oozie/test/XFsTestCase.java => 
b/core/src/test/java/org/apache/oozie/test/XFsTestCase.java...
error: a/core/src/test/java/org/apache/oozie/test/XFsTestCase.java: No such 
file or directory
Checking patch a/docs/src/site/markdown/AG_HadoopConfiguration.md => 
b/docs/src/site/markdown/AG_HadoopConfiguration.md...
error: a/docs/src/site/markdown/AG_HadoopConfiguration.md: No such file or 
directory
Checking patch a/core/pom.xml => b/core/pom.xml...
error: a/core/pom.xml: No such file or directory
Checking patch 
a/core/src/main/java/org/apache/oozie/service/HadoopAccessorService.java => 
b/core/src/main/java/org/apache/oozie/service/HadoopAccessorService.java...
error: 
a/core/src/main/java/org/apache/oozie/service/HadoopAccessorService.java: No 
such file or directory
Checking patch a/core/src/main/resources/oozie-default.xml => 
b/core/src/main/resources/oozie-default.xml...
error: a/core/src/main/resources/oozie-default.xml: No such file or directory
Checking patch 
a/core/src/test/java/org/apache/oozie/service/TestHadoopAccessorService.java => 
b/core/src/test/java/org/apache/oozie/service/TestHadoopAccessorService.java...
error: 
a/core/src/test/java/org/apache/oozie/service/TestHadoopAccessorService.java: 
No such file or directory
Checking patch a/core/src/test/java/org/apache/oozie/test/XFsTestCase.java => 
b/core/src/test/java/org/apache/oozie/test/XFsTestCase.java...
error: a/core/src/test/java/org/apache/oozie/test/XFsTestCase.java: No such 
file or directory
Checking patch a/docs/src/site/markdown/AG_HadoopConfiguration.md => 
b/docs/src/site/markdown/AG_HadoopConfiguration.md...
error: a/docs/src/site/markdown/AG_HadoopConfiguration.md: No such file or 
directory
Checking patch core/pom.xml...
Checking patch 
core/src/main/java/org/apache/oozie/service/HadoopAccessorService.java...
Checking patch core/src/main/resources/oozie-default.xml...
Checking patch 
core/src/test/java/org/apache/oozie/service/TestHadoopAccessorService.java...
Checking patch core/src/test/java/org/apache/oozie/test/XFsTestCase.java...
Checking patch docs/src/site/markdown/AG_HadoopConfiguration.md...
Checking patch core/pom.xml...
error: while searching for:
compile






error: patch failed: core/pom.xml:504
error: core/pom.xml: patch does not apply
Checking patch 
core/src/main/java/org/apache/oozie/service/HadoopAccessorService.java...
error: while searching for:
public static final String KERBEROS_AUTH_ENABLED = CONF_PREFIX + 
"kerberos.enabled";
public static final String KERBEROS_KEYTAB = CONF_PREFIX + "keytab.file";
public static final String KERBEROS_PRINCIPAL = CONF_PREFIX + 
"kerberos.principal";

private static final String OOZIE_HADOOP_ACCESSOR_SERVICE_CREATED = 
"oozie.HadoopAccessorService.created";
private static final String DEFAULT_ACTIONNAME = "default";

error: patch failed: 
core/src/main/java/org/apache/oozie/service/HadoopAccessorService.java:100
error: core/src/main/java/org/apache/oozie/service/HadoopAccessorService.java: 
patch does not apply
Checking patch core/src/main/resources/oozie-default.xml...
Hunk #1 succeeded at 2231 (offset 12 lines).
Checking patch 
core/src/test/java/org/apache/oozie/service/TestHadoopAccessorService.java...
error: while searching for:
import java.io.InputStream;
import java.io.OutputStream;
import java.net.URI;
import org.apache.hadoop.conf.Configuration;
import org.apache.oozie.ErrorCode;
import org.apache.oozie.util.XConfiguration;

public class TestHadoopAccessorService extends XFsTestCase {


error: patch failed: 
core/src/test/java/org/apache/oozie/service/TestHadoopAccessorService.java:38
error: 
core/src/test/java/org/apache/oozie/service/TestHadoopAccessorService.java: 
patch does not apply
Checking patch core/src/test/java/org/apache/oozie/test/XFsTestCase.java...
error: while searching for:
conf.set("oozie.service.HadoopAccessorService.hadoop.configurations", 
"*=hadoop-conf");
conf.set("oozie.service.HadoopAccessorService.action.configurations", 
"*=action-conf");

has = new HadoopAccessorService();
has.init(conf);
Configuration jobConf = has.createConfiguration(getNameNodeUri());

error: patch failed: 
core/src/test/java/org/apache/oozie/test/XFsTestCase.java:79
error: core/src/test/java/org/apache/oozie/test/XFsTestCase.java: patch does 
not apply
Checking patch docs/src/site/markdown/AG_HadoopConfiguration.md...
error: while searching for:


```

## Limitations


error: patch failed: docs/src/s

[jira] [Commented] (OOZIE-3529) Oozie not supported for s3 as filesystem

2019-10-04 Thread Hadoop QA (Jira)


[ 
https://issues.apache.org/jira/browse/OOZIE-3529?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16944361#comment-16944361
 ] 

Hadoop QA commented on OOZIE-3529:
--


Testing JIRA OOZIE-3529

Cleaning local git workspace



{color:red}-1{color} Patch failed to apply to head of branch




> Oozie not supported for s3 as filesystem
> 
>
> Key: OOZIE-3529
> URL: https://issues.apache.org/jira/browse/OOZIE-3529
> Project: Oozie
>  Issue Type: Bug
>  Components: core
>Affects Versions: 4.3.1, 5.1.0
>Reporter: Denes Bodo
>Assignee: Denes Bodo
>Priority: Critical
>  Labels: S3
> Fix For: 5.2.0
>
> Attachments: OOZIE-3529.001.patch, OOZIE-3529.002.patch, 
> OOZIE-3529.003.patch, OOZIE-3529.004.patch, OOZIE-3529.005.patch, 
> OOZIE-3529.006.patch, id.pig, job.properties, workflow.xml
>
>
> Many customer who uses s3 file system as secondary one experiences the 
> following error when Oozie tries to submit the Yarn application:
> {noformat}
> 2019-04-29 13:02:53,770  WARN ForkedActionStartXCommand:523 - 
> SERVER[hwnode1.puretec.purestorage.com] USER[hrt_qa] GROUP[-] TOKEN[] 
> APP[demo-wf] JOB[001-190423141707256-oozie-oozi-W] 
> ACTION[001-190423141707256-oozie-oozi-W@streaming-node] Error starting 
> action [streaming-node]. ErrorType [ERROR], ErrorCode 
> [UnsupportedOperationException], Message [UnsupportedOperationException: 
> Accessing local file system is not allowed]
> org.apache.oozie.action.ActionExecutorException: 
> UnsupportedOperationException: Accessing local file system is not allowed
>   at 
> org.apache.oozie.action.ActionExecutor.convertException(ActionExecutor.java:446)
>   at 
> org.apache.oozie.action.hadoop.JavaActionExecutor.createLauncherConf(JavaActionExecutor.java:1092)
>   at 
> org.apache.oozie.action.hadoop.MapReduceActionExecutor.createLauncherConf(MapReduceActionExecutor.java:309)
>   at 
> org.apache.oozie.action.hadoop.JavaActionExecutor.submitLauncher(JavaActionExecutor.java:1197)
>   at 
> org.apache.oozie.action.hadoop.JavaActionExecutor.start(JavaActionExecutor.java:1472)
>   at 
> org.apache.oozie.command.wf.ActionStartXCommand.execute(ActionStartXCommand.java:234)
>   at 
> org.apache.oozie.command.wf.ForkedActionStartXCommand.execute(ForkedActionStartXCommand.java:41)
>   at 
> org.apache.oozie.command.wf.ForkedActionStartXCommand.execute(ForkedActionStartXCommand.java:30)
>   at org.apache.oozie.command.XCommand.call(XCommand.java:287)
>   at java.util.concurrent.FutureTask.run(FutureTask.java:266)
>   at 
> org.apache.oozie.service.CallableQueueService$CallableWrapper.run(CallableQueueService.java:179)
>   at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>   at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>   at java.lang.Thread.run(Thread.java:745)
> Caused by: java.lang.UnsupportedOperationException: Accessing local file 
> system is not allowed
>   at 
> org.apache.hadoop.fs.RawLocalFileSystem.initialize(RawLocalFileSystem.java:48)
>   at 
> org.apache.hadoop.fs.LocalFileSystem.initialize(LocalFileSystem.java:47)
>   at 
> org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:3354)
>   at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:124)
>   at 
> org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:3403)
>   at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:3371)
>   at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:477)
>   at org.apache.hadoop.fs.FileSystem.getLocal(FileSystem.java:433)
>   at 
> org.apache.hadoop.fs.LocalDirAllocator$AllocatorPerContext.confChanged(LocalDirAllocator.java:301)
>   at 
> org.apache.hadoop.fs.LocalDirAllocator$AllocatorPerContext.getLocalPathForWrite(LocalDirAllocator.java:378)
>   at 
> org.apache.hadoop.fs.LocalDirAllocator$AllocatorPerContext.createTmpFileForWrite(LocalDirAllocator.java:461)
>   at 
> org.apache.hadoop.fs.LocalDirAllocator.createTmpFileForWrite(LocalDirAllocator.java:200)
>   at 
> org.apache.hadoop.fs.s3a.S3AFileSystem.createTmpFileForWrite(S3AFileSystem.java:572)
>   at 
> org.apache.hadoop.fs.s3a.S3ADataBlocks$DiskBlockFactory.create(S3ADataBlocks.java:811)
>   at 
> org.apache.hadoop.fs.s3a.S3ABlockOutputStream.createBlockIfNeeded(S3ABlockOutputStream.java:190)
>   at 
> org.apache.hadoop.fs.s3a.S3ABlockOutputStream.(S3ABlockOutputStream.java:168)
>   at org.apache.hadoop.fs.s3a.S3AFileSystem.create(S3AFileSystem.java:778)
>   at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:1169)
>   at org.apache.hadoop.fs.FileSystem.create(FileSystem

[jira] Subscription: Oozie Patch Available

2019-10-04 Thread jira
Issue Subscription
Filter: Oozie Patch Available (91 issues)

Subscriber: ooziedaily

Key Summary
OOZIE-3542  Handle better old Hdfs implementations in ECPolicyDisabler
https://issues.apache.org/jira/browse/OOZIE-3542
OOZIE-3536  oozie-main(pom.xml)  plugin maven-javadoc-plugin upgrade version 
caused configuration can't find the Tag 
https://issues.apache.org/jira/browse/OOZIE-3536
OOZIE-3529  Oozie not supported for s3 as filesystem
https://issues.apache.org/jira/browse/OOZIE-3529
OOZIE-3482  Fix bug in CoordSubmitXCommand#validateCoordinatorJob
https://issues.apache.org/jira/browse/OOZIE-3482
OOZIE-3480  Add windowactionstatus metrics in DBLiteWorkflowStoreService
https://issues.apache.org/jira/browse/OOZIE-3480
OOZIE-3461  CoordMaterializeTriggerService code cleanup
https://issues.apache.org/jira/browse/OOZIE-3461
OOZIE-3449  Make spark-2 as the default profile
https://issues.apache.org/jira/browse/OOZIE-3449
OOZIE-3447  Run test case in local : It shows oozie-hsqldb-orm.xml exception
https://issues.apache.org/jira/browse/OOZIE-3447
OOZIE-3418  Upgrade to Guava 27
https://issues.apache.org/jira/browse/OOZIE-3418
OOZIE-3404  The env variable of SPARK_HOME needs to be set when running pySpark
https://issues.apache.org/jira/browse/OOZIE-3404
OOZIE-3375  Can't use empty  in coordinator
https://issues.apache.org/jira/browse/OOZIE-3375
OOZIE-3367  Using && in EL expressions in oozie bundle.xml files generates 
parse errors
https://issues.apache.org/jira/browse/OOZIE-3367
OOZIE-3366  Update workflow status and subworkflow status on suspend command
https://issues.apache.org/jira/browse/OOZIE-3366
OOZIE-3364  Rerunning Oozie bundle jobs starts the coordinators in 
indeterminate order
https://issues.apache.org/jira/browse/OOZIE-3364
OOZIE-3362  When killed, SSH action should kill the spawned processes on target 
host
https://issues.apache.org/jira/browse/OOZIE-3362
OOZIE-3335  Cleanup parseFilter methods
https://issues.apache.org/jira/browse/OOZIE-3335
OOZIE-3328  Create Hive compatibility action executor to run hive actions using 
beeline
https://issues.apache.org/jira/browse/OOZIE-3328
OOZIE-3320  Oozie ShellAction should support absolute bash file path
https://issues.apache.org/jira/browse/OOZIE-3320
OOZIE-3319  Log SSH action callback error output
https://issues.apache.org/jira/browse/OOZIE-3319
OOZIE-3301  Update NOTICE file
https://issues.apache.org/jira/browse/OOZIE-3301
OOZIE-3274  Remove slf4j
https://issues.apache.org/jira/browse/OOZIE-3274
OOZIE-3266  Coord action rerun support RERUN_SKIP_NODES option
https://issues.apache.org/jira/browse/OOZIE-3266
OOZIE-3256  refactor OozieCLI class
https://issues.apache.org/jira/browse/OOZIE-3256
OOZIE-3254  [coordinator] LAST_ONLY and NONE execution modes: possible 
OutOfMemoryError when there are too many coordinator actions to materialize
https://issues.apache.org/jira/browse/OOZIE-3254
OOZIE-3199  Let system property restriction configurable
https://issues.apache.org/jira/browse/OOZIE-3199
OOZIE-3196  Authorization: restrict world readability by user
https://issues.apache.org/jira/browse/OOZIE-3196
OOZIE-3170  Oozie Diagnostic Bundle tool fails with NPE due to missing service 
class
https://issues.apache.org/jira/browse/OOZIE-3170
OOZIE-3137  Add support for log4j2 in HiveMain
https://issues.apache.org/jira/browse/OOZIE-3137
OOZIE-3135  Configure log4j2 in SqoopMain
https://issues.apache.org/jira/browse/OOZIE-3135
OOZIE-3091  Oozie Sqoop Avro Import fails with "java.lang.NoClassDefFoundError: 
org/apache/avro/mapred/AvroWrapper"
https://issues.apache.org/jira/browse/OOZIE-3091
OOZIE-3071  Oozie 4.3 Spark sharelib ueses a different version of commons-lang3 
than Spark 2.2.0
https://issues.apache.org/jira/browse/OOZIE-3071
OOZIE-3063  Sanitizing variables that are part of openjpa.ConnectionProperties
https://issues.apache.org/jira/browse/OOZIE-3063
OOZIE-3062  Set HADOOP_CONF_DIR for spark action
https://issues.apache.org/jira/browse/OOZIE-3062
OOZIE-2952  Fix Findbugs warnings in oozie-sharelib-oozie
https://issues.apache.org/jira/browse/OOZIE-2952
OOZIE-2834  ParameterVerifier logging non-useful warning for workflow definition
https://issues.apache.org/jira/browse/OOZIE-2834
OOZIE-2812  SparkConfigurationService should support loading configurations 
from multiple Spark versions
https://issues.apache.org/jira/browse/OOZIE-2812
OOZIE-2795  Create lib directory or symlink for Oozie CLI during packaging
https://issues.apache.org/jira/browse/OOZIE-2795
OOZIE-2784  Include WEEK as a parameter in the Coordinator Expression