[jira] [Commented] (HDFS-9758) libhdfs++: Implement Python bindings

2016-06-08 Thread Tibor Kiss (JIRA)

[ 
https://issues.apache.org/jira/browse/HDFS-9758?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15320657#comment-15320657
 ] 

Tibor Kiss commented on HDFS-9758:
--

Going forward with the implementation I've realized that some calls (e.g. 
hdfsExists, hdfsCreateFile, etc) are not exposed through the C interface.
Is it intentional or just WiP?

> libhdfs++: Implement Python bindings
> 
>
> Key: HDFS-9758
> URL: https://issues.apache.org/jira/browse/HDFS-9758
> Project: Hadoop HDFS
>  Issue Type: Sub-task
>  Components: hdfs-client
>Reporter: James Clampffer
>Assignee: Tibor Kiss
> Attachments: hdfs_posix.py
>
>
> It'd be really useful to have bindings for various scripting languages.  
> Python would be a good start because of it's popularity and how easy it is to 
> interact with shared libraries using the ctypes module.  I think bindings for 
> the V8 engine that nodeJS uses would be a close second in terms of expanding 
> the potential user base.
> Probably worth starting with just adding a synchronous API and building from 
> there to avoid interactions with python's garbage collector until the 
> bindings prove to be solid.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: hdfs-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: hdfs-issues-h...@hadoop.apache.org



[jira] [Commented] (HDFS-10354) Fix compilation & unit test issues on Mac OS X with clang compiler

2016-05-04 Thread Tibor Kiss (JIRA)

[ 
https://issues.apache.org/jira/browse/HDFS-10354?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15270526#comment-15270526
 ] 

Tibor Kiss commented on HDFS-10354:
---

Thanks [~bobhansen] for the detailed review!

I'd propose to continue the discussion on thread_local issue in this 
[ticket|https://issues.apache.org/jira/browse/HDFS-10355].
I've incorporated your suggestion regarding hdfsTell, makes sense.

With regard to logging: I agree that casting in every use is error prone.
I've tried to overload with size_t (and unsigned long), without much success: 
As size_t is a typedef to __SIZE_TYPE__ (which is a compiler macro) it will 
always conflict with either uint32_t or uint64_t. 

Further ideas would be appreciated to resolve this issue.


> Fix compilation & unit test issues on Mac OS X with clang compiler
> --
>
> Key: HDFS-10354
> URL: https://issues.apache.org/jira/browse/HDFS-10354
> Project: Hadoop HDFS
>  Issue Type: Sub-task
>  Components: hdfs-client
> Environment: OS X: 10.11
> clang: Apple LLVM version 7.0.2 (clang-700.1.81)
>Reporter: Tibor Kiss
>Assignee: Tibor Kiss
> Attachments: HDFS-10354.HDFS-8707.001.patch, 
> HDFS-10354.HDFS-8707.002.patch
>
>
> Compilation fails with multiple errors on Mac OS X.
> Unit test test_test_libhdfs_zerocopy_hdfs_static also fails to execute on OS 
> X.
> Compile error 1:
> {noformat}
>  [exec] Scanning dependencies of target common_obj
>  [exec] [ 45%] Building CXX object 
> main/native/libhdfspp/lib/common/CMakeFiles/common_obj.dir/base64.cc.o
>  [exec] [ 45%] Building CXX object 
> main/native/libhdfspp/lib/common/CMakeFiles/common_obj.dir/status.cc.o
>  [exec] [ 46%] Building CXX object 
> main/native/libhdfspp/lib/common/CMakeFiles/common_obj.dir/sasl_digest_md5.cc.o
>  [exec] [ 46%] Building CXX object 
> main/native/libhdfspp/lib/common/CMakeFiles/common_obj.dir/hdfs_public_api.cc.o
>  [exec] [ 47%] Building CXX object 
> main/native/libhdfspp/lib/common/CMakeFiles/common_obj.dir/options.cc.o
>  [exec] [ 48%] Building CXX object 
> main/native/libhdfspp/lib/common/CMakeFiles/common_obj.dir/configuration.cc.o
>  [exec] 
> /Users/tiborkiss/workspace/apache-hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/lib/common/configuration.cc:85:12:
>  error: no viable conversion from 'optional' to 'optional'
>  [exec] return result;
>  [exec]^~
>  [exec] 
> /Users/tiborkiss/workspace/apache-hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/third_party/tr2/optional.hpp:427:13:
>  note: candidate constructor not viable: no known conversion from 
> 'std::experimental::optional' to 'std::experimental::nullopt_t' for 1st 
> argument
>  [exec]   constexpr optional(nullopt_t) noexcept : OptionalBase() {};
>  [exec] ^
>  [exec] 
> /Users/tiborkiss/workspace/apache-hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/third_party/tr2/optional.hpp:429:3:
>  note: candidate constructor not viable: no known conversion from 
> 'std::experimental::optional' to 'const 
> std::experimental::optional &' for 1st argument
>  [exec]   optional(const optional& rhs)
>  [exec]   ^
>  [exec] 
> /Users/tiborkiss/workspace/apache-hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/third_party/tr2/optional.hpp:438:3:
>  note: candidate constructor not viable: no known conversion from 
> 'std::experimental::optional' to 'std::experimental::optional long> &&' for 1st argument
>  [exec]   optional(optional&& rhs) 
> noexcept(is_nothrow_move_constructible::value)
>  [exec]   ^
>  [exec] 
> /Users/tiborkiss/workspace/apache-hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/third_party/tr2/optional.hpp:447:13:
>  note: candidate constructor not viable: no known conversion from 
> 'std::experimental::optional' to 'const long long &' for 1st argument
>  [exec]   constexpr optional(const T& v) : OptionalBase(v) {}
>  [exec] ^
>  [exec] 
> /Users/tiborkiss/workspace/apache-hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/third_party/tr2/optional.hpp:449:13:
>  note: candidate constructor not viable: no known conversion from 
> 'std::experimental::optional' to 'long long &&' for 1st argument
>  [exec]   constexpr optional(T&& v) : OptionalBase(constexpr_move(v)) 
> {}
>  [exec] ^
>  [exec] 1 error generated.
>  [exec] make[2]: *** 
> [main/native/libhdfspp/lib/common/CMakeFiles/common_obj.dir/configuration.cc.o]
>  Error 1
>  [exec] make[1]: *** 
> [main/native/libhdfspp/lib/common/CMakeFiles/common_obj.dir/all] Error 2
>  [exec] make: *** 

[jira] [Updated] (HDFS-10355) Fix thread_local related build issue on Mac OS X

2016-05-03 Thread Tibor Kiss (JIRA)

 [ 
https://issues.apache.org/jira/browse/HDFS-10355?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Tibor Kiss updated HDFS-10355:
--
Description: 
The native hdfs library uses C++11 features heavily.
One of such feature is thread_local storage class which is supported in GCC, 
Visual Studio and the community version of clang compiler, but not by Apple's 
clang (which is default on OS X boxes). 
See further details here: http://stackoverflow.com/a/29929949

Even though not many Hadoop cluster runs on OS X developers still use this 
platform for development.

The problem can be solved multiple ways:
 a) Stick to gcc/g++ or community based clang on OS X. Developers will need 
extra steps to build Hadoop.
 b) Workaround thread_local with a helper class.
 c) Get rid of all the globals marked with thread_local. Interface change will 
be erquired.
 d) Disable multi threading support in the native client on OS X and document 
this limitation. 

Compile error related to thread_local:
{noformat}
 [exec] 
/Users/tiborkiss/workspace/apache-hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/lib/bindings/c/hdfs.cc:66:1:
 error: thread-local storage is not supported for the current target
 [exec] thread_local std::string errstr;
 [exec] ^
 [exec] 
/Users/tiborkiss/workspace/apache-hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/lib/bindings/c/hdfs.cc:87:1:
 error: thread-local storage is not supported for the current target
 [exec] thread_local std::experimental::optional 
fsEventCallback;
 [exec] ^
 [exec] 
/Users/tiborkiss/workspace/apache-hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/lib/bindings/c/hdfs.cc:88:1:
 error: thread-local storage is not supported for the current target
 [exec] thread_local std::experimental::optional 
fileEventCallback;
 [exec] ^
 [exec] 1 warning and 3 errors generated.
{noformat}

  was:
The native hdfs library uses C++11 features heavily.
One of such feature is thread_local storage class which is supported in GCC, 
Visual Studio and the community version of clang compiler, but not by Apple's 
clang (which is default on OS X boxes). 
See further details here: http://stackoverflow.com/a/29929949

Even though not many Hadoop cluster runs on OS X developers still use this 
platform for development.

The problem can be solved multiple ways:
 a) Stick to gcc/g++ or community based clang on OS X. Developers will need 
extra steps to build Hadoop.
 b) Workaround thread_local with a helper class.
 c) Get rid of all the globals marked with thread_local. Interface change will 
be erquired.
 d) Disable multi threading support in the native client on OS X and document 
this limitation. 

Compile error related to thread_local:
{noformat}
 [exec] 
/Users/tiborkiss/workspace/apache-hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/lib/bindings/c/hdfs.cc:66:1:
 error: thread-local storage is not supported for the current target
 [exec] thread_local std::string errstr;
 [exec] ^
 [exec] 1 warning and 1 error generated.
 [exec] make[2]: *** 
[main/native/libhdfspp/lib/bindings/c/CMakeFiles/bindings_c_obj.dir/hdfs.cc.o] 
Error 1
 [exec] make[1]: *** 
[main/native/libhdfspp/lib/bindings/c/CMakeFiles/bindings_c_obj.dir/all] Error 2
 [exec] make: *** [all] Error 2
{noformat}


> Fix thread_local related build issue on Mac OS X
> 
>
> Key: HDFS-10355
> URL: https://issues.apache.org/jira/browse/HDFS-10355
> Project: Hadoop HDFS
>  Issue Type: Sub-task
>  Components: hdfs-client
> Environment: OS: Mac OS X 10.11
> clang: Apple LLVM version 7.0.2 (clang-700.1.81)
>Reporter: Tibor Kiss
>
> The native hdfs library uses C++11 features heavily.
> One of such feature is thread_local storage class which is supported in GCC, 
> Visual Studio and the community version of clang compiler, but not by Apple's 
> clang (which is default on OS X boxes). 
> See further details here: http://stackoverflow.com/a/29929949
> Even though not many Hadoop cluster runs on OS X developers still use this 
> platform for development.
> The problem can be solved multiple ways:
>  a) Stick to gcc/g++ or community based clang on OS X. Developers will need 
> extra steps to build Hadoop.
>  b) Workaround thread_local with a helper class.
>  c) Get rid of all the globals marked with thread_local. Interface change 
> will be erquired.
>  d) Disable multi threading support in the native client on OS X and document 
> this limitation. 
> Compile error related to thread_local:
> {noformat}
>  [exec] 
> /Users/tiborkiss/workspace/apache-hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/lib/bindings/c/hdfs.cc:66:1:
>  error: thread-local storage 

[jira] [Commented] (HDFS-10355) Fix thread_local related build issue on Mac OS X

2016-05-03 Thread Tibor Kiss (JIRA)

[ 
https://issues.apache.org/jira/browse/HDFS-10355?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15269166#comment-15269166
 ] 

Tibor Kiss commented on HDFS-10355:
---

Yes, it does have pthread support.

> Fix thread_local related build issue on Mac OS X
> 
>
> Key: HDFS-10355
> URL: https://issues.apache.org/jira/browse/HDFS-10355
> Project: Hadoop HDFS
>  Issue Type: Sub-task
>  Components: hdfs-client
> Environment: OS: Mac OS X 10.11
> clang: Apple LLVM version 7.0.2 (clang-700.1.81)
>Reporter: Tibor Kiss
>
> The native hdfs library uses C++11 features heavily.
> One of such feature is thread_local storage class which is supported in GCC, 
> Visual Studio and the community version of clang compiler, but not by Apple's 
> clang (which is default on OS X boxes). 
> See further details here: http://stackoverflow.com/a/29929949
> Even though not many Hadoop cluster runs on OS X developers still use this 
> platform for development.
> The problem can be solved multiple ways:
>  a) Stick to gcc/g++ or community based clang on OS X. Developers will need 
> extra steps to build Hadoop.
>  b) Workaround thread_local with a helper class.
>  c) Get rid of all the globals marked with thread_local. Interface change 
> will be erquired.
>  d) Disable multi threading support in the native client on OS X and document 
> this limitation. 
> Compile error related to thread_local:
> {noformat}
>  [exec] 
> /Users/tiborkiss/workspace/apache-hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/lib/bindings/c/hdfs.cc:66:1:
>  error: thread-local storage is not supported for the current target
>  [exec] thread_local std::string errstr;
>  [exec] ^
>  [exec] 1 warning and 1 error generated.
>  [exec] make[2]: *** 
> [main/native/libhdfspp/lib/bindings/c/CMakeFiles/bindings_c_obj.dir/hdfs.cc.o]
>  Error 1
>  [exec] make[1]: *** 
> [main/native/libhdfspp/lib/bindings/c/CMakeFiles/bindings_c_obj.dir/all] 
> Error 2
>  [exec] make: *** [all] Error 2
> {noformat}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: hdfs-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: hdfs-issues-h...@hadoop.apache.org



[jira] [Updated] (HDFS-10354) Fix compilation & unit test issues on Mac OS X with clang compiler

2016-05-03 Thread Tibor Kiss (JIRA)

 [ 
https://issues.apache.org/jira/browse/HDFS-10354?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Tibor Kiss updated HDFS-10354:
--
Description: 
Compilation fails with multiple errors on Mac OS X.
Unit test test_test_libhdfs_zerocopy_hdfs_static also fails to execute on OS X.

Compile error 1:
{noformat}
 [exec] Scanning dependencies of target common_obj
 [exec] [ 45%] Building CXX object 
main/native/libhdfspp/lib/common/CMakeFiles/common_obj.dir/base64.cc.o
 [exec] [ 45%] Building CXX object 
main/native/libhdfspp/lib/common/CMakeFiles/common_obj.dir/status.cc.o
 [exec] [ 46%] Building CXX object 
main/native/libhdfspp/lib/common/CMakeFiles/common_obj.dir/sasl_digest_md5.cc.o
 [exec] [ 46%] Building CXX object 
main/native/libhdfspp/lib/common/CMakeFiles/common_obj.dir/hdfs_public_api.cc.o
 [exec] [ 47%] Building CXX object 
main/native/libhdfspp/lib/common/CMakeFiles/common_obj.dir/options.cc.o
 [exec] [ 48%] Building CXX object 
main/native/libhdfspp/lib/common/CMakeFiles/common_obj.dir/configuration.cc.o
 [exec] 
/Users/tiborkiss/workspace/apache-hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/lib/common/configuration.cc:85:12:
 error: no viable conversion from 'optional' to 'optional'
 [exec] return result;
 [exec]^~
 [exec] 
/Users/tiborkiss/workspace/apache-hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/third_party/tr2/optional.hpp:427:13:
 note: candidate constructor not viable: no known conversion from 
'std::experimental::optional' to 'std::experimental::nullopt_t' for 1st 
argument
 [exec]   constexpr optional(nullopt_t) noexcept : OptionalBase() {};
 [exec] ^
 [exec] 
/Users/tiborkiss/workspace/apache-hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/third_party/tr2/optional.hpp:429:3:
 note: candidate constructor not viable: no known conversion from 
'std::experimental::optional' to 'const std::experimental::optional &' for 1st argument
 [exec]   optional(const optional& rhs)
 [exec]   ^
 [exec] 
/Users/tiborkiss/workspace/apache-hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/third_party/tr2/optional.hpp:438:3:
 note: candidate constructor not viable: no known conversion from 
'std::experimental::optional' to 'std::experimental::optional 
&&' for 1st argument
 [exec]   optional(optional&& rhs) 
noexcept(is_nothrow_move_constructible::value)
 [exec]   ^
 [exec] 
/Users/tiborkiss/workspace/apache-hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/third_party/tr2/optional.hpp:447:13:
 note: candidate constructor not viable: no known conversion from 
'std::experimental::optional' to 'const long long &' for 1st argument
 [exec]   constexpr optional(const T& v) : OptionalBase(v) {}
 [exec] ^
 [exec] 
/Users/tiborkiss/workspace/apache-hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/third_party/tr2/optional.hpp:449:13:
 note: candidate constructor not viable: no known conversion from 
'std::experimental::optional' to 'long long &&' for 1st argument
 [exec]   constexpr optional(T&& v) : OptionalBase(constexpr_move(v)) {}
 [exec] ^
 [exec] 1 error generated.
 [exec] make[2]: *** 
[main/native/libhdfspp/lib/common/CMakeFiles/common_obj.dir/configuration.cc.o] 
Error 1
 [exec] make[1]: *** 
[main/native/libhdfspp/lib/common/CMakeFiles/common_obj.dir/all] Error 2
 [exec] make: *** [all] Error 2
{noformat}

Compile error 2:
{noformat}
 [exec] 
/Users/tiborkiss/workspace/apache-hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/lib/fs/filesystem.cc:285:66:
 error: use of overloaded operator '<<' is ambiguous (with operand types 
'hdfs::LogMessage' and 'size_type' (aka 'unsigned long'))
 [exec]   << " Existing thread count = " << 
worker_threads_.size());
 [exec]   
~~~^~
{noformat}

There is an addition compile failure in native client related to thread_local.
The complexity of the error mandates to track that issue in a [separate 
ticket|https://issues.apache.org/jira/browse/HDFS-10355].
{noformat}
 [exec] 
/Users/tiborkiss/workspace/apache-hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/lib/bindings/c/hdfs.cc:66:1:
 error: thread-local storage is not supported for the current target
 [exec] thread_local std::string errstr;
 [exec] ^
 [exec] 1 warning and 1 error generated.
 [exec] make[2]: *** 
[main/native/libhdfspp/lib/bindings/c/CMakeFiles/bindings_c_obj.dir/hdfs.cc.o] 
Error 1
 [exec] make[1]: *** 
[main/native/libhdfspp/lib/bindings/c/CMakeFiles/bindings_c_obj.dir/all] Error 2
 

[jira] [Assigned] (HDFS-9758) libhdfs++: Implement Python bindings

2016-05-02 Thread Tibor Kiss (JIRA)

 [ 
https://issues.apache.org/jira/browse/HDFS-9758?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Tibor Kiss reassigned HDFS-9758:


Assignee: Tibor Kiss

> libhdfs++: Implement Python bindings
> 
>
> Key: HDFS-9758
> URL: https://issues.apache.org/jira/browse/HDFS-9758
> Project: Hadoop HDFS
>  Issue Type: Sub-task
>  Components: hdfs-client
>Reporter: James Clampffer
>Assignee: Tibor Kiss
> Attachments: hdfs_posix.py
>
>
> It'd be really useful to have bindings for various scripting languages.  
> Python would be a good start because of it's popularity and how easy it is to 
> interact with shared libraries using the ctypes module.  I think bindings for 
> the V8 engine that nodeJS uses would be a close second in terms of expanding 
> the potential user base.
> Probably worth starting with just adding a synchronous API and building from 
> there to avoid interactions with python's garbage collector until the 
> bindings prove to be solid.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: hdfs-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: hdfs-issues-h...@hadoop.apache.org



[jira] [Updated] (HDFS-10354) Fix compilation & unit test issues on Mac OS X with clang compiler

2016-05-02 Thread Tibor Kiss (JIRA)

 [ 
https://issues.apache.org/jira/browse/HDFS-10354?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Tibor Kiss updated HDFS-10354:
--
Attachment: HDFS-10354.HDFS-8707.002.patch

> Fix compilation & unit test issues on Mac OS X with clang compiler
> --
>
> Key: HDFS-10354
> URL: https://issues.apache.org/jira/browse/HDFS-10354
> Project: Hadoop HDFS
>  Issue Type: Sub-task
>  Components: hdfs-client
> Environment: OS X: 10.11
> clang: Apple LLVM version 7.0.2 (clang-700.1.81)
>Reporter: Tibor Kiss
>Assignee: Tibor Kiss
> Attachments: HDFS-10354.HDFS-8707.001.patch, 
> HDFS-10354.HDFS-8707.002.patch
>
>
> Compilation fails with multiple errors on Mac OS X.
> Unit test test_test_libhdfs_zerocopy_hdfs_static also fails to execute on OS 
> X.
> Compile error 1:
> {noformat}
>  [exec] Scanning dependencies of target common_obj
>  [exec] [ 45%] Building CXX object 
> main/native/libhdfspp/lib/common/CMakeFiles/common_obj.dir/base64.cc.o
>  [exec] [ 45%] Building CXX object 
> main/native/libhdfspp/lib/common/CMakeFiles/common_obj.dir/status.cc.o
>  [exec] [ 46%] Building CXX object 
> main/native/libhdfspp/lib/common/CMakeFiles/common_obj.dir/sasl_digest_md5.cc.o
>  [exec] [ 46%] Building CXX object 
> main/native/libhdfspp/lib/common/CMakeFiles/common_obj.dir/hdfs_public_api.cc.o
>  [exec] [ 47%] Building CXX object 
> main/native/libhdfspp/lib/common/CMakeFiles/common_obj.dir/options.cc.o
>  [exec] [ 48%] Building CXX object 
> main/native/libhdfspp/lib/common/CMakeFiles/common_obj.dir/configuration.cc.o
>  [exec] 
> /Users/tiborkiss/workspace/apache-hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/lib/common/configuration.cc:85:12:
>  error: no viable conversion from 'optional' to 'optional'
>  [exec] return result;
>  [exec]^~
>  [exec] 
> /Users/tiborkiss/workspace/apache-hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/third_party/tr2/optional.hpp:427:13:
>  note: candidate constructor not viable: no known conversion from 
> 'std::experimental::optional' to 'std::experimental::nullopt_t' for 1st 
> argument
>  [exec]   constexpr optional(nullopt_t) noexcept : OptionalBase() {};
>  [exec] ^
>  [exec] 
> /Users/tiborkiss/workspace/apache-hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/third_party/tr2/optional.hpp:429:3:
>  note: candidate constructor not viable: no known conversion from 
> 'std::experimental::optional' to 'const 
> std::experimental::optional &' for 1st argument
>  [exec]   optional(const optional& rhs)
>  [exec]   ^
>  [exec] 
> /Users/tiborkiss/workspace/apache-hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/third_party/tr2/optional.hpp:438:3:
>  note: candidate constructor not viable: no known conversion from 
> 'std::experimental::optional' to 'std::experimental::optional long> &&' for 1st argument
>  [exec]   optional(optional&& rhs) 
> noexcept(is_nothrow_move_constructible::value)
>  [exec]   ^
>  [exec] 
> /Users/tiborkiss/workspace/apache-hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/third_party/tr2/optional.hpp:447:13:
>  note: candidate constructor not viable: no known conversion from 
> 'std::experimental::optional' to 'const long long &' for 1st argument
>  [exec]   constexpr optional(const T& v) : OptionalBase(v) {}
>  [exec] ^
>  [exec] 
> /Users/tiborkiss/workspace/apache-hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/third_party/tr2/optional.hpp:449:13:
>  note: candidate constructor not viable: no known conversion from 
> 'std::experimental::optional' to 'long long &&' for 1st argument
>  [exec]   constexpr optional(T&& v) : OptionalBase(constexpr_move(v)) 
> {}
>  [exec] ^
>  [exec] 1 error generated.
>  [exec] make[2]: *** 
> [main/native/libhdfspp/lib/common/CMakeFiles/common_obj.dir/configuration.cc.o]
>  Error 1
>  [exec] make[1]: *** 
> [main/native/libhdfspp/lib/common/CMakeFiles/common_obj.dir/all] Error 2
>  [exec] make: *** [all] Error 2
> {noformat}
> Compile error 2:
> {noformat}
>  [exec] 
> /Users/tiborkiss/workspace/apache-hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/lib/fs/filesystem.cc:285:66:
>  error: use of overloaded operator '<<' is ambiguous (with operand types 
> 'hdfs::LogMessage' and 'size_type' (aka 'unsigned long'))
>  [exec]   << " Existing thread count = " 
> << worker_threads_.size());
>  [exec]   
> ~~~^~
> 

[jira] [Created] (HDFS-10355) Fix thread_local related build issue on Mac OS X

2016-05-02 Thread Tibor Kiss (JIRA)
Tibor Kiss created HDFS-10355:
-

 Summary: Fix thread_local related build issue on Mac OS X
 Key: HDFS-10355
 URL: https://issues.apache.org/jira/browse/HDFS-10355
 Project: Hadoop HDFS
  Issue Type: Sub-task
  Components: hdfs-client
 Environment: OS: Mac OS X 10.11
clang: Apple LLVM version 7.0.2 (clang-700.1.81)
Reporter: Tibor Kiss


The native hdfs library uses C++11 features heavily.
One of such feature is thread_local storage class which is supported in GCC, 
Visual Studio and the community version of clang compiler, but not by Apple's 
clang (which is default on OS X boxes). 
See further details here: http://stackoverflow.com/a/29929949

Even though not many Hadoop cluster runs on OS X developers still use this 
platform for development.

The problem can be solved multiple ways:
 a) Stick to gcc/g++ or community based clang on OS X. Developers will need 
extra steps to build Hadoop.
 b) Workaround thread_local with a helper class.
 c) Get rid of all the globals marked with thread_local. Interface change will 
be erquired.
 d) Disable multi threading support in the native client on OS X and document 
this limitation. 

Compile error related to thread_local:
{noformat}
 [exec] 
/Users/tiborkiss/workspace/apache-hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/lib/bindings/c/hdfs.cc:66:1:
 error: thread-local storage is not supported for the current target
 [exec] thread_local std::string errstr;
 [exec] ^
 [exec] 1 warning and 1 error generated.
 [exec] make[2]: *** 
[main/native/libhdfspp/lib/bindings/c/CMakeFiles/bindings_c_obj.dir/hdfs.cc.o] 
Error 1
 [exec] make[1]: *** 
[main/native/libhdfspp/lib/bindings/c/CMakeFiles/bindings_c_obj.dir/all] Error 2
 [exec] make: *** [all] Error 2
{noformat}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: hdfs-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: hdfs-issues-h...@hadoop.apache.org



[jira] [Updated] (HDFS-10354) Fix compilation & unit test issues on Mac OS X with clang compiler

2016-05-02 Thread Tibor Kiss (JIRA)

 [ 
https://issues.apache.org/jira/browse/HDFS-10354?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Tibor Kiss updated HDFS-10354:
--
Attachment: HDFS-10354.HDFS-8707.001.patch

> Fix compilation & unit test issues on Mac OS X with clang compiler
> --
>
> Key: HDFS-10354
> URL: https://issues.apache.org/jira/browse/HDFS-10354
> Project: Hadoop HDFS
>  Issue Type: Sub-task
>  Components: hdfs-client
> Environment: OS X: 10.11
> clang: Apple LLVM version 7.0.2 (clang-700.1.81)
>Reporter: Tibor Kiss
>Assignee: Tibor Kiss
> Attachments: HDFS-10354.HDFS-8707.001.patch
>
>
> Compilation fails with multiple errors on Mac OS X.
> Unit test test_test_libhdfs_zerocopy_hdfs_static also fails to execute on OS 
> X.
> Compile error 1:
> {noformat}
>  [exec] Scanning dependencies of target common_obj
>  [exec] [ 45%] Building CXX object 
> main/native/libhdfspp/lib/common/CMakeFiles/common_obj.dir/base64.cc.o
>  [exec] [ 45%] Building CXX object 
> main/native/libhdfspp/lib/common/CMakeFiles/common_obj.dir/status.cc.o
>  [exec] [ 46%] Building CXX object 
> main/native/libhdfspp/lib/common/CMakeFiles/common_obj.dir/sasl_digest_md5.cc.o
>  [exec] [ 46%] Building CXX object 
> main/native/libhdfspp/lib/common/CMakeFiles/common_obj.dir/hdfs_public_api.cc.o
>  [exec] [ 47%] Building CXX object 
> main/native/libhdfspp/lib/common/CMakeFiles/common_obj.dir/options.cc.o
>  [exec] [ 48%] Building CXX object 
> main/native/libhdfspp/lib/common/CMakeFiles/common_obj.dir/configuration.cc.o
>  [exec] 
> /Users/tiborkiss/workspace/apache-hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/lib/common/configuration.cc:85:12:
>  error: no viable conversion from 'optional' to 'optional'
>  [exec] return result;
>  [exec]^~
>  [exec] 
> /Users/tiborkiss/workspace/apache-hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/third_party/tr2/optional.hpp:427:13:
>  note: candidate constructor not viable: no known conversion from 
> 'std::experimental::optional' to 'std::experimental::nullopt_t' for 1st 
> argument
>  [exec]   constexpr optional(nullopt_t) noexcept : OptionalBase() {};
>  [exec] ^
>  [exec] 
> /Users/tiborkiss/workspace/apache-hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/third_party/tr2/optional.hpp:429:3:
>  note: candidate constructor not viable: no known conversion from 
> 'std::experimental::optional' to 'const 
> std::experimental::optional &' for 1st argument
>  [exec]   optional(const optional& rhs)
>  [exec]   ^
>  [exec] 
> /Users/tiborkiss/workspace/apache-hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/third_party/tr2/optional.hpp:438:3:
>  note: candidate constructor not viable: no known conversion from 
> 'std::experimental::optional' to 'std::experimental::optional long> &&' for 1st argument
>  [exec]   optional(optional&& rhs) 
> noexcept(is_nothrow_move_constructible::value)
>  [exec]   ^
>  [exec] 
> /Users/tiborkiss/workspace/apache-hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/third_party/tr2/optional.hpp:447:13:
>  note: candidate constructor not viable: no known conversion from 
> 'std::experimental::optional' to 'const long long &' for 1st argument
>  [exec]   constexpr optional(const T& v) : OptionalBase(v) {}
>  [exec] ^
>  [exec] 
> /Users/tiborkiss/workspace/apache-hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/third_party/tr2/optional.hpp:449:13:
>  note: candidate constructor not viable: no known conversion from 
> 'std::experimental::optional' to 'long long &&' for 1st argument
>  [exec]   constexpr optional(T&& v) : OptionalBase(constexpr_move(v)) 
> {}
>  [exec] ^
>  [exec] 1 error generated.
>  [exec] make[2]: *** 
> [main/native/libhdfspp/lib/common/CMakeFiles/common_obj.dir/configuration.cc.o]
>  Error 1
>  [exec] make[1]: *** 
> [main/native/libhdfspp/lib/common/CMakeFiles/common_obj.dir/all] Error 2
>  [exec] make: *** [all] Error 2
> {noformat}
> Compile error 2:
> {noformat}
>  [exec] 
> /Users/tiborkiss/workspace/apache-hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/lib/fs/filesystem.cc:285:66:
>  error: use of overloaded operator '<<' is ambiguous (with operand types 
> 'hdfs::LogMessage' and 'size_type' (aka 'unsigned long'))
>  [exec]   << " Existing thread count = " 
> << worker_threads_.size());
>  [exec]   
> ~~~^~
> {noformat}
> There is an addition 

[jira] [Updated] (HDFS-10354) Fix compilation & unit test issues on Mac OS X with clang compiler

2016-05-02 Thread Tibor Kiss (JIRA)

 [ 
https://issues.apache.org/jira/browse/HDFS-10354?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Tibor Kiss updated HDFS-10354:
--
Status: Patch Available  (was: Open)

> Fix compilation & unit test issues on Mac OS X with clang compiler
> --
>
> Key: HDFS-10354
> URL: https://issues.apache.org/jira/browse/HDFS-10354
> Project: Hadoop HDFS
>  Issue Type: Sub-task
>  Components: hdfs-client
> Environment: OS X: 10.11
> clang: Apple LLVM version 7.0.2 (clang-700.1.81)
>Reporter: Tibor Kiss
>Assignee: Tibor Kiss
> Attachments: HDFS-10354.HDFS-8707.001.patch
>
>
> Compilation fails with multiple errors on Mac OS X.
> Unit test test_test_libhdfs_zerocopy_hdfs_static also fails to execute on OS 
> X.
> Compile error 1:
> {noformat}
>  [exec] Scanning dependencies of target common_obj
>  [exec] [ 45%] Building CXX object 
> main/native/libhdfspp/lib/common/CMakeFiles/common_obj.dir/base64.cc.o
>  [exec] [ 45%] Building CXX object 
> main/native/libhdfspp/lib/common/CMakeFiles/common_obj.dir/status.cc.o
>  [exec] [ 46%] Building CXX object 
> main/native/libhdfspp/lib/common/CMakeFiles/common_obj.dir/sasl_digest_md5.cc.o
>  [exec] [ 46%] Building CXX object 
> main/native/libhdfspp/lib/common/CMakeFiles/common_obj.dir/hdfs_public_api.cc.o
>  [exec] [ 47%] Building CXX object 
> main/native/libhdfspp/lib/common/CMakeFiles/common_obj.dir/options.cc.o
>  [exec] [ 48%] Building CXX object 
> main/native/libhdfspp/lib/common/CMakeFiles/common_obj.dir/configuration.cc.o
>  [exec] 
> /Users/tiborkiss/workspace/apache-hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/lib/common/configuration.cc:85:12:
>  error: no viable conversion from 'optional' to 'optional'
>  [exec] return result;
>  [exec]^~
>  [exec] 
> /Users/tiborkiss/workspace/apache-hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/third_party/tr2/optional.hpp:427:13:
>  note: candidate constructor not viable: no known conversion from 
> 'std::experimental::optional' to 'std::experimental::nullopt_t' for 1st 
> argument
>  [exec]   constexpr optional(nullopt_t) noexcept : OptionalBase() {};
>  [exec] ^
>  [exec] 
> /Users/tiborkiss/workspace/apache-hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/third_party/tr2/optional.hpp:429:3:
>  note: candidate constructor not viable: no known conversion from 
> 'std::experimental::optional' to 'const 
> std::experimental::optional &' for 1st argument
>  [exec]   optional(const optional& rhs)
>  [exec]   ^
>  [exec] 
> /Users/tiborkiss/workspace/apache-hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/third_party/tr2/optional.hpp:438:3:
>  note: candidate constructor not viable: no known conversion from 
> 'std::experimental::optional' to 'std::experimental::optional long> &&' for 1st argument
>  [exec]   optional(optional&& rhs) 
> noexcept(is_nothrow_move_constructible::value)
>  [exec]   ^
>  [exec] 
> /Users/tiborkiss/workspace/apache-hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/third_party/tr2/optional.hpp:447:13:
>  note: candidate constructor not viable: no known conversion from 
> 'std::experimental::optional' to 'const long long &' for 1st argument
>  [exec]   constexpr optional(const T& v) : OptionalBase(v) {}
>  [exec] ^
>  [exec] 
> /Users/tiborkiss/workspace/apache-hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/third_party/tr2/optional.hpp:449:13:
>  note: candidate constructor not viable: no known conversion from 
> 'std::experimental::optional' to 'long long &&' for 1st argument
>  [exec]   constexpr optional(T&& v) : OptionalBase(constexpr_move(v)) 
> {}
>  [exec] ^
>  [exec] 1 error generated.
>  [exec] make[2]: *** 
> [main/native/libhdfspp/lib/common/CMakeFiles/common_obj.dir/configuration.cc.o]
>  Error 1
>  [exec] make[1]: *** 
> [main/native/libhdfspp/lib/common/CMakeFiles/common_obj.dir/all] Error 2
>  [exec] make: *** [all] Error 2
> {noformat}
> Compile error 2:
> {noformat}
>  [exec] 
> /Users/tiborkiss/workspace/apache-hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/lib/fs/filesystem.cc:285:66:
>  error: use of overloaded operator '<<' is ambiguous (with operand types 
> 'hdfs::LogMessage' and 'size_type' (aka 'unsigned long'))
>  [exec]   << " Existing thread count = " 
> << worker_threads_.size());
>  [exec]   
> ~~~^~
> {noformat}
> There is an addition compile 

[jira] [Updated] (HDFS-10354) Fix compilation & unit test issues on Mac OS X with clang compiler

2016-05-02 Thread Tibor Kiss (JIRA)

 [ 
https://issues.apache.org/jira/browse/HDFS-10354?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Tibor Kiss updated HDFS-10354:
--
Description: 
Compilation fails with multiple errors on Mac OS X.
Unit test test_test_libhdfs_zerocopy_hdfs_static also fails to execute on OS X.

Compile error 1:
{noformat}
 [exec] Scanning dependencies of target common_obj
 [exec] [ 45%] Building CXX object 
main/native/libhdfspp/lib/common/CMakeFiles/common_obj.dir/base64.cc.o
 [exec] [ 45%] Building CXX object 
main/native/libhdfspp/lib/common/CMakeFiles/common_obj.dir/status.cc.o
 [exec] [ 46%] Building CXX object 
main/native/libhdfspp/lib/common/CMakeFiles/common_obj.dir/sasl_digest_md5.cc.o
 [exec] [ 46%] Building CXX object 
main/native/libhdfspp/lib/common/CMakeFiles/common_obj.dir/hdfs_public_api.cc.o
 [exec] [ 47%] Building CXX object 
main/native/libhdfspp/lib/common/CMakeFiles/common_obj.dir/options.cc.o
 [exec] [ 48%] Building CXX object 
main/native/libhdfspp/lib/common/CMakeFiles/common_obj.dir/configuration.cc.o
 [exec] 
/Users/tiborkiss/workspace/apache-hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/lib/common/configuration.cc:85:12:
 error: no viable conversion from 'optional' to 'optional'
 [exec] return result;
 [exec]^~
 [exec] 
/Users/tiborkiss/workspace/apache-hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/third_party/tr2/optional.hpp:427:13:
 note: candidate constructor not viable: no known conversion from 
'std::experimental::optional' to 'std::experimental::nullopt_t' for 1st 
argument
 [exec]   constexpr optional(nullopt_t) noexcept : OptionalBase() {};
 [exec] ^
 [exec] 
/Users/tiborkiss/workspace/apache-hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/third_party/tr2/optional.hpp:429:3:
 note: candidate constructor not viable: no known conversion from 
'std::experimental::optional' to 'const std::experimental::optional &' for 1st argument
 [exec]   optional(const optional& rhs)
 [exec]   ^
 [exec] 
/Users/tiborkiss/workspace/apache-hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/third_party/tr2/optional.hpp:438:3:
 note: candidate constructor not viable: no known conversion from 
'std::experimental::optional' to 'std::experimental::optional 
&&' for 1st argument
 [exec]   optional(optional&& rhs) 
noexcept(is_nothrow_move_constructible::value)
 [exec]   ^
 [exec] 
/Users/tiborkiss/workspace/apache-hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/third_party/tr2/optional.hpp:447:13:
 note: candidate constructor not viable: no known conversion from 
'std::experimental::optional' to 'const long long &' for 1st argument
 [exec]   constexpr optional(const T& v) : OptionalBase(v) {}
 [exec] ^
 [exec] 
/Users/tiborkiss/workspace/apache-hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/third_party/tr2/optional.hpp:449:13:
 note: candidate constructor not viable: no known conversion from 
'std::experimental::optional' to 'long long &&' for 1st argument
 [exec]   constexpr optional(T&& v) : OptionalBase(constexpr_move(v)) {}
 [exec] ^
 [exec] 1 error generated.
 [exec] make[2]: *** 
[main/native/libhdfspp/lib/common/CMakeFiles/common_obj.dir/configuration.cc.o] 
Error 1
 [exec] make[1]: *** 
[main/native/libhdfspp/lib/common/CMakeFiles/common_obj.dir/all] Error 2
 [exec] make: *** [all] Error 2
{noformat}

Compile error 2:
{noformat}
 [exec] 
/Users/tiborkiss/workspace/apache-hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/lib/fs/filesystem.cc:285:66:
 error: use of overloaded operator '<<' is ambiguous (with operand types 
'hdfs::LogMessage' and 'size_type' (aka 'unsigned long'))
 [exec]   << " Existing thread count = " << 
worker_threads_.size());
 [exec]   
~~~^~
{noformat}

There is an addition compile failure in native client related to thread_local.
The complexity of the error mandates to track that issue in a separate ticket.
{noformat}
 [exec] 
/Users/tiborkiss/workspace/apache-hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/lib/bindings/c/hdfs.cc:66:1:
 error: thread-local storage is not supported for the current target
 [exec] thread_local std::string errstr;
 [exec] ^
 [exec] 1 warning and 1 error generated.
 [exec] make[2]: *** 
[main/native/libhdfspp/lib/bindings/c/CMakeFiles/bindings_c_obj.dir/hdfs.cc.o] 
Error 1
 [exec] make[1]: *** 
[main/native/libhdfspp/lib/bindings/c/CMakeFiles/bindings_c_obj.dir/all] Error 2
 [exec] make: *** [all] Error 2
{noformat}


Unit 

[jira] [Created] (HDFS-10354) Fix compilation & unit test issues on Mac OS X with clang compiler

2016-05-02 Thread Tibor Kiss (JIRA)
Tibor Kiss created HDFS-10354:
-

 Summary: Fix compilation & unit test issues on Mac OS X with clang 
compiler
 Key: HDFS-10354
 URL: https://issues.apache.org/jira/browse/HDFS-10354
 Project: Hadoop HDFS
  Issue Type: Sub-task
  Components: hdfs-client
 Environment: OS X: 10.11
clang: Apple LLVM version 7.0.2 (clang-700.1.81)
Reporter: Tibor Kiss
Assignee: Tibor Kiss


Compilation fails with multiple errors on Mac OS X.
Unit test test_test_libhdfs_zerocopy_hdfs_static also fails to execute on OS X.

Compile error 1:
{noformat}
 [exec] Scanning dependencies of target common_obj
 [exec] [ 45%] Building CXX object 
main/native/libhdfspp/lib/common/CMakeFiles/common_obj.dir/base64.cc.o
 [exec] [ 45%] Building CXX object 
main/native/libhdfspp/lib/common/CMakeFiles/common_obj.dir/status.cc.o
 [exec] [ 46%] Building CXX object 
main/native/libhdfspp/lib/common/CMakeFiles/common_obj.dir/sasl_digest_md5.cc.o
 [exec] [ 46%] Building CXX object 
main/native/libhdfspp/lib/common/CMakeFiles/common_obj.dir/hdfs_public_api.cc.o
 [exec] [ 47%] Building CXX object 
main/native/libhdfspp/lib/common/CMakeFiles/common_obj.dir/options.cc.o
 [exec] [ 48%] Building CXX object 
main/native/libhdfspp/lib/common/CMakeFiles/common_obj.dir/configuration.cc.o
 [exec] 
/Users/tiborkiss/workspace/apache-hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/lib/common/configuration.cc:85:12:
 error: no viable conversion from 'optional' to 'optional'
 [exec] return result;
 [exec]^~
 [exec] 
/Users/tiborkiss/workspace/apache-hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/third_party/tr2/optional.hpp:427:13:
 note: candidate constructor not viable: no known conversion from 
'std::experimental::optional' to 'std::experimental::nullopt_t' for 1st 
argument
 [exec]   constexpr optional(nullopt_t) noexcept : OptionalBase() {};
 [exec] ^
 [exec] 
/Users/tiborkiss/workspace/apache-hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/third_party/tr2/optional.hpp:429:3:
 note: candidate constructor not viable: no known conversion from 
'std::experimental::optional' to 'const std::experimental::optional &' for 1st argument
 [exec]   optional(const optional& rhs)
 [exec]   ^
 [exec] 
/Users/tiborkiss/workspace/apache-hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/third_party/tr2/optional.hpp:438:3:
 note: candidate constructor not viable: no known conversion from 
'std::experimental::optional' to 'std::experimental::optional 
&&' for 1st argument
 [exec]   optional(optional&& rhs) 
noexcept(is_nothrow_move_constructible::value)
 [exec]   ^
 [exec] 
/Users/tiborkiss/workspace/apache-hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/third_party/tr2/optional.hpp:447:13:
 note: candidate constructor not viable: no known conversion from 
'std::experimental::optional' to 'const long long &' for 1st argument
 [exec]   constexpr optional(const T& v) : OptionalBase(v) {}
 [exec] ^
 [exec] 
/Users/tiborkiss/workspace/apache-hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/third_party/tr2/optional.hpp:449:13:
 note: candidate constructor not viable: no known conversion from 
'std::experimental::optional' to 'long long &&' for 1st argument
 [exec]   constexpr optional(T&& v) : OptionalBase(constexpr_move(v)) {}
 [exec] ^
 [exec] 1 error generated.
 [exec] make[2]: *** 
[main/native/libhdfspp/lib/common/CMakeFiles/common_obj.dir/configuration.cc.o] 
Error 1
 [exec] make[1]: *** 
[main/native/libhdfspp/lib/common/CMakeFiles/common_obj.dir/all] Error 2
 [exec] make: *** [all] Error 2
{noformat}

Compile error 2:
{noformat}
 [exec] 
/Users/tiborkiss/workspace/apache-hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/lib/fs/filesystem.cc:285:66:
 error: use of overloaded operator '<<' is ambiguous (with operand types 
'hdfs::LogMessage' and 'size_type' (aka 'unsigned long'))
 [exec]   << " Existing thread count = " << 
worker_threads_.size());
 [exec]   
~~~^~
{noformat}

Unit test failure:
{noformat}

{noformat}


There is an addition compile failure in native client related to thread_local.
The complexity of the error mandates to track that issue in a separate ticket.
{noformat}
 [exec]  2/16 Test  #2: test_test_libhdfs_zerocopy_hdfs_static 
..***Failed2.07 sec
 [exec] log4j:WARN No appenders could be found for logger 
(org.apache.hadoop.metrics2.lib.MutableMetricsFactory).
 [exec] log4j:WARN Please initialize 

[jira] [Commented] (HDFS-10332) hdfs-native-client fails to build with CMake 2.8.11 or earlier

2016-04-29 Thread Tibor Kiss (JIRA)

[ 
https://issues.apache.org/jira/browse/HDFS-10332?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15263707#comment-15263707
 ] 

Tibor Kiss commented on HDFS-10332:
---

Thanks for committing, James!

> hdfs-native-client fails to build with CMake 2.8.11 or earlier
> --
>
> Key: HDFS-10332
> URL: https://issues.apache.org/jira/browse/HDFS-10332
> Project: Hadoop HDFS
>  Issue Type: Sub-task
>  Components: hdfs-client
>Reporter: Tibor Kiss
>Assignee: Tibor Kiss
>Priority: Minor
> Attachments: HDFS-10332.01.patch, HDFS-10332.HDFS-8707.001.patch
>
>
> Due to a new syntax introduced in CMake 2.8.12 (get_filename_component's 
> function when VAR=DIRECTORY) the native-client won't build.
> Currently RHEL6 & 7 are using older version of CMake. 
> Error log:
> {noformat}
> [INFO] --- maven-antrun-plugin:1.7:run (make) @ hadoop-hdfs-native-client ---
> [INFO] Executing tasks
> main:
>  [exec] JAVA_HOME=, 
> JAVA_JVM_LIBRARY=/usr/java/jdk1.7.0_79/jre/lib/amd64/server/libjvm.so
>  [exec] JAVA_INCLUDE_PATH=/usr/java/jdk1.7.0_79/include, 
> JAVA_INCLUDE_PATH2=/usr/java/jdk1.7.0_79/include/linux
>  [exec] Located all JNI components successfully.
>  [exec] -- Could NOT find PROTOBUF (missing:  PROTOBUF_LIBRARY 
> PROTOBUF_INCLUDE_DIR)
>  [exec] -- valgrind location: MEMORYCHECK_COMMAND-NOTFOUND
>  [exec] -- checking for module 'fuse'
>  [exec] --   package 'fuse' not found
>  [exec] -- Failed to find Linux FUSE libraries or include files.  Will 
> not build FUSE client.
>  [exec] -- Configuring incomplete, errors occurred!
>  [exec] CMake Error at main/native/libhdfspp/CMakeLists.txt:71 
> (get_filename_component):
>  [exec]   get_filename_component unknown component DIRECTORY
>  [exec] Call Stack (most recent call first):
>  [exec]   main/native/libhdfspp/CMakeLists.txt:95 (copy_on_demand)
>  [exec]
>  [exec]
>  [exec] CMake Error at main/native/libhdfspp/CMakeLists.txt:71 
> (get_filename_component):
>  [exec]   get_filename_component unknown component DIRECTORY
>  [exec] Call Stack (most recent call first):
>  [exec]   main/native/libhdfspp/CMakeLists.txt:96 (copy_on_demand)
>  [exec]
>  [exec]
>  [exec] CMake Error at main/native/libhdfspp/CMakeLists.txt:71 
> (get_filename_component):
>  [exec]   get_filename_component unknown component DIRECTORY
>  [exec] Call Stack (most recent call first):
>  [exec]   main/native/libhdfspp/CMakeLists.txt:97 (copy_on_demand)
>  [exec]
>  [exec]
>  [exec] CMake Error at main/native/libhdfspp/CMakeLists.txt:71 
> (get_filename_component):
>  [exec]   get_filename_component unknown component DIRECTORY
>  [exec] Call Stack (most recent call first):
>  [exec]   main/native/libhdfspp/CMakeLists.txt:98 (copy_on_demand)
>  [exec]
>  [exec]
>  [exec] CMake Error: The following variables are used in this project, 
> but they are set to NOTFOUND.
>  [exec] Please set them or make sure they are set and tested correctly in 
> the CMake files:
>  [exec] PROTOBUF_LIBRARY (ADVANCED)
>  [exec] linked by target "hdfspp" in directory 
> /home/tiborkiss/devel/workspace/hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp
>  [exec] linked by target "hdfspp_static" in directory 
> /home/tiborkiss/devel/workspace/hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp
>  [exec] linked by target "protoc-gen-hrpc" in directory 
> /home/tiborkiss/devel/workspace/hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/lib/proto
>  [exec] linked by target "bad_datanode_test" in directory 
> /home/tiborkiss/devel/workspace/hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/tests
>  [exec] linked by target "hdfs_builder_test" in directory 
> /home/tiborkiss/devel/workspace/hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/tests
>  [exec] linked by target "hdfspp_errors_test" in directory 
> /home/tiborkiss/devel/workspace
>  [exec] 
> /hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/tests
>  [exec] linked by target "libhdfs_threaded_hdfspp_test_shim_static" 
> in directory 
> /home/tiborkiss/devel/workspace/hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/test
>  [exec] linked by target "logging_test" in directory 
> /home/tiborkiss/devel/workspace/hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/tests
>  [exec] linked by target "node_exclusion_test" in directory 
> 

[jira] [Commented] (HDFS-9758) libhdfs++: Implement Python bindings

2016-04-29 Thread Tibor Kiss (JIRA)

[ 
https://issues.apache.org/jira/browse/HDFS-9758?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15263651#comment-15263651
 ] 

Tibor Kiss commented on HDFS-9758:
--

Thanks for the detailed comments, [~James Clampffer]!

I’ve proposed Cython, cpppyy and Boost.Python simply because I thought we’d 
like to expose the object oriented interface to Python.
If that’s not the case CTypes is surely the simplest option without imposing 
any dependency.

Cython would be beneficial if we would either need OO interfaces or we’d prefer 
performance over programmer productivity.

As you already implemented a reasonable amount of code for CTypes I think it is 
best to extend that further.
If you like I can pick up where you left off and finish the remaining calls and 
include python tests.


> libhdfs++: Implement Python bindings
> 
>
> Key: HDFS-9758
> URL: https://issues.apache.org/jira/browse/HDFS-9758
> Project: Hadoop HDFS
>  Issue Type: Sub-task
>  Components: hdfs-client
>Reporter: James Clampffer
> Attachments: hdfs_posix.py
>
>
> It'd be really useful to have bindings for various scripting languages.  
> Python would be a good start because of it's popularity and how easy it is to 
> interact with shared libraries using the ctypes module.  I think bindings for 
> the V8 engine that nodeJS uses would be a close second in terms of expanding 
> the potential user base.
> Probably worth starting with just adding a synchronous API and building from 
> there to avoid interactions with python's garbage collector until the 
> bindings prove to be solid.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: hdfs-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: hdfs-issues-h...@hadoop.apache.org



[jira] [Updated] (HDFS-10332) hdfs-native-client fails to build with CMake 2.8.11 or earlier

2016-04-27 Thread Tibor Kiss (JIRA)

 [ 
https://issues.apache.org/jira/browse/HDFS-10332?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Tibor Kiss updated HDFS-10332:
--
Attachment: (was: HDFS-10332-HDFS-8707.001.patch)

> hdfs-native-client fails to build with CMake 2.8.11 or earlier
> --
>
> Key: HDFS-10332
> URL: https://issues.apache.org/jira/browse/HDFS-10332
> Project: Hadoop HDFS
>  Issue Type: Sub-task
>  Components: hdfs-client
>Reporter: Tibor Kiss
>Assignee: Tibor Kiss
>Priority: Minor
> Attachments: HDFS-10332.01.patch, HDFS-10332.HDFS-8707.001.patch
>
>
> Due to a new syntax introduced in CMake 2.8.12 (get_filename_component's 
> function when VAR=DIRECTORY) the native-client won't build.
> Currently RHEL6 & 7 are using older version of CMake. 
> Error log:
> {noformat}
> [INFO] --- maven-antrun-plugin:1.7:run (make) @ hadoop-hdfs-native-client ---
> [INFO] Executing tasks
> main:
>  [exec] JAVA_HOME=, 
> JAVA_JVM_LIBRARY=/usr/java/jdk1.7.0_79/jre/lib/amd64/server/libjvm.so
>  [exec] JAVA_INCLUDE_PATH=/usr/java/jdk1.7.0_79/include, 
> JAVA_INCLUDE_PATH2=/usr/java/jdk1.7.0_79/include/linux
>  [exec] Located all JNI components successfully.
>  [exec] -- Could NOT find PROTOBUF (missing:  PROTOBUF_LIBRARY 
> PROTOBUF_INCLUDE_DIR)
>  [exec] -- valgrind location: MEMORYCHECK_COMMAND-NOTFOUND
>  [exec] -- checking for module 'fuse'
>  [exec] --   package 'fuse' not found
>  [exec] -- Failed to find Linux FUSE libraries or include files.  Will 
> not build FUSE client.
>  [exec] -- Configuring incomplete, errors occurred!
>  [exec] CMake Error at main/native/libhdfspp/CMakeLists.txt:71 
> (get_filename_component):
>  [exec]   get_filename_component unknown component DIRECTORY
>  [exec] Call Stack (most recent call first):
>  [exec]   main/native/libhdfspp/CMakeLists.txt:95 (copy_on_demand)
>  [exec]
>  [exec]
>  [exec] CMake Error at main/native/libhdfspp/CMakeLists.txt:71 
> (get_filename_component):
>  [exec]   get_filename_component unknown component DIRECTORY
>  [exec] Call Stack (most recent call first):
>  [exec]   main/native/libhdfspp/CMakeLists.txt:96 (copy_on_demand)
>  [exec]
>  [exec]
>  [exec] CMake Error at main/native/libhdfspp/CMakeLists.txt:71 
> (get_filename_component):
>  [exec]   get_filename_component unknown component DIRECTORY
>  [exec] Call Stack (most recent call first):
>  [exec]   main/native/libhdfspp/CMakeLists.txt:97 (copy_on_demand)
>  [exec]
>  [exec]
>  [exec] CMake Error at main/native/libhdfspp/CMakeLists.txt:71 
> (get_filename_component):
>  [exec]   get_filename_component unknown component DIRECTORY
>  [exec] Call Stack (most recent call first):
>  [exec]   main/native/libhdfspp/CMakeLists.txt:98 (copy_on_demand)
>  [exec]
>  [exec]
>  [exec] CMake Error: The following variables are used in this project, 
> but they are set to NOTFOUND.
>  [exec] Please set them or make sure they are set and tested correctly in 
> the CMake files:
>  [exec] PROTOBUF_LIBRARY (ADVANCED)
>  [exec] linked by target "hdfspp" in directory 
> /home/tiborkiss/devel/workspace/hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp
>  [exec] linked by target "hdfspp_static" in directory 
> /home/tiborkiss/devel/workspace/hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp
>  [exec] linked by target "protoc-gen-hrpc" in directory 
> /home/tiborkiss/devel/workspace/hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/lib/proto
>  [exec] linked by target "bad_datanode_test" in directory 
> /home/tiborkiss/devel/workspace/hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/tests
>  [exec] linked by target "hdfs_builder_test" in directory 
> /home/tiborkiss/devel/workspace/hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/tests
>  [exec] linked by target "hdfspp_errors_test" in directory 
> /home/tiborkiss/devel/workspace
>  [exec] 
> /hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/tests
>  [exec] linked by target "libhdfs_threaded_hdfspp_test_shim_static" 
> in directory 
> /home/tiborkiss/devel/workspace/hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/test
>  [exec] linked by target "logging_test" in directory 
> /home/tiborkiss/devel/workspace/hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/tests
>  [exec] linked by target "node_exclusion_test" in directory 
> 

[jira] [Commented] (HDFS-9758) libhdfs++: Implement Python bindings

2016-04-27 Thread Tibor Kiss (JIRA)

[ 
https://issues.apache.org/jira/browse/HDFS-9758?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15259810#comment-15259810
 ] 

Tibor Kiss commented on HDFS-9758:
--

We have several options to implement Python bindings for the pure C++ HDFS 
Client:
 - CFFI (MIT License)
 - cppyy (MIT License)
 - Ctypes (MIT License)
 - Cython (Apache License)
 - SWIG (GPL License)
 - Boost.Python (Boost Software License)
 - pure python extensions

While CFFI is simple & clean it does not support C++. 
cppyy would be a great choice but it supports pypy at this time.
Ctypes is integrated to CPython since 2.5, but C++ support is not great.
Cython does support both C & C++, seems a reasonable choice.
SWIG also supports C & C++, plus it could be later used to bring other 
scripting language support. It's licensing could be a problem.
Boost.Python seems to have great C++ support at a first glance. It's license 
needs to be studied.

Thoughts / feelings / preferences?

> libhdfs++: Implement Python bindings
> 
>
> Key: HDFS-9758
> URL: https://issues.apache.org/jira/browse/HDFS-9758
> Project: Hadoop HDFS
>  Issue Type: Sub-task
>  Components: hdfs-client
>Reporter: James Clampffer
>
> It'd be really useful to have bindings for various scripting languages.  
> Python would be a good start because of it's popularity and how easy it is to 
> interact with shared libraries using the ctypes module.  I think bindings for 
> the V8 engine that nodeJS uses would be a close second in terms of expanding 
> the potential user base.
> Probably worth starting with just adding a synchronous API and building from 
> there to avoid interactions with python's garbage collector until the 
> bindings prove to be solid.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (HDFS-10332) hdfs-native-client fails to build with CMake 2.8.11 or earlier

2016-04-26 Thread Tibor Kiss (JIRA)

 [ 
https://issues.apache.org/jira/browse/HDFS-10332?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Tibor Kiss updated HDFS-10332:
--
Attachment: HDFS-10332.HDFS-8707.001.patch

> hdfs-native-client fails to build with CMake 2.8.11 or earlier
> --
>
> Key: HDFS-10332
> URL: https://issues.apache.org/jira/browse/HDFS-10332
> Project: Hadoop HDFS
>  Issue Type: Sub-task
>  Components: hdfs-client
>Reporter: Tibor Kiss
>Assignee: Tibor Kiss
>Priority: Minor
> Attachments: HDFS-10332-HDFS-8707.001.patch, HDFS-10332.01.patch, 
> HDFS-10332.HDFS-8707.001.patch
>
>
> Due to a new syntax introduced in CMake 2.8.12 (get_filename_component's 
> function when VAR=DIRECTORY) the native-client won't build.
> Currently RHEL6 & 7 are using older version of CMake. 
> Error log:
> {noformat}
> [INFO] --- maven-antrun-plugin:1.7:run (make) @ hadoop-hdfs-native-client ---
> [INFO] Executing tasks
> main:
>  [exec] JAVA_HOME=, 
> JAVA_JVM_LIBRARY=/usr/java/jdk1.7.0_79/jre/lib/amd64/server/libjvm.so
>  [exec] JAVA_INCLUDE_PATH=/usr/java/jdk1.7.0_79/include, 
> JAVA_INCLUDE_PATH2=/usr/java/jdk1.7.0_79/include/linux
>  [exec] Located all JNI components successfully.
>  [exec] -- Could NOT find PROTOBUF (missing:  PROTOBUF_LIBRARY 
> PROTOBUF_INCLUDE_DIR)
>  [exec] -- valgrind location: MEMORYCHECK_COMMAND-NOTFOUND
>  [exec] -- checking for module 'fuse'
>  [exec] --   package 'fuse' not found
>  [exec] -- Failed to find Linux FUSE libraries or include files.  Will 
> not build FUSE client.
>  [exec] -- Configuring incomplete, errors occurred!
>  [exec] CMake Error at main/native/libhdfspp/CMakeLists.txt:71 
> (get_filename_component):
>  [exec]   get_filename_component unknown component DIRECTORY
>  [exec] Call Stack (most recent call first):
>  [exec]   main/native/libhdfspp/CMakeLists.txt:95 (copy_on_demand)
>  [exec]
>  [exec]
>  [exec] CMake Error at main/native/libhdfspp/CMakeLists.txt:71 
> (get_filename_component):
>  [exec]   get_filename_component unknown component DIRECTORY
>  [exec] Call Stack (most recent call first):
>  [exec]   main/native/libhdfspp/CMakeLists.txt:96 (copy_on_demand)
>  [exec]
>  [exec]
>  [exec] CMake Error at main/native/libhdfspp/CMakeLists.txt:71 
> (get_filename_component):
>  [exec]   get_filename_component unknown component DIRECTORY
>  [exec] Call Stack (most recent call first):
>  [exec]   main/native/libhdfspp/CMakeLists.txt:97 (copy_on_demand)
>  [exec]
>  [exec]
>  [exec] CMake Error at main/native/libhdfspp/CMakeLists.txt:71 
> (get_filename_component):
>  [exec]   get_filename_component unknown component DIRECTORY
>  [exec] Call Stack (most recent call first):
>  [exec]   main/native/libhdfspp/CMakeLists.txt:98 (copy_on_demand)
>  [exec]
>  [exec]
>  [exec] CMake Error: The following variables are used in this project, 
> but they are set to NOTFOUND.
>  [exec] Please set them or make sure they are set and tested correctly in 
> the CMake files:
>  [exec] PROTOBUF_LIBRARY (ADVANCED)
>  [exec] linked by target "hdfspp" in directory 
> /home/tiborkiss/devel/workspace/hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp
>  [exec] linked by target "hdfspp_static" in directory 
> /home/tiborkiss/devel/workspace/hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp
>  [exec] linked by target "protoc-gen-hrpc" in directory 
> /home/tiborkiss/devel/workspace/hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/lib/proto
>  [exec] linked by target "bad_datanode_test" in directory 
> /home/tiborkiss/devel/workspace/hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/tests
>  [exec] linked by target "hdfs_builder_test" in directory 
> /home/tiborkiss/devel/workspace/hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/tests
>  [exec] linked by target "hdfspp_errors_test" in directory 
> /home/tiborkiss/devel/workspace
>  [exec] 
> /hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/tests
>  [exec] linked by target "libhdfs_threaded_hdfspp_test_shim_static" 
> in directory 
> /home/tiborkiss/devel/workspace/hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/test
>  [exec] linked by target "logging_test" in directory 
> /home/tiborkiss/devel/workspace/hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/tests
>  [exec] linked by target "node_exclusion_test" in directory 
> 

[jira] [Updated] (HDFS-10332) hdfs-native-client fails to build with CMake 2.8.11 or earlier

2016-04-26 Thread Tibor Kiss (JIRA)

 [ 
https://issues.apache.org/jira/browse/HDFS-10332?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Tibor Kiss updated HDFS-10332:
--
Attachment: HDFS-10332-HDFS-8707.001.patch

> hdfs-native-client fails to build with CMake 2.8.11 or earlier
> --
>
> Key: HDFS-10332
> URL: https://issues.apache.org/jira/browse/HDFS-10332
> Project: Hadoop HDFS
>  Issue Type: Sub-task
>  Components: hdfs-client
>Reporter: Tibor Kiss
>Assignee: Tibor Kiss
>Priority: Minor
> Attachments: HDFS-10332-HDFS-8707.001.patch, HDFS-10332.01.patch
>
>
> Due to a new syntax introduced in CMake 2.8.12 (get_filename_component's 
> function when VAR=DIRECTORY) the native-client won't build.
> Currently RHEL6 & 7 are using older version of CMake. 
> Error log:
> {noformat}
> [INFO] --- maven-antrun-plugin:1.7:run (make) @ hadoop-hdfs-native-client ---
> [INFO] Executing tasks
> main:
>  [exec] JAVA_HOME=, 
> JAVA_JVM_LIBRARY=/usr/java/jdk1.7.0_79/jre/lib/amd64/server/libjvm.so
>  [exec] JAVA_INCLUDE_PATH=/usr/java/jdk1.7.0_79/include, 
> JAVA_INCLUDE_PATH2=/usr/java/jdk1.7.0_79/include/linux
>  [exec] Located all JNI components successfully.
>  [exec] -- Could NOT find PROTOBUF (missing:  PROTOBUF_LIBRARY 
> PROTOBUF_INCLUDE_DIR)
>  [exec] -- valgrind location: MEMORYCHECK_COMMAND-NOTFOUND
>  [exec] -- checking for module 'fuse'
>  [exec] --   package 'fuse' not found
>  [exec] -- Failed to find Linux FUSE libraries or include files.  Will 
> not build FUSE client.
>  [exec] -- Configuring incomplete, errors occurred!
>  [exec] CMake Error at main/native/libhdfspp/CMakeLists.txt:71 
> (get_filename_component):
>  [exec]   get_filename_component unknown component DIRECTORY
>  [exec] Call Stack (most recent call first):
>  [exec]   main/native/libhdfspp/CMakeLists.txt:95 (copy_on_demand)
>  [exec]
>  [exec]
>  [exec] CMake Error at main/native/libhdfspp/CMakeLists.txt:71 
> (get_filename_component):
>  [exec]   get_filename_component unknown component DIRECTORY
>  [exec] Call Stack (most recent call first):
>  [exec]   main/native/libhdfspp/CMakeLists.txt:96 (copy_on_demand)
>  [exec]
>  [exec]
>  [exec] CMake Error at main/native/libhdfspp/CMakeLists.txt:71 
> (get_filename_component):
>  [exec]   get_filename_component unknown component DIRECTORY
>  [exec] Call Stack (most recent call first):
>  [exec]   main/native/libhdfspp/CMakeLists.txt:97 (copy_on_demand)
>  [exec]
>  [exec]
>  [exec] CMake Error at main/native/libhdfspp/CMakeLists.txt:71 
> (get_filename_component):
>  [exec]   get_filename_component unknown component DIRECTORY
>  [exec] Call Stack (most recent call first):
>  [exec]   main/native/libhdfspp/CMakeLists.txt:98 (copy_on_demand)
>  [exec]
>  [exec]
>  [exec] CMake Error: The following variables are used in this project, 
> but they are set to NOTFOUND.
>  [exec] Please set them or make sure they are set and tested correctly in 
> the CMake files:
>  [exec] PROTOBUF_LIBRARY (ADVANCED)
>  [exec] linked by target "hdfspp" in directory 
> /home/tiborkiss/devel/workspace/hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp
>  [exec] linked by target "hdfspp_static" in directory 
> /home/tiborkiss/devel/workspace/hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp
>  [exec] linked by target "protoc-gen-hrpc" in directory 
> /home/tiborkiss/devel/workspace/hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/lib/proto
>  [exec] linked by target "bad_datanode_test" in directory 
> /home/tiborkiss/devel/workspace/hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/tests
>  [exec] linked by target "hdfs_builder_test" in directory 
> /home/tiborkiss/devel/workspace/hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/tests
>  [exec] linked by target "hdfspp_errors_test" in directory 
> /home/tiborkiss/devel/workspace
>  [exec] 
> /hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/tests
>  [exec] linked by target "libhdfs_threaded_hdfspp_test_shim_static" 
> in directory 
> /home/tiborkiss/devel/workspace/hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/test
>  [exec] linked by target "logging_test" in directory 
> /home/tiborkiss/devel/workspace/hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/tests
>  [exec] linked by target "node_exclusion_test" in directory 
> /home/tiborkiss/devel/workspace/hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/tests

[jira] [Updated] (HDFS-10332) hdfs-native-client fails to build with CMake 2.8.11 or earlier

2016-04-26 Thread Tibor Kiss (JIRA)

 [ 
https://issues.apache.org/jira/browse/HDFS-10332?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Tibor Kiss updated HDFS-10332:
--
Status: Patch Available  (was: Open)

> hdfs-native-client fails to build with CMake 2.8.11 or earlier
> --
>
> Key: HDFS-10332
> URL: https://issues.apache.org/jira/browse/HDFS-10332
> Project: Hadoop HDFS
>  Issue Type: Sub-task
>  Components: hdfs-client
>Reporter: Tibor Kiss
>Assignee: Tibor Kiss
>Priority: Minor
> Attachments: HDFS-10332.01.patch
>
>
> Due to a new syntax introduced in CMake 2.8.12 (get_filename_component's 
> function when VAR=DIRECTORY) the native-client won't build.
> Currently RHEL6 & 7 are using older version of CMake. 
> Error log:
> {noformat}
> [INFO] --- maven-antrun-plugin:1.7:run (make) @ hadoop-hdfs-native-client ---
> [INFO] Executing tasks
> main:
>  [exec] JAVA_HOME=, 
> JAVA_JVM_LIBRARY=/usr/java/jdk1.7.0_79/jre/lib/amd64/server/libjvm.so
>  [exec] JAVA_INCLUDE_PATH=/usr/java/jdk1.7.0_79/include, 
> JAVA_INCLUDE_PATH2=/usr/java/jdk1.7.0_79/include/linux
>  [exec] Located all JNI components successfully.
>  [exec] -- Could NOT find PROTOBUF (missing:  PROTOBUF_LIBRARY 
> PROTOBUF_INCLUDE_DIR)
>  [exec] -- valgrind location: MEMORYCHECK_COMMAND-NOTFOUND
>  [exec] -- checking for module 'fuse'
>  [exec] --   package 'fuse' not found
>  [exec] -- Failed to find Linux FUSE libraries or include files.  Will 
> not build FUSE client.
>  [exec] -- Configuring incomplete, errors occurred!
>  [exec] CMake Error at main/native/libhdfspp/CMakeLists.txt:71 
> (get_filename_component):
>  [exec]   get_filename_component unknown component DIRECTORY
>  [exec] Call Stack (most recent call first):
>  [exec]   main/native/libhdfspp/CMakeLists.txt:95 (copy_on_demand)
>  [exec]
>  [exec]
>  [exec] CMake Error at main/native/libhdfspp/CMakeLists.txt:71 
> (get_filename_component):
>  [exec]   get_filename_component unknown component DIRECTORY
>  [exec] Call Stack (most recent call first):
>  [exec]   main/native/libhdfspp/CMakeLists.txt:96 (copy_on_demand)
>  [exec]
>  [exec]
>  [exec] CMake Error at main/native/libhdfspp/CMakeLists.txt:71 
> (get_filename_component):
>  [exec]   get_filename_component unknown component DIRECTORY
>  [exec] Call Stack (most recent call first):
>  [exec]   main/native/libhdfspp/CMakeLists.txt:97 (copy_on_demand)
>  [exec]
>  [exec]
>  [exec] CMake Error at main/native/libhdfspp/CMakeLists.txt:71 
> (get_filename_component):
>  [exec]   get_filename_component unknown component DIRECTORY
>  [exec] Call Stack (most recent call first):
>  [exec]   main/native/libhdfspp/CMakeLists.txt:98 (copy_on_demand)
>  [exec]
>  [exec]
>  [exec] CMake Error: The following variables are used in this project, 
> but they are set to NOTFOUND.
>  [exec] Please set them or make sure they are set and tested correctly in 
> the CMake files:
>  [exec] PROTOBUF_LIBRARY (ADVANCED)
>  [exec] linked by target "hdfspp" in directory 
> /home/tiborkiss/devel/workspace/hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp
>  [exec] linked by target "hdfspp_static" in directory 
> /home/tiborkiss/devel/workspace/hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp
>  [exec] linked by target "protoc-gen-hrpc" in directory 
> /home/tiborkiss/devel/workspace/hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/lib/proto
>  [exec] linked by target "bad_datanode_test" in directory 
> /home/tiborkiss/devel/workspace/hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/tests
>  [exec] linked by target "hdfs_builder_test" in directory 
> /home/tiborkiss/devel/workspace/hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/tests
>  [exec] linked by target "hdfspp_errors_test" in directory 
> /home/tiborkiss/devel/workspace
>  [exec] 
> /hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/tests
>  [exec] linked by target "libhdfs_threaded_hdfspp_test_shim_static" 
> in directory 
> /home/tiborkiss/devel/workspace/hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/test
>  [exec] linked by target "logging_test" in directory 
> /home/tiborkiss/devel/workspace/hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/tests
>  [exec] linked by target "node_exclusion_test" in directory 
> /home/tiborkiss/devel/workspace/hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/tests
> tiborkiss@eiger ~/d/w/hadoop ❯❯❯ 

[jira] [Updated] (HDFS-10332) hdfs-native-client fails to build with CMake 2.8.11 or earlier

2016-04-26 Thread Tibor Kiss (JIRA)

 [ 
https://issues.apache.org/jira/browse/HDFS-10332?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Tibor Kiss updated HDFS-10332:
--
Attachment: HDFS-10332.01.patch

> hdfs-native-client fails to build with CMake 2.8.11 or earlier
> --
>
> Key: HDFS-10332
> URL: https://issues.apache.org/jira/browse/HDFS-10332
> Project: Hadoop HDFS
>  Issue Type: Sub-task
>  Components: hdfs-client
>Reporter: Tibor Kiss
>Assignee: Tibor Kiss
>Priority: Minor
> Attachments: HDFS-10332.01.patch
>
>
> Due to a new syntax introduced in CMake 2.8.12 (get_filename_component's 
> function when VAR=DIRECTORY) the native-client won't build.
> Currently RHEL6 & 7 are using older version of CMake. 
> Error log:
> {noformat}
> [INFO] --- maven-antrun-plugin:1.7:run (make) @ hadoop-hdfs-native-client ---
> [INFO] Executing tasks
> main:
>  [exec] JAVA_HOME=, 
> JAVA_JVM_LIBRARY=/usr/java/jdk1.7.0_79/jre/lib/amd64/server/libjvm.so
>  [exec] JAVA_INCLUDE_PATH=/usr/java/jdk1.7.0_79/include, 
> JAVA_INCLUDE_PATH2=/usr/java/jdk1.7.0_79/include/linux
>  [exec] Located all JNI components successfully.
>  [exec] -- Could NOT find PROTOBUF (missing:  PROTOBUF_LIBRARY 
> PROTOBUF_INCLUDE_DIR)
>  [exec] -- valgrind location: MEMORYCHECK_COMMAND-NOTFOUND
>  [exec] -- checking for module 'fuse'
>  [exec] --   package 'fuse' not found
>  [exec] -- Failed to find Linux FUSE libraries or include files.  Will 
> not build FUSE client.
>  [exec] -- Configuring incomplete, errors occurred!
>  [exec] CMake Error at main/native/libhdfspp/CMakeLists.txt:71 
> (get_filename_component):
>  [exec]   get_filename_component unknown component DIRECTORY
>  [exec] Call Stack (most recent call first):
>  [exec]   main/native/libhdfspp/CMakeLists.txt:95 (copy_on_demand)
>  [exec]
>  [exec]
>  [exec] CMake Error at main/native/libhdfspp/CMakeLists.txt:71 
> (get_filename_component):
>  [exec]   get_filename_component unknown component DIRECTORY
>  [exec] Call Stack (most recent call first):
>  [exec]   main/native/libhdfspp/CMakeLists.txt:96 (copy_on_demand)
>  [exec]
>  [exec]
>  [exec] CMake Error at main/native/libhdfspp/CMakeLists.txt:71 
> (get_filename_component):
>  [exec]   get_filename_component unknown component DIRECTORY
>  [exec] Call Stack (most recent call first):
>  [exec]   main/native/libhdfspp/CMakeLists.txt:97 (copy_on_demand)
>  [exec]
>  [exec]
>  [exec] CMake Error at main/native/libhdfspp/CMakeLists.txt:71 
> (get_filename_component):
>  [exec]   get_filename_component unknown component DIRECTORY
>  [exec] Call Stack (most recent call first):
>  [exec]   main/native/libhdfspp/CMakeLists.txt:98 (copy_on_demand)
>  [exec]
>  [exec]
>  [exec] CMake Error: The following variables are used in this project, 
> but they are set to NOTFOUND.
>  [exec] Please set them or make sure they are set and tested correctly in 
> the CMake files:
>  [exec] PROTOBUF_LIBRARY (ADVANCED)
>  [exec] linked by target "hdfspp" in directory 
> /home/tiborkiss/devel/workspace/hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp
>  [exec] linked by target "hdfspp_static" in directory 
> /home/tiborkiss/devel/workspace/hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp
>  [exec] linked by target "protoc-gen-hrpc" in directory 
> /home/tiborkiss/devel/workspace/hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/lib/proto
>  [exec] linked by target "bad_datanode_test" in directory 
> /home/tiborkiss/devel/workspace/hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/tests
>  [exec] linked by target "hdfs_builder_test" in directory 
> /home/tiborkiss/devel/workspace/hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/tests
>  [exec] linked by target "hdfspp_errors_test" in directory 
> /home/tiborkiss/devel/workspace
>  [exec] 
> /hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/tests
>  [exec] linked by target "libhdfs_threaded_hdfspp_test_shim_static" 
> in directory 
> /home/tiborkiss/devel/workspace/hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/test
>  [exec] linked by target "logging_test" in directory 
> /home/tiborkiss/devel/workspace/hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/tests
>  [exec] linked by target "node_exclusion_test" in directory 
> /home/tiborkiss/devel/workspace/hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/tests
> tiborkiss@eiger ~/d/w/hadoop ❯❯❯ cat 

[jira] [Updated] (HDFS-10332) hdfs-native-client fails to build with CMake 2.8.11 or earlier

2016-04-26 Thread Tibor Kiss (JIRA)

 [ 
https://issues.apache.org/jira/browse/HDFS-10332?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Tibor Kiss updated HDFS-10332:
--
Description: 
Due to a new syntax introduced in CMake 2.8.12 (get_filename_component's 
function when VAR=DIRECTORY) the native-client won't build.

Currently RHEL6 & 7 are using older version of CMake. 

Error log:
{noformat}
[INFO] --- maven-antrun-plugin:1.7:run (make) @ hadoop-hdfs-native-client ---
[INFO] Executing tasks

main:
 [exec] JAVA_HOME=, 
JAVA_JVM_LIBRARY=/usr/java/jdk1.7.0_79/jre/lib/amd64/server/libjvm.so
 [exec] JAVA_INCLUDE_PATH=/usr/java/jdk1.7.0_79/include, 
JAVA_INCLUDE_PATH2=/usr/java/jdk1.7.0_79/include/linux
 [exec] Located all JNI components successfully.
 [exec] -- Could NOT find PROTOBUF (missing:  PROTOBUF_LIBRARY 
PROTOBUF_INCLUDE_DIR)
 [exec] -- valgrind location: MEMORYCHECK_COMMAND-NOTFOUND
 [exec] -- checking for module 'fuse'
 [exec] --   package 'fuse' not found
 [exec] -- Failed to find Linux FUSE libraries or include files.  Will not 
build FUSE client.
 [exec] -- Configuring incomplete, errors occurred!
 [exec] CMake Error at main/native/libhdfspp/CMakeLists.txt:71 
(get_filename_component):
 [exec]   get_filename_component unknown component DIRECTORY
 [exec] Call Stack (most recent call first):
 [exec]   main/native/libhdfspp/CMakeLists.txt:95 (copy_on_demand)
 [exec]
 [exec]
 [exec] CMake Error at main/native/libhdfspp/CMakeLists.txt:71 
(get_filename_component):
 [exec]   get_filename_component unknown component DIRECTORY
 [exec] Call Stack (most recent call first):
 [exec]   main/native/libhdfspp/CMakeLists.txt:96 (copy_on_demand)
 [exec]
 [exec]
 [exec] CMake Error at main/native/libhdfspp/CMakeLists.txt:71 
(get_filename_component):
 [exec]   get_filename_component unknown component DIRECTORY
 [exec] Call Stack (most recent call first):
 [exec]   main/native/libhdfspp/CMakeLists.txt:97 (copy_on_demand)
 [exec]
 [exec]
 [exec] CMake Error at main/native/libhdfspp/CMakeLists.txt:71 
(get_filename_component):
 [exec]   get_filename_component unknown component DIRECTORY
 [exec] Call Stack (most recent call first):
 [exec]   main/native/libhdfspp/CMakeLists.txt:98 (copy_on_demand)
 [exec]
 [exec]
 [exec] CMake Error: The following variables are used in this project, but 
they are set to NOTFOUND.
 [exec] Please set them or make sure they are set and tested correctly in 
the CMake files:
 [exec] PROTOBUF_LIBRARY (ADVANCED)
 [exec] linked by target "hdfspp" in directory 
/home/tiborkiss/devel/workspace/hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp
 [exec] linked by target "hdfspp_static" in directory 
/home/tiborkiss/devel/workspace/hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp
 [exec] linked by target "protoc-gen-hrpc" in directory 
/home/tiborkiss/devel/workspace/hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/lib/proto
 [exec] linked by target "bad_datanode_test" in directory 
/home/tiborkiss/devel/workspace/hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/tests
 [exec] linked by target "hdfs_builder_test" in directory 
/home/tiborkiss/devel/workspace/hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/tests
 [exec] linked by target "hdfspp_errors_test" in directory 
/home/tiborkiss/devel/workspace
 [exec] 
/hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/tests
 [exec] linked by target "libhdfs_threaded_hdfspp_test_shim_static" in 
directory 
/home/tiborkiss/devel/workspace/hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/test
 [exec] linked by target "logging_test" in directory 
/home/tiborkiss/devel/workspace/hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/tests
 [exec] linked by target "node_exclusion_test" in directory 
/home/tiborkiss/devel/workspace/hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/tests
tiborkiss@eiger ~/d/w/hadoop ❯❯❯ cat ../HADOOP-cmake-error.txt
[INFO] --- maven-antrun-plugin:1.7:run (make) @ hadoop-hdfs-native-client ---
[INFO] Executing tasks

main:
 [exec] JAVA_HOME=, 
JAVA_JVM_LIBRARY=/usr/java/jdk1.7.0_79/jre/lib/amd64/server/libjvm.so
 [exec] JAVA_INCLUDE_PATH=/usr/java/jdk1.7.0_79/include, 
JAVA_INCLUDE_PATH2=/usr/java/jdk1.7.0_79/include/linux
 [exec] Located all JNI components successfully.
 [exec] -- Could NOT find PROTOBUF (missing:  PROTOBUF_LIBRARY 
PROTOBUF_INCLUDE_DIR)
 [exec] -- valgrind location: MEMORYCHECK_COMMAND-NOTFOUND
 [exec] -- checking for module 'fuse'
 [exec] --   package 'fuse' not found
 [exec] -- 

[jira] [Created] (HDFS-10332) hdfs-native-client fails to build with CMake 2.8.11 or earlier

2016-04-26 Thread Tibor Kiss (JIRA)
Tibor Kiss created HDFS-10332:
-

 Summary: hdfs-native-client fails to build with CMake 2.8.11 or 
earlier
 Key: HDFS-10332
 URL: https://issues.apache.org/jira/browse/HDFS-10332
 Project: Hadoop HDFS
  Issue Type: Sub-task
  Components: hdfs-client
Reporter: Tibor Kiss
Priority: Minor


Due to a new syntax introduced in CMake 2.8.12 (get_filename_component's 
function when VAR=DIRECTORY) the native-client won't build.

Currently RHEL6 & 7 are using older version of CMake. 





--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Assigned] (HDFS-10332) hdfs-native-client fails to build with CMake 2.8.11 or earlier

2016-04-26 Thread Tibor Kiss (JIRA)

 [ 
https://issues.apache.org/jira/browse/HDFS-10332?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Tibor Kiss reassigned HDFS-10332:
-

Assignee: Tibor Kiss

> hdfs-native-client fails to build with CMake 2.8.11 or earlier
> --
>
> Key: HDFS-10332
> URL: https://issues.apache.org/jira/browse/HDFS-10332
> Project: Hadoop HDFS
>  Issue Type: Sub-task
>  Components: hdfs-client
>Reporter: Tibor Kiss
>Assignee: Tibor Kiss
>Priority: Minor
>
> Due to a new syntax introduced in CMake 2.8.12 (get_filename_component's 
> function when VAR=DIRECTORY) the native-client won't build.
> Currently RHEL6 & 7 are using older version of CMake. 



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HDFS-10199) Unit tests TestCopyFiles, TestDistCh, TestLogalyzer under org.apache.hadoop.tools are failing

2016-03-23 Thread Tibor Kiss (JIRA)

[ 
https://issues.apache.org/jira/browse/HDFS-10199?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15209087#comment-15209087
 ] 

Tibor Kiss commented on HDFS-10199:
---

Thanks [~arpitagarwal] & [~cnauroth] !

> Unit tests TestCopyFiles, TestDistCh, TestLogalyzer under 
> org.apache.hadoop.tools are failing
> -
>
> Key: HDFS-10199
> URL: https://issues.apache.org/jira/browse/HDFS-10199
> Project: Hadoop HDFS
>  Issue Type: Bug
>Affects Versions: 2.8.0
>Reporter: Tibor Kiss
>Assignee: Tibor Kiss
>Priority: Minor
> Fix For: 2.8.0
>
> Attachments: HDFS-10199-branch-2.01.patch, HDFS-10199.01.patch
>
>
> Due to logging changes introduced in HDFS-9402 the TestCopyFiles, TestDistCh, 
> TestLogalyzer are
> failing due to ClassCastException thrown from 
> org.apache.hadoop.tools.TestCopyFiles' constructor.
> Error message:
> {noformat}
> Running org.apache.hadoop.tools.TestLogalyzer
> Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.213 sec - 
> in org.apache.hadoop.tools.TestLogalyzer
> Results :
> Failed tests:
>   TestSuite$1.warning Exception in constructor: testDeleteLocal 
> (java.lang.ClassCastException: org.slf4j.impl.Log4jLoggerAdapter cannot be 
> cast to org.apache.commons.logging.impl.Log4JLogger
> at org.apache.hadoop.tools.TestCopyFiles.(TestCopyFiles.java:63)
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native 
> Method)
> at 
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
> at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
> at junit.framework.TestSuite.createTest(TestSuite.java:63)
> at junit.framework.TestSuite.addTestMethod(TestSuite.java:310)
> at junit.framework.TestSuite.addTestsFromTestCase(TestSuite.java:153)
> at junit.framework.TestSuite.(TestSuite.java:132)
> at 
> org.junit.internal.runners.JUnit38ClassRunner.(JUnit38ClassRunner.java:72)
> at 
> org.junit.internal.builders.JUnit3Builder.runnerForClass(JUnit3Builder.java:11)
> at 
> org.junit.runners.model.RunnerBuilder.safeRunnerForClass(RunnerBuilder.java:59)
> at 
> org.junit.internal.builders.AllDefaultPossibilitiesBuilder.runnerForClass(AllDefaultPossibilitiesBuilder.java:26)
> at 
> org.junit.runners.model.RunnerBuilder.safeRunnerForClass(RunnerBuilder.java:59)
> at 
> org.junit.internal.requests.ClassRequest.getRunner(ClassRequest.java:26)
> at 
> org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:262)
> at 
> org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
> at 
> org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
> at 
> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
> at 
> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
> at 
> org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
> )
> {noformat}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HDFS-10199) Unit tests TestCopyFiles, TestDistCh, TestLogalyzer under org.apache.hadoop.tools are failing

2016-03-23 Thread Tibor Kiss (JIRA)

[ 
https://issues.apache.org/jira/browse/HDFS-10199?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15208655#comment-15208655
 ] 

Tibor Kiss commented on HDFS-10199:
---

Thanks [~arpitagarwal], just renamed & uploaded the patch.

> Unit tests TestCopyFiles, TestDistCh, TestLogalyzer under 
> org.apache.hadoop.tools are failing
> -
>
> Key: HDFS-10199
> URL: https://issues.apache.org/jira/browse/HDFS-10199
> Project: Hadoop HDFS
>  Issue Type: Bug
>Affects Versions: 2.8.0
>Reporter: Tibor Kiss
>Assignee: Tibor Kiss
>Priority: Minor
> Fix For: 2.8.0
>
> Attachments: HDFS-10199-branch-2.01.patch, HDFS-10199.01.patch
>
>
> Due to logging changes introduced in HDFS-9402 the TestCopyFiles, TestDistCh, 
> TestLogalyzer are
> failing due to ClassCastException thrown from 
> org.apache.hadoop.tools.TestCopyFiles' constructor.
> Error message:
> {noformat}
> Running org.apache.hadoop.tools.TestLogalyzer
> Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.213 sec - 
> in org.apache.hadoop.tools.TestLogalyzer
> Results :
> Failed tests:
>   TestSuite$1.warning Exception in constructor: testDeleteLocal 
> (java.lang.ClassCastException: org.slf4j.impl.Log4jLoggerAdapter cannot be 
> cast to org.apache.commons.logging.impl.Log4JLogger
> at org.apache.hadoop.tools.TestCopyFiles.(TestCopyFiles.java:63)
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native 
> Method)
> at 
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
> at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
> at junit.framework.TestSuite.createTest(TestSuite.java:63)
> at junit.framework.TestSuite.addTestMethod(TestSuite.java:310)
> at junit.framework.TestSuite.addTestsFromTestCase(TestSuite.java:153)
> at junit.framework.TestSuite.(TestSuite.java:132)
> at 
> org.junit.internal.runners.JUnit38ClassRunner.(JUnit38ClassRunner.java:72)
> at 
> org.junit.internal.builders.JUnit3Builder.runnerForClass(JUnit3Builder.java:11)
> at 
> org.junit.runners.model.RunnerBuilder.safeRunnerForClass(RunnerBuilder.java:59)
> at 
> org.junit.internal.builders.AllDefaultPossibilitiesBuilder.runnerForClass(AllDefaultPossibilitiesBuilder.java:26)
> at 
> org.junit.runners.model.RunnerBuilder.safeRunnerForClass(RunnerBuilder.java:59)
> at 
> org.junit.internal.requests.ClassRequest.getRunner(ClassRequest.java:26)
> at 
> org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:262)
> at 
> org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
> at 
> org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
> at 
> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
> at 
> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
> at 
> org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
> )
> {noformat}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (HDFS-10199) Unit tests TestCopyFiles, TestDistCh, TestLogalyzer under org.apache.hadoop.tools are failing

2016-03-23 Thread Tibor Kiss (JIRA)

 [ 
https://issues.apache.org/jira/browse/HDFS-10199?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Tibor Kiss updated HDFS-10199:
--
Attachment: HDFS-10199-branch-2.01.patch

> Unit tests TestCopyFiles, TestDistCh, TestLogalyzer under 
> org.apache.hadoop.tools are failing
> -
>
> Key: HDFS-10199
> URL: https://issues.apache.org/jira/browse/HDFS-10199
> Project: Hadoop HDFS
>  Issue Type: Bug
>Affects Versions: 2.8.0
>Reporter: Tibor Kiss
>Assignee: Tibor Kiss
>Priority: Minor
> Fix For: 2.8.0
>
> Attachments: HDFS-10199-branch-2.01.patch, HDFS-10199.01.patch
>
>
> Due to logging changes introduced in HDFS-9402 the TestCopyFiles, TestDistCh, 
> TestLogalyzer are
> failing due to ClassCastException thrown from 
> org.apache.hadoop.tools.TestCopyFiles' constructor.
> Error message:
> {noformat}
> Running org.apache.hadoop.tools.TestLogalyzer
> Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.213 sec - 
> in org.apache.hadoop.tools.TestLogalyzer
> Results :
> Failed tests:
>   TestSuite$1.warning Exception in constructor: testDeleteLocal 
> (java.lang.ClassCastException: org.slf4j.impl.Log4jLoggerAdapter cannot be 
> cast to org.apache.commons.logging.impl.Log4JLogger
> at org.apache.hadoop.tools.TestCopyFiles.(TestCopyFiles.java:63)
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native 
> Method)
> at 
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
> at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
> at junit.framework.TestSuite.createTest(TestSuite.java:63)
> at junit.framework.TestSuite.addTestMethod(TestSuite.java:310)
> at junit.framework.TestSuite.addTestsFromTestCase(TestSuite.java:153)
> at junit.framework.TestSuite.(TestSuite.java:132)
> at 
> org.junit.internal.runners.JUnit38ClassRunner.(JUnit38ClassRunner.java:72)
> at 
> org.junit.internal.builders.JUnit3Builder.runnerForClass(JUnit3Builder.java:11)
> at 
> org.junit.runners.model.RunnerBuilder.safeRunnerForClass(RunnerBuilder.java:59)
> at 
> org.junit.internal.builders.AllDefaultPossibilitiesBuilder.runnerForClass(AllDefaultPossibilitiesBuilder.java:26)
> at 
> org.junit.runners.model.RunnerBuilder.safeRunnerForClass(RunnerBuilder.java:59)
> at 
> org.junit.internal.requests.ClassRequest.getRunner(ClassRequest.java:26)
> at 
> org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:262)
> at 
> org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
> at 
> org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
> at 
> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
> at 
> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
> at 
> org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
> )
> {noformat}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (HDFS-10199) Unit tests TestCopyFiles, TestDistCh, TestLogalyzer under org.apache.hadoop.tools are failing

2016-03-23 Thread Tibor Kiss (JIRA)

 [ 
https://issues.apache.org/jira/browse/HDFS-10199?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Tibor Kiss updated HDFS-10199:
--
Status: Patch Available  (was: Open)

> Unit tests TestCopyFiles, TestDistCh, TestLogalyzer under 
> org.apache.hadoop.tools are failing
> -
>
> Key: HDFS-10199
> URL: https://issues.apache.org/jira/browse/HDFS-10199
> Project: Hadoop HDFS
>  Issue Type: Bug
>Affects Versions: 2.8.0
>Reporter: Tibor Kiss
>Priority: Minor
> Fix For: 2.8.0
>
> Attachments: HDFS-10199.01.patch
>
>
> Due to logging changes introduced in HDFS-9402 the TestCopyFiles, TestDistCh, 
> TestLogalyzer are
> failing due to ClassCastException thrown from 
> org.apache.hadoop.tools.TestCopyFiles' constructor.
> Error message:
> {noformat}
> Running org.apache.hadoop.tools.TestLogalyzer
> Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.213 sec - 
> in org.apache.hadoop.tools.TestLogalyzer
> Results :
> Failed tests:
>   TestSuite$1.warning Exception in constructor: testDeleteLocal 
> (java.lang.ClassCastException: org.slf4j.impl.Log4jLoggerAdapter cannot be 
> cast to org.apache.commons.logging.impl.Log4JLogger
> at org.apache.hadoop.tools.TestCopyFiles.(TestCopyFiles.java:63)
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native 
> Method)
> at 
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
> at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
> at junit.framework.TestSuite.createTest(TestSuite.java:63)
> at junit.framework.TestSuite.addTestMethod(TestSuite.java:310)
> at junit.framework.TestSuite.addTestsFromTestCase(TestSuite.java:153)
> at junit.framework.TestSuite.(TestSuite.java:132)
> at 
> org.junit.internal.runners.JUnit38ClassRunner.(JUnit38ClassRunner.java:72)
> at 
> org.junit.internal.builders.JUnit3Builder.runnerForClass(JUnit3Builder.java:11)
> at 
> org.junit.runners.model.RunnerBuilder.safeRunnerForClass(RunnerBuilder.java:59)
> at 
> org.junit.internal.builders.AllDefaultPossibilitiesBuilder.runnerForClass(AllDefaultPossibilitiesBuilder.java:26)
> at 
> org.junit.runners.model.RunnerBuilder.safeRunnerForClass(RunnerBuilder.java:59)
> at 
> org.junit.internal.requests.ClassRequest.getRunner(ClassRequest.java:26)
> at 
> org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:262)
> at 
> org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
> at 
> org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
> at 
> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
> at 
> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
> at 
> org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
> )
> {noformat}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HDFS-10199) Unit tests TestCopyFiles, TestDistCh, TestLogalyzer under org.apache.hadoop.tools are failing

2016-03-23 Thread Tibor Kiss (JIRA)

[ 
https://issues.apache.org/jira/browse/HDFS-10199?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15208426#comment-15208426
 ] 

Tibor Kiss commented on HDFS-10199:
---

Attached a trivial fix.

> Unit tests TestCopyFiles, TestDistCh, TestLogalyzer under 
> org.apache.hadoop.tools are failing
> -
>
> Key: HDFS-10199
> URL: https://issues.apache.org/jira/browse/HDFS-10199
> Project: Hadoop HDFS
>  Issue Type: Bug
>Affects Versions: 2.8.0
>Reporter: Tibor Kiss
>Priority: Minor
> Fix For: 2.8.0
>
> Attachments: HDFS-10199.01.patch
>
>
> Due to logging changes introduced in HDFS-9402 the TestCopyFiles, TestDistCh, 
> TestLogalyzer are
> failing due to ClassCastException thrown from 
> org.apache.hadoop.tools.TestCopyFiles' constructor.
> Error message:
> {noformat}
> Running org.apache.hadoop.tools.TestLogalyzer
> Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.213 sec - 
> in org.apache.hadoop.tools.TestLogalyzer
> Results :
> Failed tests:
>   TestSuite$1.warning Exception in constructor: testDeleteLocal 
> (java.lang.ClassCastException: org.slf4j.impl.Log4jLoggerAdapter cannot be 
> cast to org.apache.commons.logging.impl.Log4JLogger
> at org.apache.hadoop.tools.TestCopyFiles.(TestCopyFiles.java:63)
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native 
> Method)
> at 
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
> at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
> at junit.framework.TestSuite.createTest(TestSuite.java:63)
> at junit.framework.TestSuite.addTestMethod(TestSuite.java:310)
> at junit.framework.TestSuite.addTestsFromTestCase(TestSuite.java:153)
> at junit.framework.TestSuite.(TestSuite.java:132)
> at 
> org.junit.internal.runners.JUnit38ClassRunner.(JUnit38ClassRunner.java:72)
> at 
> org.junit.internal.builders.JUnit3Builder.runnerForClass(JUnit3Builder.java:11)
> at 
> org.junit.runners.model.RunnerBuilder.safeRunnerForClass(RunnerBuilder.java:59)
> at 
> org.junit.internal.builders.AllDefaultPossibilitiesBuilder.runnerForClass(AllDefaultPossibilitiesBuilder.java:26)
> at 
> org.junit.runners.model.RunnerBuilder.safeRunnerForClass(RunnerBuilder.java:59)
> at 
> org.junit.internal.requests.ClassRequest.getRunner(ClassRequest.java:26)
> at 
> org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:262)
> at 
> org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
> at 
> org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
> at 
> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
> at 
> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
> at 
> org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
> )
> {noformat}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Created] (HDFS-10199) Unit tests TestCopyFiles, TestDistCh, TestLogalyzer under org.apache.hadoop.tools are failing

2016-03-23 Thread Tibor Kiss (JIRA)
Tibor Kiss created HDFS-10199:
-

 Summary: Unit tests TestCopyFiles, TestDistCh, TestLogalyzer under 
org.apache.hadoop.tools are failing
 Key: HDFS-10199
 URL: https://issues.apache.org/jira/browse/HDFS-10199
 Project: Hadoop HDFS
  Issue Type: Bug
Affects Versions: 2.8.0
Reporter: Tibor Kiss
Priority: Minor
 Fix For: 2.8.0


Due to logging changes introduced in HDFS-9402 the TestCopyFiles, TestDistCh, 
TestLogalyzer are
failing due to ClassCastException thrown from 
org.apache.hadoop.tools.TestCopyFiles' constructor.

Error message:
{noformat}
Running org.apache.hadoop.tools.TestLogalyzer
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.213 sec - in 
org.apache.hadoop.tools.TestLogalyzer

Results :

Failed tests:
  TestSuite$1.warning Exception in constructor: testDeleteLocal 
(java.lang.ClassCastException: org.slf4j.impl.Log4jLoggerAdapter cannot be cast 
to org.apache.commons.logging.impl.Log4JLogger
at org.apache.hadoop.tools.TestCopyFiles.(TestCopyFiles.java:63)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at 
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at 
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at junit.framework.TestSuite.createTest(TestSuite.java:63)
at junit.framework.TestSuite.addTestMethod(TestSuite.java:310)
at junit.framework.TestSuite.addTestsFromTestCase(TestSuite.java:153)
at junit.framework.TestSuite.(TestSuite.java:132)
at 
org.junit.internal.runners.JUnit38ClassRunner.(JUnit38ClassRunner.java:72)
at 
org.junit.internal.builders.JUnit3Builder.runnerForClass(JUnit3Builder.java:11)
at 
org.junit.runners.model.RunnerBuilder.safeRunnerForClass(RunnerBuilder.java:59)
at 
org.junit.internal.builders.AllDefaultPossibilitiesBuilder.runnerForClass(AllDefaultPossibilitiesBuilder.java:26)
at 
org.junit.runners.model.RunnerBuilder.safeRunnerForClass(RunnerBuilder.java:59)
at 
org.junit.internal.requests.ClassRequest.getRunner(ClassRequest.java:26)
at 
org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:262)
at 
org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
at 
org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
at 
org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
at 
org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
at 
org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
)
{noformat}




--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (HDFS-10199) Unit tests TestCopyFiles, TestDistCh, TestLogalyzer under org.apache.hadoop.tools are failing

2016-03-23 Thread Tibor Kiss (JIRA)

 [ 
https://issues.apache.org/jira/browse/HDFS-10199?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Tibor Kiss updated HDFS-10199:
--
Attachment: HDFS-10199.01.patch

> Unit tests TestCopyFiles, TestDistCh, TestLogalyzer under 
> org.apache.hadoop.tools are failing
> -
>
> Key: HDFS-10199
> URL: https://issues.apache.org/jira/browse/HDFS-10199
> Project: Hadoop HDFS
>  Issue Type: Bug
>Affects Versions: 2.8.0
>Reporter: Tibor Kiss
>Priority: Minor
> Fix For: 2.8.0
>
> Attachments: HDFS-10199.01.patch
>
>
> Due to logging changes introduced in HDFS-9402 the TestCopyFiles, TestDistCh, 
> TestLogalyzer are
> failing due to ClassCastException thrown from 
> org.apache.hadoop.tools.TestCopyFiles' constructor.
> Error message:
> {noformat}
> Running org.apache.hadoop.tools.TestLogalyzer
> Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.213 sec - 
> in org.apache.hadoop.tools.TestLogalyzer
> Results :
> Failed tests:
>   TestSuite$1.warning Exception in constructor: testDeleteLocal 
> (java.lang.ClassCastException: org.slf4j.impl.Log4jLoggerAdapter cannot be 
> cast to org.apache.commons.logging.impl.Log4JLogger
> at org.apache.hadoop.tools.TestCopyFiles.(TestCopyFiles.java:63)
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native 
> Method)
> at 
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
> at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
> at junit.framework.TestSuite.createTest(TestSuite.java:63)
> at junit.framework.TestSuite.addTestMethod(TestSuite.java:310)
> at junit.framework.TestSuite.addTestsFromTestCase(TestSuite.java:153)
> at junit.framework.TestSuite.(TestSuite.java:132)
> at 
> org.junit.internal.runners.JUnit38ClassRunner.(JUnit38ClassRunner.java:72)
> at 
> org.junit.internal.builders.JUnit3Builder.runnerForClass(JUnit3Builder.java:11)
> at 
> org.junit.runners.model.RunnerBuilder.safeRunnerForClass(RunnerBuilder.java:59)
> at 
> org.junit.internal.builders.AllDefaultPossibilitiesBuilder.runnerForClass(AllDefaultPossibilitiesBuilder.java:26)
> at 
> org.junit.runners.model.RunnerBuilder.safeRunnerForClass(RunnerBuilder.java:59)
> at 
> org.junit.internal.requests.ClassRequest.getRunner(ClassRequest.java:26)
> at 
> org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:262)
> at 
> org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
> at 
> org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
> at 
> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
> at 
> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
> at 
> org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
> )
> {noformat}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)