[
https://issues.apache.org/jira/browse/ARROW-1130?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17658162#comment-17658162
]
Rok Mihevc commented on ARROW-1130:
-----------------------------------
This issue has been migrated to [issue
#16751|https://github.com/apache/arrow/issues/16751] on GitHub. Please see the
[migration documentation|https://github.com/apache/arrow/issues/14542] for
further details.
> io-hdfs-test failure
> --------------------
>
> Key: ARROW-1130
> URL: https://issues.apache.org/jira/browse/ARROW-1130
> Project: Apache Arrow
> Issue Type: Bug
> Components: C++
> Environment: Ubuntu 16.04, GCC 4.8, Parquet-cpp
> Reporter: Young Park
> Assignee: Wes McKinney
> Priority: Blocker
> Fix For: 0.5.0
>
>
> Hi,
> I have noticed that arrow-cpp's io-hdfs-test fails during compilation with
> GCC 4.8, but passes when compiled with GCC 5.4 (as it just skips all tests as
> it doesn't connect to the HDFS client).
> I went into the test output log and it seemed to want me to set the variable
> ARROW_HDFS_TEST_USER, so I set the variable to 'root' and
> ARROW_HDFS_TEST_PORT to '9000' (which is the port that I use to connect to my
> local hdfs) and the test passes.
> Do I need to configure the environment and the variables in a specific way to
> get it to work?
> I'm mainly asking as I am trying to use arrow and parquet c++ libraries in an
> external project and I continue to run into segfaults in the libhdfs
> jni_helper even though I successfully connect to HDFS on my local Hadoop
> cluster and even read in a single Parquet file and am hoping that somehow
> this will help me figure out the issue in my external project as well.
> Thank you in advance for your help.
--
This message was sent by Atlassian Jira
(v8.20.10#820010)