[ 
https://issues.apache.org/jira/browse/ARROW-8135?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17061490#comment-17061490
 ] 

Matej Murin commented on ARROW-8135:
------------------------------------

[~wesm] yes, thank you for the link ! it was the aws-sdk-cpp package !

> [Python] Problem importing PyArrow on a cluster
> -----------------------------------------------
>
>                 Key: ARROW-8135
>                 URL: https://issues.apache.org/jira/browse/ARROW-8135
>             Project: Apache Arrow
>          Issue Type: Bug
>          Components: C++
>    Affects Versions: 0.16.0
>         Environment: Linux, RedHat CentOS 7
>            Reporter: Matej Murin
>            Priority: Major
>              Labels: newbie
>
> Hi, when I am trying to import pyarrow in python, I get the following error:
> *File "<stdin>", line 1, in <module>*
>  *File 
> "/services/matejm/anaconda3/lib/python3.7/site-packages/pyarrow/__init__.py", 
> line 49, in <module>*
>  *from pyarrow.lib import cpu_count, set_cpu_count*
>  *ImportError: libaws-cpp-sdk-s3.so: cannot open shared object file: No such 
> file or directory*
>  What can this be related to? I have searched wherever i could've and could 
> not find any reason for it, so I figured i might as well try in here.
> Thank you very much
> Note: I have installed py-arrow and its dependencies off-line, since our 
> cluster has a company firewall that does not allow pip nor conda installation



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

Reply via email to