thinkORo commented on issue #40754:
URL: https://github.com/apache/arrow/issues/40754#issuecomment-2017552083

   Sure. Here is a list of parameters that unfortunately did not work
   
   ```
   import os
   os.environ['TLS_SKIP_VERIFY']="TRUE"
   os.environ['TLS_VERIFY']="FALSE"
   os.environ['VERIFY_CLIENT']="FALSE"
   os.environ['VERIFY']="FALSE"
   os.environ['CURLOPT_SSL_VERIFYHOST']="FALSE"
   os.environ['CURLOPT_SSL_VERIFYPEER']="FALSE"
   
   os.environ['REQUESTS_CA_BUNDLE']='path_to_my.pem'
   os.environ['AWS_CA_BUNDLE']='path_to_my.pem'
   os.environ['CURL_CA_BUNDLE']='path_to_my.pem'
   os.environ['ARROW_SSL_CERT_FILE']='path_to_my.crt'
   os.environ['SSL_CERT_FILE']='path_to_my.crt'
   ```
   
   From my perspective the problem is that the S3FileSystem implementation of 
pyarrow.fs has a different "signature" as s3fs. In s3fs.S3FileSystem I can 
define client_kwargs to provide a specific certificate. In pyarrow's 
implementation I couldn't find a respective way.
   
   But: My initial problem was related to pyiceberg (which is based on 
pyarrow). And there I got a hint to check the ssl verification parameter by
   
   ```
   import ssl
   paths = ssl.get_default_verify_paths()
   print(paths)
   ```
   I'm running Python in a virtual environment. Here the "openssl_cafile" from 
ssl points to a cert.pem in the virtual environment 
(./envs/name_of_my_venv/ssl/cert.pem) which I have to adjust (add the content 
of my CA's pem).
   
   And with that adjustment I got it to work. More or less :-) But at least 
without the initial mentioned certificate problem.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to