My application needs to read / write data to a distributed memory FS.
The application is not Map/Reduce or Spark job. 
Let's say I need to run a docker orchestration pipeline, where each step is
a docker container, and I want to save the intermediate results of each step
and pass it to the next step. A docker container may use Java, Python, R or
C++ code to execute the logic. 

Is it possible to use Python and R to r /w directly to IGFS in primary mode
or in dual_sync mode?
Technically I can use for example ZeroMQ to implement IPC communication
between Java and Python and use IGFS Native API, but it will introduce
additional latency.

If I run IGFS as Hadoop accelerator on top of HDFS, can I use python HDFS
interfaces?
http://wesmckinney.com/blog/python-hdfs-interfaces/

Thanks



--
View this message in context: 
http://apache-ignite-users.70518.x6.nabble.com/IGFS-Python-and-R-clients-tp12361.html
Sent from the Apache Ignite Users mailing list archive at Nabble.com.

Reply via email to