Hi

What would be the best way to use Spark and neutral networks (especially RNN 
LSTM) ?


I think it would be possible by "tool"-combination:

Pyspark + anaconda + pandas + numpy + keras + tensorflow + scikit


But what about scalability and usability by Spark (pyspark) ?


How compatible are data structures (for example dataframe) between Spark and 
other "tools" ?

And it is possible convert them between different "tools" ?


---

Eras

Reply via email to