I am tying to connect to AWS managed ES service using Spark ES Connector , but am not able to.
I am passing es.nodes and es.port along with es.nodes.wan.only set to true. But it fails with below error: 34 ERROR NetworkClient: Node [x.x.x.x:443] failed (The server x.x.x.x failed to respond); no other nodes left - aborting... org.elasticsearch.hadoop.EsHadoopIllegalArgumentException: Cannot detect ES version - typically this happens if the network/Elasticsearch cluster is not accessible or when targeting a WAN/Cloud instance without the proper setting 'es.nodes.wan.only' at org.elasticsearch.hadoop.rest.InitializationUtils.discoverEsVersion(InitializationUtils.java:294) at org.elasticsearch.spark.rdd.EsSpark$.doSaveToEs(EsSpark.scala:103) at org.elasticsearch.spark.rdd.EsSpark$.saveToEs(EsSpark.scala:79) at org.elasticsearch.spark.rdd.EsSpark$.saveToEs(EsSpark.scala:76) ... 50 elided Caused by: org.elasticsearch.hadoop.rest.EsHadoopNoNodesLeftException: Connection error (check network and/or proxy settings)- all nodes failed; tried [[x.x.x.x:443]] I just wanted to check if anyone have already connected to managed ES service of AWS from spark and how it can be done? -- Thanks Deepak