Hi @all,

I am using pyspark program to write the data into elastic index by using
upsert operation (sample code snippet below).

def writeDataToES(final_df):
    write_options = {
    "es.nodes":  elastic_host,
    "es.net.ssl": "false",
    "es.nodes.wan.only": "true",
    "es.net.http.auth.user": elastic_user_name,
    "es.net.http.auth.pass": elastic_password,
    "es.port": elastic_port,
    "es.net.ssl": "true",
    'es.spark.dataframe.write.null': "true",
    "es.mapping.id" : mapping_id,
    "es.write.operation": "upsert"
    }
    final_df.write.format(
"org.elasticsearch.spark.sql").options(**write_options).mode("append").save(
f"{index_name}")


while writing data from delta table to elastic index, i am getting error
for few records(error message below)

*Py4JJavaError: An error occurred while calling o1305.save.*
*: org.apache.spark.SparkException: Job aborted due to stage failure: Task
4 in stage 524.0 failed 4 times, most recent failure: Lost task 4.3 in
stage 524.0 (TID 12805) (192.168.128.16 executor 1):
org.elasticsearch.hadoop.EsHadoopException: Could not write all entries for
bulk operation [1/1]. Error sample (first [5] error messages):*
* org.elasticsearch.hadoop.rest.EsHadoopRemoteException:
illegal_argument_exception: Illegal group reference: group index is missing*

Could you guide me on it, am I missing anythings,

If you require more additional details, please let me know.

Thanks

Reply via email to