bb33chen opened a new issue, #34:
URL: https://github.com/apache/doris-streamloader/issues/34

   The operation succeeds when using streamload curl
   **curl -v -L --location-trusted -u "${USER}:${PASSW}" \
     -H "format: parquet" \
     -H "label: ${LABEL}" \
     -H "timeout: ${TIMEOUT}" \
     -H "Expect: 100-continue" \
     -T "${SOURCE_FILE}" \
     
"http://${DORIS_HOST}:${DORIS_PORT}/api/${DB_NAME}/${TABLE_NAME}/_stream_load"; \
     | tee -a "${LOG_FILE}"**
    
   yet an error occurs with streamloader after adding the header 
format:parquet, 
   **${DORIS_LOADER} \
     --source_file="${SOURCE_FILE}" \
     --url="${DORIS_URL}" \
     --header="format:parquet" \
     --db="${DB_NAME}" \
     --table="${TABLE_NAME}" \
     --u="${USER}" \
     --p="${PASSW}" \
     --workers=${WORKERS} \
     --check_utf8=false \
     2>&1 | tee -a "${LOG_FILE}"**\
   
   
   as shown below:
   
   regent@dorisFEServer01:~/testdata$ bash streamload_parquet.sh 
   [2026-01-27 09:00:05] Start Doris Stream Load
   time=2026-01-27 09:00:05 level=error msg=Read file failed, error message: 
EOF, before retrying, we suggest:
   1.Check the input data files and fix if there is any problem.
   2.Do select count(*) to check whether data is partially loaded.
   3.If the data is partially loaded and duplication is unacceptable, consider 
dropping the table (with caution that all data in the table will be lost) and 
retry.
   4.Otherwise, just retry.
    line=reader/reader.go:139
   time=2026-01-27 09:00:05 level=error msg=5.When using a specified line 
delimiter, the file must end with that delimiter. line=reader/reader.go:141


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to