Answer the question
In order to leave comments, you need to log in
How to import dataset of 30k documents into ElasticSearch?
Using curl, I import 30k documents in a file of the format described in the documentation .
Gives out a code
HTTP/1.1 413 Request Entity Too Large
content-length: 0
curl -i -X POST localhost:9200/_bulk -H "Content-Type: application/x-ndjson" --data-binary @dataset-bulk-30k.ndjson
Answer the question
In order to leave comments, you need to log in
Imported like this. Maybe someone has a better option - I'll be glad to know.
split --verbose -l1000 dataset-bulk-30k.ndjson bulk.
for f in bulk.??; do echo $f; curl -i -X POST localhost:9200/_bulk -H "Content-Type: application/x-ndjson" --data-binary @$f; done
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question