Answer the question
In order to leave comments, you need to log in
What is the best way to import XML (8 Gb) into a postgresql table?
You need to upload the FIAS files, or rather the new GAR ( https://fias.nalog.ru/Updates ) to the postgresql database.
The peculiarity of these files is the large XML size (up to 7 Gb).
Now loading is implemented for MS SQL via sqlxmlbulkload, 90 Gb is loaded in 4.5 hours.
Everything is implemented there quite simply, there is an XSD with additional annotations, you need to specify this XSD + XML and that's it.
As part of import substitution, you need to switch to PG, he began to study the issue and did not find anything like that, only the option will come across when we read XML into memory or into a table and work with it using xpath.
MS SQL also has such an approach, but there it did not work on large files, I think that there will be the same problem with Pg.
If anyone has experience with PG and large XML on PG I would be grateful for any information.
Answer the question
In order to leave comments, you need to log in
Go to my git. There is this project, implement plugin for postgrey.
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question