Answer the question
In order to leave comments, you need to log in
What to choose for parsing large xml?
There is a following task.
Write daemons to process a large number and large xml files. You will have to process all the time to constantly update the data.
What would you suggest for such a task?
The project is in php, we are moving to the architecture of micro-services, so the choice is between phpdaemon or Golang.
Also tell me the data warehouse, in which you can constantly enter a large number of data.
Let's say the handlers will be launched in 10 threads and will process several thousand objects per second. You need a storage that can withstand from 20k hits per second, which will not degrade.
Thanks
Answer the question
In order to leave comments, you need to log in
Tarantool
habrahabr.ru/company/mailru/blog/273695
habrahabr.ru/company/mailru/blog/273695/#comment_8...
phpdaemon is already dead. I recommend you take amphp . But if you have enough experience with go, go is better.
about the storage - you can write to memory (redis, or in a queue) and then aggregate into some kind of couchdb. Or just in couchdb / mongodb and if there is not enough performance, write to the buffer.
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question