N
N
nlan2015-12-22 19:06:43
PHP
nlan, 2015-12-22 19:06:43

What to choose for parsing large xml?

There is a following task.
Write daemons to process a large number and large xml files. You will have to process all the time to constantly update the data.
What would you suggest for such a task?
The project is in php, we are moving to the architecture of micro-services, so the choice is between phpdaemon or Golang.
Also tell me the data warehouse, in which you can constantly enter a large number of data.
Let's say the handlers will be launched in 10 threads and will process several thousand objects per second. You need a storage that can withstand from 20k hits per second, which will not degrade.
Thanks

Answer the question

In order to leave comments, you need to log in

2 answer(s)
A
abcyu, 2015-12-22
@nlan

Tarantool
habrahabr.ru/company/mailru/blog/273695
habrahabr.ru/company/mailru/blog/273695/#comment_8...

S
Sergey, 2015-12-22
Protko @Fesor

phpdaemon is already dead. I recommend you take amphp . But if you have enough experience with go, go is better.
about the storage - you can write to memory (redis, or in a queue) and then aggregate into some kind of couchdb. Or just in couchdb / mongodb and if there is not enough performance, write to the buffer.

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question