P
P
Pavel Osipov2011-08-27 18:26:59
Python
Pavel Osipov, 2011-08-27 18:26:59

Python web project structure for working with graphs?

Good day
I thought of implementing one study in python. The bottom line is that from a remote computer they access the server on which the python script loads a relatively large graph for id - from this request, does something with it, returns some metric, closes the connection, updates the graph and saves it.
So I'm asking for advice from knowledgeable people on architecture. Of the requirements, the minimum delay with 100 concurrently serviced graphs.
And in general, maybe it makes no sense to implement it in python?
I plan to install Apache + mod_wsgi. A Redis database like a handy noSQL and pickle to binary serialize a graph into a database.
Also, the python script must be used as a daemon.
Share your thoughts, is this the right approach?
Thanks in advance!

Answer the question

In order to leave comments, you need to log in

2 answer(s)
F
Fedor Indutny, 2011-08-27
@donnerjack13589

Might also be worth looking at neo4j.org/ ? In my work, I don’t need graph bases, but, it seems, people recommend it.
I would generally look towards node.js + cluster ( nodejs.org/ + github.com/learnboost/cluster ), but here it’s up to someone to taste. There is a module for node.js: search.npmjs.org/#/neo4j .

H
homm, 2011-08-27
@homm

> From requirements, the minimum delay at 100 at the same time serviced graphs.
As a framework - Tornado. And Apache with mod_wsgi can be thrown out.
> pickle for binary serialization of the graph into the base.
Marshal up to 3 times faster.

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question