N
N
NetyNicka2014-10-29 00:16:08
PHP
NetyNicka, 2014-10-29 00:16:08

How to implement a social media aggregator?

Hello colleagues! An order was received to develop an optimal social network aggregator.
The point is:
Each user can create several news feeds, which indicate various social networks (FB, TW, G+, YT).
As a result, the average user gets 5 feeds, each of which has 5 social accounts connected (from which all posts are pulled at the beginning, and then updated by date).
Other users can view their own and others' feeds.
At the moment there are 2 options:
1) Parse records from all sources and save them to the database, and then give them to users.
2) The parser acts as a proxy and receives the content at the time of the request.
The first option is problematic in that it takes up huge amounts of memory and resources. (100 users, 5 feeds, 5 social accounts, already 2500 requests and a considerable amount of database)
The second one is less relevant due to time delays, problematic sorting, etc.
I would like to hear your opinion and options for implementing such a task.

Answer the question

In order to leave comments, you need to log in

2 answer(s)
A
Alexander Aksentiev, 2014-10-29
@NetyNicka

It depends on what requirements, in addition to "social network aggregator",
Proxies are more profitable in terms of resources.
With speed, large networks should not have problems.
Here is the VK api for example
Average response time - 48 ms
Uptime - 100%
If you need some more or less complex sorting, then you need to do the first option. Also, if you plan to notify users about a particular event, the first option.
Why do you count the number of requests and the place. You will not be storing media. It is enough to store text information and links to media, if any.
Again, it all depends on the requirements and functionality.
It is possible to make more options. Delete old data after N times. If you need to get them, then download them on the fly.
Or something intermediate between 1 and 2. Store meta information as a list of post/post IDs, and get them all on the fly when requested by the user.

F
frees2, 2014-10-29
@frees2

I have experience in aggregation, these bad people, radish admins, from social networks, in particular, Facebook, Google search, and G + (rss) and so on, constantly change "headlines", do not rely on a permanent parser. Not enough nerves.

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question