S
S
Sergey2015-06-25 10:44:26
PHP
Sergey, 2015-06-25 10:44:26

What is the best way to design a database for a dating site?

There is a task that is described here - yiiframework.ru/forum/viewtopic.php?t=10771 in the topic there is a solution to do something key -> value. How good is this solution? I don't think it's going to be very cool. Search and filtering are only possible if you search for two fields, then you will need to create indexes on both fields, and since the table will be huge due to fierce denormalization, the insert will take a very long time.
What do you think?
upd. The link is not my question, it's just an example of what needs to be done and a description of the problem, there is also a key -> value solution, my solution is to create a relatively large table, for example profile (whom I am looking for, how old, the purpose of dating, marital status .. . etc.), for multiple selection (there are few points) create a separate table
upd2. The problem is that almost every item in the questionnaire is not just a choice from those presented, but also an opportunity to enter something of your own.

Answer the question

In order to leave comments, you need to log in

4 answer(s)
R
Roman Mirilaczvili, 2015-06-25
@2ord

See the Introduction to Graph Databases slides .
I hope this clarifies to you how best to design work with the database.

A
Artem Voronov, 2015-06-25
@newross

Before drawing conclusions, it is necessary to determine the size of the audience. If it is 5000-10000 people? What is there to slow down then?
To the question of denormalization - this is one of the basic HiLoad tricks.

K
KonstV, 2015-06-25
@KonstV

In any highload, according to my relational DB, fences are made and not fenced. Then optimize according to the situation.

D
Djadka Maxho, 2015-06-25
@Djadka

It all depends on the amount and volume of data, if you have 10k people, then the data will be there for a couple of gigs, and you shouldn’t even think about denormalization, you can build a joint and tune a muscle for it and it works smartly if you make the right settings. If some table exceeds several gigs, then you should already think about denormalization, from personal experience you should not waste time on optimization, because it may not be needed, and bottlenecks appear where you can’t even see it.

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question