E
E
Evgeny Yerko2014-11-06 17:09:11
Database
Evgeny Yerko, 2014-11-06 17:09:11

Key/Value storage for large volumes to work with under PHP ?

Hello!
Share your experience and tell me what you use.
The task is as follows:
- there is a limited server resource
- a mysql database (about 300 thousand records), three columns (bigint 20, int 11 and text ) InnoDB
- the database weighs a lot (for me), about 6GB, and does not leave my database soon can swear
Therefore, I am looking for a Key / Value storage that would cope with a large volume, a simple and understandable driver (set, get, delete, update) to transfer the "text" column to it, where all the basic data is actually stored in "serialize"
- I tried Redis but then I realized that the whole thing is spinning in the OP, but this is not exactly what I need.
- Mongo I have no experience with it, I did not find recommendations if it will pull my database
P.S. base grows every day

Answer the question

In order to leave comments, you need to log in

5 answer(s)
E
Evgeniy Yerko, 2014-11-07
@OsBen

Thanks to all.
I just have the experience of creating an IM and there, as a rule, up to 300mb of the database (in my experience)
Yes, and I work with phpMyAdmin and there the brakes already begin (I know, I know that I need to use the console), I'm used to windows.

X
xmoonlight, 2014-11-06
@xmoonlight

Your base by the number of records is nothing.
Do a normal structure and queries - mysql will pull quietly.
In general, Apache-Cassandra review and installation in Russian

B
Boris Benkovsky, 2014-11-06
@benbor

Your entire size in the database is occupied by text, therefore, no matter what database you take, you will have your 6GB. (Of course +-) Therefore, there are not very many tips, and 300 thousand records are generally antlers, so that the task rests on the sample. So you have to buy more disk space

P
Puma Thailand, 2014-11-07
@opium

300k is the home base of a high school student who decided to index his photos.
6 gigs is not even fit in RAM on any computer.
yes, even 600 gigabytes of database is nothing
to think about when you have 10 terabytes of data there.

N
Ninazu, 2017-06-23
@Ninazu

The task is similar, so I did not create a new topic. My picture is slightly different, but the essence is the same.
Now the database has 200M+ records in MySQL, Field type is VARCHAR(50). The table consists of one PK, you just need to store all unique values.
Now it weighs 12GB + 46GB indexes)
RAM on the server is not enough 512MB, the disk is also 32GB, Now the database is on the local machine, with a forwarded port, since it does not fit on the host. We need a less greedy solution. Perhaps some kind of key / value base, or better without value at all, only the key.
With the ability to search the database in batches. Now I'm pulling along
IN (?,?, ....),
and inserted through
INSERT IGNORE

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question