O
O
Oleg Tarakanov2015-03-28 22:48:18
big data
Oleg Tarakanov, 2015-03-28 22:48:18

How do you deal with big data in MS SQL Server?

Hello!
Our company develops regional solutions for the education system. We work on ASP.NET + MS SQL Server.
The database size of large customers is rapidly approaching 500 GB - in a year or two we will encounter a database of 5-10 TB in size.
The solution in the form of "rewrite stupid code in Oracle" is not suitable, because the system is really very serious and is in great demand by the end user - the school employee.
Increasing server requirements is working so far, but requesting servers with 1 TB of RAM and 8 CPUs looks wild, even for large regions (Krasnodar Territory, Moscow Region).
A possible solution is being sought by the forces of sys. admin, with minimal involvement of the developer in the process.
Googling hasn't led to anything good so far. I heard that you can split the database and lay out separate tables on other hosts, but I did not find any proof links.
I would appreciate any advice/question.

Answer the question

In order to leave comments, you need to log in

3 answer(s)
A
Artyom Karetnikov, 2015-03-31
@druoleg

So show the structure of the largest tables, let's see. In the savings, I had a database of more than 2 terabytes, but we really have a lot of users and a lot of transactions. Terabyte of RAM was not needed. :)

C
Cool Admin, 2015-03-28
@ifaustrue

Why so much RAM? What is the size of "hot data", what is the load pattern? Why is the database so "healthy"? Doesn't it store binary information? Can it be divided into different smaller databases?
This and more can be used as directions for optimization. And something tells me that good multi-level storage and pushing binaries to a separate place will solve your problems with the database.

I
Ilya T., 2015-03-28
@Insaned

Nothing is clear from your post. More questions arise than answers: why a terabyte of RAM? Than orakl is basically better for you? What is the load profile? What is the data structure? If the load is predominantly on reading, there is no problem to scale horizontally. If the load is mixed, you need to look towards optimization. In general, you need to take a normal DBA by the ears and put him to smoke your base.
5-10 Tb at the present time - nifiga is not a big base.

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question