T
T
tarasverq2015-02-18 13:34:37
Database
tarasverq, 2015-02-18 13:34:37

How to work with a large database using C# + MsSQLL?

Good day!
There is a database in which it is necessary to enter a large amount of data.
Approximately 5 million records per day. Before adding a database to the database, we first check that they are not there, otherwise we edit the existing record.
At first I used a visual generator from the studio, left the plates, found code examples on the Internet.
It was like this:

usersTableAdapter usersTA = new usersTableAdapter();
var usersTable = new users.usersDataTable();
usersTA.Fill(usersTable);

Accordingly, the entire database hung in the RAM, and then updated.
After 10 million records were entered into the database, the start of the program at this stage became very long, the entire RAM and the swap file were also clogged. The program was also quite slow. It took about a minute to process 5000 data.
After that, I thought that it was worth rewriting all this disgrace into a stored procedure. Rewrote. In the same visual generator, I added a stored procedure, and threw out all the use of usersTable from the code. Used only TableAdapter. The RAM stopped clogging up, but the processing time only increased to a minute and forty seconds for 5000 data.
Current stored procedure code:
IF (NOT EXISTS(SELECT * FROM [dbo].[users] WHERE [id] = @id)) 
BEGIN 
    INSERT INTO [dbo].[users]
           (...)
     VALUES
           (...);
    SET	@result = 1;
END 
ELSE 
BEGIN 
    UPDATE [dbo].[users]
   SET ...
 WHERE [id] = @id
  SET	@result = 0;
END

My question is how to make data processing as fast as possible? What are the best components to use? Should the database be checked for data availability, or should it be done in some other way?

Answer the question

In order to leave comments, you need to log in

2 answer(s)
I
Ivan Filatov, 2015-02-18
@tarasverq

How do you transfer records, which are 5000, to storage? 1 piece? Then the time will really increase.
Option 2:
1. Convert all data (all 5000 records) to XML - pass XML to storage, which will parse XML and add/edit data.
2. Load all records (INSERT) into a temporary table in the database. Then you run the storage, which will do the logic of adding / updating data to the main table, comparing two tables - the main and the temporary.

A
Alexey, 2015-02-18
@HaJIuBauKa

Why pull all the data into memory?
First get a list of unique record keys (or a set of fields by which records are unique) and then pull the record one by one and process it.
Is this the data synchronization you are trying to perform?
That is, you just need to transfer the data by ID code?
If so, then you just need to remember on which ID the last download ended and continue from there.

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question