Answer the question
In order to leave comments, you need to log in
How to successfully parse a huge .csv file?
There is a task: to take a csv file, parse it and write the data to the database (I use Entity). The database has 617 columns with different data types (int, real, varchar, datetime). Question: What is the best way to do this?
Answer the question
In order to leave comments, you need to log in
If there is a lot of data and you just need to add it ( INSERT INTO ), then forget about Entity Framework and use SqlBulkCopy .
You should also think about reducing the number of columns, 617 is too much :-)
Types will have to be done by hand if the CSV does not contain any information about the data type (for example, in the table header). Alternatively, you can try to determine the data type automatically from the first row of data. For example, if there are only numbers in the field, then consider that it is an int ( \d+ ), if the numbers are a separator, then float ( [\d\,\.]+ ), if the date ( \d{1,2}\ .\d{1,2}\.\d{2,4} ), thendatetime (or date ), if true or false ( true|false ), then bool ( bit ), and varchar by default . Parentheses indicate an approximate variant of regular expression patterns to be checked.
CSV-TO-SQL is an online handy service. Will make a SQL script. He will understand where varchar, bool and so on. But 617 columns is a tin of course ..)
If this is a one-time task, then it is better to import through the DBMS client:
MsSQL: https://support.discountasp.net/kb/a1179/how-to-im...
MySQL: mysqlimport or LOAD DATA INFILE
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question