. Net – using dapper ORM to improve the performance of SQLite batch insert

I'm using a desktop application that uses SQLite to batch insert tens of thousands of rows into the SQLite database I want to help optimize batch insert performance At present, it takes up to 50 seconds to insert 60 megabytes of data into the database

>What connection string parameters can I use to improve performance? Should I change the buffer size? This may be through a connection string parameter? Are there any other connection string parameters that improve performance? My current connection string is:

>I'm using dapper orm (built by stackoverflow staff) is there a faster way to batch insert into SQLite? > in. Net System. Data. SQLite is used to insert into SQLite How can I get improved performance of special compiled versions of SQLite? Is one version of SQLite better than another? Currently using system Data. SQLite from http://sqlite.phxsoftware.com >Currently, I wrap inserts in a transaction to make them faster (which is a good improvement). > I insert one table at a time, a total of 17 Can I parallelize this on different threads to make it faster?

Current performance Is this typical? Can I do better

>55000 row to 19 column tables: 2.25 second insertion (24K insertion / s) > 10000 row to 63 column tables: 2.74 second insertion (3.7k / s)

I like SQLite, but I want to make it faster At present, using XML serialization to save objects to XML files is faster than saving them to SQLite database, so my boss asked: why switch to SQLite? Or should you use mongodb or other object databases?

Solution

So I finally found the use Net in SQLite This technique improves the insertion performance by 4.1 times! My total time savings ranged from 27 seconds to 6.6 seconds WOW!

This article explains the fast way to do bulk inserts into SQLite The key is to reuse the same parameter object, but assign a different value for each record insertion Net to build all these dbparameter objects For example, 100k rows and 30 columns = 3 million parameter objects that must be created In contrast, creating and reusing objects with only 30 parameters is much faster

New show:

>55000 rows (19 columns). 53 seconds = 100k inserts / second

internal const string PeakResultsInsert = @"INSERT INTO PeakResult values(@Id,@PeakID,@QuanPeakID,@ISTDRetentionTimeDiff)";

            var command = cnn.CreateCommand();
            command.CommandText = BatchConstants.PeakResultsInsert;

            string[] parameterNames = new[]
                                 {
                                   "@Id","@PeakID","@QuanPeakID","@ISTDRetentionTimeDiff"
                                  };

            DbParameter[] parameters = parameterNames.Select(pn =>
            {
                DbParameter parameter = command.CreateParameter();
                parameter.ParameterName = pn;
                command.Parameters.Add(parameter);
                return parameter;
            }).ToArray();

            foreach (var peakResult in peakResults)
            {
                parameters[0].Value = peakResult.Id;
                parameters[1].Value = peakResult.PeakID;
                parameters[2].Value = peakResult.QuanPeakID;
                parameters[3].Value = peakResult.ISTDRetentionTimeDiff;

                command.ExecuteNonQuery();
            }

Finally, I can't insert dapper into my big table For my small desk, I still use dapper

Note some other things I found:

>I tried to use multiple threads to insert data into the same database without any improvement (no difference) > from system Data. SQLite upgrade 1.0.0 69 to 1.0 79. (it doesn't affect my performance, I can see) > I don't assign a type to dbparameter, it doesn't seem to make a performance difference For reading, I can't improve dapper's performance

The content of this article comes from the network collection of netizens. It is used as a learning reference. The copyright belongs to the original author.
THE END
分享
二维码
< <上一篇
下一篇>>