Java – a faster way to batch save using Hibernate?

I have a program that reads text files line by line, creates a hibernate entity object from each line, and saves them I have several such text files to process, each with about 300000 lines I find my current implementation speed is very slow, and I want to know what I can do to improve it

My main method processes the text file line by line as follows:

// read the file line by line
FileInputStream fileInputStream = new FileInputStream(new File(fileName));
InputStreamReader inputStreamReader = new InputStreamReader(fileInputStream);
BufferedReader bufferedReader = new BufferedReader(inputStreamReader);
int lineCount = 0;
String line = bufferedReader.readLine();
while (line != null)
{
    // convert the line into an Observations object and persist it
    convertAndPersistObservationsLine(line);

    // if the number of lines we've processed has built up to the JDBC batch size then flush
    // and clear the session in order to control the size of Hibernate's first level cache
    lineCount++;
    if (lineCount % JDBC_CACHE_SIZE == 0)
    {
        observationsDao.flush();
        observationsDao.clear();
    }

    line = bufferedReader.readLine();
}

The convertandpersistobservationsline() method just splits the text line into tags, creates a new entity object, fills the entity's fields with data from the tags, and then calls hibernate's session Dao save object of saveorupdate() method Dao methods flush () and clear () are direct calls to the corresponding hibernate session methods

Hibernate attribute 'hibernate use_ second_ level_ Cache 'is set to false, and Hibernate attribute' hibernate jdbc. batch_ Set 'size' to 50, Java constant JDBC_ CACHE_ So is size

Can anyone suggest a better way to solve this problem, or any adjustment above may improve the performance of the batch loader?

Thank you for your help

– James

Solution

The code itself and the hibernate configuration look correct (I mean they follow the batch insert idiom in the documentation) But here are some additional suggestions:

As mentioned earlier, make sure you are not using an ID generator that fails batch processing like identity Use generationtype During auto, the persistence provider will select the appropriate policy based on the database. Therefore, depending on your database, you may have to change the table or sequence policy (because hibernate can use the hi Lo algorithm to cache the ID))

Also ensure that hibernate performs batch processing as expected To do this, activate logging and monitor batchingbatcher to track the size of the batch it is executing (which will be logged)

In your particular case, you may actually consider using the statelessession interface (of course once the problem is solved)

The content of this article comes from the network collection of netizens. It is used as a learning reference. The copyright belongs to the original author.
THE END
分享
二维码
< <上一篇
下一篇>>