Java – a way to reduce memory loss

background

I have a spring batch program that reads the file (the size of the sample file I'm using is 4 GB), processes the file a little, and then writes it to the Oracle database

My program uses one thread to read files, 12 working threads to process and push databases

I'm stirring a lot of young memories, which makes my program slower than I thought

establish

JDK 1.6. 18 spring batch 2.1 X 4 core machine 16 GB RAM

-Xmx12G 
-Xms12G 
-NewRatio=1 
-XX:+UseParallelGC
-XX:+UseParallelOldGC

problem

With these JVM parameters, I'll get about 5.5% of tenured generation X GB of memory and about 5.5% of young generation X GB of memory

In the process of processing this file, my tenured generation is good It grows to a maximum of 3 GB, and I never need to do a complete GC

However, the younger generation is also the largest It reaches the range of 5 GB, then the parallel secondary GC occurs, and the young Gen is reduced to 500MB A secondary GC is better than a full GC, but it still slows down my program (I'm sure that when a young Gen set occurs, the application still freezes because I see database activity crash) I spent more than 5% of the program time freezing small GC, which seems excessive In the process of processing this 4 GB file, I will say that I use 50-60 GB of young gene memory

There are no obvious defects in my program I try to follow the general OO principles and write clean java code I try to create objects for no reason I'm using thread pools and passing objects as much as possible instead of creating new objects I'll start profiling the application, but I wonder if anyone has some good general rules of thumb or anti patterns to avoid excessive memory loss? It's 50-60gb memory loss. Is it best for me to deal with a 4GB file? Do I need to restore JDK 1.2 skills, such as object pool? (although Brian Goetz gave a presentation, including why object pooling is stupid, we don't need to do this. I believe he believes more than I believe in myself:):)

Solution

I think the session with the memory analyzer will emit a lot of light on this topic This gives a good overview of how many objects have been created, which has some display

I am always amazed at how many strings are generated

It is also obvious for domain objects that cross reference them If you suddenly see three times more objects from derived objects, there is something at the source

NetBeans has a good way to build it I've used jpprofiler before I think if you spend enough time on eclipse, you can get the same information from the PPTP tool

The content of this article comes from the network collection of netizens. It is used as a learning reference. The copyright belongs to the original author.
THE END
分享
二维码
< <上一篇
下一篇>>