Java – is there any way to free memory during big data processing?
I have a database for storing invoices I have to use the information in all invoices to perform complex operations for any given month through a series of algorithms Retrieving and processing the data required for these operations takes up a lot of memory because there may be a large number of invoices When the interval between user requests for these calculations reaches several years, the problem becomes more and more serious As a result, I get a permgen exception because it seems that the garbage collector does not complete its work between monthly calculations
I always use system GC reminds GC that it is not a good habit to do its work So my question is, is there any other way to release memory? Can you force the JVM to use HD swap to temporarily store part of the computation?
In addition, I try to use system. Com for month end calculations GC, the result is high CPU utilization (due to the call to the garbage collector) and low memory utilization This can be done, but I don't think it's a suitable solution
Solution
Do not use system gc(). It always takes a long time to run and usually doesn't do anything
The best way is to rewrite the code to minimize memory usage You haven't explained exactly how the code works, but here are two ideas:
>Try reusing the data structures you generate for each month Therefore, suppose you have a list of invoices, please reuse it next month. > If you need all of these, consider writing the processed files to temporary files while processing, and then reloading them when ready