Java – what is the most efficient way to load data from a file into a collection as needed?

I'm developing a java project that allows users to parse multiple files that may have thousands of lines The parsed information is stored in different objects and then added to the collection

Since the GUI does not need to load all these objects at once and save them in memory, I am looking for an effective way to load / unload data from files so that the data is loaded into the collection only when requested by the user

I'm just evaluating options now I also thought of the best way to reload the previously observed data after loading a part of the data into the collection and rendering it on the GUI Rerun parser / populate collection / populate GUI? Or maybe find a way to save the collection to memory, or serialize / deserialize the collection itself?

I know that loading / unloading data subsets becomes tricky if some kind of data filtering is performed Suppose I filter IDS, so my new subset will contain data from two previously analyzed subsets This is no problem. I save the master copy of the whole data in memory

I've read that Google collections is very effective when dealing with large amounts of data and provides ways to simplify many methods, so this can provide an alternative method that allows me to save collections in memory This is only a general discussion The question of what set to use is a separate and complex one

Do you know what is the general recommendation for such tasks? I'd like to hear what you've done in a similar scene

I can provide more details if necessary

Solution

You can embed the database into an application, such as HSQLDB This allows you to parse the file for the first time and then use SQL to execute simple and complex queries

The content of this article comes from the network collection of netizens. It is used as a learning reference. The copyright belongs to the original author.
THE END
分享
二维码
< <上一篇
下一篇>>