Is java – readfull() at risk of suffocation?

I notice that when I use readfull() on a file instead of read (byte []), the processing time is greatly reduced However, I found that readfull may be a double-edged sword If I accidentally try to read a huge, thousands of megabytes file, it may suffocate?

This is the function I use to generate SHA-256 checksum:

public static byte[] createChecksum(File log,String type) throws Exception {
    DataInputStream fis = new DataInputStream(new FileInputStream(log));
    Long len = log.length();
    byte[] buffer = new byte[len.intValue()];
    fis.readFully(buffer); // TODO: readFully may come at the risk of
                            // choking on a huge file.
    fis.close();
    MessageDigest complete = MessageDigest.getInstance(type);
    complete.update(buffer);
    return complete.digest();
}

If I use instead:

DataInputStream fis = new DataInputStream(new BufferedInputStream(new FileInputStream(log)));

Will this take the risk? Or... Is it the best choice (in case you can't control the data size) to always control the number of bytes read in and use a loop until all bytes are read?

(think about it, because the messagedigest API receives a complete byte array at a time, I don't know how to get a checksum without filling all the data immediately, but I think this is another problem of another thread

Solution

You should only allocate a buffer of moderate size (possibly 65536 bytes) and do a loop. You read 64KB at a time and attach it to the digester in the loop with "complete. Update()" Be careful with the last block so that you only process the number of bytes read (possibly less than 64KB)

The content of this article comes from the network collection of netizens. It is used as a learning reference. The copyright belongs to the original author.
THE END
分享
二维码
< <上一篇
下一篇>>