C# Is it better to keep objects in memory or reload them?

tom982

Emeritus
Joined
May 31, 2012
Posts
4,351
Location
New York
I've been working on a small project and have code resembling this:

Code:
using (ZipFile zip = ZipFile.Read(ipaFile))
{
    // Extract a certain file
    // Process data from this file
    // Extract further files from the zip file based on the results from the previous file
}

Which got me thinking, is it bad practise to keep something in memory for a long time? It doesn't particularly apply here as step #2 is pretty fast so keeping it in memory isn't an issue, but is there a point where it's better to have an approach like this:

Code:
using (ZipFile zip = ZipFile.Read(ipaFile))
{
    // Extract a certain file
}
    
// Process data from this file

using (ZipFile zip = ZipFile.Read(ipaFile))
{
    // Extract further files from the zip file based on the results from the previous file
}

The IPA files I'm loading are anywhere from 1MB to 312MB so I'm not dealing with anything massive.

If a lot of time consuming operations are done between the two file extractions, is it better to dispose of it and initialise it again later? I know it would take more time to reload it, but wanted to know if there are any downsides to keeping it loaded. I'm not concerned about this for this particular project but just curious for future reference.

Thanks!
 
I am no expert with C# but I think that you could keep the files in RAM since these days the capacity of RAM is higher in almost all of the systems. Furthermore, as soon as you unload the files, the contents are saved on disk and then you need to load the data back into RAM again from disk. And considering the worst case scenario, it will take at least a couple of seconds more to get the data back from the disk.

If I were you, I would have gone with method #1 but since I am no expert, wait for other opinions as well :)


-Pranav
 
It's not bad practice whatsoever, and is either good application design or bad design depending on what you need from that file, and how often you need to grab stuff from it. If the data is no longer useful to the rest of the program's functionality, then obviously it becomes irrelevant and thus is no longer needed, and you probably should not keep it in memory anymore. If the file is large, then you shouldn't store the entire thing in memory though, and avoid the assumption that the RAM can hold it (keep in mind applications have to share this memory), and what you should instead do is read parts of the file in chunks and extract what you need at the time of the read operation.
 
and what you should instead do is read parts of the file in chunks and extract what you need at the time of the read operation.
Could you please elaborate on this or maybe provide an example of code?

By the way, that avatar is scary :|
 
Thank you both, I assumed as much but just wanted to make sure.

If the file is large, then you shouldn't store the entire thing in memory though, and avoid the assumption that the RAM can hold it (keep in mind applications have to share this memory), and what you should instead do is read parts of the file in chunks and extract what you need at the time of the read operation.

Is that with a BufferedStream? I was looking into parsing huge text files a few weeks ago and ended up using a BufferedStream and I suppose the same applies to reading other files. This is only a little tool that'll help me collect some data though so it doesn't need to be robust enough to distribute to others, it just needs to function and I've got plenty of memory to do it.
 
and what you should instead do is read parts of the file in chunks and extract what you need at the time of the read operation.
Could you please elaborate on this or maybe provide an example of code?

By the way, that avatar is scary :|

By reading chunks of the file via a file stream you don't have to read the entire file at once. If you are interested in extracting some kind of image data from the file, then you'll read until you find the image signature (assuming the data is uncompressed for easy explanation), and start reading the data and parsing the image from the file chunk by chunk until the image is read. Some files are over 2GB in size and thus you probably should never read them entirely into memory regardless of what you are doing with that file.

Some people still only have 2GB memory, some have 4, others have 8, and if you are lucky, some have up to 16 and the other odd cases. You can't just expect that someone has the sufficient memory to do it, and keep in mind that this is total memory, not what might be available at the time that your program runs on their system.
 

Has Sysnative Forums helped you? Please consider donating to help us support the site!

Back
Top