5 Replies - 2849 Views - Last Post: 07 June 2008 - 06:31 PM Rate Topic: -----

#1 Godzilla  Icon User is offline

  • D.I.C Head

Reputation: 0
  • View blog
  • Posts: 53
  • Joined: 29-May 08

Memory mapping a file too large to fit in memory?

Posted 06 June 2008 - 08:40 PM

How would one approach mmapping a file that's too large to fit in memory? For instance you have 1GB of RAM, and the file is 2GB. Or you have two files each 1GB in size and you want to map both to memory.

Is there any reasonable approach to mapping this file(s), or does one have little choice other than to manually read in the data piece by piece via binary streams?
Is This A Good Question/Topic? 0
  • +

Replies To: Memory mapping a file too large to fit in memory?

#2 NickDMax  Icon User is offline

  • Can grep dead trees!
  • member icon

Reputation: 2250
  • View blog
  • Posts: 9,245
  • Joined: 18-February 07

Re: Memory mapping a file too large to fit in memory?

Posted 07 June 2008 - 01:09 AM

Well... at some level you have to read in the file byte by byte (more or less).
Now you can ask a stream to go out and get you a nice big chunk... you can even continue to use that stream's get pointer to maintain the location within the file...

So basically you read in the data into a buffer as you need it. By filling the buffer in a single operation you can save some time as apposed to loading smaller chunks of data as you need them.
Was This Post Helpful? 0
  • +
  • -

#3 perfectly.insane  Icon User is offline

  • D.I.C Addict
  • member icon

Reputation: 70
  • View blog
  • Posts: 644
  • Joined: 22-March 08

Re: Memory mapping a file too large to fit in memory?

Posted 07 June 2008 - 09:55 AM

It depends on the API. You can map portions of files into memory under Windows and Linux.

There's CreateFileMapping/MapViewOfFile for Windows. MapViewOfFile gives you a window of the file that you've opened, which should be a smaller size than what you have as memory.

There's mmap for Linux and other Unicies, which takes in offset and length parameters for this purpose.

Here's a strange example for windows.
http://www.dreaminco...h...st&p=361708

Memory mapped files are not always the most efficient. This is mostly an implementation dependent thing.
Was This Post Helpful? 0
  • +
  • -

#4 Godzilla  Icon User is offline

  • D.I.C Head

Reputation: 0
  • View blog
  • Posts: 53
  • Joined: 29-May 08

Re: Memory mapping a file too large to fit in memory?

Posted 07 June 2008 - 03:11 PM

Perhaps a better way to phrase it would be, what techniques are out there to map or index a file, not necessarily into memory?

I'd be using this on Linux, though something that works on Windows as well would of course be preferred over Linux specific.
Was This Post Helpful? 0
  • +
  • -

#5 perfectly.insane  Icon User is offline

  • D.I.C Addict
  • member icon

Reputation: 70
  • View blog
  • Posts: 644
  • Joined: 22-March 08

Re: Memory mapping a file too large to fit in memory?

Posted 07 June 2008 - 03:22 PM

If you're not mapping a file into memory, then you'd read it with stream I/O functions. You can randomly access locations in files by resetting the file pointer to a different location (i.e. fseek, istream::seekg, etc.).
Was This Post Helpful? 0
  • +
  • -

#6 Godzilla  Icon User is offline

  • D.I.C Head

Reputation: 0
  • View blog
  • Posts: 53
  • Joined: 29-May 08

Re: Memory mapping a file too large to fit in memory?

Posted 07 June 2008 - 06:31 PM

I wrote the following code:
uint get_item(ifstream &in, int num){
	&in.seekg(num*sizeof(uint), ios_base::beg);
	uint value;
	&in.read((char*)&value, sizeof(uint));
	return value;
}

So for instance given a binary file of uints and an ifstream in to the file, one could get the 20th uint with get_item(in, 20).

Would this be the best approach to take?
Was This Post Helpful? 0
  • +
  • -

Page 1 of 1