8 TB is not the mathematical limit:
(2^64) bits = 2 exabytes
However, physical addressing limitations on all the 64 bit processors are 48 bits. Read the pagetable blog (
http://www.pagetable.com/?p=29) entry I posted:
(2^48) bits = 32 terabytes
So 8 terabytes is a limit set by Windows.
Like I said about the IMAGE_FILE_LARGE_ADDRESS_AWARE, this is an option that has to be enabled during compilation. Sure Microsoft's products would have it enabled, but what about other peoples products.
I have read the paper about Multics, and what you are saying is not the same as what they are saying.
When they talk about mapping it to a hard drive, they mean executable code. They are not talking about user generated variables that are filled with whatever they want, they are not talking about actually reading a file from disk into memory.
So if I have a program that is 15 megabytes, it would only load the first megabyte, and then when required it will page fault, and load the require data from the file on disk. Same thing happens with DLL's, however they are shared across processes, so if it already exists within memory, just map it to that one, or page fault and fetch it from disk.
Any memory the user creates on the heap/stack (mine is on the heap, when I go grab a 1.4 Gb file from a network share) will be stored in the pagefile if required, especially if it overflows main physical RAM.
I have reduced the file size to 1 GB. That will come in just under 1.9 GB of stuff in memory (had I used C instead of C++ I would not have had as much overhead). Do note, most programmers use C# these days, so the overhead I am seeing with C++ is almost nothing compared to what they would see.
Once the file is loaded into memory, I will remove the network cable from the machine, so that now the only way to get the data is by actually reading it from the pagefile/memory. If what you say is correct this will fail.
Do note, during these tests nothing is ever changed on this file. The same results have been achieved using a file that is sitting on a remote server, as well as on a local server. Even the one on the local disk still swaps what is in memory out.
Notepad for example edits files on the local disk. One would assume then if it was cheap to just allocate memory and let it be mapped back to a file that it could open files over 1 GB without any issues, but Notepad does exactly what any inefficient program would do, and that is copy the entire file into memory. (Try it, load up a 100 MB file into notepad, notice how Notepad now takes up a 100 MB of memory?). Even if you are not planning to edit the file, Notepad will still take up that much space in memory. Since it has not been touched, there has been no reason to mark the page dirty, so it should just map it back to the file, and when the system runs out of actual ram, it will only have to put a small piece of code to ram to let the system know where it is supposed to map it.
I am sorry Perris, but what you have so far been saying is correct in a sense. It is only correct for executable code, as I said with my 15 MB binary example. It is not correct for any other case. Notepad shows just that. It is a naive application and copies everything into RAM. Which is why it is not possible to open big files with Notepad.
Programmers have been using tricks for years now to only have to parse a subset of a certain amount of data because of this very reason. There is no way it will all fit into memory. When you load up a 50 MB Word doc with images and text, it will only load the page you are currently viewing, generate the "viewable" image, and then throw that out of memory, and load the next page. That way it won't ever have the full 50 MB worth of document in memory as that is inefficient.
Here we go, found the error number on a thread:
http://forums.microsoft.com/WindowsHomeServer/ShowPost.aspx?PostID=1444560&SiteID=50
So this is definitely a Windows Vista issues that the original poster was having. All he can hope for is to have Microsoft get their act together and fix this. As it is definitely a system bug, and one that will not be fixed by switching to 64 bit.