问题
I am investigating a strange problem with my application, where the behaviour is different on 2 versions of Windows:
- Windows XP (32-bit)
- Windows Server 2008 (64-bit)
My findings are as follows.
Windows XP (32-bit)
When running my test scenario, the XML parser fails at a certain point during the parsing of a very large configuration file (see this question for more information).
At the time of failure, the process size is approximately 2.3GB. Note that a registry key has been set to allow the process to exceed the default maximum process size of 2GB (on 32-bit operating systems).
The system of the failure is a call to IXMLDOMDocument::load()
failing, as described in the question linked above.
Windows Server 2008 (64-bit)
I run exactly the same test scenario in Windows Server 2008 -- the only variable is the operating system. When I look at my process under Task Manager, it has a * 32
next to it, which I am assuming means it is running in 32-bit compatibility mode.
What I am noticing is that at the point where the XML parsing fails on Windows XP, the process size on Windows Server 2008 is only about 1GB (IOW, approximately half the process size as on Windows XP).
The XML parsing does not fail on Windows Server 2008, it all works as it should.
My questions are:
Why would a 32-bit application (running in 32-bit mode) consume half the amount of memory on a 64-bit operating system? Is it really using half the memory, it is usual virtual memory differently, or is it something else?
Acknowledging that my application (seems) to be using half the amount of memory on Windows Server 2008, does anyone have any ideas as to why the XML parsing would be failing on Windows XP? Every time I run the test case, the error accessed via
IXMLDOMParseError
(see this answer) is different. Because this appears to be non-deterministic, it suggests to me that I am running into a memory usage problem rather than dealing with malformed XML.
回答1:
You didn't say how you observed the process. I'll assume you used Taskmgr.exe. Beware that it's default view gives very misleading values in the Memory column. It shows Working set size, the amount of RAM that's being used by the process. That has nothing to do with the source of your problem, running out of virtual memory space. There is not much reason to assume that Windows 2008 would show the same value as XP, it has a significantly different memory manager.
You can see the virtual memory size as well, use View + Columns.
The reason your program doesn't bomb on a 64-bit operating system is because 32-bit processes have close to 4 gigabytes of addressable virtual memory. On a 32-bit operating system, it needs to share the address space with the operating system and gets only 2 gigabytes. More if you use the /3GB boot option.
Use the SAX parser to avoid consuming so much memory.
回答2:
Not only are there differences in available memory between 32 bit and 64 bit (as discussed in previous answers), but its the availability of contiguous memory that may be killing your app on 32 bit.
On 32 bit machine your app's DLLs will be littering the memory landscape in the first 2GB of memory (app at 0x00400000, OS DLLs up at 0x7xxx0000, other DLLs elsewhere). Most likely the largest contiguous block you have available is about 1.1GB.
On a 64 bit machine (which gives you the 4GB address space with /LARGEADDRESSAWARE) you'll have a least one block in that 4GB space that is 2GB or more in size.
So there is your difference. If your XML parser is relying on a large blob of memory rather than many small blobs it may be that your XML parser is running out of contiguous usable space on 32 bit but is not running out of contiguous usable space on 64 bit.
If you want to visualize this on the 32 bit OS, grab a copy of VMValidator (free) and look at the Virtual view for a visualization of your memory and the Pages and Paragraphs views to see the data for each memory page/paragraph.
来源:https://stackoverflow.com/questions/2152492/process-sizes-and-differences-in-behaviour-on-32bit-vs-64bit-windows-versions