If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below. |
|
|
|
Thread Tools | Display Modes |
#1
|
|||
|
|||
Industry conversion to 64Bit and WOW64
Niether Microsoft, AMD, nor Intel are coming clean out to the public
about what the conversion to 64bit architecture really entails for backward compatibility and performance degredation of existing applications. Backward compatibility has been the bane of CPU designers for the last 20 years. In order to get clunky CISC chips to run at 1.0 GHz and higher speeds, they have had to re-design the core pipeline of the CPU as a RISC core, because RISC cores are much simpler than CISC cores, and therefore, can run cooler at higher clock speeds. The problem is that these new Pentium IIIs needed to run software that was compiled to run natively on 486s, 386s, or even 286s (be that as it may). In order to pull this off, extra hardware was added to convert old CISC instructions on the fly into small lists of RISC instructions so that the RISC core could execute them. The idea behind the upcoming conversion to 64bit is that we abandon backward compability, so that the complete computer, from OS to software to motherboard to CPU, is all seamlessly integrated in architecture. But Microsoft realized very quickly that not allowing compatibility with existing 32bit apps would be a business nightmare. (Although it would be a utopia in terms of computer speed). So what they did was bundle a WoW64 system with their 64bit versions of XP. Which is the XP Professional x64 Edition. Interestingly enough, neither Microsoft, AMD, nor Intel is coming straight out to the public about what this conversion to 64bit platform really entails. If you google about the details of the WoW64 system you will see that is indeed called an EMULATOR of 32bit apps. Meaning the computer is not sending 32bit code directly to the CPU, but instead traps every incoming 32bit instruction and converts it to 64bit before sending it to the native 64bit processor. Anyone who is familiar with the difference between emulation and "compiled native code" will realize that this entails a hefty performance degredation of 32bit code run on a 64bit platform. How severe is this performance degredation? Well essentially what is happening is that every single reference to memory in old 32bit code will have to be trapped so that all those memory pointer references can be converted into 64bit memory pointers. Applications write and read from memory nearly all the time. 90% of all instructions run, reference memory in some way. Very rarely do you have spans of code that juggle things in the registers of the CPU only. Ironically, if you google about WOW64, you get webpages claiming that a 64bit computer can run old 32bit apps "without performance degredation". This is clearly a bald-faced lie coming out of marketing. I am glad that the CPU industry has finally given up wholesale on backward compability. It needed to be done at some point even though its a "growing pain" of sorts. But what has essentially happened is that the issues of backward compatibility have been outsourced to the Operating System, with the hopes that the end-user will not "notice" a performance degredation. It is likely that professionals running 32bit business apps in an office probably will not "notice" a performance degredation at all. But those of you who work with intensive graphics applications or physics simulations beware. |
#2
|
|||
|
|||
"HMS Beagle" wrote in message
... Niether Microsoft, AMD, nor Intel are coming clean out to the public about what the conversion to 64bit architecture really entails for backward compatibility and performance degredation of existing applications. Backward compatibility has been the bane of CPU designers for the last 20 years. In order to get clunky CISC chips to run at 1.0 GHz and higher speeds, they have had to re-design the core pipeline of the CPU as a RISC core, because RISC cores are much simpler than CISC cores, and therefore, can run cooler at higher clock speeds. The problem is that these new Pentium IIIs needed to run software that was compiled to run natively on 486s, 386s, or even 286s (be that as it may). In order to pull this off, extra hardware was added to convert old CISC instructions on the fly into small lists of RISC instructions so that the RISC core could execute them. The idea behind the upcoming conversion to 64bit is that we abandon backward compability, so that the complete computer, from OS to software to motherboard to CPU, is all seamlessly integrated in architecture. But Microsoft realized very quickly that not allowing compatibility with existing 32bit apps would be a business nightmare. (Although it would be a utopia in terms of computer speed). So what they did was bundle a WoW64 system with their 64bit versions of XP. Which is the XP Professional x64 Edition. Interestingly enough, neither Microsoft, AMD, nor Intel is coming straight out to the public about what this conversion to 64bit platform really entails. If you google about the details of the WoW64 system you will see that is indeed called an EMULATOR of 32bit apps. Meaning the computer is not sending 32bit code directly to the CPU, but instead traps every incoming 32bit instruction and converts it to 64bit before sending it to the native 64bit processor. Anyone who is familiar with the difference between emulation and "compiled native code" will realize that this entails a hefty performance degredation of 32bit code run on a 64bit platform. I think I'll believe this more than I'll believe you: 'For example, the version of 64-bit Windows developed for the Intel Itanium 2 processor uses Wow64win.dll to set up the emulation of x86 instructions within the Itanium 2's unique instruction set. That's a more computationally expensive task than the Wow64win.dll's functions on the AMD64 architecture, which switches the processor hardware from its 64-bit mode to 32-bit mode when it's time to execute a 32-bit thread, and then handles the switch back to 64-bit mode. No emulation is required here.' http://en.wikipedia.org/wiki/WOW64 How severe is this performance degredation? Well essentially what is happening is that every single reference to memory in old 32bit code will have to be trapped so that all those memory pointer references can be converted into 64bit memory pointers. Applications write and read from memory nearly all the time. 90% of all instructions run, reference memory in some way. Very rarely do you have spans of code that juggle things in the registers of the CPU only. Ironically, if you google about WOW64, you get webpages claiming that a 64bit computer can run old 32bit apps "without performance degredation". This is clearly a bald-faced lie coming out of marketing. I am glad that the CPU industry has finally given up wholesale on backward compability. It needed to be done at some point even though its a "growing pain" of sorts. But what has essentially happened is that the issues of backward compatibility have been outsourced to the Operating System, with the hopes that the end-user will not "notice" a performance degredation. It is likely that professionals running 32bit business apps in an office probably will not "notice" a performance degredation at all. But those of you who work with intensive graphics applications or physics simulations beware. -- Derek |
#3
|
|||
|
|||
Derek Baker wrote:
"HMS Beagle" wrote in message ... Interestingly enough, neither Microsoft, AMD, nor Intel is coming straight out to the public about what this conversion to 64bit platform really entails. If you google about the details of the WoW64 system you will see that is indeed called an EMULATOR of 32bit apps. Meaning the computer is not sending 32bit code directly to the CPU, but instead traps every incoming 32bit instruction and converts it to 64bit before sending it to the native 64bit processor. Anyone who is familiar with the difference between emulation and "compiled native code" will realize that this entails a hefty performance degredation of 32bit code run on a 64bit platform. I think I'll believe this more than I'll believe you: 'For example, the version of 64-bit Windows developed for the Intel Itanium 2 processor uses Wow64win.dll to set up the emulation of x86 instructions within the Itanium 2's unique instruction set. That's a more computationally expensive task than the Wow64win.dll's functions on the AMD64 architecture, which switches the processor hardware from its 64-bit mode to 32-bit mode when it's time to execute a 32-bit thread, and then handles the switch back to 64-bit mode. No emulation is required here.' http://en.wikipedia.org/wiki/WOW64 I guess just one more reason to stick with the AMD64 line of CPUs. My XP3700 Clawhammer feels so much faster than any Intel processor I've used, and it runs a LOT cooler!!! |
#4
|
|||
|
|||
"andrew" wrote My XP3700 Clawhammer feels so much faster than any Intel processor I've used, and it runs a LOT cooler!!! Wait until you get into the 90nm models. They are amazing! -- Ed Light Smiley :-/ MS Smiley :-\ Send spam to the FTC at Thanks, robots. |
#5
|
|||
|
|||
HMS Beagle wrote:
Niether Microsoft, AMD, nor Intel are coming clean out to the public about what the conversion to 64bit architecture really entails for backward compatibility and performance degredation of existing applications. Backward compatibility has been the bane of CPU designers for the last 20 years. In order to get clunky CISC chips to run at 1.0 GHz and higher speeds, they have had to re-design the core pipeline of the CPU as a RISC core, because RISC cores are much simpler than CISC cores, and therefore, can run cooler at higher clock speeds. The problem is that these new Pentium IIIs needed to run software that was compiled to run natively on 486s, 386s, or even 286s (be that as it may). In order to pull this off, extra hardware was added to convert old CISC instructions on the fly into small lists of RISC instructions so that the RISC core could execute them. The idea behind the upcoming conversion to 64bit is that we abandon backward compability, so that the complete computer, from OS to software to motherboard to CPU, is all seamlessly integrated in architecture. But Microsoft realized very quickly that not allowing compatibility with existing 32bit apps would be a business nightmare. (Although it would be a utopia in terms of computer speed). So what they did was bundle a WoW64 system with their 64bit versions of XP. Which is the XP Professional x64 Edition. Interestingly enough, neither Microsoft, AMD, nor Intel is coming straight out to the public about what this conversion to 64bit platform really entails. If you google about the details of the WoW64 system you will see that is indeed called an EMULATOR of 32bit apps. Meaning the computer is not sending 32bit code directly to the CPU, but instead traps every incoming 32bit instruction and converts it to 64bit before sending it to the native 64bit processor. Anyone who is familiar with the difference between emulation and "compiled native code" will realize that this entails a hefty performance degredation of 32bit code run on a 64bit platform. How severe is this performance degredation? Well essentially what is happening is that every single reference to memory in old 32bit code will have to be trapped so that all those memory pointer references can be converted into 64bit memory pointers. Applications write and read from memory nearly all the time. 90% of all instructions run, reference memory in some way. Very rarely do you have spans of code that juggle things in the registers of the CPU only. Ironically, if you google about WOW64, you get webpages claiming that a 64bit computer can run old 32bit apps "without performance degredation". This is clearly a bald-faced lie coming out of marketing. I am glad that the CPU industry has finally given up wholesale on backward compability. It needed to be done at some point even though its a "growing pain" of sorts. But what has essentially happened is that the issues of backward compatibility have been outsourced to the Operating System, with the hopes that the end-user will not "notice" a performance degredation. It is likely that professionals running 32bit business apps in an office probably will not "notice" a performance degredation at all. But those of you who work with intensive graphics applications or physics simulations beware. Lets not forget, apps that aren't easy to port to 64bit compilation, are down to poor design and programming in the 1st place... |
#6
|
|||
|
|||
On Mon, 20 Jun 2005 11:59:43 +0100, "Derek Baker"
wrote: I think I'll believe this more than I'll believe you: 'For example, the version of 64-bit Windows developed for the Intel Itanium 2 processor uses Wow64win.dll to set up the emulation of x86 instructions within the Itanium 2's unique instruction set. That's a more computationally expensive task than the Wow64win.dll's functions on the AMD64 architecture, which switches the processor hardware from its 64-bit mode to 32-bit mode when it's time to execute a 32-bit thread, and then handles the switch back to 64-bit mode. No emulation is required here.' But wait! That looks and smells and sounds like full hardware backward compatibility. You do realize that having a 32-bit pipe inside the 64bit CPU adds extra circuitry, and extra circuitry means that you run hotter, thus you are less stable at higher clock speeds. I know I realize this. But do you? My understanding is that this conversion to 64bit was going to abandon backward compatibility (at the hardware level) so that we can run these next-generation processors cooler, and hence faster. Maybe I was wrong. Maybe the chip manufacturers have caved in and decided to do the same old stuff they have been doing for the last 20 years. (And that is conserving old clunky hardware that was designed to run code written 10 years ago.) |
#7
|
|||
|
|||
"HMS Beagle" wrote But wait! That looks and smells and sounds like full hardware backward compatibility. You do realize that having a 32-bit pipe inside the 64bit CPU adds extra circuitry, and extra circuitry means that you run hotter, thus you are less stable at higher clock speeds. I know I realize this. But do you? The Winchesters and later run _really_ cool. Like, on 2/3 or less the power of the former ones. When mine is at 100% along with the video card, it only reaches 45C if it's warm inside. I do have an Arctic Freezer 64 (with the ratty base) on it with no fan, and an aluminum tape duct to the 80mm 2300 rpm case fan, so it's not a typical case, but they are fabulously low-power. -- Ed Light Smiley :-/ MS Smiley :-\ Send spam to the FTC at Thanks, robots. |
#8
|
|||
|
|||
"HMS Beagle" wrote in message ... On Mon, 20 Jun 2005 11:59:43 +0100, "Derek Baker" wrote: I think I'll believe this more than I'll believe you: 'For example, the version of 64-bit Windows developed for the Intel Itanium 2 processor uses Wow64win.dll to set up the emulation of x86 instructions within the Itanium 2's unique instruction set. That's a more computationally expensive task than the Wow64win.dll's functions on the AMD64 architecture, which switches the processor hardware from its 64-bit mode to 32-bit mode when it's time to execute a 32-bit thread, and then handles the switch back to 64-bit mode. No emulation is required here.' But wait! That looks and smells and sounds like full hardware backward compatibility. You do realize that having a 32-bit pipe inside the 64bit CPU adds extra circuitry, and extra circuitry means that you run hotter, thus you are less stable at higher clock speeds. I know I realize this. But do you? There is ONE pipe for each task, it does both 64 and 32 bit. My understanding is that this conversion to 64bit was going to abandon backward compatibility (at the hardware level) so that we can run these next-generation processors cooler, and hence faster. Maybe I was wrong. Maybe the chip manufacturers have caved in and decided to do the same old stuff they have been doing for the last 20 years. (And that is conserving old clunky hardware that was designed to run code written 10 years ago.) -- Derek |
#9
|
|||
|
|||
"HMS Beagle" wrote:
But wait! That looks and smells and sounds like full hardware backward compatibility. You do realize that having a 32-bit pipe inside the 64bit CPU adds extra circuitry, and extra circuitry means that you run hotter, thus you are less stable at higher clock speeds. I know I realize this. But do you? Hmm, maybe you should read up on the AMD 64 bit extemions before posting this mis-information around. The chip doesn't have both a 64 and 32 bit pipe, it uses a single pipe where only the lower 32 bits are being used when executing 32 bit code. Hence running 32 bit code means only the lower half of the pipe is being utilized. My understanding is that this conversion to 64bit was going to abandon backward compatibility (at the hardware level) so that we can run these next-generation processors cooler, and hence faster. You are thinking of Intel's Itanium 64 bit chip. It requires software emulation to run legacy 32 bit applications. AMD's 64 bit extensions are fully backward compatible to the point that the 64 bit chips can boot and run a 32 bit OS that is unware it is running on a 64 bit CPU. The fact I can install a plain copy of 32 bit Windows or Linux without any special emulation software is proof that no emulation occurs. The fact that 32 bit games run blazingly fast on 32 bit Windows, in fact faster than 64 bit games on 64 bit Windows, is also proof that there is no performance penalty. Maybe I was wrong. Maybe the chip manufacturers have caved in and decided to do the same old stuff they have been doing for the last 20 years. (And that is conserving old clunky hardware that was designed to run code written 10 years ago.) While AMD64 is backward compatible with 32 bit code, it is hardly old clunky hardware. The 64 bit architecure adds a significant number of new 64 bit registers to the programming model, with the original 32 bit registers extended to full 64 bits. This alone allows compilers to do a much better job of optimizing code, some Linux tests showed performance improvements of 30% due to this fact alone. While encryption software that can make full use of 64 bit registers benefit by 200%. The architectural features available from this new chip match those of any other 64 bit architecture. Also due to the variable length instruction word, code for AMD64 is much smaller than the same code for a 64 bit RISC chip. So the AMD64 uses less memory bandwidth to read its instruction stream, leaving more available for data (although making the instruction decoding more complex, but we can live with complex if it makes the chip faster). |
#10
|
|||
|
|||
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1 /me thinks somebody is getting 0wned! BL. - -- Brad Littlejohn | Email: Unix Systems Administrator, | Web + NewsMaster, BOFH.. Smeghead! | http://www.sbcglobal.net/~tyketto PGP: 1024D/E319F0BF 6980 AAD6 7329 E9E6 D569 F620 C819 199A E319 F0BF -----BEGIN PGP SIGNATURE----- Version: GnuPG v1.4.1 (GNU/Linux) iD8DBQFCuFeZyBkZmuMZ8L8RAtYQAKDbwy8gopmYP4ZEtYJLeq 6B38E7tgCfQjlN t5E1/VgGFiq4JI3UtCyY2TU= =+fD7 -----END PGP SIGNATURE----- |
|
Thread Tools | |
Display Modes | |
|
|