quote: As the term became common after the introduction of the 80386, it usually implies binary compatibility with the 32-bit instruction set of the 80386. This may sometimes be emphasized as x86-32 or x32 to distinguish it either from the original 16-bit "x86-16" or from the 64-bit x86-64.  Although most x86 processors used in new personal computers and servers have 64-bit capabilities, to avoid compatibility problems with older computers or systems, the term x86-64 (or x64) is often used to denote 64-bit software, with the term x86 implying only 32-bit. 
quote: So you're both going to pretend that the convention isn't to use "x86" to refer to 32-bit stuff and "x64" to refer to 64-bit stuff?
quote: illiteratehack writes "10 years ago AMD released its first Opteron processor, the first 64-bit x86 processor . The firm's 64-bit 'extensions' allowed the chip to run existing 32-bit x86 code in a bid to avoid the problems faced by Intel's Itanium processor. However AMD suffered from a lack of native 64-bit software support, with Microsoft's Windows XP 64-bit edition severely hampering its adoption in the workstation market." But it worked out in the end.
quote: I have more expertise about computer hardware in my left pinky than either of you children will ever learn.
quote: The insistence of toddlers that long-established norms like x86 vs. x64 to denote 32-bit vs. 64-bit somehow never existed