|
Edited on Thu Apr-22-10 12:05 AM by RoyGBiv
Why is 32 bit still the "default"?
I mean, I understood in the beginning. Processors that supported 64 bit were the exception. There was little incentive to make 64 bit apps, drivers, etc. available except for "enthusiasts" who live on the bleeding edge.
But that's changed.
I realize it's a lot better than it once was, but some fairly major applications don't have 64 bit versions. Firefox comes to mind. You can get a 64 bit version, but it's not the standard, and then you have to deal with certain extensions that won't function in a 64 bit environment or that need to be compiled separately just for your machine. Running a 32 bit app in a 64 bit environment isn't really a problem, but why? Why is Flash's 64 bit version still an alpha product (even though it works fairly well, it seems)? Why did I just spend 30 minutes compiling Thunderbird and then compiling engimail just so I could have a 64 bit version of it and use GnuPG with it?
I don't remember this kind of lag before 32 bit became standard. Not being a programmer, I admit to a vast degree of ignorance in this. Is there a good, practical reason for it?
OnEdit: One reason I find this odd is that Mozilla has 64 bit versions of its daily builds in its repositories. The final product they release, however, is 32 bit. I don't get it.
|