64 BIT hardware requires more silicon to build. That means more $'s which kills the desire to go to appliance type computing.
There are still 8 BIT microprocessors being used in embedded applications everywhere, though 16 BIT are gaining ground. Smaller means cheaper which means mainstream.
As for upgrading indutrial hardware to support new software - our latest fiasco at work was horrid. New $100 MB did not have enough of the right PCI socket types to use the $5k, special, serial link boards we must use so we lost one of 3 data channels. The new Viewsonic 26 inch, wide screen monitors do not work right (yes, we tried varying resolutions, HDMI vs component, different cables, etc.) with the new MB's video and text is so blurry it is illegible. The 19 inch CRT's that were failing after 12 years of service were still crystall sharp.
My personal PC got an upgrade from a 21 inch CRT to a 25 inch Samsung LCD. I coudl not read the text without severe eyestrain. Small fonts appeared as black smears. Our IT "monitor expert" came by and wasted 4 hours of his and my time trying to get something. (That's more dollars than a new alienware system would cost!) He wnet through 2 additional, different new monitors (Samsung and HP), 2 new cables. 2 of the three monitors he saw immediately had mask issus casuing hosting and he sent them back. With the best cable and monitor he could find he finally realised the older video card (ATI) was not driving the monitor adequately. He tried another video card which did not work properly in that PC. Then he threw up his hands and said get the guy his old monitor back in lieu of a total system replacement. Then to get the CRT back instead of an LCD I had to waste 2 hours going to plant medical to get special dispensation to have a CRT... My CRT was long gone so they gave me another scrap one which every day or develops a case of the jitters wheer the fonts go all jagged until I cycle power.
Upgrade hardware in an industrial environment. HELL NO!