I was recently reading Tracy Kidder’s excellent book Soul of a New Machine.

The author pointed out what a big deal the transition to 32-Bit computing was.

However, in the last 20 years, I don’t really remember a big fuss being made of most computers going to 64-bit as a de facto standard. Why is this?

  • GomaEspumaRegional@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 months ago

    It depends on what you meant by 64bit computing, which is not the same as x86 becoming a 64bit architecture.

    FWIW, 64 bit computing had been a thing for a very long time in the supercomputer/mainframe space since the 70s. And high end microprocessors had supported 64bit since the early 90s.

    So by the time AMD introduced x86_64 there had been about a quarter century of 64bit computing ;-)

    It was a big deal for x86 vendors, though. As that is when x86 took over most of the datacenter and workstation markets.