I was recently reading Tracy Kidder’s excellent book Soul of a New Machine.

The author pointed out what a big deal the transition to 32-Bit computing was.

However, in the last 20 years, I don’t really remember a big fuss being made of most computers going to 64-bit as a de facto standard. Why is this?

  • JaggedMetalOs@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 months ago

    I think it’s more that in the PC world, early x86 really wasn’t very good and the 386 brought not just 32-bits, but also a lot of other improvements that made more advanced OSs like Windows 95, NT and Linux possible. So you had this major step up in capability, not specifically because of the move from 16 to 32 bits but happening at the same time.

    IIRC for platforms that used the Motorola 68k the move from the 16bit 68000 to the 32bit 68020 wasn’t nearly as big because the chips were more similar (the 68000 kind of being a 16/32 bit hybrid anyway, the Atari ST even being named after Sixteen/Thirtytwo)

    And the move from 32bit to 64bit CPUs in modern times is the same, there weren’t any major steps up in terms of capability other than much larger RAM address space.

    And obviously for consoles “bits” was still a big marketing gimmick at the time so calling the newer console generation “32bits” was a big thing even though it doesn’t really mean anything (eg. bits dropped from the 64bit N64 to the 32bit GameCube, because no-one cared about bits anymore).