I was recently reading Tracy Kidder’s excellent book Soul of a New Machine.

The author pointed out what a big deal the transition to 32-Bit computing was.

However, in the last 20 years, I don’t really remember a big fuss being made of most computers going to 64-bit as a de facto standard. Why is this?

  • noiserr@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 months ago

    384 bits for the GeForce 4090

    384bit is the memory bus width. AMD’s Fiji (r9 290x) had a 512-bit bus in 2013. Not to be confused with data types used for calculations.