x86 came out 1978,

21 years after, x64 came out 1999

we are three years overdue for a shift, and I don’t mean to arm. Is there just no point to it? 128 bit computing is a thing and has been in the talks since 1976 according to Wikipedia. Why hasn’t it been widely adopted by now?

  • ET3D@alien.topB
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 year ago

    Lots of good responses regarding why 128-bit isn’t a thing, but I’d like to talk about something else.

    Extrapolating from two data points is a folly. It simply can’t work. You can’t take two events, calculate the time between them, and then assume that another event will happen after the same amount of time.

    Besides, your points are wrong. (Edit: That also has been mentioned in another response.)

    x86 (8086) came out in 1978 as a 16-bit CPU. 32-bit came with the 386 in 1985. x64, although described in 1999, was released in 2003.

    So now you have three data points: 1978 for 16-bit, 1985 for 32-bit and 2003 for 64-bit. Differences are 7 years and 18 years.

    Not that extrapolating from 3 points is good practice, but at least it’s more meaningful. You could, for example, conclude that it took about 2.5 times more to move from 32-bit to 64-bit than it did from 16-bit to 32-bit. Multiply 18 years by 2.5 and you get 45 years. So the move from 64-bit to 128-bit would be expected in 2003+45 = 2048.

    This is nonsense, of course, but at least it’s a calculation backed by some data (which is still rather meaningless data).