• ZaadKanon69@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Maybe, but it would be unnecessary because RDNA4 caps out at midrange so GDDR6 would suffice. AMD decided to cut RDNA4 high-end to produce more AI chips and earn more money, and give their engineers more time to get multiple GPU chiplets on 1 card working for RDNA5, which is where the real performance boom is at. RDNA3 still has 1 graphics die, only the memory controller and cache is on separate chiplets.

      So the 7900XTX will remain AMD’s flagship until RDNA5.

      There might be some kind of refresh of the 7900XT(X) with slightly better performance and efficiency, maybe those would use GDDR7 if possible and economical.

      The good news for current owners is the 7900 cards have plenty of VRAM to last until RDNA5. The bad news is there will be no competition fir the 5080 and 5090 so expect even higher MSRPs than the 4000 series. $2500 MSRP for a 32GB 5090 wouldn’t surprise me. And $1500 for the 5080, the “gaming flagship”.

      If you were waiting for next gen hoping value would improve vs the 4000 series… I hope you have even more patience.

      The moment I heard the news about RDNA4 high-end being scrapped and the monster chiplet design was moved to RDNA5, as well as the high AI demand and lack of production capacity I pulled the trigger on a 7900XT because next gen is going to be absolutely bonkers on the Nvidia side and nothing better will be released on the AMD side other than software improvements, maybe a refresh of the 7900 cards but that’s it. This card with 20GB VRAM will last me until RDNA5/RTX6000.

      Jayz2cents also made a video a while ago voicing his opinion to buy a GPU now cause it’s only gonna get worse in the coming years. A situation arguably worse than the crypto boom, combined with a lack of competition for 80 and 90 series and Nvidia’s Apple approach… Bad news.

      Intel won’t have a truly viable product for general gaming within this timeframe either. Even today their drivers are lightyears behind both AMD and Nvidia with performance all over the place depending on each individual game. And Intel too is making AI chips based on GPUs. The consumer GPUs are like a proof of concept.

      • Jeep-Eep@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        It might be a way to improve perf either straight up or by cutting wattage and even with the scrapped big models - assuming that a possible AI crash doesn’t have those tapeouts pulled from storage - they likely have RDNA 4’s GDDR7 controller taped out if the line was to use it.

        • ZaadKanon69@alien.topB
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          It would also increase cost, and Navi43/44 don’t need that extra performance. They will probably be slower or at best the same speed as a 7800XT. On the flipside they’ll be dirt cheap too, probably with a good chunk of VRAM, so really good low-midrange cards that actually work in games unlike Intel.

          Unless AMD is already going for multi-graphics chiplets and they just slap on four Navi43 chiplets to create a flagship. That would be pretty epic and is an option still on the table, they already have a proper functional chiplet design fir their AI cards. But I think we won’t see that until RDNA5.

    • Wfing@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      The current lineup doesn’t even use GDDR6X and it’s selling like shit. There’s just no way.