• ThunderClap449@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 months ago

    I’m fine with it. Consumers reap what they sow, basically. AMD is likely gonna drop high end GPUs in general, if not dedicated GPUs completely.

    • lugaidster@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      10 months ago

      I doubt this. They just might drop TSMC for them and go with Samsung. TSMC is way too expensive for big dies given the reluctance people have to pay similar prices to Nvidia for similar performance from AMD. At the end of the day, AMD doesn’t have a significant edge from a cost perspective. Chiplet benefits are cool, but AMD needs an interposer so the cost advantage might not be as impressive and the 6nm dies might not be as cheap given that they’re still manufactured on TSMC.

      So TLDR: I think they might shift their focused for the higher end market, but I doubt they will entirely abandon it. They might just take a break from it.

    • DeeJayDelicious@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      10 months ago

      I mean, we can debate “high-end”. By RDNA 5, we should have 4k @ 120 fps as a base-line for all dedicated GPUs. Where do you go after that in consumer GPUs?

      While there will always be a small, enthusiast market for super-high end GPUs, I’m not sure the mainstream will be interested in pushing 240 FPS. Maybe Nvida sees the writing on the wall, which is why they’re pivoting away from consumer-focused GPUs.

      And if AMD continues to serve us solid 300-600$ dGPUs until then, I think that’s still a win. I don’t think the market for >1000$ dGPUs is that large anyway.

      • ThunderClap449@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 months ago

        I mean, all of that assumes requirements won’t keep increasing. Raytracing just artificially increases the performance requirements once you start getting to the top of what’s possible. The same will be done once RT is getting capped out.

        • chapstickbomber@alien.topB
          link
          fedilink
          English
          arrow-up
          1
          ·
          10 months ago

          To be fair “highest selling high end GPU ever” for AMD is still not a lot compared to NV or their own midrange stuff.

          • ResponsibleJudge3172@alien.topB
            link
            fedilink
            English
            arrow-up
            1
            ·
            10 months ago

            Sorry, I misread. I thought you said highest selling GPU which is what I have also read elsewhere.

            Seems to me 7800XT is their best performer but not sure

      • ThunderClap449@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 months ago

        From the last few gens? Not even close to best. Certainly not more than Radeon 9000 series, HD4000 series, HD5000 series, HD7000 series, and R9 200 series.

        7900 XTX, one year later is at 0.19% on Steam.

        A year after 6900 XT released, 6900 series was at 1.19%.

        Given their last high end before 6900 was 390X, which there is no steam hardware survey on, but 7970 was ahead… Yeah, not even close.

        • Pl4y3rSn4rk@alien.topB
          link
          fedilink
          English
          arrow-up
          1
          ·
          10 months ago

          Their last “high end” before the RX 6900 XT was the Vega VII and before that Vega 64, yeah they only were able to compete with the RTX 2080 and GTX 1080 respectively but so does the RX 7900 XTX that can only compete with the RTX 4080…

          PS.: Also the high end AMD GPU before Vega 64 was the R9 Fury X (R9 390X was a 290X refresh that launched in the same period), that was quite competitive with the GTX 980 Ti but it’s 4 GB of HBM and and the necessity to be water cooled limited its sales…

          • NoLikeVegetals@alien.topB
            link
            fedilink
            English
            arrow-up
            1
            ·
            10 months ago

            quite competitive with the GTX 980 Ti

            The Fury X was an instant no-buy for high-end 4K gamers, due to the measly 4GB of VRAM.

            Just as the RTX 4080 should be a no-buy for high-end 4K gamers, due to the measly 16GB of VRAM. In a year’s time, AAA RT-enabled games will suck up >16GB at 4K.

            • Pl4y3rSn4rk@alien.topB
              link
              fedilink
              English
              arrow-up
              1
              ·
              10 months ago

              Surely the 4080 might not age very well but it’s very likely it’s RT performance will be insufficient before VRAM becomes an issue, even Alan Wake II limits its use of path tracing at max settings and still uses a decent amount of raster.

            • Pl4y3rSn4rk@alien.topB
              link
              fedilink
              English
              arrow-up
              1
              ·
              10 months ago

              Sadly they weren’t that impactful besides the Vega 56 - competed very well with the GTX 1070 and Nvidia launched the GTX 1070 Ti because of it - they consumed too much power at stock because of overvoltage and they launched way too late…

              • ThunderClap449@alien.topB
                link
                fedilink
                English
                arrow-up
                1
                ·
                10 months ago

                Yep, they were honestly decent cards, but wrong time to launch. Same issue nVidia had with 400 series, without the whole “overpriced to fuck, trying to scam customers” kinda deal they had with the benchmarking requirements for reviewers.

              • LittlebitsDK@alien.topB
                link
                fedilink
                English
                arrow-up
                1
                ·
                10 months ago

                loved my Vega 56, performed well, and a little undervolting fixed the power issue big time… didn’t feel like I were missing out for “not buying Nvidia”

                • Pl4y3rSn4rk@alien.topB
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  10 months ago

                  It really was a good GPU, sadly the 1st impression Vega gave wasn’t good stacked with being one year late, overvolted and barely could reach the GTX 1080 at launch…

                  I would’ve gotten one if they weren’t quite uncommon in my country, even Navi was a lot more easy to find in the used market so I ended up getting a RX 5700 that’s serving me very well!

        • Pristine_Pianist@alien.topB
          link
          fedilink
          English
          arrow-up
          1
          ·
          10 months ago

          Again with the steam numbers it’s not accurate as the data is gather from a pool of people who opts in to the survey that pool could be 500/5000 people we wouldn’t known

          • NoLikeVegetals@alien.topB
            link
            fedilink
            English
            arrow-up
            1
            ·
            10 months ago

            Steam is also heavily biased towards Nvidia users. I’d like to see stats which discount China, which is flooded with Nvidia GPUs, especially in their internet cafes. The other issue is that Steam seems to count the same cafe PC twice, if two survey opted-in gamers log onto that same PC.

      • NoLikeVegetals@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 months ago

        Which is hilarious because 7900 XT(X) is the highest selling high end GPU they’ve ever made afaik

        That sounds plausible, but only because the total addressable market for GPUs is so much bigger now.

        The real measure is the ratio of 7900 XTX to RTX 4080 and also the 4090.

        I’m pretty sure the 4090 is outselling the 7900 XTX by something like 20:1…