• rW0HgFyxoJhYka@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      11 months ago

      AMD kinda looks content to just copy whatever NVIDIA does and take up their 10% marketshare just by default due to people who hate NVIDIA.

      • James20k@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        11 months ago

        due to people who hate NVIDIA.

        I’ve always bought AMD personally, partly because of nvidias anticompetitive business practices, but largely because the actual hardware has always been better for the same price. Eg I needed a GPU with the most vram/$ recently for some compute, and a 6700xt was a clear win. I suspect a lot of people go AMD for basically this reason

        Increasingly though their software support is so bad its making it hard to justify buying a new AMD gpu even if it is cheaper

    • BrainSweetiesss@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      11 months ago

      They are too busy increasing their GPU prices after having one or maybe two successful releases in the last 5 years

    • MrPapis@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      11 months ago

      In reality all this special sauce is cool I think noone disagrees, but its far from most of gaming. Even in the games it is cool you’re still actively degrading performance x2-3-4 times. There’s definitely something to be said about a native non RT image being better than upscaled-fg-rt/pt, but obviously at the cost of the very cool lighting/shadow tech. Which lets be honest for the most part(path tracing not withstanding) isn’t even better looking. It is more realistic though and I do appreciate that but at 30-60fps instead of 60-120? Eh it isn’t a clear win just different.

      My main games this year is: Starfield, cp2077 dlc, diablo and call of duty. It’s one dlc of a game I finished 2 times where Nvidia tech actually matter.

      I don’t give a shit about Alan wake. The first one was a glorified tech demo. The second part here seems more like a game but it’s also mostly just a tech demo/indie game and let’s be real most people wouldn’t give a crap about it if it wasn’t for the Nvidia promotion. Noone cared about the old one that’s for sure. People are told to care about this game by the green overlord but it’s the worst excuse ever for the Nvidia superiority. Doubly because even Nvidia can’t even run it properly. And if you actually want good performance aka non RT/pt AMD is even out performing Nvidia. 7900xtx being closer to 4090 than 4080 shows a real advantage that you can bet on instead of this tech that’s still very much vaporware/ not really used a lot.

      Cp2077 and ratchet and clank those are 2 real games where Nvidia really trumps with their special sauce. And are we really so blinded that we are gonna make purchasing off of a few outliers in a sea of it literally just not mattering?

      The answer is yes because fomo not because the Nvidia special sauce is anything much.

      • Relevant_Force_3470@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        11 months ago

        Nvidia wipe thier cocks all over amd’s face in cyberpunk. Not sure about the other games, as they’re a bit shite for my liking. COD I assume runs on a toaster.

        Alan Wake 2 is reviewing very well. Its a good gane that also utilises modern tech; just what us gamers crave. So it is very relevant.

        • MrPapis@alien.topB
          link
          fedilink
          English
          arrow-up
          1
          ·
          11 months ago

          Alan wake is a horror Puzzle game that’s 20 hours. How many popular horror Puzzle games is actually for a large amount of people? It’s a far cry from being anything for the mass market. I know that’s how they market it but it just isn’t. People will marvel at their 30 FPS path traced slide show because they have to find the joy in what they purchased but by far most people would enjoy the game more with it off, you know because you can actually run at 60 fps or more which for over a decade has been absolute minimum for a high end system.

          The 4090 can’t even do 1080p ultra saying “look how good they do” when they are barely running the game is just weird.

          And again again the 7900xtx wipes the floor with the 4080 without the tech that makes the game run like crap on all hardware. It’s not like 25 fps is actually usable for 1000+ dollar hardware. That’s simply unacceptable performance for anyone but diehard Nvidia tech fans.

          • DarkLord55_@alien.topB
            link
            fedilink
            English
            arrow-up
            1
            ·
            11 months ago

            30fps is far from a slideshow. And yes I would be completely happy playing at 30fps with path tracing in alanwake 2

      • DarkLord55_@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        11 months ago

        Except with raytracing getting easier to run, Rasterization is probably coming to its final days probably 2 or 3 generations of gpus and I would say most developers will mainly focus on RT/PT over raster.

        as it’s easier to design and develop with. They will probably have a raster mode but it probably wouldn’t look great and be unoptimized.

        So being faster at RT right now nvidia cards will age a lot better than their AMD counterparts offsetting their cheaper prices

        • MrPapis@alien.topB
          link
          fedilink
          English
          arrow-up
          1
          ·
          11 months ago

          This is completely false logic what we are seeing is ever increasing visual fidelity and RT gets harder and harder to run. Just look at the old RT GPUs they become irrelevant with every new generation and that will be worse before it gets better. Maybe by 7000 series your possible future will come to fruition but today now and 5 years ahead things are looking very weak and every Nvidia GPU will become much faster obsolete instead of lasting better. It’s completely untrue point that isn’t true today so why would it be tomorrow?

          Please think some more about it, you’re really not making sense.

    • James20k@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      11 months ago

      The most bizarre thing is that things like XeSS are better than FSR. There’s literally no reason for intel to be able to do a better job here, especially because their solution is way newer

      I have no idea what’s going on on amd’s software side of things, but chronic underinvestment seems to be the defining feature of the entire software space of their gpu department. Their hardware has traditionally been a lot more powerful in terms of the raw underlying compute power, they’ve just always done a much worse job in bringing any of it to bear

      So you end up with FineWine^tm memes which might as well read “our driver department is so underfunded that we’ve dropped XX% of our cards performance to software issues”

      • Speedstick2@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        11 months ago

        The software is probably fine, it is just that the software needs hardware AI acceleration in order to create the quality image we expect from DLSS and XeSS.

        I’m quite frustrated with AMD for not having true Tensor equivalent cores on RDNA 3. They have AI accelerators but they don’t have all the capabilities of the Tensor cores that allow them to do hardware accelerated upscaling.

      • Relevant_Force_3470@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        11 months ago

        Their software is indeed terrible. But they just seem so blinded on the hardware side by raw raster and MOAR vram. Makes no sense in modern games. Even Intel are doing a version of XeSS based on their AI cores. AMD is always just so far behind, software AND hardware.