• EmilMR@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    11 months ago

    Ray reconstruction is the main difference maker and I don’t think amd can work around this one without acceleration. Traditional denoisers have been existed for so long and its impossible to even attempt what nvidia is doing.

  • JonWood007@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    11 months ago

    I love how whenever you have these comparisons you always get these weirdo nvidia people who act like dlss is so much better when they both look…exactly the same to me.

    • v8rumble@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      11 months ago

      Hasn’t always been the case. It looks like Remedy did a good job tweaking/implementing it.

    • NewestAccount2023@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      11 months ago

      I have an Nvidia card and can test each setting, every one I’ve tested has fsr looking noticeably worse. Videos are no substitute for seeing it with your own eyes at 100+ fps in native res with no video compression

    • Spartancarver@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      11 months ago

      Important context here: this person is playing on a 22” 1080p screen lol

      No I’m not kidding

      • JonWood007@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        11 months ago

        Yes. I literally paid $60 for it if you must know. Not everyone is rich frick with a 4090 and a screen that takes up your whole wall. Bet you also have some obscene sound system where you piss off the entire neighborhood too while I use these ratty old $50 headphones.

    • Temporary-Map4810@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      11 months ago

      Can someone please explain to me, apart from the shimmer which is clearly visible like OP said, how is DLSS trillion times better and worth paying hundreds of dollars more for? Like can you enunciate, because I have an RTX 3050 Ti and I use DLSS and FSR both at 1080p. DLSS has less motion artifacts, but apart from that, I would not consider it to be a million times better to the point that I would buy a slower and more expensive card just to have DLSS.

      • conquer69@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        11 months ago

        Because motion artifacts is the main downside of playing at a lower resolution. The constant pixel shimmering, ghosting and unstable image sucks ass and DLSS greatly alleviates it.

        FSR2 still has all the visual artifacts which makes upscaling from a lower resolution an all around worse option for AMD cards. Even native resolution FSR has them.

        People would rather play and render at 1080p and use DLSS to upscale to 4K than render at 1440p and upscale with FSR. That’s rendering 81% more pixels and still outputting a worse result. The discrepancy is big enough to add in some RT options and still come out ahead in both visuals and performance.

      • Vhirsion@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        11 months ago

        Sad to see you get downvoted, like yeah DLSS is better, but as you said, it’s not a trillion billion gazillion times better, I have had a 5700 XT, 3090 and now a 7900 XTX, DLSS is good but I wouldn’t buy a more expensive Nvidia card just to get that.

    • rW0HgFyxoJhYka@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      11 months ago

      AMD kinda looks content to just copy whatever NVIDIA does and take up their 10% marketshare just by default due to people who hate NVIDIA.

      • James20k@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        11 months ago

        due to people who hate NVIDIA.

        I’ve always bought AMD personally, partly because of nvidias anticompetitive business practices, but largely because the actual hardware has always been better for the same price. Eg I needed a GPU with the most vram/$ recently for some compute, and a 6700xt was a clear win. I suspect a lot of people go AMD for basically this reason

        Increasingly though their software support is so bad its making it hard to justify buying a new AMD gpu even if it is cheaper

    • BrainSweetiesss@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      11 months ago

      They are too busy increasing their GPU prices after having one or maybe two successful releases in the last 5 years

    • MrPapis@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      11 months ago

      In reality all this special sauce is cool I think noone disagrees, but its far from most of gaming. Even in the games it is cool you’re still actively degrading performance x2-3-4 times. There’s definitely something to be said about a native non RT image being better than upscaled-fg-rt/pt, but obviously at the cost of the very cool lighting/shadow tech. Which lets be honest for the most part(path tracing not withstanding) isn’t even better looking. It is more realistic though and I do appreciate that but at 30-60fps instead of 60-120? Eh it isn’t a clear win just different.

      My main games this year is: Starfield, cp2077 dlc, diablo and call of duty. It’s one dlc of a game I finished 2 times where Nvidia tech actually matter.

      I don’t give a shit about Alan wake. The first one was a glorified tech demo. The second part here seems more like a game but it’s also mostly just a tech demo/indie game and let’s be real most people wouldn’t give a crap about it if it wasn’t for the Nvidia promotion. Noone cared about the old one that’s for sure. People are told to care about this game by the green overlord but it’s the worst excuse ever for the Nvidia superiority. Doubly because even Nvidia can’t even run it properly. And if you actually want good performance aka non RT/pt AMD is even out performing Nvidia. 7900xtx being closer to 4090 than 4080 shows a real advantage that you can bet on instead of this tech that’s still very much vaporware/ not really used a lot.

      Cp2077 and ratchet and clank those are 2 real games where Nvidia really trumps with their special sauce. And are we really so blinded that we are gonna make purchasing off of a few outliers in a sea of it literally just not mattering?

      The answer is yes because fomo not because the Nvidia special sauce is anything much.

      • DarkLord55_@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        11 months ago

        Except with raytracing getting easier to run, Rasterization is probably coming to its final days probably 2 or 3 generations of gpus and I would say most developers will mainly focus on RT/PT over raster.

        as it’s easier to design and develop with. They will probably have a raster mode but it probably wouldn’t look great and be unoptimized.

        So being faster at RT right now nvidia cards will age a lot better than their AMD counterparts offsetting their cheaper prices

        • MrPapis@alien.topB
          link
          fedilink
          English
          arrow-up
          1
          ·
          11 months ago

          This is completely false logic what we are seeing is ever increasing visual fidelity and RT gets harder and harder to run. Just look at the old RT GPUs they become irrelevant with every new generation and that will be worse before it gets better. Maybe by 7000 series your possible future will come to fruition but today now and 5 years ahead things are looking very weak and every Nvidia GPU will become much faster obsolete instead of lasting better. It’s completely untrue point that isn’t true today so why would it be tomorrow?

          Please think some more about it, you’re really not making sense.

      • Relevant_Force_3470@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        11 months ago

        Nvidia wipe thier cocks all over amd’s face in cyberpunk. Not sure about the other games, as they’re a bit shite for my liking. COD I assume runs on a toaster.

        Alan Wake 2 is reviewing very well. Its a good gane that also utilises modern tech; just what us gamers crave. So it is very relevant.

        • MrPapis@alien.topB
          link
          fedilink
          English
          arrow-up
          1
          ·
          11 months ago

          Alan wake is a horror Puzzle game that’s 20 hours. How many popular horror Puzzle games is actually for a large amount of people? It’s a far cry from being anything for the mass market. I know that’s how they market it but it just isn’t. People will marvel at their 30 FPS path traced slide show because they have to find the joy in what they purchased but by far most people would enjoy the game more with it off, you know because you can actually run at 60 fps or more which for over a decade has been absolute minimum for a high end system.

          The 4090 can’t even do 1080p ultra saying “look how good they do” when they are barely running the game is just weird.

          And again again the 7900xtx wipes the floor with the 4080 without the tech that makes the game run like crap on all hardware. It’s not like 25 fps is actually usable for 1000+ dollar hardware. That’s simply unacceptable performance for anyone but diehard Nvidia tech fans.

          • DarkLord55_@alien.topB
            link
            fedilink
            English
            arrow-up
            1
            ·
            11 months ago

            30fps is far from a slideshow. And yes I would be completely happy playing at 30fps with path tracing in alanwake 2

    • James20k@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      11 months ago

      The most bizarre thing is that things like XeSS are better than FSR. There’s literally no reason for intel to be able to do a better job here, especially because their solution is way newer

      I have no idea what’s going on on amd’s software side of things, but chronic underinvestment seems to be the defining feature of the entire software space of their gpu department. Their hardware has traditionally been a lot more powerful in terms of the raw underlying compute power, they’ve just always done a much worse job in bringing any of it to bear

      So you end up with FineWine^tm memes which might as well read “our driver department is so underfunded that we’ve dropped XX% of our cards performance to software issues”

      • Relevant_Force_3470@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        11 months ago

        Their software is indeed terrible. But they just seem so blinded on the hardware side by raw raster and MOAR vram. Makes no sense in modern games. Even Intel are doing a version of XeSS based on their AI cores. AMD is always just so far behind, software AND hardware.

      • Speedstick2@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        11 months ago

        The software is probably fine, it is just that the software needs hardware AI acceleration in order to create the quality image we expect from DLSS and XeSS.

        I’m quite frustrated with AMD for not having true Tensor equivalent cores on RDNA 3. They have AI accelerators but they don’t have all the capabilities of the Tensor cores that allow them to do hardware accelerated upscaling.

  • XXNameAlreadyTakenXX@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    11 months ago

    So no differences, other than some mild shimmering on FSR 2.2 sometimes. I mean, just watch the videos and it’s a head scratcher. These “reviewers” need views. They are making it seem far worse than it actually is. I think Nvidia is paying them. Honestly.

    • I-Miss-RIF@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      11 months ago

      If you can’t tell the difference between Ray Reconstruction/no Ray Reconstruction that’s a you thing.

    • bootyjuicer7@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      11 months ago

      Bro getting stupid high off the copium. DLSS is just superior in every single way, unfortunately. Everyone who has a DLSS capable card can tell you that FSR is just objectively worse. You probably don’t have a GPU capable of trying both technologies side by side, so your only point of comparison is an extremely compressed YouTube video where you can’t possibly see the quality difference… Upscaling on a hardware level is always going to be better than through software.

    • Firefox72@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      11 months ago

      FSR 2.2 is not even in the same universe as DLSS. A few static screenshots or light slow walking don’t change that.

      • lagadu@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        11 months ago

        I disagree, AMD doesn’t really need to improve FSR upscaling for one simple reason, which you mentioned: XeSS exists and every modern gpu can use it.

    • picastchio@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      11 months ago

      Change the YouTube resolution to 4K (or 1080p Enhanced if you have Premium) even if you don’t have a 4K monitor.

  • hardlyreadit@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    11 months ago

    Its kinda interesting this game isnt getting backlash for not having intel arc gpus on the spec list. Or xess

    • Skulkaa@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      11 months ago

      Most of the games don’t , unless sponsored by Intel. Arc has very little marketshare to bother including it . Most of the time you can assume A750/770 ~ RX 7600 in raster , a bit better in RT

      • Tuhajohn@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        11 months ago

        I just don’t understand why developers chose fsr instead of xess. They run on every card, but mostly xess is just better. Fsr is a crap on most games.

      • McD-Szechuan@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        11 months ago

        Holy smokes I have no idea what that second sentence means but it really reinforces the whole “I dont need to know shit /r/AMD will educate me” thing I’ve got going on since discovering this sub.

    • M-Kuma@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      11 months ago

      Probably because between EGS and the requirements about 15 people are gonna buy it on PC.

    • dotjzzz@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      11 months ago

      Why would you blame someone for not supporting less than 1% market? Intel may have sold close to 3% during last year, RTX 3000 and RX 6000 has been selling for triple that time and with much higher global shipment base.

    • Method__Man@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      11 months ago

      intel doesnt decide… that would be up to the developers.

      Intel could BEG the developers to put XeSS in, but in the end its up to the dev, NOT intel

    • rW0HgFyxoJhYka@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      11 months ago

      “How come a game doesn’t consider ARC when ARC has less than 1% market share??”

      XeSS is another matter. However it sounds like the devs wanted the best graphics and therefore focused on the best technologies.

  • youssif94@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    11 months ago

    The versions are confusing me again, is dlss 3.5 just the upscaler for ray tracing? or is it including frame gen? shouldn’t it be fsr 2 vs dlss 2?

    • Turtvaiz@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      11 months ago

      DLSS 3.5 = DLSS, Frame Generation, and Ray Reconstruction

      But as you can see DLSS also has a version called 3.5. In Cyberpunk 2077 you had DLSS 3.5 because there was FG and RR, but the DLSS itself was version 3.1.

      It’s very clear and totally not confusing.

    • Ryoohki_360@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      11 months ago

      3.5 is everything DLSS Super Resolution, Frame Generation and Ray Reconstruc. All 3 DLL have the same version numbers now so

      • sudo-rm-r@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        11 months ago

        DLSS Super Resolution is the worst name ever. Deep Learning Super Sampling Super Resolution. wtf

        • rW0HgFyxoJhYka@alien.topB
          link
          fedilink
          English
          arrow-up
          1
          ·
          11 months ago

          DLSS = brand name.

          But Deep Learning = AI

          Super Sampling = better image quality using SS

          Super Resolution = upscaling

          I mean it literally makes sense.

          Ray Reconstruction on the other hand…that is a denoiser, there’s no reconstruction.

        • Massive_Parsley_5000@alien.topB
          link
          fedilink
          English
          arrow-up
          1
          ·
          11 months ago

          DLSS doesn’t really mean anything anymore; it’s just a brand to NV now.

          Yes, the acronym literally means, “Deep Learning Super Sampling”, but NV uses the DLSS brand for any performance uplifting features these days that utilize the tensor cores in some way for gaming. That’s why you have the weirdness with the version numbers and such.

  • Vhirsion@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    11 months ago

    Idk if it’s just me, but I literally see no difference apart from the minor shimmering on the fence.

    • nas360@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      11 months ago

      It looks like a gimped implementation of FSR2. The native AA mode using FSR as the AA is very shimmery which should not be the case at all since there is no upescaling.

      Don’t you just love sponsorship deals.

      • AludraScience@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        11 months ago

        Yes, because AMD sponsored games like RE4, Jedi survivor, Last of us part 1, and starfield all have flawless FSR ?

        /s if not obvious

      • wirmyworm@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        11 months ago

        Yeah on the other hand witcher 3 on my ps5 has better image quality with 1440p fsr auto. All the foliage cleans up pretty nicely. Fsr 2 needs more work to look good I guess. Nvidia sponsored game they focused on dlss. For example on the other side Starfield didnt even have Nvidia drives at launch.

  • Soppywater@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    11 months ago

    BUT THE SHIMMERING ON THE FENCE THAT WE HAD TO ZOOM IN TO SHOW IT TO YOU IS SO BAD ITS UNPLAYABLE!!!1!1!! WHAT THE FUCK AMD YOURE SO TERRIBLE