Ray reconstruction is the main difference maker and I don’t think amd can work around this one without acceleration. Traditional denoisers have been existed for so long and its impossible to even attempt what nvidia is doing.
With this system requirement, Alan wakes; I sleep. 😅
I love how whenever you have these comparisons you always get these weirdo nvidia people who act like dlss is so much better when they both look…exactly the same to me.
Hasn’t always been the case. It looks like Remedy did a good job tweaking/implementing it.
I have an Nvidia card and can test each setting, every one I’ve tested has fsr looking noticeably worse. Videos are no substitute for seeing it with your own eyes at 100+ fps in native res with no video compression
Because it does look better… in motion, look at the shimmering in the video—top right when she’s walking towards the fence.
Important context here: this person is playing on a 22” 1080p screen lol
No I’m not kidding
Yes. I literally paid $60 for it if you must know. Not everyone is rich frick with a 4090 and a screen that takes up your whole wall. Bet you also have some obscene sound system where you piss off the entire neighborhood too while I use these ratty old $50 headphones.
Can someone please explain to me, apart from the shimmer which is clearly visible like OP said, how is DLSS trillion times better and worth paying hundreds of dollars more for? Like can you enunciate, because I have an RTX 3050 Ti and I use DLSS and FSR both at 1080p. DLSS has less motion artifacts, but apart from that, I would not consider it to be a million times better to the point that I would buy a slower and more expensive card just to have DLSS.
Because motion artifacts is the main downside of playing at a lower resolution. The constant pixel shimmering, ghosting and unstable image sucks ass and DLSS greatly alleviates it.
FSR2 still has all the visual artifacts which makes upscaling from a lower resolution an all around worse option for AMD cards. Even native resolution FSR has them.
People would rather play and render at 1080p and use DLSS to upscale to 4K than render at 1440p and upscale with FSR. That’s rendering 81% more pixels and still outputting a worse result. The discrepancy is big enough to add in some RT options and still come out ahead in both visuals and performance.
Sad to see you get downvoted, like yeah DLSS is better, but as you said, it’s not a trillion billion gazillion times better, I have had a 5700 XT, 3090 and now a 7900 XTX, DLSS is good but I wouldn’t buy a more expensive Nvidia card just to get that.
AMD need to get their shit together as Nvidia are running away with quality, performance and ray tracing.
AMD need to get their shit together as Nvidia are running away with quality, performance and ray tracing.
LOL
They’ve been destroying Nvidia on pure raster though, and that’s what matters most to majority of gamers.
LOL
AMD kinda looks content to just copy whatever NVIDIA does and take up their 10% marketshare just by default due to people who hate NVIDIA.
due to people who hate NVIDIA.
I’ve always bought AMD personally, partly because of nvidias anticompetitive business practices, but largely because the actual hardware has always been better for the same price. Eg I needed a GPU with the most vram/$ recently for some compute, and a 6700xt was a clear win. I suspect a lot of people go AMD for basically this reason
Increasingly though their software support is so bad its making it hard to justify buying a new AMD gpu even if it is cheaper
Yeah, pretty much this ^
They are too busy increasing their GPU prices after having one or maybe two successful releases in the last 5 years
In reality all this special sauce is cool I think noone disagrees, but its far from most of gaming. Even in the games it is cool you’re still actively degrading performance x2-3-4 times. There’s definitely something to be said about a native non RT image being better than upscaled-fg-rt/pt, but obviously at the cost of the very cool lighting/shadow tech. Which lets be honest for the most part(path tracing not withstanding) isn’t even better looking. It is more realistic though and I do appreciate that but at 30-60fps instead of 60-120? Eh it isn’t a clear win just different.
My main games this year is: Starfield, cp2077 dlc, diablo and call of duty. It’s one dlc of a game I finished 2 times where Nvidia tech actually matter.
I don’t give a shit about Alan wake. The first one was a glorified tech demo. The second part here seems more like a game but it’s also mostly just a tech demo/indie game and let’s be real most people wouldn’t give a crap about it if it wasn’t for the Nvidia promotion. Noone cared about the old one that’s for sure. People are told to care about this game by the green overlord but it’s the worst excuse ever for the Nvidia superiority. Doubly because even Nvidia can’t even run it properly. And if you actually want good performance aka non RT/pt AMD is even out performing Nvidia. 7900xtx being closer to 4090 than 4080 shows a real advantage that you can bet on instead of this tech that’s still very much vaporware/ not really used a lot.
Cp2077 and ratchet and clank those are 2 real games where Nvidia really trumps with their special sauce. And are we really so blinded that we are gonna make purchasing off of a few outliers in a sea of it literally just not mattering?
The answer is yes because fomo not because the Nvidia special sauce is anything much.
Except with raytracing getting easier to run, Rasterization is probably coming to its final days probably 2 or 3 generations of gpus and I would say most developers will mainly focus on RT/PT over raster.
as it’s easier to design and develop with. They will probably have a raster mode but it probably wouldn’t look great and be unoptimized.
So being faster at RT right now nvidia cards will age a lot better than their AMD counterparts offsetting their cheaper prices
This is completely false logic what we are seeing is ever increasing visual fidelity and RT gets harder and harder to run. Just look at the old RT GPUs they become irrelevant with every new generation and that will be worse before it gets better. Maybe by 7000 series your possible future will come to fruition but today now and 5 years ahead things are looking very weak and every Nvidia GPU will become much faster obsolete instead of lasting better. It’s completely untrue point that isn’t true today so why would it be tomorrow?
Please think some more about it, you’re really not making sense.
My 2080ti is still doing fine in cyberpunk
Fine wine, lol
Nvidia wipe thier cocks all over amd’s face in cyberpunk. Not sure about the other games, as they’re a bit shite for my liking. COD I assume runs on a toaster.
Alan Wake 2 is reviewing very well. Its a good gane that also utilises modern tech; just what us gamers crave. So it is very relevant.
Alan wake is a horror Puzzle game that’s 20 hours. How many popular horror Puzzle games is actually for a large amount of people? It’s a far cry from being anything for the mass market. I know that’s how they market it but it just isn’t. People will marvel at their 30 FPS path traced slide show because they have to find the joy in what they purchased but by far most people would enjoy the game more with it off, you know because you can actually run at 60 fps or more which for over a decade has been absolute minimum for a high end system.
The 4090 can’t even do 1080p ultra saying “look how good they do” when they are barely running the game is just weird.
And again again the 7900xtx wipes the floor with the 4080 without the tech that makes the game run like crap on all hardware. It’s not like 25 fps is actually usable for 1000+ dollar hardware. That’s simply unacceptable performance for anyone but diehard Nvidia tech fans.
30fps?
You need to go check out benchmarks…
Inform yourself then come back and we can talk.
30fps is far from a slideshow. And yes I would be completely happy playing at 30fps with path tracing in alanwake 2
The most bizarre thing is that things like XeSS are better than FSR. There’s literally no reason for intel to be able to do a better job here, especially because their solution is way newer
I have no idea what’s going on on amd’s software side of things, but chronic underinvestment seems to be the defining feature of the entire software space of their gpu department. Their hardware has traditionally been a lot more powerful in terms of the raw underlying compute power, they’ve just always done a much worse job in bringing any of it to bear
So you end up with FineWine^tm memes which might as well read “our driver department is so underfunded that we’ve dropped XX% of our cards performance to software issues”
Their software is indeed terrible. But they just seem so blinded on the hardware side by raw raster and MOAR vram. Makes no sense in modern games. Even Intel are doing a version of XeSS based on their AI cores. AMD is always just so far behind, software AND hardware.
The software is probably fine, it is just that the software needs hardware AI acceleration in order to create the quality image we expect from DLSS and XeSS.
I’m quite frustrated with AMD for not having true Tensor equivalent cores on RDNA 3. They have AI accelerators but they don’t have all the capabilities of the Tensor cores that allow them to do hardware accelerated upscaling.
So no differences, other than some mild shimmering on FSR 2.2 sometimes. I mean, just watch the videos and it’s a head scratcher. These “reviewers” need views. They are making it seem far worse than it actually is. I think Nvidia is paying them. Honestly.
If you can’t tell the difference between Ray Reconstruction/no Ray Reconstruction that’s a you thing.
I can see the difference, it just doesn’t really change much about the game.
Bro getting stupid high off the copium. DLSS is just superior in every single way, unfortunately. Everyone who has a DLSS capable card can tell you that FSR is just objectively worse. You probably don’t have a GPU capable of trying both technologies side by side, so your only point of comparison is an extremely compressed YouTube video where you can’t possibly see the quality difference… Upscaling on a hardware level is always going to be better than through software.
Don’t overdose on copium my dude
FSR 2.2 is not even in the same universe as DLSS. A few static screenshots or light slow walking don’t change that.
I disagree, AMD doesn’t really need to improve FSR upscaling for one simple reason, which you mentioned: XeSS exists and every modern gpu can use it.
Change the YouTube resolution to 4K (or 1080p Enhanced if you have Premium) even if you don’t have a 4K monitor.
Its kinda interesting this game isnt getting backlash for not having intel arc gpus on the spec list. Or xess
Most of the games don’t , unless sponsored by Intel. Arc has very little marketshare to bother including it . Most of the time you can assume A750/770 ~ RX 7600 in raster , a bit better in RT
Well XeSS very often beat FSR so it makes sense to include -> https://www.pcgamer.com/cyberpunk-2077-fsr-vs-xess/
RTX users should always use DLSS because its superior. TechSpot tested this in depth recently. I hope both AMD and Intel will be able to improve their upscaling tech.
I just don’t understand why developers chose fsr instead of xess. They run on every card, but mostly xess is just better. Fsr is a crap on most games.
Holy smokes I have no idea what that second sentence means but it really reinforces the whole “I dont need to know shit /r/AMD will educate me” thing I’ve got going on since discovering this sub.
Probably because between EGS and the requirements about 15 people are gonna buy it on PC.
No budget for astroturfing.
Why would you blame someone for not supporting less than 1% market? Intel may have sold close to 3% during last year, RTX 3000 and RX 6000 has been selling for triple that time and with much higher global shipment base.
i dont know forcing dsll and fsr as the only to render things overshadows everything
intel doesnt decide… that would be up to the developers.
Intel could BEG the developers to put XeSS in, but in the end its up to the dev, NOT intel
“How come a game doesn’t consider ARC when ARC has less than 1% market share??”
XeSS is another matter. However it sounds like the devs wanted the best graphics and therefore focused on the best technologies.
It does actually look the same 😭 I can’t believe comments were true lmao
The versions are confusing me again, is dlss 3.5 just the upscaler for ray tracing? or is it including frame gen? shouldn’t it be fsr 2 vs dlss 2?
It has everything. Upscale / super res, frame gen, ray reconstruction
DLSS 3.5 = DLSS, Frame Generation, and Ray Reconstruction
But as you can see DLSS also has a version called 3.5. In Cyberpunk 2077 you had DLSS 3.5 because there was FG and RR, but the DLSS itself was version 3.1.
It’s very clear and totally not confusing.
You can manually replace the dll files for DLSS
3.5 is everything DLSS Super Resolution, Frame Generation and Ray Reconstruc. All 3 DLL have the same version numbers now so
DLSS Super Resolution is the worst name ever. Deep Learning Super Sampling Super Resolution. wtf
DLSS = brand name.
But Deep Learning = AI
Super Sampling = better image quality using SS
Super Resolution = upscaling
I mean it literally makes sense.
Ray Reconstruction on the other hand…that is a denoiser, there’s no reconstruction.
DLSS doesn’t really mean anything anymore; it’s just a brand to NV now.
Yes, the acronym literally means, “Deep Learning Super Sampling”, but NV uses the DLSS brand for any performance uplifting features these days that utilize the tensor cores in some way for gaming. That’s why you have the weirdness with the version numbers and such.
Plus Reflex.
does anyone know if I have to play the first one to underatand this one?
Man is FSR 2.2 worse than 2.1? Wtf
I don’t even need to watch to know DLSS looks better
Idk if it’s just me, but I literally see no difference apart from the minor shimmering on the fence.
It looks like a gimped implementation of FSR2. The native AA mode using FSR as the AA is very shimmery which should not be the case at all since there is no upescaling.
Don’t you just love sponsorship deals.
Yes, because AMD sponsored games like RE4, Jedi survivor, Last of us part 1, and starfield all have flawless FSR ?
/s if not obvious
Yeah on the other hand witcher 3 on my ps5 has better image quality with 1440p fsr auto. All the foliage cleans up pretty nicely. Fsr 2 needs more work to look good I guess. Nvidia sponsored game they focused on dlss. For example on the other side Starfield didnt even have Nvidia drives at launch.
How easy is it to mod FSR3 into a game if DLSS3 is already in it I wonder
BUT THE SHIMMERING ON THE FENCE THAT WE HAD TO ZOOM IN TO SHOW IT TO YOU IS SO BAD ITS UNPLAYABLE!!!1!1!! WHAT THE FUCK AMD YOURE SO TERRIBLE