Apart from the two games with a collective daily peak of concurrent players on Steam sitting at 150 poor human souls, where are the FSR3 games?
But scratch FSR3 - where is FSR3.1 that would properly work with VRR screens and therefore could ACTUALLY be beneficial for people who can clearly see non-VRR judder?
Never had issues with VRR on the preview driver that enables frame generation on all games.
The thing is neither afmf, nor fsr3 works atm with vrr you can enable vrr yes.
But it doesn’t work with it.
But it’s still beta and dates for 2024 so.
FSR works fine with VRR. Unless you mean FSR3 frame gen.
FSR3 is upscaling tech, frame gen is a feature of that and the AMFM is the driver level version of frame gen.
When it’s done. AMD quickly added HDR support and is well aware of the vsync/VRR limitations.
Exactly this. I’d wager the current 2 game FSR3 release was intentional for dead games just so they could “release” it. Makes no sense to push FSR3 yet when Vsync and VRR are big issues, especially VRR. With the supposed GPUOpen release with proper support, that should be the real start of adoption.
Meanwhile I’m chilling here with a 7900xtx, ultrawide 1440p oled, raster native, max settings hitting 120-165fps on everything and it’s just beautiful. Don’t care about ai trash and marketing hype. Brute forcing through native raster the way I always have on an OLED is just beautiful and ray tracing on half a dozen games that I actually care about that have it does good enough. Paid $800 for the 7900xtx vs $1200 for a 4080 at the time.
Cyber punk was great and it’s the only title I’d say I’d rather experience on a Nvidia alternative. Thankfully I played it twice after release (once immediately and again after it was patched up), and a 3rd time with expansion. It’s a good game but it’s literally the only title I’d love instead on Nvidia. Still, not good enough for me to want to play 50 times and quote benches on to justify a 4090 purchase.
Indeed, native + raster is the way to go, no upscaling or AI shit between my frames, the XTX shreds anything on my 3440x1440 monitor tho mine is VA, OLED isn’t very good if you do anything besides media consumption / gaming. RT is on 3090 / Ti level, so good enough for me for games like Metro Exodus Enhanced Edition. Especially because I got the XTX for below 1000 euros and 4080s are like 1300 - 1500 euros.
This is it. No doubt AI will have its time. We’re getting close to the nm wall with physical manufacturing limitations but I feel like everyone’s jumping the gun here. You don’t need DLSS to look good and if you do I’m sorry the game you bought was ported or programmed like shit and they used fsr or DLSS as a crutch. Maybe if everyone didn’t jump on that bandwagon we wouldn’t have that issue.
Ray tracing is neat but it’s still a niche. Despite everyone bashing amd, they forget they power the console market which means they control the rate of adoption. With some niche exceptions, publishers care where the money is and the console market is it. So expect games like cyberpunk and path traced portal to be the exception, not the rule.I simply don’t care about hype machines that drool over three or four games that are not good enough to play on repeat 500 times and advertise bench scores on reddit as if this is the one true end to gaming. I play a ton of games front to back and most hardly support ray tracing or don’t at all.
So I’ll continue to buy amd cards because they’re cheaper and press that which is still the spine of the industry…raster. The hype machine can come back at me when 90% of the industry is supporting ray tracing and FSR/DLSS (and they’re finished). Until then, from a purely gaming standpoint and under $1000us…7900xtx it is for brute force supremacy. F ai for now.
How can you complain about upscaling and “AI shit” with a VA monitor? The biggest downside in some games for DLSS is ghosting but VA has way worse ghosting/smearing as that’s how the panel works with fast transistions from dark to light scenes. Imo VA is probably the worst panel type for games/media.
It’s fine if you down mind it or don’t even notice it but then that’d also apply to upscaling.
I have a 3440x1440 165hz VA monitor and a 2560x1440 144hz nano-IPS monitor. Between the 2 I do not notice any ghosting and the only situation in which I notice minor black smearing is when reading white text on a pitch black background, like on Reddit.
Upscaling makes the game noticeably blurry vs native, it loses some of that sharpness, clarity and fidelity in favor of better FPS, upscaling is amazing on an old card, but I don’t expect to use upscaling on a card I dropped close to 1 grand on.
Upscaling makes the game noticeably blurry vs native, it loses some of that sharpness, clarity and fidelity in favor of better FPS,
This is just a matter of sharpening as current AA methods also incur heavy blur. They just tend to default to better sharpening than DLSS, which often defaults to no sharpening, but this is easily fixed. FSR often tends to be sharper than native (i’d say it’s often even overtuned) as it includes CAS and it’s often better than the default sharpening pass with native.
Blur and clarity is not a good argument against DLSS/FSR as it’s just a matter of what filter you apply to the game.
I’m not saying I got a “special screen” so to say. I’m just saying that I only notice the black smearing with white text on a black background, I don’t notice it anywhere else, not saying it isn’t present, I just don’t notice it 99% of the time, even having mainly gamed on my nano-IPS monitor before getting the ultrawide.
I’m just saying that upscaling makes games look blurry. How or why I don’t really care about. That’s why I don’t like it and frankly don’t expect to use it on an expensive card.
Anybody can see judder and tearing lol
I’m waiting too, to play the witcher 3 with all of the rt goodness
I think I saw CDPR on their presentation slide. I also think they mentioned Witcher 3 somewhere in their presentation material so it’s possible it might come out soon.
If I were a dev, I would wait until AMD fixes the VRR issue and only then roll out an update with it.
Freesync works for me in Forespoke with FSR3 frame gen on. I just have Vsync set to on in the driver instead of game.
Now to think about it maybe AMD decided to drop fsr 3 beta on games we don’t even care about so, that on future games we actually it gets the better more polished version. Fingers crossed.
AMD dropped FSR 3 implementation on those games, to meet deadline they told us about when they first showed us FSR 3. That was appetizer from their side, and now we’re open for waiting game again, could be Q1 2024 as they told us, could be end of 2024, no one knows. But Nvidia gained my respect more, as selling hardware with working features since day 1 is better for consumer, then selling hardware with upcoming features tgat aren’t here yet year later since release, and nobody knows when they will be
Yep Nvidia definitely overcharges for their stuff but they can get away with it and it’s somewhat justified with the feature/software stack that they offer.
Even if AMD eventually matches Nvidia on features it always seems like it’s worse/incomplete versions and it can take months or years later for that stuff to even arrive.
I get that the discrete graphics space is probably low priority for AMD at this point but their effort in the last few years has been kind of pathetic. I honestly hope Intel continues to make strides in the market to become a better competitor for Nvidia.
Yes fully agree. I was fully sold on AMDs presentation when they launched 7900 series, and when I had a choice between buying 7900XT and 4070Ti, I chose 7900XT because of feature parity AMD promised us, and there was that big uprage about VRAM too back then as I remember. Year later (almost) I’m here, without proper FSR3.0 support they promised us, RT performance is there but it isn’t if we compare, and I sit and think that for my use case I shoid have chosen 4070Ti back then. Tough, I love Adrenaline, that’s a piece of what I missed most when I was on my 1070, where instead of 3 apps that I need to use I could use one.
I shoid have chosen 4070Ti back then
the VRAM issue is still real tho at least for 4k and maybe even in the future for 1440p. you shouldnt feel like you made a bad decision at least.
That’s sad to hear tbh, I still don’t understand why Nvidia cuts down so heavily in VRAM. Well, most likely to move their 4090 is most rational explanation
Bro I was in your same position a year ago, and now too. The energy consumption is 25% more on the amd side with only 5/10% more raster performance. I went for a full amd build and I will ever complain cpu/mobo performance. But I see when I try to turn on RT and lose 70% of the performance… is heartbreaking. But I think that the real problem isn’t amd side… look at Alan wake 2: when you turn off rt, it kicks rt software and it’s really good… I know that path rt it’s the Ferrari of gaming but I need 1600$ gpu for running it! So I’m pretty satisfied and I really want to see how fsr will kick the ass
Even a 4070 can play Alan wake with PT.
Never buy a product based on future promises.
The Q1 2024 date is not for FSR3, it was for AFMF. FSR3 is already available for devs to implement. Sadly it’s been over a month and no more games added it yet which does not look good at all. Modders have added DLSS3 FG within hours of Starfield’s release yet no dev has managed to add FSR3 in weeks.
I wouldn’t put FSR3 in my game even if it was easy - it’s a lot of dev work laid out for what is essentially an alpha feature which certainly has a lot of issues that may be mistaken as bugs in your game or issues that are your fault from the 99% of users who don’t know every single quirk. It is absolutely not a “press button and have better experience” feature - it has tons of potential to ruin your gameplay and people might not understand how so or why.
Beta refers to feature complete, and features like VRR support are critical.
FSR3.2 or something if and when it works properly, yes people should jump on that.
And I think sadly we will not see any movement from devs this year too.
I am always rooting for AMD but this is entirely their fault. They do not seem to engage with devs or the community to promote FSR3 when we know it could be a game changer for many non-RTX gpu owners.
If they neglect it like the FSR 2 upscaling shimmering issues then more and more people will turn to Nvidia in the long run.
Not just non-RTX. All non 4000-series owners. FSR 3 has the potential to prolong the life of the 20 and 30-series. The RTX 3080 and 3090 may well increase in value when FSR 3 starts being delivered.
Not just non-RTX. All non 4000-series owners. FSR 3 has the potential to prolong the life of the 20 and 30-series. The RTX 3080 and 3090 may well increase in value when FSR 3 starts being delivered.
FSR framegen breaks both DLSS and Reflex, replacing DLSS with a much worse alternative and offering no replacement to Reflex even on AMD cards - a non-negotiable core feature for GPU-bound gaming which only becomes more mandatory with added latency from frame gen.
There is no game where i would turn FSR framegen on because it means either giving up those features or not having them present to begin with on a GPU-heavy game, which is not a good experience.
I mean Nvidia was in the same boat with their RTX 20xx series. Only a few Raytracing games that all made poor use of it like Tomb raider Shadows and battlefield V meh reflections. Let’s not even mention dlss 1.0 being useless.
It’s a shame the two games are just not that interesting and barely had any hype buildup. They still need their cyberpunk 2077 or Alan Wake 2 or Witcher 3/portal/metro Exodus remake.
They still need their cyberpunk 2077 or Alan Wake 2 or Witcher 3/portal/metro Exodus remake.
That’s the thing. Where is AMDs crown jewel to show off their products strengths? Is it … Starfield?
They made a big deal out of it before release. I was at gamescom and they namedropped AMD during the presentation.
Part of the “optimisation” gap at release was from allways enabled SAM vs not whitelisted R-Bar from NVIDIA.
Not sure if that counts as optimized for AMD GPU’s. :)
The game does still look like an outlier with CPU and GPU scaling, as if its full of engine bottlenecks and bugs.
It’s funny how starfield after getting DLSS and FG modded in became a fairly nice showcase for nvidia.
then selling hardware with upcoming features tgat aren’t here yet year later since release, and nobody knows when they will be
They also had audacity to print FSR 3 on boxes for months already :D
Anyone remember when VRR stood for variable refresh rate? Pepperidge farm remembers.
Oh look, it’s the constant nvidia troll making obvious bait content, how unusual.
Hey OP — Your post has been removed for not being in compliance with Rule 3.
Be civil and follow side-wide rules, this means no insults, personal attacks, slurs, brigading, mass mentioning users or other rude behaviour
Discussing politics or religion is also not allowed on /r/AMD
Please read the rules or message the mods for any further clarification
I made the switch to AMD after about 3 generations of Nvidia and losing DLSS is tough. RT isn’t really my jam but FSR and their frame gen tech still need time to be ironed out.
While I expected their frame gen to have growing pains I would feel a lot better if they kept regularly updating the baseline FSR image quality too.
Otherwise quite happy with the raster performance.
Idk bro. I don’t use the whole alphabets worth of softwares these companies make. I set my POS to 1440@120 frames and max out the settings and just play random shit
It’ll come out shortly after Nvidia has DLSS 4.0.