Another game with upscaling baked into all presets. “Free performance” they said. I expect frame hallucination tech to be included in these presets in 1-2 years tops
Upscaled 1080p low settings and only 30fps with A750? At least the game has xess.
This gives me hope that Star Wars outlaws will also have FSR 3. 🤌
60fps with balanced upscaling on a 4080/7900xtx … That’s not a good sign
Se suponía que FSR y DLSS eran opcionales, ahora parecen ser obligatorios en cualquier configuración gráfica.
Damn. My 5700XT did Mirage fine on Medium/Low shadows. Now its basement quality.
im more baffled as to why they are comparing a RX 5700 to a GTX 1070. Id have thought they would have said a RX 590 (instead of 5700) or RTX 2060 (instead of 1070)
In 2023, RX 5700 performs equally or better than the RTX 2060 Super/2070. The 5600XT is more like a 2060 competitor.
Either the game runs like shit on AMD hardware, or the system requirements are wrong and they just write down the components that they tested with.
Another thing is that the game is doing software based RT by default and that might be the reason why the AMD GPUs suffer more compared to their Nvidia counterparts.
1070 doesn’t have hardware RT though.
It can do software based RT just like every other modern GPU out there.
might be the reason why the AMD GPUs suffer more compared to their Nvidia counterparts.
I was pointing out that 1070 doesn’t have hardware RT either.
You don’t need RT hardware to do software RT. That’s what I’m saying.
And I’m saying 1070 doesn’t have hardware RT so it’s just as gimped as AMD cards when doing RT.
It’s because the minimum settings is with software RT enabled and thus they’re using the fastest cards without hardware RT that can reach 1080p30 FSR2.
They’re not, those are the cheapest/slowest GPUs that have 8 GB of VRAM. Clearly game devs are done pandering to people who bought Nvidia’s e-waste.
Enthusiast
1440p […] with FSR Quality […] 60 FPS1440p + xxxx Quality upscaling is just another way to say 1080p you cowards. It’s 2023 and enthusiast level is 1080p @ 60 FPS.
Looks pretty though with the amount of vegetation.
I’m surprised that they put the RTX 3080 and RX 6800 XT side by side in a game that has forced ray tracing. I guess that’s the advantage of being cross platform and AMD sponsored.
1440 + upscaling looks better than 1080p because you’re on a 1440 resolution monitor.
Do you even 1440p?
Spoken like someone who has never upscaled 1080p content on a 1440 screen. The pixel ratios are all off. It’s fucked.
1440p is perfect for watching old 720p HD content though.
Hopefully this is another Alan Wake situation where the game performs better than what their system requirements would suggest. With this being a based on a movie game, I don’t have super high hopes that that’ll be the case though. Going from movie to game and vice versa almost always has bad results.
Also, displaying system requirements with upscaling is a bad joke. This is basically saying you’re not getting native 1080p 60fps without a 6800xt or 3080. Upscaling is definitely becoming the crutch a lot of folks feared it would be.
NATIVE resolution gang ftw!
Well, folks, the game is basically software ray tracing an open world regardless of settings at all the times using compute shaders from what I gather. I mean, what’d you expect from something pushing consoles to the brink? That it would not be equally demanding on PC?
RT is the future. With this game and Alan Wake 2 using software RT at all times and hardware RT for anything beyond “low”, RT is the future.
At some point GPUs will be so powerful and game engines will have RT in the bag that RT will be used in basically all indie games…and then suddenly nobody will talk about RT anymore because RT is in everything for years.
people also forget both AMD and nvidia didn’t really offer a generation leap in GPU performance this gen, sure it’s popular to call games “unoptimized” (I know many AAA releases are, not denying that) but the issue compounds with the fact that game devs couldn’t predict a GPU generation to be this bad caused by inflated crypro-boom margins a generation prior
Wait what? No performance improvement this gen? Bullshit, RTX 4090 is literally 2 times faster than RTX 3080 and RTX 3090 is only 10% Faster than the 3080 you do the math. RTX 4080 is 50% Faster than RTX 3080.
RTX 4080 is 50% Faster than RTX 3080.
The msrp is 71% higher too.
How do FSR 2 & 3 work with tech like ReShade?
ReShade has a pretty cool shader that will make any game into stereoscopic 3D. When it works it works well, but it doesn’t work in every game.
I’m curious because I would like to try it for this, since like 99.97% of Avatar’s appeal (shtick?) is the 3D.
I thought 3d was dead
It might be, AFAIK neither Nvidia nor AMD still support their implementation of it in their drivers.
But I’ve tried it on a few games and it is pretty cool to me, much stronger than almost any modern 3D movie (IMO why 3D has gone into hibernation), and I gather that Cameron designed the franchise around 3D. Plus the Avatar 1 game was in 3D. FWIW.
Looks capped at 60fps?
60 FPS WITH Frame Gen AND FSR balanced? God, that’s gonna be a horrible experience.
I really don’t like the direction we are heading.
If Anti-Lag is still locked by then , please enjoy your frame gen input lag.