https://tpucdn.com/review/alan-wake-2-performance-benchmark/images/performance-1920-1080.png
In other games not using advanced current gen featuresets, the 5700XT performs equal to the RTX 3060 or 2070 Super.
So yeah, while it starts, it’s basically unplayable on this card. Turing, RDNA1’s older competitor, is still going strong.
With newer games, old tech will suffer more and more, and the AI in the Nvidia cards will help them age better.
It also has significantly more stutters. Possibly shader compilation stutters.
No expert but Mesh shaders reduce this significantly?
could just be choking on geoemtry work that is intuitive with mesh shaders but complex with fixed function.
I’m glad we are finally getting the next gen featureset. The game truly looks incredible.
For the fps that are being delivered, i really do not think the game is that good looking. Maybe its different tastes but the game looks unoptimized given the performance vs graphics.
In the case of AW2 I think it’s more of a case of diminishing returns, like say you can look at the difference between 99% and 99.9% realistic as just a 0.9% improvement, or you can look at it as a reduction from 1% to 0.1% non-realistic, where the difference there is an order of magnitude.
I get where you’re coming from… Graphically it doesn’t look that much better than RE4 Remake, yet it performs substantially worse. However that is the nature of diminishing returns.
I’m not completely sold on the art direction either. It’s not bad, but it doesn’t appeal to me as strongly as other games. I think Control is more attractive.
Well that blows ass for me. I was going to ride this generation out and maybe do a system rebuild when the next set of gpus launch. Guess I’m sticking with consoles.
And again the 4070Ti is slower everywhere against the 3090…
Poor 4070Ti buyers, card aging like milk already. But muh RT and DLSS!!! Wont help ya when the cards kills itself on missing bandwidth and low VRAM xD
The only benchmark it loses is at 4K native (no DLSS) ray-tracing / path-tracing, and the 3090 isn’t playable there either.
They really dropped the ball on the recommended specs. They said you would need a 3070 running DLSS performance mode to get 60FPS at 1080p medium. This shows a 3070 getting over 60fps with at 1080p max with no DLSS. They made the game seem like it was going to be a technical dumpster fire before it came out.
I kind of wonder if the people making the recommended specs even tested with DLSS/FSR2, or if they just tacked that on there without thinking? (There’s a row for DLSS, so we have to put something in there).
I’m sitting here on my only 4 year old 2060 wondering if I can even run the game.
2060 is generally faster than 3050, but has lower VRAM.
So you can run it, but you’ll need to turn settings down to avoid stutter.Eh, that’s fine.
I’m kinda bummed that it’s looking like I’m gonna need a significant upgrade, but the modern cards aren’t performing as well as they should for the prices they’re asking.
Maybe I’m too far out of the loop, but consoles are looking more enticing.
Look for bf deals. There was a 4060 deal yesterday for like $220. The 4060 at 300 is bullshit, but it’s a lot better at those discount prices.
And the continued rdna2 sales are fantastic, too. I mean we’ve seen 400 dollar 6800s for a while, and sub $450 6800xt’s.
We’ve also seen things like $275 3060tis, though that’s been more rare.
Point being, rdna2 and ampere are on good sales regularly, and even rdna3 and Lovelace are starting to come down. I’m expecting good holiday sales this year.
Thanks for the info. I was avoiding the 4060 I was reading that it’s not worth the $300 asking price, especially compared to the 3070 ti
The 4060 at 300 is bullshit
It is less than past 60 class GPUs though 🤔
I don’t get how this runs at all. I thought mesh shading was massively different from the regular render pipeline. They actually bothered to implement both? Or is there some automatic fallback path that it runs on, and didn’t take much of any work to get going?
They actually bothered to implement both?
the dev already said they implemented both, it sounds like the fallback pipeline just isn’t really something they want to support. The dev said it was buggy and significantly slower and has to use more VRAM to emulate the mesh shading
I like how nobody wins in this market.
If you bought Nvidia, you’re at baseline, if you bought RDNA3… If I say it, I will get banned :)
honestly… given new technologies and whatnot, it’s impressive it can push 25 fps on max fps on 1080p after some custom tweaks. even lowering some parts of the settings would means its more than just playable. I’m not going to test it myself though. I’ll spend the $$ on a console game of AW2 lol, but that is pretty cool though.
The 3070 Vs 2080 Ti is interesting because 20 series support DX12 Ultimate right?
Normally those 2 GPU’s are pretty similar in performance usually slight edge to the 3070bbut for games with high VRAM then the 2080 Ti is slightly better .
But in this test the 3070 is 15% faster which is pretty significant
RT is compute intensive. The heavier the RT, the more TFLOPs matter (until DLSS3.5 starts to complicate things) until other bottlenecks like overall memory performance step in
Could be dual issue working well here. RDNA3 GPUs seem to be performing a bit better, as well.
L1 throughput was one of the major changes in Ampere so it could be related to that.
That is indeed pretty interesting. Yes, usually the 2080Ti is a little bit below a 3070 but i this game, it’s 14% slower. I guess some refinements were made with Ampere and the highter throughput probably matters too. Still, not bad at all for a 5 year old card and it handily beats the 6700XT as well. I think buyers got their money’s worth with the 2080Ti. It’s longevity is kinda insane, especially for Raytracing and due to its 11 GB frame buffer.
Anyone tried the game with an 6800XT at 3440x1440? Is it playable with reasonable settings, preferably with no FSR?
so much for fine wine
Why would you compare max settings?
Glad I swapped my 5700 XT for a 6700 XT when it was a valuable mining card. RDNA1 always felt like an undercooked architecture. RDNA2 is where is really came into its own.