Looking at the advantages, the 7800xt has:

  1. 10% lower price

  2. 4 more gigs of vram

  3. 6% better raster performance.

The 4070 has:

  1. Better frame gen

  2. Better upscaling via dlss

  3. Better drivers

  4. Better rt performance

  5. CUDA for the few people that actually need it.

  6. Better power efficiency.

  7. The ability to use both dlss and fsr. If a game just has dlss, amd users are screwed.

All in all I think the AMD card is still the underdog based in advantages and needs to be at least 15% cheaper in order to sway buyers to team red. For just a $50 price difference, the team green advantages are too stacked imo.

Edit: this is of course in the US market. Every market is different.

  • Inevitable_Hold_5816@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    11 months ago

    Hi guys

    Can someone technically competent please explain to all of us why doesn’t AMD attack the “elephant in the room”, give it priority and close the gap vs. Nvidia?

    By this, I specifically mean the quality of its upscaling technology. Let’s make a quick list of the competitive landscape:

    1. Rasterization performance per dollar: Pricing is within AMD’s control, and due its lower market share and slower inventory rotation, there are usually better offers at retail level too. Across the board, AMD cards tend to offer 10-15% better rasterization performance compared to their direct Nvidia equivalent. However, should it choose to, Nvidia can easily close the gap with refreshes on its 4060/4070/4080 lines. AMD could release a hypothetical 7750/7850/7950 refresh, but their higher power consumption may cripple their headroom. This is a dangerous game to play for AMD and continuously selling their latest cards at a discount is not sustainable.
    2. VRAM: AMD cards offer higher VRAM than their Nvidia equivalents. This has a largely psychological effect of buyers feeling that higher VRAM will offer superior “future proofing”. Whether or not this will be true depends on how fast will games demand higher requirements and the timing of the users’ next GPU upgrade. However, this “AMD advantage” is fleeting - all Nvidia needs to do is release a refreshed line with higher VRAM. Imagine a 16GB “4070 Ti Super” with less than 5% rasterization gap from a 7900XT but with DLSS3.5 and a 20% uplift on RT/PT.
    3. Ray tracing: Nvidia is ahead by at least one generation in hardware accelerated ray tracing and there is nothing AMD can do to improve RT rendering with its current generation of hardware. But it has one thing to its advantage: time. Current consoles are in their early years and game developers will not go “all in” with RT tech. For the next years, heavy hardware-based RT will be a niche market for a small number of games for PC gamers. This gives AMD some time to catch up - rasterization is most definitely not dead, and will not be so for at least the next 3 to 5 years. Nvidia fans are disappointed that developers (most using UE5) are relying on software-based Lumen and other features instead of pushing on the hardware side. There is a reason for this - developers are rational people - they want their games to run well on consoles and most mid-range PCs, and they don’t want their buyers to feel they are “missing out” - which is exactly what will happen if they go “all in” like AW2 and CB2077 did - both bankrolled by Nvidia cash. Ray Reconstruction is intrinsically linked to the adoption of ray tracing so no further comment there.
    4. Frame generation: Let’s make a clear separation between upscaling and frame generation. They are different things. The frame generation component of FSR3 has one big issue and that is VRR support. This will undoubtedly be fixed as already announced by AMD. Aside from this, the FG component of FSR3 seems to be a robust technology that is remarkably close to Nvidia’s Frame Generation tech, with the extra bonus of being hardware agnostic. I’m also assuming Anti Lag + will be fixed and re-released by the end of the year. The main issue is that games need to be running at 60-70fps before enabling FG, which may be hard without also enabling upscaling. And you cannot use FSR3 without also using FSR upscaling. Which brings us to the final and most pivotal point.
    5. Upscaling tech: Here we are, to the elephant in the room. FSR2.2 upscaling quality is objectively and undeniably worse than DLSS2 and XeSS 1.2. The performance uplift is similar to Nvidia’s and better than Intel’s, but this does not matter. The Image Quality (IQ) downside is too big, particularly with in-motion shimmering and other types of artefacts. I can live with different sharpening levels and I am glad that Native AA in FSR3 can compete with DLAA. But I cannot live with the motion artefacts…there are too many games and too many examples and this is not some “Nvidia evil plan”. I am a big fan of AMD and I’ve been sporting Radeons in my systems since my Radeon 9800 Pro in 2004. But the IQ penalty has become unacceptable. I did not buy an $800 card to play like this.

    AMD’s FSR upscaling tech needs improvements - and fast. This is their Achilles’ heel. Running games natively at higher than 60fps is already difficult, and will become more so as additional UE5 games are released.

    Want to hit 60fps with a low-to-mid range card? Enable upscaling.

    Want to hit 100+fps reliable? Enable upscaling.

    Want to hit 100+fps with a low-to-mid range card? Enable upscaling to hit 60fps and then FG on top.

    Want to enjoy mid to high quality RT features? Enable upscaling, with FG on top depending on your GPU.

    Upscaling is already ubiquitous, and underpins all the other technologies: Frame Generation, Ray Tracing and Path Tracing.

    …and FSR2.2 is just poor. You can’t blame it on hardware, as XeSS offers objectively better quality on AMD cards. So AMD can and must improve it with urgency.

    Does it needs to completely re-work FSR tech? Does it need to scrap it and copy XeSS? Does it need to offer less of a performance uplift and more quality? Does it need an Ultra Quality setting? I doubt it, as Native AA (i.e. sharpening without upscaling) does not remove the shimmering. Does it need to de-couple FSR from frame generation, so that we can use XeSS + FSR FG? I don’t know, this is why we need some expert insight.

    What I know is that USPCALING QUALITY has become -the- critical factor, and Nvidia’s DLSS is its MAIN advantage nowadays - not raytracing.

    • Mother-Translator318@alien.topOPB
      link
      fedilink
      English
      arrow-up
      1
      ·
      11 months ago

      Yup. You hit the nail on the head. Of course raw performance is still king, but features, especially upscaling, are the queen. Upscaling is extremely important now to the point where it’s easily worth an extra 10% to get dlss. On top of that nvidia gets both dlss and fsr meaning they are set no matter what. It’s a tough situation for amd

  • Turbotef@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    11 months ago

    Meh, still gonna buy a 7800 XT

    Better drivers? When will you guys drop that ridiculous myth of bad AMD drivers?

    My GTX 1070-80ti had multiple driver errors, while I only had issues once on my old 280X

    • Mother-Translator318@alien.topOPB
      link
      fedilink
      English
      arrow-up
      1
      ·
      11 months ago

      Amd drivers are worse though. Both the 7900xtx and 7900xt had massive driver problems for almost 3 months at launch. I’m not saying Nvidia is perfect but they have issues far less often.

      Also buying a gpu right now in general is a horrible idea as Nvidia are about to launch the super cards. If amd are smart we should see some huge price drops or else Nvidia will bury them if performance uplift is similar to 20 series super cards

  • Imaginary-Ad564@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    11 months ago

    Nvidia driver suck IMO because of the shitty user interface.

    So I think drivers goes to AMD for ease of use.

    ROCm is making CUDA irrelevant.

    AMD Frame Gen kinda destroys one of NVidia’s main selling point for the 40 series, so thats why the price should drop on the 40 series.

    On top of that you can use FMF on almost all games on 6000\7000 cards now which is something Nvidia doesn’t offer.

    • akumian@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      11 months ago

      Clicking an update now on an app is shitty. Haters gonna hate just cant live up to facts.

  • whatthetoken@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    11 months ago

    It’s selling well. It’s literally fire in new games. It’s already a good price for the performance.

    I can see why they won’t stop the price. Nvidia may make them, but I doubt it

    • Dchella@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      11 months ago

      It’s crazy to hear that the 6800xt/3080 performance level is still ‘fire.’ Such a disappointing generation…

      • ger_brian@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        11 months ago

        The only area where the new generation is exciting is the high end. The 4090 offered a great jump from the previous generation and was a much better card than the 3090 was relative to the rest of the product portfolio.

      • kyralfie@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        11 months ago

        7800XT is like 50% cheaper at 6800XT’s inflation adjusted MSRP ($650 then is like $750 now). Plus they only just briefly appeared at that MSRP and were sold for much more for the vast duration of their lifecycle. So yes, it’s quite a deal at this price.

      • drwhetfarts@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        11 months ago

        Nothing about 6800XT/3080 performance is considering “fire” in 2023. Yet its fine for some people.

      • Cowstle@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        11 months ago

        the 980/390X not being “extremely good” next generation was an exception, not the norm. Pascal was an amazing generation that was also totally onesided because of how ridiculously good it was.

        Top of the line GPUs tend to fall into ~2nd best once the next generation rolls around and that’s still very good.

      • SUNTZU_JoJo@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        11 months ago

        Agreed. Got my 6800XT at launch 3 years ago at MSRP…best GPU purchase I ever made (up there with my previous purchase of the RX5808GB sapphire nitro+ for £249 tax included)

  • saboglitched@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    11 months ago

    I saw a 4070 for 530$ with a 20$ promo code and with Alan Wake on newegg the other day. And when you see that the 4070 beats the 7900xtx in path tracing natively in games like alan wake, never mind with things like DLSS 3, 3.5, yeah the 7800xt needs to be atleast 15-20% cheaper to make up rt hardware and software deficiencies to be a better buy imo.

    • ziplock9000@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      11 months ago

      Yes the 4070 is better in 1% of games. Whereas better raster performance for the other 99%. DLSS is just a nice-to-have.

      Raster performance will always be king as the 1st priority for a GPU.

      Very obvious cherry picking.

      • Cats_Cameras@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        11 months ago

        Why settle for raster if you could have better lighting, though? E.g., the difference is night and day in games like Witcher 3 and Cyberpunk.

  • Redericpontx@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    11 months ago

    Honestly if I was really that strapped for money and had to have that tight of a budget I’d personally go 7800xt just for the fact I only do raster gaming pretty much in mmos and competative games and even when I play minecraft for example I just use shaders and texture packs and don’t touch rt.

    Even then an extra $80? for less raster and less vram feels like a kick in the nuts imo

  • SRFoxtrot341_V2@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    11 months ago

    If the 7800 XT is already selling well as expected, I don’t see why AMD should feel pressured by the price drop.

    It is good to see 2 decent cards being extremely competitive atm, unlike the 6700 XT which literally wiped the floor at $320/330 against the likes of RTX 3060/Ti and even 3070.

    • Murky-Fruit3569@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      11 months ago

      there is no such thing as “selling as expected” in the industry my dude. You want to sell as much as possible. The fact that you are still selling an acceptable amount doesn’t mean you should rest assured. Just saying.

      • Verpal@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        11 months ago

        Pretty sure AMD is still limited by wafer supply, if AMD can scale up supply whenever they want, sure lets sell as much as possible, but desktop dGPU in terms of priority is quite low in comparison to supply for console and OEM.

      • riba2233@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        11 months ago

        They have a limited number of wafers from tsmc and they are selling through then so no issues

      • ProphetoftheOnion@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        11 months ago

        “Selling as expected”, means AMD have a certain amount of wafer supply pre-ordered, and to go beyond that might encur costs they won’t recuperate.

        AMD aren’t ready to beat nvidia this generation, most of their focus is on delivering Epyc backlog. Graphics is getting more R&D time and funding, but this is mostly for packaging knowledge and AI growth.

        Packaging is AMD’s current bottleneck, and it doesn’t matter which wafers they use. From what I’ve heard, Epyc, Zen and Radeon chiplet designs get packaged using the same resources. And AMD would rather be packaging Epyc parts.

      • ApplicationCalm649@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        11 months ago

        That’s not how business works. Margin matters, too, not just volume.

        AMD is a company, not a sports team. They are in this to turn a profit. Investors don’t care if they chip away at Nvidia’s install base. They care if they make money.

        • ziplock9000@alien.topB
          link
          fedilink
          English
          arrow-up
          1
          ·
          11 months ago

          You’re assuming that overall volume of profit will go down if they sell more at a lower price, which is VERY unlikely given the huge markups they have.

          • Dreadnerf@alien.topB
            link
            fedilink
            English
            arrow-up
            1
            ·
            11 months ago

            You’re ignoring that they have better things to spend silicon on than gpus. They could and are letting gpus idle while shovelling cpus.

    • drwhetfarts@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      11 months ago

      I doubt 7800XT is selling well. The card was delayed for like almost 1 year because AMD wanted to sell out 6800 and 6900 inventory and 7800XT is replaced next year by RDNA4, which will have no high-end SKUs but still a 8800XT that will beat 7800XT for sure.

  • NoLikeVegetals@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    11 months ago

    Better drivers

    lol, spoken like someone who hasn’t used Nvidia and AMD drivers over the last 3 years.

  • vivu1@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    11 months ago

    Point #7 is not for nvidia, ability to fsr cause amd is just that good to let everyone use it. its just nvidia is so gay that they wont let their own technology used on their own products like dlss 3, i say rtx 3000 and 2000 can definitely use it but nvidia want to justify bad prices somehow on their new gpus

  • scubawankenobi@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    11 months ago

    CUDA for the few people that actually need it.

    Aka “AI use cases” = massive growth market (growth in use case).

    Not everyone “needs” AI, but it’s great to have the capability.

    Note: I’m long-time team-red fan & finally after wasting better part of a year on high-end AMD cards (vega64 liquid cooled, other variant cards w/32gb vram, etc) & never getting them to: A) work at all … or B) perform like crap w/rocm.