• 1dgtlkey@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    i love my 7900xt, the only thing i dislike about it is FSR, it’s so obviously worse than DLSS it’s not even funny.

  • BucDan@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    What are the odds amd does a refresh with faster ram? Like the 6900xt to 6950xt

    • Pl4y3rSn4rk@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Sincerely I would rather prefer them to focus on VOPD implementation to increase their overall performance in some scenarios, if they manage to get at least 10 to 20% in gaming it would be enough to make the 7900 XTX competitive with the 4090.

  • Todesfaelle@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    I bought a XFX Merc 7900XT for what would be $730USD which put it about $110 cheaper than the cheapest 4070 Ti in my region which also happened to be the incredibly ugly Zotac model.

    While I can live without pumping bigger RT numbers it’s DLSS which I’m having a hard time losing when I do play something which benefits the use of upscaling since FSR, in it’s current long-standing state, leaves much to be desired.

    There is a degree of buyer’s remorse since upscaling will only become more popular but I still feel pretty good about my decision to get it over the 4070 Ti for 1440p.

  • StewTheDuder@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    7900XT Taichi owner on a 7800x3d system, chose it over a 4070ti after my 3070ti hit the VRAM wall at 3440x1440 in a few AAA titles, wasn’t going to let Nvidia get me again. Card is an absolute beast. Half the time I find myself capping my frames to 75/90/120 pending game, and my system purrs while sipping power and temps cooler than a polar bears toe nails.

    RT performance is adequate, rarely even needing upscaling to achieve 60+ fps at 1440 UW. Cyberpunk looks nice with RT and I can run Ultra RT with XESS and have a nice look and feel. But you know what feels great and looks great? Running the game maxed out, no RT and getting 120+ fps on my DWF QD OLED. OLED is a bigger boost to fidelity than RT is by far. Save some money on your GPU and get an OLED.

    By the time RT is standard 3-4 years from now, I’ll be upgrading my GPU anyways. RT is nice in some games and I do like to turn it on from time to time, but we just ain’t there yet. This card is fast af so upscaling isn’t big for me either just get. AMD does need to figure the shimmering and blurry shit out, but rn, for what I need and what I’m getting? Happy as a clam. I have nvidia in another system in my house and don’t have a “team”. Will probably look to upgrade GPU around 6000/9000 series releases and see where each company is at. Rooting for Intel too.

    • wirmyworm@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      If you wanna target 4K Instead target 1800p and use RSR to make the image look a bit sharper. It works really well and in my experience, you get a crisper image. With some TAA nowadays native 4k can look a bit blurry imo

  • Revolutionary-Land41@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Well I went with AMD this time and bought a RX 7900 XT.

    Both brands have their pros and cons, but I will never ever buy a GPU again with the bare minimum of VRAM, like my RTX 3070 had.

  • NyanMusume@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    7900XT Merc 310 Black here. If I don’t mind letting it hit 100C I can get the fps out of it comparable to the 7900XTX reference model. Paid around $700 ish with the free game and big sale a few months ago. Currently I’m running it with capped clocks of 2200mhz with normal bios (instead of letting it auto boost to 3000 on max power bios) and it stays between 60-75C hotspot. Very satisfied.

  • drummerdude41@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Here’s where things get fuzzy, for i think a lot of consumers who are into competitive gaming. The games i play, I will never use dlss, fsr, xess. These techs, especially the new ones dlss3 and fsr3 add huge amounts of latencey to the game. Just by concept, it’s impossible not to add latencey with dlss3 because software has to wait on two frames to generate a third ai frame. This still results in more frames, but it adds latencey. I almost exclusively play competitive games, and that latencey is a big no no. I also want high frames so things like rt dont really matter to me. In this regard, amd is a very compelling choice because it gets almost identical performance at chepaer costs for what i use it for. Nvidia reflex is the only software that is potentially swaying for me personally, but fps limts under your monitor refresh rate while usi g gsync/freesync have the same effect. So while i am a fraction of the gpu market, there is a market where dlss and fsr is not the future for its games.

    • wirmyworm@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Using my 7900xt with a LG OLED as well. Playing ark survival ascended at good image quality and high or epic settings. How hot does yours get, in ark mine reached 84c looking at msi afterburner

    • blandhotsauce1985@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Got one of AMD’s branded reference models a few months back. I love it. I game 1440p so I’m glad to hear it’s decent at 4k

  • adaa1262@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    What’s going on in this sub lately with all the Nvidia fanboys acting like RT is necessary and FSR is a blurry mess?!

    No we don’t care about RT and FSR is fine it depends on the game implementation (haven’t got any issues with Alan wake 2 what are u talking about?!).

    Also they keep whining about Starfield as if AMD sabotaged the launch like we haven’t seen Nvidia do it before (cough cough game works).

    And no Starfield runs like shit because the game was made in Vulkan they hired AMD to port it to DX12 and they didn’t have the time to implement basic features like FOV slider and other upscaling tech at launch.

    Also you’ll have to understand that Nvidia starts to not care that much about the gaming division by releasing lucklester drivers thats why it run like shit on Nvidia.