• Rankzmajor@alien.topB
    link
    fedilink
    English
    arrow-up
    2
    ·
    10 months ago

    I’d like to see a rumor they solidified the manufacturing process!

    Prices coming down across the board. I can dream I guess…

  • Tummybunny2@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 months ago

    These will go nicely with DisplayPort 2.1 monitors that should be available by 2030, or perhaps even earlier.

    • fixminer@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      10 months ago

      The refresh cards still use the same silicon, just with a different amount of disabled functional units. Enabling DP 2.1 might require an expensive partial redesign, which Nvidia doesn’t really have any incentive to do, since people will buy the GPUs anyway.

      • capn_hector@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 months ago

        What is different about AMD’s W7000 such that they can offer higher DP standards support than the RX 7000 consumer cards?

    • ThatOneLance@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      10 months ago

      “new” “major” feature to help sell the new cards. Let’s say you got a 4090, upgrade your panel, and then realized you’re gonna need DP 2.1 because you got a 20k super HD oled monitor… time to drop $ for that 5090.

      There is “Display Stream Compression” or DSC that I believe can help with the bandwith constraints from 1.4a to 2.1 but it’s compression and may cause latency/quality loss - in general you just lose future proof as AMD already has DP 2.1 meaning they can fully support the 540hz monitors for example or any high quality panel.

  • TimeGoddess_@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 months ago

    I wonder why everyone holds AMDs display port 2.1 in such high regard. Its barely more bandwidth than hdmi 2.1 since its not the full UHBR20 80GBPS

    its UHB13.5 at 54GBPS on RDNA 3 vs 48GBPS HDMI 2.1 on ADA GPUs

    • althaz@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      10 months ago

      Because HDMI is a pain in the ass and everybody who is using a PC is expecting to use DisplayPort.

      • kuddlesworth9419@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 months ago

        The only time we use HDMI is when you want to connect you’re PC to a projector or a surround system because the standard there is HDMI.

        • Haunting_Champion640@alien.topB
          link
          fedilink
          English
          arrow-up
          1
          ·
          10 months ago

          And unless I’m missing something, the ONLY way to get TrueHD 7.1 + ATMOS is via HDMI. The old optical standard (name escapes me) can’t do that.

          • kuddlesworth9419@alien.topB
            link
            fedilink
            English
            arrow-up
            1
            ·
            10 months ago

            Yea I think you are right but all the standards and audio/video formats get very complicated very fast for me. With MPC-HC and MadVR I can play anything no problem though so that is OK. But if you actually want a surround receiver and for the audio to work as intended you need to use HDMI as far as I understand it.

      • Haunting_Champion640@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 months ago

        Because HDMI is a pain in the ass and everybody who is using a PC is expecting to use DisplayPort.

        Unless you’ve ascended to the PC-powered couch gamer. Don’t knock it until you try it!

        • PC

        • flagship OLED

        • comfy couch

        • ATMOS surround system

        Takes PC gaming to a whole other level. HDMI is nice for eARC (PC direct to display, audio feeds back to the AVR/amp via HDMI). It’s a clean, awesome setup. Also lets you run TrueHD 7.1 ATMOS for all those Linux ISOs :)

        • althaz@alien.topB
          link
          fedilink
          English
          arrow-up
          1
          ·
          10 months ago

          I have 5 PCs and 8 monitors in my home and they are all using DP except one that’s really old and uses DVI. DP is the standard connector for PC monitors.

          • Reddituser19991004@alien.topB
            link
            fedilink
            English
            arrow-up
            1
            ·
            10 months ago

            Everything in my home uses HDMI.

            I also don’t use monitors. LG CX OLED as main monitor, sometimes hook a PC up to the regular LED bedroom tv.

            No interest in using displayport, HDMI is fine for the vast majority of use cases. Not sure why we need competing connectors. Frankly, not sure why we don’t all just switch to USB-C long term.

            There’s just no reason to not standardize with USB-C for every display.

      • filisterr@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 months ago

        not to mention that HDMI is licensed port, that is not fully implemented under Linux, while DP is license-free port.

      • TimeGoddess_@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 months ago

        Im using it to connect My PC to my OLED TV and AV Reciever. Way better than a monitor to me

    • JohnExile@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      10 months ago

      There was some kind of recent issue with a new fatty monitor requiring 2.1, right? I remember people rubbing it in Nvidia’s face over it. This article just reads funny, like dunking on Nvidia for not putting money into something that had literally 0 tech available for it in the forseeable future and then laughing at them for adding it when there’s finally tech to use it.

        • TheRealBurritoJ@alien.topB
          link
          fedilink
          English
          arrow-up
          1
          ·
          10 months ago

          The display supports it’s full resolution and refresh rate over HDMI 2.1. It’s just not working right now at full rate over HDMI with either AMD or NvIdia graphics cards for some reason.

          The table included in this article is misleading, as they’ve cropped off the original Korean text which states that it’s currently not working at 240Hz over HDMI 2.1 with AMD either but that they contacted AMD who said it would work with a future driver update. The only reason that NVIDIA is listed at 120Hz in that table is because Quasarzone didn’t get a reply from NVIDIA in time for publication.

          It’s unclear why no cards can do 240Hz over HDMI with that monitor when it’s in spec.

          • AK-Brian@alien.topB
            link
            fedilink
            English
            arrow-up
            1
            ·
            10 months ago

            The display only has one high refresh HDMI port (out of three), but that input is limited to 120Hz. It’s stamped on the shell and listed as such in the manual, so it would appear to be on Samsung.

            https://imgur.com/a/xx1PWyp

            240Hz on this display requires the use of DisplayPort, which is what makes it a perfect (well, imperfect) example.

      • Haunting_Champion640@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 months ago

        like dunking on Nvidia for not putting money into something that had literally 0 tech available for it in the forseeable future

        That’s not how this works. GPUs have an effective lifespan of 5-10 years, and hardware vendors look at what’s out there to set roadmaps and releases. Someone has to go first and make it available, and Nvidia skipping 2.1 on 4xxx is only slowing down next-gen display releases.

      • Dez_Moines@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 months ago

        Probably because people expected a $1000+ GPU to have one years worth of incredibly foreseeable future-proofing built into it.

    • Stoicza@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      10 months ago

      48 vs 54 is roughly 12.5% more bandwidth. In the PC hardware world, 12% isn’t often considered very close.

      That 12.5% increased bandwidth allows this 8k monitor, to be easily run at 240hz 10 bit color with DSC.

      • TheRealBurritoJ@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 months ago

        That monitor theoretically supports it’s full resolution and refresh rate over HDMI 2.1 too, the extra bandwidth isn’t the difference maker.

        Right now no cards seem to work with it at 240Hz over HDMI, but it’s listed as supported on the monitors end.

    • Kepler_L2@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      10 months ago

      Actually RDNA3 dGPUs do have the full 80Gbps bandwidth, but it’s artificially limited to 54 Gbps on consumer GPUs.

  • fixminer@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 months ago

    I sure hope Blackwell has DP 2.1, anything else would be a bad joke. Ada should have already had it.

    • hackenclaw@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      10 months ago

      I find it quite strange nvidia didnt try to upsell it in Ada GPU, like only offer DP 2.1 on AD102, AD103. The lower end get support next gen.

    • Haunting_Champion640@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      10 months ago

      DP2.1 is great and all, but is there any talk yet of “HDMI3” or 2.2? Flagship OLED TVs wont ever come with DP

      I’m hoping by 2024 we can get an 8k120 TV that can also run in 4k240 mode.

  • bubblesort33@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 months ago

    Yes we know. A lot of people wouldn’t shut about the Nvidia GPUs not having it, as if it was that important.

    • TSP-FriendlyFire@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      10 months ago

      But but but there’s that one Samsung monitor everyone trots out every time the subject is brought up! Obviously every 7000 series owner has one, though I can’t fathom what game even a 7900XTX could drive at 4K super ultrawide at 240Hz.

      • Haunting_Champion640@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 months ago

        Nvidia: “No one has DP2.1 monitors, why ship it?”

        Monitor manufacturers: “No one has DP2.1 capable sources, why ship it?”

        Some one has to go first. This is how tech works people.

        • Sexyvette07@alien.topB
          link
          fedilink
          English
          arrow-up
          1
          ·
          10 months ago

          He’s right, though. Even the 4090 isn’t powerful enough for 4k120 without upscaling and Frame Gen. Why put out 4k240 monitors if the newest gen, essentially “Titan” class GPU, can’t even drive it? Just to say you did it first? It hardly seems like a good idea to sink all that capital just to be the first person to offer a halo product that relatively very few people will buy. There’s a significant risk to that strategy that could bite them in the end.

          There’s a natural progression to technology, always has been. When the hardware becomes powerful enough, that’s when the next step forward will happen. Especially with how much money is involved in researching and implementing new technologies. As of now, IIRC only one monitor is capable of pushing 4k240 anyway, and it should work with the 4090 with DSC by HDMI 2.1’s specifications. So either Samsung screwed up, or Nvidia needs to enable it at the driver level. Either way, it’s a non issue, which I’m sure is why Nvidia didn’t put DP 2.1 on the 40 series. If they don’t have the power to drive the port now, they sure as hell won’t have it down the line.

          • Haunting_Champion640@alien.topB
            link
            fedilink
            English
            arrow-up
            1
            ·
            10 months ago

            Even the 4090 isn’t powerful enough for 4k120 without upscaling and Frame Gen. Why put out 4k240 monitors if the newest gen, essentially “Titan” class GPU, can’t even drive it?

            Because AI-upscaling and frame gen are here to stay. You’re going to have to accept it eventually.

    • capn_hector@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      10 months ago

      there’s barely even any monitors anyway.

      it’s like nvidia and the consoles: AMD can do whatever they want but the market penetration isn’t there until nvidia is onboard. Monitors are a low-margin high-volume business and you can’t support an advanced product that tops out at 10% addressable market.

      Let alone when that brand’s customers are notoriously “thrifty”…

      • ConsistencyWelder@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 months ago

        It’s not just about what you need today, it’s also about what you need in a couple years. If I pay $1600+ for a video card you can rest assured I expect it to be used for more than a couple years. Skimping on the ports seems like a bizarre choice.

        • HorseFeathers55@alien.topB
          link
          fedilink
          English
          arrow-up
          1
          ·
          10 months ago

          So you need more than 165hz, 10bit, hdr, and 4k in a couple of years? Because that’s what hdmi 2.1 on a 4090 is running for me. I agree. They could have done better on the ports, but to the majority of users, the hdmi 2.1 has enough bandwidth tbh.

          • Haunting_Champion640@alien.topB
            link
            fedilink
            English
            arrow-up
            1
            ·
            10 months ago

            So you need more than 165hz, 10bit, hdr, and 4k in a couple of years?

            Yes. I want my dual-4k 32:9 240hz display for coding goddamit.

          • TSP-FriendlyFire@alien.topB
            link
            fedilink
            English
            arrow-up
            1
            ·
            10 months ago

            Not just need, but be capable of driving, too. Even a 4090 wouldn’t be able to run most games at the resolutions and refresh rates we’re talking about, and I doubt someone buying an insanely expensive monitor and the most expensive consumer GPU on the planet would then play games on low/mid settings.

        • Sexyvette07@alien.topB
          link
          fedilink
          English
          arrow-up
          1
          ·
          10 months ago

          HDMI 2.1 can do 4k240 with DSC. The fact is that there’s only ONE monitor even capable of 4k240, and there’s some issue preventing that specific monitor from doing it over HDMI. The only way that’s a Nvidia problem is if it’s not enabled at the driver level. Otherwise, that’s a Samsung problem.

          If the 4090 doesn’t have the power to push 4k240 now, it sure as hell won’t going forward once games become even more demanding. HDMI 2.1 can handle everything the 4090 is capable of. Should it have had DP 2.1? Yes. But acting like it’s holding back the 4090 is ridiculous. People are acting like that one monitor thats having issues is representative of every 4k240 monitor going forward, and coming to the false conclusion that the 4090 isnt capable of it. Once the issue with that specific Samsung monitor gets fixed, this all becomes a non-issue.

          Going back to the power issue, it can’t even do 4k240 without a shit ton of upscaling and Frame Gen except on older games anyway. The whole issue is ridiculously overblown IMO.

        • capn_hector@alien.topB
          link
          fedilink
          English
          arrow-up
          1
          ·
          10 months ago

          It’s not just about what you need today, it’s also about what you need in a couple years.

          I think this is a real tough argument even in the high-end monitor market. isn’t your $700 or $1200 or $2500 or $3500 going to get you more in 2 years if you wait?

          why not wait to see what the monitor market has to offer when nvidia has cards to drive them?

          it literally is the ironic mirror image of AMD’s tech holding back the consoles. just a funny coincidence of fate, funny reversal.

  • OrionsTieClip@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 months ago

    I feel like the number of people that care about being able to drive a display faster than 4k 120hz is rather small, bit weird to dunk on NVIDIA for that

      • OrionsTieClip@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 months ago

        I personally care about the shit VRAM. I do ML research, and anything except the 3090 and maybe the 12 gb cards is stupid.

    • VIRT22@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      10 months ago

      There’s a correlation between NVIDIA adapting a new tech standard and the number of monitors makers offering products that support it since GeForce is market leader.

      The 4090 can easily push 4K 240Hz in lighter games if paired with high-end CPUs without DSC and full 10-bit.

    • ranixon@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      10 months ago

      Expensive cards will be connected to expensive monitors, and expensive monitors like the Samsung Neo G9 uses this connector

    • Firefox72@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      10 months ago

      It might be nieche for now but Nvidia skimping out on it in their 4000 series was just weird given the price of these products.

  • BroderLund@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 months ago

    All fine and good. As I work with video I’m more interested in them adding HW acceleration of HEVC 10bit 4:2:2. Codec that a lot of cameras today shooting in. Sony, Canon, DJI and more. Intel Arc is the only GPU (and CPU with iGPU) that support’s it as of now. Supported since Intel 10 Gen, so well before 30-series cards.

      • BroderLund@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 months ago

        It is for normal compressed video. 8bit 4:2:0. However cameras capture at higher color quality. 10bit 4:2:2. This for the purpose of color grading where the cameras shot in LOG profile where colorgrading is needed to get a good look. 10bit 4:2:0 is supported in many GPUs but that has lower color information and many cameras do only 4:2:2 and not 4:2:0 chroma subsampling. This difference is not much until you start colorgrade you videos. Then the colors can get strange and create artifacts that you don’t want.

    • LittlebitsDK@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      10 months ago

      so if you already have it in hardware, why would you need another piece of hardware to do the same thing? *puzzled look*

  • gitg0od@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 months ago

    i need rtx 5090, as a big vr user, vr mods and all, take my money nvidia !

  • wolvAUS@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 months ago

    Can NVIDIA start normalizing larger VRAM capacities (>12GB VRAM)?

    I’d gladly pay for a card with 20GB of VRAM. The closest is a 4090 but it’s like $3K AUD over here.

    • althaz@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      10 months ago

      I think this is just about a given, tbh. nVidia GPUs are going to benchmark like shit in 2025 with low VRAM totals now that we’re at the point where the games needing more VRAM are actually out.

      Although that’s mostly going to be a thing for the mid-range (and low-end if they return to that market). For 4070+ GPUs VRAM is weirdly low, but doesn’t seem to cause any problems yet (and honestly I don’t see it being a problem until the next console gen).

  • Prestonality@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 months ago

    Tech articles would have me believe this whole lack of DisplayPort 2.1 was bigger problem with the 40 series than reality.

  • Pablogelo@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 months ago

    Weird article, regards kopite7kimi as a quality leaker but at the end says GDDR7 is “other rumors” which should be taken with a grain of salt, when it was hope who affirmed it.

    Btw, would this be N3B or N3E? Did anyone follow on NVidia orders?