• gravballe@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 months ago

    and still high idle power usage (cant get mine under 15w) my nvidia / amd cards idle around 4-5w even with screen attached… (im using a a380 in a server for video encoding/decode)

  • BlueKnight44@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 months ago

    Where is intel’s future honestly without GPU’s? Second rate CPU’s? Manufacturing silicon for everyone else? I just don’t see how Intel continues as an industry leader without expanding into other segments like GPU’s. They have sold off a bunch of other businesses in the last few years. Maybe I just don’t understand their business enough.

  • brand_momentum@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 months ago

    I never thought Intel would get rid of Arc dGPUs, but now the rise of AI could be a big reason and solid reason for share holders to keep Arc going anyway

    I don’t see AMD beating Intel in AI

    • dotjzzz@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      10 months ago

      They need to beat 4070 at the beginning of the cycle. Or at least 4070 Ti level mid-cycle. And that’s just old x070 non-Ti level performance to begin with.

  • yock1@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 months ago

    Let hope Intel continues. Nvidia seems to concentrate on ai which isn’t good for gamers and having a sole manufacturer left (amd) to make a monopoly won’t be good either.

      • yock1@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 months ago

        "gamers"as s hole, just in case of misunderstanding.

        Nvidia are earning af. ton of money on ai hardware, they would be fools to not move their manufacturing capacity more towards ai and not consumer graphics card. This will make graphics cards more scares and expensive. Just look what the crypto boom did and still does, graphics card cost an arm and a leg and will get much much worse with the ai boom

  • soggybiscuit93@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 months ago

    Arc is realistically a bigger threat to AMD than it is to Nvidia. The second half of the 2020’s will be AMD and Intel competing over second place for desktop dGPUs.

    For mobile, Arc iGPUs, while obviously not matching dedicated GPUs, can realistically offer good enough performance to some people who want to do light gaming, then stepping up to a low end dGPU just to make sure Minecraft, Fortnight, etc. can at least run may not be worth the extra cost.

    Either way, I think Intel’s heavy focus on putting Arc in all of their Core Ultra CPUs and heavily focusing on iGPU can be a potentially bigger disruptor than their desktop dGPUs, at least in the nearterm.

    • Eitan189@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      10 months ago

      Arc is no threat whatsoever to Nvidia, not unless Intel manage to scale up the architecture to enterprise-grade levels and develop something akin to the CUDA API.

      • soggybiscuit93@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 months ago

        Intel’s competitor to CUDA is oneAPI and SYCL. Intel poses no threat to Nvidia GPUs in datacenter in the near term, but that doesn’t mean Intel won’t still secure contracts.

        Intel’s biggest threat to Nvidia is against Nvidia’s laptop dGPU volume segment. Arc offers synergies with Intel CPUs, a single vendor for both CPU and GPU for OEMs, and likely bundled discounts for them as well. A renewed focus on improving iGPUs also threatens some of Nvidia’s low end dGPUs in laptops - customers don’t have to choose between very poor performance iGPU or stepping up to a dGPU, and now iGPUs will start to become good enough that some customers will just opt to not buy a low end mobile dGPU in coming years.

        • Nointies@alien.topB
          link
          fedilink
          English
          arrow-up
          1
          ·
          10 months ago

          Not to mention that Intel could have consumer AI tech in nearly -every- laptop sold in 5 years with just an intel iGPU. Not to mention mini-PCs etc etc, especially if LNL pans out well. Thats a scale of deployability that Nvidia simply cannot compete with.

        • NoiseSolitaire@alien.topB
          link
          fedilink
          English
          arrow-up
          1
          ·
          10 months ago

          A renewed focus on improving iGPUs also threatens some of Nvidia’s low end dGPUs in laptops - customers don’t have to choose between very poor performance iGPU or stepping up to a dGPU

          AMD has had iGPUs in laptops for a long time now, and the better CPUs for more than a couple of the past few years, yet laptops are still sold with Nvidia dGPUs even when they have decent AMD iGPUs.

          It might kill the lowest of the lowest end of laptop dGPUs, but I think Nvidia’s pricing is doing that faster than Intel’s success with Arc.

          • YNWA_1213@alien.topB
            link
            fedilink
            English
            arrow-up
            1
            ·
            10 months ago

            The issue with AMD laptops is availability and the mixing of generations under similar SKU numbers. There’s only a handful of Zen4 laptops in the wild, and they’re mixed in with Zen2 and Zen3 parts, leading to a confusing experience for the average buyer. So, people will either go for an Intel laptop, or find an Nvidia dGPU laptop for the ‘upgrade’.

    • Nointies@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      10 months ago

      Not to mention the inclusion of XMX cores in Arrow Lake and presumably beyond could provide XeSS video upscaling similar to what DLSS is doing, all without a dGPU

  • bubblesort33@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 months ago

    So Smooth Sync is just not working in most games? Like what if you tried it in something totally unexpected like The Witcher 1 or 2? Something old, or something brand new? Is it a white list where they select which games to enable it for? Or a black list where they disable it for certain games exhibiting problems?