• JustMrNic3@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    So soon???

    I hope the EU changes the rules that the vendors provides at least 10 years of support for their products or open source all the software, so others can still provide the support!

  • OkraFit4982@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    To be frank RDNA1 also belongs in this bracket, several 3 updates have been total regressions in game performance.
    Especially compared to the RADV drivers.

    • markthelast@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      It’s been a good run. We have been speculating for months if AMD is ending real support for Polaris/Vega, and it was only a matter of time.

      A while back, I retired my RX 580 for a Vega 56, and now, I got a used Vega Frontier Edition as my daily driver. Polaris/Vega were solid cards for their era, but AMD made their choice to step away with RDNA II/RDNA III. Hopefully, AMD doesn’t abandon RDNA I in 2024/2025 after the way they surprised everyone with retiring support for GCN I/II/III in 2021. With the end of Vega, the GCN era is done.

  • petron007@alien.top
    cake
    B
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    That sounds about right considering the past couple of driver updates have been dog smack for my RX480. Still running ones from July since they are only ones that are stable.

    With RX6600 coming down to around 120-140e and rtx 20 series sub 200euros, it’s probably a good time for all of us Polaris owners to jump ship and get something more modern. 👍

    • doneandtired2014@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Radeon VII wasn’t a lower tier die. Every Radeon VII ever sold was effectively a salvaged Instinct MI50 that was unable to be validated for that particular market segment. It was and remains AMD’s only equivalent to NVIDIA’s (very dead) Titan line of products (as all Titans were salvaged Quadros and Teslas).

      The jump from 14nm (which wasn’t really that much different than 16nm) to 7nm can’t be overstated. It was only slightly less of a leap as NVIDIA recently made when jumping from Samsung 8nm to TSMC N4 this generation (which was *massive*). VEGA 20 might be significantly smaller than VEGA 10, but it also packs 10% more transistors into that smaller surface area. Additionally, the memory interface is twice as wide in VEGA 20 (4096 bit) relative to VEGA 10 (2048bit) because AMD doubled the HBM2 stacks from 2 to 4. HBM2 was/is insanely expensive compared to GDDR5, GDDR5x, GDDR6, and GDDR6x modules, so much so that Radeon VII’s VRAM cost *by itself* was equitable to the BOM board partners were paying to manufacture a complete RX 580.

      All in all, it was an okay card. It wasn’t particularly good for gaming relative to its peers but the same criticism could easily be made for VEGA 56 and 64. It was an phenomenal buy for content creators who needed gobs of VRAM but couldn’t afford the $2500 NVIDIA was asking for Titan V.

      • handymanshandle@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        I remember one of the primary driving factors of cost on the R9 Fury cards (Fury, Nano and Fury X, as well as stuff like the Radeon Pro Duo) being the ridiculous cost of HBM manufacturing. Given that it’s, well, stacked memory with little room for manufacturing defects, it was not cheap to manufacture.

        I want to say that this was also the primary reason that the RX Vega cards (Vega 56 and 64 more accurately) were cheaper than their Fury counterparts - less memory modules that were insanely expensive to make means, well, a less expensive card. I could honestly see why AMD ended up dropping HBM for consumer graphics cards, as its ridiculous memory bandwidth advantage was diminished heavily by its buy-in cost and the rise in suitable gaming performance of GDDR5/6 memory, even if it meant that the cards consumed more power.

      • capn_hector@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        Radeon VII wasn’t a lower tier die. Every Radeon VII ever sold was effectively a salvaged Instinct MI50 that was unable to be validated for that particular market segment.

        Sure, but couldn’t they make a bigger chip that performed even faster? Why did they reduce the size of the flagship, why not make it both smaller node and also keep the size the same?

        Yeah, it’d take architectural changes to GCN, but, that’s not consumers’ problem, they’re buying products not ideas.

        Isn’t that exactly what NVIDIA did with Ada, shrink the node but all the dies get much smaller, so a x80 tier product is now the same size as a 3060 or whatever?

  • taryakun@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Radeon VII was released 02.2019, 4.5 years ago. That’s too early to drop the support.

    • IrrelevantLeprechaun@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      No it’s not too early. 4.5 years in the GPU industry is comparable to 10-25 years in actual time.

      There is zero justifiable reason to continue supporting such ancient hardware. I mean even Nvidia doesn’t support Turing anymore, let alone Pascal lmao.

    • LordTism@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Same way i feel about RDNA 1 not getting driver level frame gen, Anti lag, or any other fancy new driver level features. Makes no sense when all of those features if implemented by a dev into a game will work on RDNA 1, But at a driver level? Nope we gotta go F ourselves apparently. Really tempted to not go AMD for my next GPU. If they cant even support a 4 year old GPU im not interested.

  • nyanmisaka@alien.top
    cake
    B
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    AMD has never been a software company. But NVIDIA is. Maxwell 2 and Pascal are still being supported.

  • RyanSmithAT@alien.top
    cake
    OPB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Nothing you guys haven’t already noticed with the last couple of Adrenaline driver releases for Windows. But AMD has finally officially commented on the matter (and this has yet to be posted on their website).

    Polaris and Vega aren’t legacy - AMD isn’t pulling ongoing driver support entirely - but they’re now in an “extended” support phase. Meaning they’ll mainly get bug fixes and irregular “functionality updates” that AMD decides to backport from their mainline (now RDNA-only) driver branch. AMD has not told me how long they intend this extended support period to last.

  • Astigi@alien.top
    cake
    B
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Linux has way better open drivers and performance anyway, and not thanks to bloated and lazy AMD.
    Mesa is a blessing

  • knownbyfew_yt@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Is this even ethical business practice?

    In my country brand new RX 580s are still being sold and you have to also consider the fact that AMD still makes new APUs with Vega iGPUs, so how come they can discontinue game optimization updates for the Vega/Polaris series?

    • Jism_nl@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      There’s no brand new RX580’s. It’s just discontinued for a longer time and your buying a product that’s bin on the shelfs for some time. If you install the card now and download the latest drivers for it, 99.9% everything works just fine.

  • bert_the_one@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Well when the drivers support ends for my RX580, which is AMD’s most popular graphics card on steam, I think I may take a good look at Intel ARC graphics cards or look at the equivalent AMD card providing it’s priced right.

    • Swizzy88@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Same here, I’m also looking at ARC but I’ll wait for their new gen. Hopefully they’ll keep a similar pricing strategy.

    • RPSamCool44@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      You can generally get a 6700xt on the cheap in the used market depending on where you live.

      I retired my rx590 to an 6700xt that cost me 300$ CAD.