• kyralfie@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      10 months ago

      Nope, again just priced badly. All of the Ada Lovlace chips are efficient and performant for what they are.

        • RephRayne@alien.topB
          link
          fedilink
          English
          arrow-up
          1
          ·
          10 months ago

          Thanks to the crypto/AI price gouging, nVidia have been allowed to up tier their GPUs. If the 4060 was released as the 4050 that it probably should’ve been then the 4060 Ti could’ve lost the “Ti” part and they could’ve sold the 16GB version as the plain Ti - with the corresponding price corrections.

        • kyralfie@alien.topB
          link
          fedilink
          English
          arrow-up
          1
          ·
          10 months ago

          4060 Ti is much more efficient + that frame generation. The difference in performance is not material, really. I know imagining is hard for some people but please do try. If you can’t judge chips on merit without marketing names and prices, just imagine it would cost $100 cheaper and be called 4060. Wouldn’t it be impressive then? So much more perfromant, so much more efficient.

          So, again, a great product but a bad pricing.

          • TheEternalGazed@alien.topB
            link
            fedilink
            English
            arrow-up
            1
            ·
            10 months ago

            4060 Ti has 128-bit bus width, meaning it will age poorly in the long term. 4060 was designed to be planned obsolescence.

            3060 Ti will last you longer as future games will demand more memory bandwidth.

            • capn_hector@alien.topB
              link
              fedilink
              English
              arrow-up
              1
              ·
              10 months ago

              that’s what I said, the memory bandwidth is already baked into the numbers you see. the cache increases mean that you don’t need as much actual memory bandwidth - it’s the same thing AMD did with RDNA2.

              AMD reduced the memory bus by 25% on the 6700XT relative to its predecessor and 33% on the 6600XT relative to its predecessor, so, if you think that will cause those cards to age more poorly…

              • TheEternalGazed@alien.topB
                link
                fedilink
                English
                arrow-up
                1
                ·
                10 months ago

                RDNA 2 is dogshit as well, lmao. I’m not defending them either.

                The cache increase didn’t do shit since the previous generation Ampere performs better at 4K than Ada Lovelace.