• anamethatisnt@lemmy.world
    link
    fedilink
    English
    arrow-up
    76
    arrow-down
    5
    ·
    1 day ago

    I can’t wait for the release in 2044! I hope 1440p is still all the rage when it launches!

    Serious note: I hope Intel stays in the dgpu market, we could use another player in the space.

    • inclementimmigrant@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      33
      ·
      edit-2
      1 day ago

      Yeah, very glad that Intel has stayed in the market.

      It’s very refreshing to see a company release a reasonable, though that term has been skewed so much over the years, budget cpu that doesn’t completely suck and actually tries to run current green games.

      • adarza@lemmy.ca
        link
        fedilink
        English
        arrow-up
        9
        ·
        1 day ago

        very glad that Intel has stayed in the market

        for now, anyway. nobody knows if the discrete gpu division will survive the leadership shakeup and new ceo (when they find one)

        • fuckwit_mcbumcrumble@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          13
          ·
          1 day ago

          It would be incredibly stupid for Intel to abandon the dGPU market after spending all this money on it. As long as Battlemage turns out alright (basically it’s only goal) I doubt it will go away.

          They cut the die size nearly in half so they’re no longer blowing a fuck ton of money on a $200 GPU. As long as utilization of the silicon goes up it should be fine.

          • mnemonicmonkeys@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            2
            ·
            19 hours ago

            Apparently Intel is replacing Gelsinger because his plan to turn the company’s fortunes around are taking too long. My guess is the new CEO will likely sell off major parts of the company and I doubt the dGPU division will be kept

          • DarkThoughts@fedia.io
            link
            fedilink
            arrow-up
            6
            ·
            1 day ago

            Yeah, I think giving up after just two generations would be a weird move. It’s not an easy market to enter and Intel knew that beforehand.

    • zib@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      ·
      1 day ago

      I haven’t been a huge fan of Intel for their cpus for some time now, but I agree, there needs to be more gpu competition out there. I’ve been wanting to try out an Arc for a while, I’m just hoping the dgpu drivers are better than what they run for their integrated chips.

    • iAmTheTot@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      3
      ·
      1 day ago

      This is a strange comment when the article is about the launch on December 12th. Maybe the joke went over my head?

  • secret300@lemmy.sdf.org
    link
    fedilink
    English
    arrow-up
    5
    ·
    18 hours ago

    I just bought an a750 honestly I’m loving it. So far I’ve only had issues in Skyrim with the shadows, and vermintide 2.

  • doggle@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    6
    ·
    19 hours ago

    Sick. I got an a770le when they launched. Buggy AF, but not bad performance when it decided to work. It currently lives as a dedicated av1 encoder in a Plex server

  • Sanctus@lemmy.world
    link
    fedilink
    English
    arrow-up
    39
    arrow-down
    1
    ·
    1 day ago

    It’s official, the Intel Arc B580 GPU is launching on December 12, 2044

    So excited. Can’t wait.

    • golli@lemm.ee
      link
      fedilink
      English
      arrow-up
      7
      ·
      21 hours ago

      Isn’t this the same architecture that is also in their iGPUs? That should help keep them motivated to improve drivers even if they lose interest in dGPUs.

    • doggle@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      3
      ·
      19 hours ago

      Been a while but I played around with the a770 in Arch for a few months. It didn’t play nice with proton and even native games were hit and miss. Better support from Intel than nvidia gives, but it’s a new platform and Linux development was definitely taking a back seat to the windows drivers which were also a buggy mess.

      And basically nobody had the cards so if something didn’t work your options were to give up or become a computer graphics programming wizard and fix it all yourself from scratch.

      To answer the question: not really, no. The drivers themselves may have been fine, but who knows how any given software will handle a brand new GPU architecture.

    • kippinitreal@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      edit-2
      1 day ago

      As an aside knowing most companies working in embedded technologies usually work in, or have strong aspects in Linux. Why then are Linux drivers so difficult to come by? Lack of customers seems unlikely since they mostly have everything ready, right? Or is it cost cutting to avoid lengthy QA on another platform? That would be easy to sidestep by giving a no-warranty driver version?

  • Zarxrax@lemmy.world
    link
    fedilink
    English
    arrow-up
    16
    arrow-down
    2
    ·
    edit-2
    1 day ago

    Slightly better performance than a 4060 and $50 cheaper. But the 4060 is about to be replaced with a newer model at this point, so is it actually a good deal? Questionable.

      • Sconrad122@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        21 hours ago

        And come out in 2026 after Nvidia finally gives up trying to convince the market that their $600 5070 is the low end of the GPU market

    • Blue_Morpho@lemmy.world
      link
      fedilink
      English
      arrow-up
      13
      ·
      1 day ago

      Nvidia says the $2000 5090 comes out in January. They haven’t even hinted at an announcement for the 5060 so it will be a very long time before it comes out.

    • iAmTheTot@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      12
      ·
      1 day ago

      Might be a year before Nvidia’s next 60 class card is out. They usually release from highest spec to lowest.

    • inclementimmigrant@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      9
      ·
      edit-2
      1 day ago

      Given that Nvidia said to be upping the prices of their 50XX series and the current 4060 and 7600 cards only offer 8 GB of vram, which honestly is insufficient for modern games now and overpriced, yeah, I do think this will offer decent value to budget gamers.

      • SupraMario@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        20 hours ago

        The 20xx series was expensive, skipped the 3x/4x and went back to amd. Even though I got my 7900xtx on sale, it still was insanely expensive for a gpu…where are the $500 top gpus gone.

    • themoonisacheese@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      7
      ·
      1 day ago

      Number for number, sure, if it’s actually available at that price.

      The problem is that Intel’s drivers sucked in the past, so they definitely have to prove themselves with this launch. I definitely wouldn’t be buying it release day if I needed a GPU.

      • DarkThoughts@fedia.io
        link
        fedilink
        arrow-up
        2
        ·
        1 day ago

        I just can’t imagine the extra vram making such a difference in performance that it is enough to play in 1440p, let alone on ultra. I have a 6650 XT, which is slightly slower than the targeted 4060 / 7600 and that thing struggles even in 1080p.

        • The Hobbyist@lemmy.zip
          link
          fedilink
          English
          arrow-up
          5
          ·
          1 day ago

          Check the video. It clearly shows how performance drops significantly the moment you run out of vram. It doesn’t meant the performance will be perfect in 1440p, it means Intel is using that as a competition ground, something the 8GB cards fail at and maybe Intel’s GPU isn’t great but the 12GB will probably make a difference (and Intel is maybe being quiet at 1080p because they are likely to perform worse).

  • dinckel@lemmy.world
    link
    fedilink
    English
    arrow-up
    12
    ·
    1 day ago

    Hopefully it will bring some decent generational improvements. The only thing i’m not a huge fan of is the 45% price increase over lasts gen, which isn’t even putting used or discounted cards into consideration

  • Jo Miran@lemmy.ml
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    1
    ·
    1 day ago

    I game at 1440p on a 1080ti. So what this tells me is that I don’t need to upgrade. Cool.

      • Jo Miran@lemmy.ml
        link
        fedilink
        English
        arrow-up
        4
        ·
        1 day ago

        I mostly play BG3 now but I was hard into Destiny 2. As long as I capped my FPS to match my monitor (so 120), I could crank it up to pretty much max. BG3 and Last Epoch I max out (still fps capped). Cyberpunk 2077 I didn’t bother with and play it on GeForce Now. Most other games I play are AA or indie and the 1080ti at 1440p handles them easily.

        Space Marine II is another that’s going on GeForce Now just because I want it on Ultra everything. So literally 95%+ of my library runs maxed at 1440p/120 on a 1080ti.