512gb of unified memory is insane. The price will be outrageous but for AI enthusiasts it will probably be worth it.

  • rdri@lemmy.world
    link
    fedilink
    English
    arrow-up
    14
    arrow-down
    1
    ·
    19 hours ago

    can be configured up to 512GB, or over half a terabyte.

    Are you ok mate?

      • rottingleaf@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        1
        ·
        13 hours ago

        That’s a retcon of hardware producers using measurement units confusion to advertise less as more.

        It’s nice to have consistent units naming, but when the industry has existed for a long enough time with the old ones, seems intentional harm for profit.

        • BorgDrone@lemmy.one
          link
          fedilink
          English
          arrow-up
          1
          ·
          3 hours ago

          How is it a retcon? The use of giga- as a prefix for 109 has been in use as part of the metric system since 1960. I don’t think anyone in the fledgeling computer industry was talking about giga- or mega- anything at that time. The use of mega- as a prefix for 106 has been in use since 1873, over 60 years before Claude Shannon even came up with the concept of a digital computer.

          if anything, the use of mega- and giga- to mean 1024 is a retcon over previous usage.

        • sugar_in_your_tea@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          1
          ·
          10 hours ago

          That’s not a retcon. Manufacturers were super inconsistent with using it, so we standardized the terminology. For floppy disks were advertised as 1.44MB, but have an actual capacity of 1440 KiB, which is 1.47 MB or 1.41 MiB.

          The standardization goes back to 1999 when the IEC officially adopted and published that standard.

          There was a federal lawsuit on the matter in California in 2020 that agreed with the IEC terminology.

          All of this was taken from this Wikipedia article if you’d like to read more. Since we have common usage, standards going back almost 30 years, and a federal US lawsuit all confirming the terminology difference between binary and decimal units, it really doesn’t seem like a retcon.

              • scratchee@feddit.uk
                link
                fedilink
                English
                arrow-up
                3
                ·
                7 hours ago

                Agreed, but do you pick the de-facto standard of the entire industry (minus storage advertising) or the de joure standard of an outside body that has made a very slight headway into a very resistant industry.

                The reality is that people will be confused no matter what you do, but at least less people will be confused if you ignore the mibibyte, because less people have even heard of it

                • sugar_in_your_tea@sh.itjust.works
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  edit-2
                  6 hours ago

                  You pick neither, and enforce correct usage of both in advertised products. Tech people will adapt, and non-tech people will be confused regardless (they still confuse megabytes/sec and megabits/sec, and that’s an 8x difference).

                  • scratchee@feddit.uk
                    link
                    fedilink
                    English
                    arrow-up
                    2
                    ·
                    6 hours ago

                    Agreed, I’d be entirely fine with legal enforcement of the ISO definitions in advertising, no need to air historical dirty laundry outside the profession