• HiddenLayer555@lemmy.ml
    link
    fedilink
    English
    arrow-up
    5
    ·
    2 days ago

    Also, games went from writing the most cleverly optimized code you’ve ever seen to squeeze every last drop of compute power out of a 6502 CPU all while fitting on a ROM cartridge to not giving a single shit about any sort of efficiency, blowing up the install size with unused and duplicated assets, and literally making fun of anyone without the latest highest end computer for being poor.

    • zalgotext@sh.itjust.works
      link
      fedilink
      arrow-up
      1
      ·
      5 hours ago

      I know it’s a one-of-a-kind game, but it still amazes me that Roller Coaster Tycoon released in 1999, a game where you could have hundreds of NPCs on screen at a time, unique events and sound effects for each of those NPCs, physics simulations of roller coasters and rides, terrain manipulation, and it was all runnable on pretty basic hardware at that time. Today’s AAA games could never. I’m glad some indie games are still carrying the torch for small, efficient games that people can play on any hardware though.

    • Sauerkraut@discuss.tchncs.de
      link
      fedilink
      arrow-up
      4
      ·
      edit-2
      1 day ago

      Ah, back when game development was managed by game developers who were gamers themselves and prioritized quality over min-maxing shareholder profits…

      Or another way to look at it, is that it was the market takeover phase of capitalism where capitalists are willing to operate at a loss to corner the market and create their own monopolies (see Nintendo, Google, Facebook, Amazon, etc). But once market grow stalls out they switch to the milking phase or enshitification phase of capitalism where they prioritize profits over everything else

      • CanadaPlus@lemmy.sdf.org
        link
        fedilink
        arrow-up
        1
        ·
        8 hours ago

        IIRC software development, including games, was a pretty gritty industry last century too.

        It’s more a matter of having the luxury of space for bloat. (Most of the anti-user features are new, though)

  • SuperSpruce@lemmy.zip
    link
    fedilink
    arrow-up
    5
    ·
    2 days ago

    2024: What are you doing with 16GB RAM and 300% CPU at 5.4GHz?

    • Running some random process introduced with Windows 11 that adds literally nothing to the users experience other than heat and fan noise
  • andrew_bidlaw@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    2
    ·
    2 days ago

    Completely unconnected to OP, but oh fuck do I hate that Microsoft Excel couldn’t open two documents side by side before like 2017. They all opened in one instance of the app unless you launch another as an admin, and it even screamed at you that it can’t open files with the same name. W?T?F?

  • r00ty@kbin.life
    link
    fedilink
    arrow-up
    3
    arrow-down
    1
    ·
    2 days ago

    It’s a different world now though. I could go into detail of the differences, but suffice to say you cannot compare them.

    Having said that, Windows lately seems to just be slow on very modern systems for no reason I can ascertain.

    I swapped back to Linux as primary os a few weeks ago and it’s just so snappy in terms of ui responsiveness. It’s not better in every way. But for sure I never sit waiting for windows to decide to show me the context menu for an item in explorer.

    Anyway in short, the main reason for the difference with old and new computer systems is the necessary abstraction.

    • ☆ Yσɠƚԋσʂ ☆@lemmy.mlOP
      link
      fedilink
      arrow-up
      2
      ·
      2 days ago

      That’s complete nonsense I’m afraid. While abstractions are necessary, the bloat of modern software absolutely isn’t. A lot of the bloat isn’t fundamental, but a result of things growing through accretion, and people papering over legacy designs instead of starting fresh.

      The selection pressures of the industry do not favor efficiency. Software developers are able to write inefficient software and rely on hardware getting faster. Meanwhile, hardware manufacturers benefit from bloated software because it creates demand for new hardware.

      Phones are a perfect example of this in action. Most of the essential apps on the phone haven’t changed in any significant way in over a decade. Yet, they continue getting less and less performant without any visible benefit for the user. Imagine if instead, hardware stayed the same and people focused on optimizing software to be more efficient over the past decade.

      • r00ty@kbin.life
        link
        fedilink
        arrow-up
        1
        arrow-down
        1
        ·
        2 days ago

        Except it’s not nonsense. I’ve worked in development through both eras. You need to develop in an abstracted way because there are so many variations on hardware to deal with.

        There is bloating for sure, and of course. A lot is because it’s usually much better to use an existing library than reinvent the wheel. And the library needs to cover many other use cases than your own. I encountered this myself, where I used a Web library to work with releases on forgejo, had it working generally, but then saw there was a library for it. The boilerplate to make the library work was more than I did to just make the Web requests.

        But that’s mostly size. The bloat in terms of speed is mostly in the operating system I think and hardware abstraction. Not libraries by and large.

        I’m also going to say legacy systems being papered over doesn’t always make things slower. Where I work, I’ve worked on our legacy system for decades. But on the current product for probably the past 5-10. We still sell both. The legacy system is not the slower system.

          • r00ty@kbin.life
            link
            fedilink
            arrow-up
            0
            arrow-down
            1
            ·
            2 days ago

            It does. It definitely does.

            If I write software for fixed hardware with my own operating system designed for that fixed hardware and you write software for a generic operating system that can work with many hardware configurations. Mine runs faster every time. Every single time. That doesn’t make either better.

            This is my whole point. You cannot compare the apollo software with a program written for a modern system. You just cannot.

            • boletus@sh.itjust.works
              link
              fedilink
              arrow-up
              1
              ·
              2 days ago

              I’m not disagreeing that it’s different. It’s a more fair comparison to compare it to embedded software development, where you are writing low level code for a specific piece of hardware.

              I’m just saying that abstraction in general is not an excuse for the current state of computer software. Computers are so insanely fast nowadays it makes no sense that Windows file Explorer and other such software can be so sluggish still.

  • Delzur@vegantheoryclub.org
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    2 days ago

    With 16gb of RAM and 102% CPU, the computer shows you a UI on any underlying hardware, any monitor/tv/whatever, handles a moise, keyboard, sound, handles any hardware interruption, probably fetches and sends stuff to the internet, scans your disk to index files so you can search almost instantly through gigabytes of storage whether it’s USB sticks, ssds, harddrive, nvme drive. And probably a lot of other stuff I’m forgetting. Meanwhile the other thingy with 4kb ram did college math problems. Impressive for the time yes, but that’s it.

    Yes, nowadays there is a lot of inefficiency, but that comparison does not, and never did, make sense.

    • SuperSpruce@lemmy.zip
      link
      fedilink
      arrow-up
      1
      ·
      8 hours ago

      We had most of this with Windows 7 and probably XP as well. Those used a fraction of the RAM, disk space, and CPU time for largely the same effect as today.