• taiidan@slrpnk.net
    link
    fedilink
    arrow-up
    8
    arrow-down
    1
    ·
    1 day ago

    I hate this hand-holding. Certainly use venvs for dev projects but allow system-wide installations for those that want it. OSS has always been about giving you enough rope to hang yourself.

    • logging_strict@programming.dev
      link
      fedilink
      arrow-up
      2
      ·
      1 day ago

      then they come after our guns, but spoons are always magically safe

      To all the fat slob system wide installation cock blocking PR submitters, i say,

      Ban spoons!

      Shooting ourselves in the foot is a G’d given right! /nosarc

    • Eager Eagle@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 day ago

      Which you can still do. That said, the “correct” and less problematic way of installing packages should be easier than the alternative.

    • wewbull@feddit.uk
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      1 day ago

      What really annoys me is they purposely broke per-user and local installation. Fine, system wise installation isn’t a good idea when it’s already managed by another package manager, but user installation is my domain.

      The reason they did this is because a package installed by the user can be active when a system tool is called and break the system tool. The distro developers went “Oh, we should force all user code into venvs so that our code is safe”.

      Completely and utterly backwards. The protected code needs to be inside the defensive wall. The user should be allowed to do anything in the knowledge that they can’t inadvertently change the OS. When a system tool is called it should only have system libraries on it’s Python Path.

      • Eager Eagle@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 day ago

        You still have the option to choose not to use a venv and risk breaking your user space.

        The changes make this harder to do it by accident by encouraging use of a venv. Part of the problem is that pip install --user is not exactly in the user space and may in fact break system packages, and as you wrote, the user shouldn’t be able to inadvertently change the OS.

        • taiidan@slrpnk.net
          link
          fedilink
          arrow-up
          1
          ·
          18 hours ago

          Makes more sense and I agree, especially with the apparent ease of pip install --user. But there should be no barriers when the root user is used with pip install --system.

        • wewbull@feddit.uk
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          1 day ago

          So the problem here is that you can inject code into a system python process because they run with the user’s python install location on their path.

          They’ve fixed the wrong “root cause”.

  • gigachad@sh.itjust.works
    link
    fedilink
    arrow-up
    21
    ·
    2 days ago

    Soon, you won’t have a choice because major distros are adopting PEP 668. This will make pip install fail in the default system Python and show an error telling you to use a virtual environment.

    Well, if this is true then why bother convincing people ;)

    • Lucy :3@feddit.org
      link
      fedilink
      arrow-up
      10
      ·
      2 days ago

      So … if I want to use a python module like, for example, mcstatus in a live shell for convenience I first need to create a venv, activate it, install the package and then use it? And then either have dozens of venvs somewhere or remake them every time?

        • Lucy :3@feddit.org
          link
          fedilink
          arrow-up
          7
          arrow-down
          1
          ·
          2 days ago

          Yes, but it has to be somewhere. I don’t want dozens of venv dirs in my homedir.

          • cd_slash_rmrf@programming.dev
            link
            fedilink
            arrow-up
            1
            ·
            1 day ago

            just to add to the other answers - no need to have them in your home dir (that sounds like it would suck). use a tool like uv tool or pipx , or just manually create any venv you need under a path you choose, say $HOME/.cache/venvs/

          • gigachad@sh.itjust.works
            link
            fedilink
            arrow-up
            4
            ·
            edit-2
            2 days ago

            This article is about Python venvs using Docker. That I wouldn’t want to pollute the base installation on my local machine should be clear.

            But you can just create a venv and install everything in there, no need to create dozens of venvs if that’s what you want.

            • logging_strict@programming.dev
              link
              fedilink
              arrow-up
              2
              ·
              edit-2
              1 day ago

              the one venv to rule them all is not a viable solution.

              Some packages cause problems, one tactic is to isolate the problem package into a separate venv.

              For example

              .venv/ – main venv for the dev setup

              .doc/.venv - Sphinx docs (now minimum py310)

              .wth for e.g. package restview which has out of date dependencies.

              Each venv has its respective requirements files. Some associated with a dependency or optional dependency. The ones that aren’t are pin files.

              Lets say there are easily a total of 20 requirements and constraints (pin files).

              This mess is in a freak’n nasty multi level hierarchy.

              Now imagine you are the author maintainer of 10 packages. Can you be confident one package requirements won’t conflict with other packages requirements?

              Almost forgot

              these packages are no longer maintained:

              pip-tools

              pip-requirements-parser

              … scary

                • FizzyOrange@programming.dev
                  link
                  fedilink
                  arrow-up
                  1
                  ·
                  1 day ago

                  Well, if you want to have Pip-installed tools available generally (e.g. until distros started screwing it up, pip was the best way to install CMake), the suggestion was to have a venv for the user that would be activated in your .bashrc or whatever.

                  I think that would work, but then what happens if you want to use a project-level venv, which is really what they’re designed for? If you create and activate a venv when you already have one activated does it all work sensibly? My guess would be that it doesn’t.

  • OmegaLemmy@discuss.online
    link
    fedilink
    arrow-up
    6
    ·
    2 days ago

    You already are forced to use a venv, but I fucking hate pip and some projects don’t work in venv I don’t know why it just doesn’t and it sucks

    • marcos@lemmy.world
      link
      fedilink
      arrow-up
      8
      ·
      2 days ago

      That’s the thing, if everybody is forced to use a venv, those projects will either fix their shit or lose all of their userbase.

      • logging_strict@programming.dev
        link
        fedilink
        arrow-up
        1
        ·
        1 day ago

        So these package maintainers are harboring magical charms and voodoo dolls which us lowly plebs just don’t know about?

        If these guys are so awesome, shouldn’t we be plying them with coke and blow and keep 'em working resolving our dependency resolution issues?

        They do have the secret sauce and just holding it back from the rest of us

          • logging_strict@programming.dev
            link
            fedilink
            arrow-up
            1
            ·
            edit-2
            14 hours ago

            Would the coke and blow happen for that guy?

            Makes sense if i’m that guy

            This question is about Python package funding. If world+dog no longer stresses over pip dependency resolution isn’t this not extremely valuable? So how to go about getting that package permanently funded. No bs dangling a tiny carrot and insisting on strict justice (reporting milestones …). Then funding only happens for large projects.

            Question on package funding is very legitimate. Have a list of packages that are no longer maintained cuz funding never happened.

            Can subsist on crocodile tears. It’s a guilty pleasure.

            Meaning, if package funding never ever happens, and all that ever happens is never ending articles/threads on Python devs whining about dependency resolution, i’m going to feed that.

            Personally not suffering from dependency resolution stress. Everyone else does.

            If the available solutions were sufficient there would be no more articles with comment sections filled with war stories, tales of carnage, and loss.

            … always comes down to that one guy.

            Solve the Python author maintainer funding issue!

            Then and only then will i market the package that specifically targeted towards resolving pip dependency resolution issues for package (and app) maintainers.

  • Eager Eagle@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    ·
    edit-2
    2 days ago

    System-wide installation as it was implemented should stay in the past. I like pixi’s (Conda alternative) approach here, where each system dependency lives in its own virtual bubble, so recreating and porting this software is a breeze.

    But if all you use can stay in a venv, just use one.

  • Oskar@mastodon.nu
    link
    fedilink
    arrow-up
    3
    ·
    2 days ago

    @norambna Good points! 👍🏻 especially since conflict resolution in PIP sucks and it’ll happily install incompatible packages

    • logging_strict@programming.dev
      link
      fedilink
      arrow-up
      1
      ·
      1 day ago

      pip is great! It lets ya know when there are dependency conflicts.

      Up to us to learn how to deal with resolving dependency conflicts.

      There are those who attend the whining parade down main street.

      There are the very very few who write a package to resolve these issues.

  • subterfuge@lemmy.world
    link
    fedilink
    arrow-up
    2
    ·
    2 days ago

    How will this affect command-line tools like azure-cli installed in a container image that can be installed with pip? Will we be forced to append the venv to $PATH?

    • logging_strict@programming.dev
      link
      fedilink
      arrow-up
      2
      ·
      edit-2
      1 day ago

      We need AA meetings

      Hello!

      My name is Billy Joe Jim Bob

      Hello Billy!

      I haven’t had a dependency conflict for the past 3 hours. The sleeping problems haven’t gone away. As i feel my eye lids drupe, keep thinking about each of my packages and imagining where will the next unresolvable dependency conflicts emerge.

      Then i wake up covered in sweat.

      Can’t keep going on like this. Thank you for listening

      Thank you for sharing Billy!