• Lucy :3@feddit.org
    link
    fedilink
    arrow-up
    7
    arrow-down
    1
    ·
    3 days ago

    Yes, but it has to be somewhere. I don’t want dozens of venv dirs in my homedir.

    • cd_slash_rmrf@programming.dev
      link
      fedilink
      arrow-up
      1
      ·
      2 days ago

      just to add to the other answers - no need to have them in your home dir (that sounds like it would suck). use a tool like uv tool or pipx , or just manually create any venv you need under a path you choose, say $HOME/.cache/venvs/

    • gigachad@sh.itjust.works
      link
      fedilink
      arrow-up
      4
      ·
      edit-2
      3 days ago

      This article is about Python venvs using Docker. That I wouldn’t want to pollute the base installation on my local machine should be clear.

      But you can just create a venv and install everything in there, no need to create dozens of venvs if that’s what you want.

      • logging_strict@programming.dev
        link
        fedilink
        arrow-up
        2
        ·
        edit-2
        2 days ago

        the one venv to rule them all is not a viable solution.

        Some packages cause problems, one tactic is to isolate the problem package into a separate venv.

        For example

        .venv/ – main venv for the dev setup

        .doc/.venv - Sphinx docs (now minimum py310)

        .wth for e.g. package restview which has out of date dependencies.

        Each venv has its respective requirements files. Some associated with a dependency or optional dependency. The ones that aren’t are pin files.

        Lets say there are easily a total of 20 requirements and constraints (pin files).

        This mess is in a freak’n nasty multi level hierarchy.

        Now imagine you are the author maintainer of 10 packages. Can you be confident one package requirements won’t conflict with other packages requirements?

        Almost forgot

        these packages are no longer maintained:

        pip-tools

        pip-requirements-parser

        … scary

          • FizzyOrange@programming.dev
            link
            fedilink
            arrow-up
            1
            ·
            2 days ago

            Well, if you want to have Pip-installed tools available generally (e.g. until distros started screwing it up, pip was the best way to install CMake), the suggestion was to have a venv for the user that would be activated in your .bashrc or whatever.

            I think that would work, but then what happens if you want to use a project-level venv, which is really what they’re designed for? If you create and activate a venv when you already have one activated does it all work sensibly? My guess would be that it doesn’t.

            • Benjaben@lemmy.world
              link
              fedilink
              arrow-up
              2
              ·
              1 day ago

              Oh! Hmm. That’s a good question and I really don’t know. So in other words (this is just how I’m organizing the thoughts in my own head, probably includes some misunderstandings so feel free to correct any you notice) - your “system Python” is really an activated venv specified in your user config in some way, and the question is what happens when you deliberately try to then activate a distinct project venv, which Python executable and collection of installed libraries is invoked when doing stuff with it active?

              On the one hand I’ve never considered that and it’s probably a mistake to make too many assumptions about how Python (and its instrumentation, pip etc. included) are interacting with the OS. Because I know fuck all about that, when I really think about it lol. On the other hand, one of the things I find pleasant about Python is that usually much more informed and thoughtful people than myself have chosen among several ways of dealing with whatever situation I’m thinking about, and have decided on a sensible default. But yep, idk. I originally just thought you misunderstood the idea of a venv lol, to my happy surprise, nope!