acheong08 4 hours ago

Just a few months back I said I would never use uv. I was already used to venv and pip. No need for another tool I thought.

I now use uv for everything Python. The reason for the switch was a shared server where I did not have root and there were all sorts of broken packages/drivers and I needed pytorch. Nothing was working and pip was taking ages. Each user had 10GB of storage allocated and pip's cache was taking up a ton of space & not letting me change the location properly. Switched to uv and everything just worked

If you're still holding out, really just spend 5 minutes trying it out, you won't regret it.

  • tomjakubowski 2 hours ago

    The absolute killer feature for me of uv is that it's still compatible with all of my old venv-based workflows. Just run `uv venv`.

  • tetha 2 hours ago

    For me, the big key was: uv is so much easier to explain and especially use - especially for people who sometimes script something in python and don't do this daily.

    pip + config file + venv requires you to remember ~2 steps to get the right venv - create one and install stuff into it, and for each test run, script execution and such, you need to remember a weird shebang-format, or to activate the venv. And the error messages don't help. I don't think they could help, as this setup is not standardized or blessed. You just have to beat a connection of "Import Errors" to venvs into your brain.

    It's workable, but teaching this to people unfamiliar with it has reminded me how.. squirrely the whole tooling can be, for a better word.

    Now, team members need to remember "uv run", "uv add" and "uv sync". It makes the whole thing so much easier and less intimidating to them.

  • psychoslave 3 hours ago

    I wonder how it compares with something more green generalist like "mise", to which I migrated after using "ASDF" for some time.

    • wrboyce 3 hours ago

      I use both! uv installed globally with mise, and uv tools can then be managed via “miss use -g pipx:foo”.

      • icedchai 10 minutes ago

        Same! I recently set up some dev environments with mise. My old ones are still using poetry, the new ones have uv. uv is incredibly snappy. It's like night and day!

  • yjftsjthsd-h 3 hours ago

    > Each user had 10GB of storage allocated and pip's cache was taking up a ton of space & not letting me change the location properly. Switched to uv and everything just worked

    Is it better about storage use? (And if so, how? Is it just good at sharing what can be shared?)

    • fsh 3 hours ago

      uv hardlinks identical packages, so adding virtual envs takes up very little space.

      • snerbles 2 hours ago

        Unless you cross mount points, which uv will helpfully warn about.

        • codethief an hour ago

          In those situations, have you had a any luck using UV_LINK_MODE=symlink? I eventually had to resort to `copy` mode because it seemed the folder names (hashes) uv created in the package cache were not fully deterministic. So sometimes my cache volume would change and my Docker build would break. :\

    • acheong08 an hour ago

      Both pip and uv cache packages to ~/.cache. Uv lets you change it to /tmp and symlink instead of copying

    • kissgyorgy 2 hours ago

      There is a global cache for all installed packages in the user home cache dir.

  • bmitc 8 minutes ago

    What has been holding me back on uv is my experience with Ruff. Ruff claims "Drop-in parity with Flake8, isort, and Black", but that is simply not true. At least for isort, Ruff only re-implemented what they wanted and then ask you to use Ruff to call out to the old isort tool if there's a feature or setting that's missing in the Ruff re-implementation. So what's the point? Ruff just partially re-implemented many existing different tools and added some new ones. So using Ruff actually increases the amount of tools, yet again, you're having to use because of this and it also not doing everything that Pylint does.

    For moving to uv, I haven't heard a good story for what uv provides over Poetry rather than "is fast". The only unique thing that I am currently aware of is that uv can install Python itself, which gets rid of tools like Pyenv. I'm interested because of that, but "is fast" isn't enough of a reason.

  • oofbey an hour ago

    I love uv. The one gotcha I'll warn people about is: don't touch uvx. I've lost an embarrassing number of hours or days trying to figure out why nothing works properly or makes sense when I tried to run things with uvx. I guess I understand why it's there, but I think it's a built-in foot-gun and not well documented. But if you stay away from it, things work great.

    • jsmeaton 7 minutes ago

      What issues are you having with uvx? It replaces tools like pipx that set up implicit venvs to run specific tools. Works great for me.

    • maleldil 14 minutes ago

      uvx is fine. I use it to run executable packages all the time. What is your problem with it?

  • mistrial9 37 minutes ago

    similar story recently with an experimental repo that starts with "its so easy, just $uv a b c" .. under the hood it implies a lot of redundancies? but true enough it worked fine and trouble-free too, on a standard GNU-Debian-Ubuntu host

  • _vya7 2 hours ago

    I remember using pip and venv back in like 2009. Last time I checked, maybe 5 or 10 years ago, the recommendation of the community was generally to just use Docker instead of all these tools. Did that not catch on?

    • unclad5968 2 hours ago

      The advice seems to change every year. For a while it was venv, then pipenv, poetry, docker, and now uv. Maybe the ecosystem will settle on that but who knows.

  • ed_elliott_asc 2 hours ago

    I came here to comment that I don’t see any reason to bother - that’s for the comment, I will try it now!

  • espdev 4 hours ago

    > Just a few months back I said I would never use uv. I was already used to venv and pip. No need for another tool I thought

    Really? :)

    requirements.txt is just hell and torture. If you've ever used modern project/dependency management tools like uv, Poetry, PDM, you'll never go back to pip+requirements.txt. It's crazy and a mess.

    uv is super fast and a great tool, but still has roughnesses and bugs.

    • aequitas 3 hours ago

      Pip-tools+requirements.txt helped me survive the past few years. I also never thought I needed uv, but after all the talk about it I gave it a spin and never want back. It’s just so blazing fast en convenient.

    • kortex 2 hours ago

      We use uv to compile requirements.txt from pyproject.toml to get the locked versions.

          # Makefile
          compile-deps:
           uv pip compile pyproject.toml -o requirements.txt
          
          compile-deps-dev:
           uv pip compile --extra=dev pyproject.toml -o requirements.dev.txt
      • espdev 27 minutes ago

        What for? Support legacy CI/CD pipelines or something like that? uv.lock already contains locked versions of all dependencies plus a lot of other needed metadata.

    • _Algernon_ 3 hours ago

      pip also works with pyproject.toml. Sticking with requirements.txt is a self-imposed constraint.

polivier 5 hours ago

The first time I used `uv`, I was sure that I had made a mistake or typed something wrong because the process finished so much more quickly than anything I had ever experienced with `pip`.

  • tux3 4 hours ago

    I've sometimes had uv take up to 200ms to install packages, so you could feel a slight delay between pressing enter and the next shell prompt

    You don't have that problem with Poetry. You go make a cup of coffee for a couple minutes, and it's usually done when you come back.

    • Numerlor 3 hours ago

      It's funny when the exact same thing was probably said about pipenv and poetry

      • icedchai 9 minutes ago

        I've had poetry sit there for minutes resolving dependencies at a previous company. I thought something was broken... it probably was... but it did eventually complete.

  • baby 5 hours ago

    Same here lol! The experience is so smooth it doesn't feel like python

    • johnfn 3 hours ago

      That makes sense, because it's Rust. :)

  • augustflanagan 4 hours ago

    I just had this same experience last week, and was certain it wasn’t working correctly as well. I’m a convert.

  • nialse 5 hours ago

    Likewise. I was skeptical, then I tried it and won’t go back.

theLiminator 5 hours ago

uv and ruff are a great counterexample to all those people who say "never reinvent the wheel". Don't ever do it just for the sake of doing it, but if you have focused goals you can sometimes produce a product that's an order of magnitude better.

  • CrendKing an hour ago

    I believe most of the time this phrase is said to an inexperienced artisan who has no idea how the current system works, what's the shortcoming of it, and how to improve upon it. Think of an undergraduate student who tries to solve the Goldbach conjecture. Usually what ended up is either he fails to reinvent the wheel, or reinvent the exact same wheel, which has no value. The phrase certainly does not apply to professionals.

  • eviks 5 hours ago

    They didn't reinvent the wheel, "just" replaced all the wood with more durable materials to make it handle rotation at 10 times the speed

    • socalgal2 4 hours ago

      I'd be curious to know exactly what changed. Python -> Rust won't make network downloads faster nor file I/O faster. My naive guess is that all the speed comes from choosing better algorithms and/or parallelizing things. Not from Python vs Rust (though if it's hard to parallelize in Python and easy in rust that would certainly make a difference)

      • ekidd 4 hours ago

        I've translated code from Ruby to Python, and other code from Rust to Python.

        Rust's speed advantages typically come from one of a few places:

        1. Fast start-up times, thanks to pre-compiled native binaries.

        2. Large amounts of CPU-level concurrency with many fewer bugs. I'm willing to do ridiculous threading tricks in Rust I wouldn't dare try in C++.

        3. Much lower levels of malloc/free in Rust compared to some high-level languages, especially if you're willing to work a little for it. Calling malloc in a multithreaded system is basically like watching the Millennium Falcon's hyperdrive fail. Also, Rust encourages abusing the stack to a ridiculous degree, which further reduces allocation. It's hard to "invisibly" call malloc in Rust, even compared to a language like C++.

        4. For better or worse, Rust exposes a lot of the machinery behind memory layout and passing references. This means there's a permanent "Rust tax" where you ask yourself "Do I pass this by value or reference? Who owns this, and who just borrows is?" But the payoff for that work is good memory locality.

        So if you put in a modest amount of effort, it's fairly easy to make Rust run surprisingly fast. It's not an absolute guarantee, and there are couple of traps for the unwary (like accidentally forgetting to buffer I/O, or benchmarking debug binaries).

      • the8472 4 hours ago

        NVMe hungers, keeping it fed is hard work. Doing some serial read, decompress, checksum, write loop will leave if starved (QD<1) whenever you're doing anything but the last step. Disk IO isn't async unless you use io_uring (well ok, writeback caches can be). So threads are almost a must to keep NVMe busy. Conversely, waiting for blocking IO (e.g. directory enumeration) will keep your CPU starved. Here too the answer is more threads.

      • jerpint 4 hours ago

        From just my observations they basically parallelized the install sequence instead of having it be sequential (among many other optimizations most likely)

      • physicsguy 3 hours ago

        The package resolution is a big part of it, it's effectively a constraint solver. I.e. if package A requires package B constrained between version 1.0 < X <= 2.X and Package B requires package C between... and so on and so on.

        Conda rewrote their package resolver for similar reasons

      • jerf 2 hours ago

        It became a bit of a meme, especially in the web development space, that all programs are always waiting on external resources like networks, databases, disks, etc., and so scripting languages being slower than other languages doesn't matter and they'll always be as fast as non-scripting languages.

        Even on a single core, this turns out to be simply false. It isn't that hard to either A: be doing enough actual computation that faster languages are in fact perceptibly faster, even, yes, in a web page handler or other such supposedly-blocked computation or B: without realizing it, have stacked up so many expensive abstractions on top of each other in your scripting language that you're multiplying the off-the-top 40x-ish slower with another set of multiplicative penalties that can take you into effectively arbitrarily-slower computations.

        If you're never profiled a mature scripting language program, it's worth your time. Especially if nobody on your team has ever profiled it before. It can be an eye-opener.

        Then it turns out that for historical path reasons, dynamic scripting languages are also really bad at multithreading and using multiple cores, and if you can write a program that can leverage that you can just blow away the dynamic scripting languages. It's not even hard... it pretty much just happens.

        (I say historical path reasons because I don't think an inability to multithread is intrinsic to the dynamic scripting languages. It's just they all came out in an era when they could assume single core, it got ingrained into them for a couple of decades, and the reality is, it's never going to come fully out. I think someone could build a new dynamic language that threaded properly from the beginning, though.)

        You really can see big gains just taking a dynamic scripting language program and turning it into a compiled language with no major changes to the algorithms. The 40x-ish penalty off the top is often in practice an underestimate, because that number is generally from highly optimized benchmarks in which the dynamic language implementation is highly tuned to avoid expensive operations; real code that takes advantage of all the conveniences and indirection and such can have even larger gaps.

        This is not to say that dynamic scripting languages are bad. Performance is not the only thing that matters. They are quite obviously fast enough for a wide variety of tasks, by the strongest possible proof of that statement. That said, I think it is the case that there are a lot of programmers who have no idea how much performance they are losing in dynamic scripting languages, which can result in suboptimal engineering decisions. It is completely possible to replace a dynamic scripting language program with a compiled one and possibly see 100x+ performance improvements on very realistic code, before adding in multithreading. It is hard for that not to manifest in some sort of user experience improvement. My pitch here is not to give up dynamic scripting languages, but to have a more realistic view of the programming language landscape as a whole.

        • RhysU an hour ago

          > Then it turns out that for historical path reasons, dynamic scripting languages are also really bad at multithreading and using multiple cores...

          What would a dynamic scripting language look like that wasn't subject to this limitation? Any examples? I don't know of contenders in this design space but I am not up on it.

        • socalgal2 2 hours ago

          I'm not trying to suggest that you can't do faster computation in a lower-level language. But, a package manager doesn't do much computation. It mostly downloads, decompresses, and writes files. Yes, it has to solve constraints but that's not a bottleneck given most projects have at most a few 100 dependencies and not millions.

          I don't know python but in JavaScript, triggering 1000 downloads in parallel is trivial. Decompressing them, like in python, is calling out to some native function. Decompressing them in parallel in JS would also be trivial (no idea about python). Writing them in parallel is also trivial.

    • doug_durham an hour ago

      A big part of the "magic" is that there is a team of paid professionals maintaining and improving it. That's more important than it being written in Rust. If uv were forked it would devolve to the level of pip over time.

  • bmitc 5 minutes ago

    Ruff is actually a good example of the danger of rewrites. They rewrote tools but not all of the parts of the tools.

  • 0cf8612b2e1e 5 hours ago

    The history of Python package management is clear that everyone thinks they can do a better job than the status quo.

    • psunavy03 5 hours ago

      In this case, they were right.

    • nickelpro 4 hours ago

      uv is purely a performance improvement, it changes nothing about the mechanics of Python environment management or packaging.

      The improvements came from lots of work from the entire python build system ecosystem and consensus building.

      • 0cf8612b2e1e 3 hours ago

        Disagree in that uv makes switching out the underlying interpreter so straightforward. Becomes trivial to swap from say 3.11 to 3.12. The pybi idea.

        Sure, other tools could handle the situation, but being baked into the tooling makes it much easier to bootstrap different configurations.

        • nickelpro 3 hours ago

          Yes, it's faster and better than pyenv, but the mechanism it's using (virtual environments) is not a uv invention.

          uv does the Python ecosystem better than any other tool, but it's still the standard Python ecosystem as defined in the relevant PEPs.

          • pityJuke 3 hours ago

            Are the lock files standardised, or a uv-specific thing?

            • nickelpro 2 hours ago

              uv has both a uv-specific implementation, and support for standard PEP 751 lockfiles

      • globular-toast 3 hours ago

        Actually not true. One of the main differences with uv is you don't have to think about venvs any more. There's a talk about it from one of the authors at a recent PyCon here: https://www.youtube.com/watch?v=CV8KRvWKYDw (not the same talk I linked elsewhere in the thread).

        • nickelpro 2 hours ago

          How do you think uv works?

          It creates a venv. Note were talking about the concept of a virtual environment here, PEP 405, not the Python module "venv".

    • henry700 4 hours ago

      Of course they do, this tends to happen when the history is it being hot flaming garbage.

  • mort96 5 hours ago

    Honestly "don't reinvent the wheel" makes absolutely no sense as a saying. We're not still all using wooden discs as wheels, we have invented much better wheels since the neolithic. Why shouldn't we do the same with software?

    • simonw 4 hours ago

      When asked why he had invented JSON when XML already existed, Douglas Crockford said:

      The good thing about reinventing the wheel is that you can get a round one.

      https://scripting.wordpress.com/2006/12/20/scripting-news-fo...

      • idle_zealot 2 hours ago

        You can get a round one. Or you can make yet another wonky shaped one to add to the collection, as ended up being the case with JSON.

        • simonw 2 hours ago

          What makes JSON wonky?

          Personally the only thing I miss from it is support for binary data - you end up having to base64 binary content which is a little messy.

          • idle_zealot an hour ago

            Quoted keys, strict comma rules, very limited data types, are the main ones. There are a host of others if you view it through the lenses of user-read/write, and a different set of issues if you view it as a machine data interface. Trying to combine the two seems fundamentally misguided.

        • psunavy03 an hour ago

          Insert the xkcd about 15 competing standards . . .

          • oblio 10 minutes ago

            Standards do die off, up to a point. XML is widely used but the last time I really had to edit it in anger working in DevOps/web/Python was a long time ago (10 years ago?).

            At this point XML is the backbone of many important technologies that many people won't use or won't use directly anymore.

            This wasn't the case circa 2010, when I doubt any dev could have really avoided XML for a bunch of years.

            I do like XML, though.

    • haiku2077 4 hours ago

      Right, wheels are reinvented every few years. Compare tires of today to the ones 20 years ago and the technology and capability is very different, even though they look identical to a casual eye.

      My primary vehicle has off-road capable tires that offer as much grip as a road-only tire would have 20-25 years ago, thanks to technology allowing Michelin to reinvent what a dual-purpose tire can be!

      • nightpool 3 hours ago

        > Compare tires of today to the ones 20 years ago and the technology and capability is very different, even though they look identical to a casual eye

        Can you share more about this? What has changed between tires of 2005 and 2025?

    • aalimov_ 4 hours ago

      I always took this saying as meaning that we don’t re-invent the concept of the wheel. For example the Boring company and Tesla hoping to reinvent the concept of the bus/train.. (iirc your car goes underground on some tracks and you get to bypass traffic and not worry about steering)

      A metal wheel is still just a wheel. A faster package manager is still just a package manager.

      • haiku2077 4 hours ago

        That's not how I've ever seen it used in practice. People use it to mean "don't build a replacement for anything functional."

  • jjtheblunt 5 hours ago

    > an order of magnitude better

    off topic, but i wonder why that phrase gets used rather than 10x which is much shorter.

    • BeetleB 3 hours ago

      Short answer: Because the base may not be 10.

      Long answer: Because if you put a number, people expect it to be accurate. If it was 6x faster, and you said 10x, people may call you out on it.

    • screye 4 hours ago

      It's meant to signify a step change. Order of magnitude change = no amount of incremental changes would make up for it.

      In common conversation, the multiplier can vary from 2x - 10x. In context of some algorithms, order of magnitudes can be over the delta rather than absolutes. eg: an algorithms sees 1.1x improvement over the previous 10 years. A change that shows a 1.1x improvement by itself, overshadows an an order-of-magnitude more effort.

      For salaries, I've used order-of-magnitude to mean 2x. Good way to show a step change in a person's perceived value in the market.

    • bxparks 4 hours ago

      I think of "an order of magnitude" as a log scale. It means somewhere between 3.16X and 31.6X.

      • jjtheblunt 4 hours ago

        yeah that's what i meant with 10x, like it's +1 on the exponent, if base is 10. but i'm guessing what others are thinking, hence the question.

    • fkyoureadthedoc 5 hours ago

      - sounds cooler

      - 10x is a meme

      - what if it's 12x better

    • bmacho 3 hours ago

      Because it's not 10x?

    • chuckadams 4 hours ago

      Because "magnitude" has cool gravitas, something in how it's pronounced. And it's not meant to be precise, it just means "a whole lot more".

    • refulgentis 4 hours ago

      "10x" has been cheapened / heard enough / de facto, is a more general statement than a literal interpretation would indicate. (i.e. 10x engineer. Don't hear that much around these parts these days)

      Order of magnitude faces less of that baggage, until it does :)

      • psunavy03 an hour ago

        Would you say it faces . . . orders of magnitude less baggage?

    • neutronicus 4 hours ago

      5x faster is an order of magnitude bc of rounding

larkost 3 hours ago

Just a warning in case others run into it: on very anemic systems (e.g.: AWS T2.micro running Windows, yes... I know...) uv will try to open too many simultaneous downloads, overloading things, resulting in timeouts.

You can use ent ENV variable UV_CONCURRENT_DOWNLOADS to limit this. In my case it needed to be 1 or 2. Anything else would cause timeouts.

An extreme case, I know, but I think that uv is too aggressive here (a download thread for every module). And should use aggregate speeds from each source server as a way of auto-tuning per-server threading.

  • ehsankia 2 hours ago

    Not extreme at all, A lot of people use the cheapest smallest VPS for their hobby work. I know I do (albeit not AWS). Thanks for sharing, hope they improve the automatic detection there.

mh- 6 hours ago

Started using this recently for personal stuff on my laptop. When you're used to pip, it's just confusingly fast. More than once I thought maybe it didn't work because it returned too quickly..

leonheld 5 hours ago

I adore the

  uv add <mydependencies> --script mycoolscript.py
And then shoving

  #!/usr/bin/env -S uv run
on top so I can run Python scripts easily. It's great!
  • simonw 5 hours ago

    I built a Claude Project with special instructions just teaching it how to do this, which means it can output full scripts for me with inline dependencies based on a single prompt: https://simonwillison.net/2024/Dec/19/one-shot-python-tools/

    Claude 4's training cutoff date is March 2025 though, I just checked and it turns out Claude Sonnet 4 can do this without needing any extra instructions:

      Python script using uv and inline script dependecies
      where I can give it a URL and it scrapes it with httpx
      and beautifulsoup and returns a CSV of all links on
      the page - their URLs and their link text
    
    Here's the output, it did the right thing with regards to those dependencies: https://claude.ai/share/57d5c886-d5d3-4a9b-901f-27a3667a8581
    • sunaookami 4 hours ago

      Using your system instructions for uv for every LLM now since first seeing your post last year, thanks! It's insanely helpful just asking e.g. Claude to give me a python script for XYZ and just using "uv run". I also added:

        If you need to run these scripts, use "uv run script-name.py". It will automatically install the dependencies. Stdlibs don't need to be specified in the dependencies array.
      
      since e.g. Cursor often gets confued because the dependencies are not installed and it doesn't know how to start the script. The last sentence is for when LLMs get confused and want to add "json" for example to the dependency array.
    • varunneal 5 hours ago

      claude sonnet typically forgets about uv script syntax in my experience. I usually find myself having to paste in the docs every time. By default it wants to use uv project syntax.

  • oblio 5 minutes ago

    What's "mydependencies"? A config file?

  • jsilence 5 hours ago

    Using this trick with Marimo.io notebooks in app-mode.

    Instant reactive reproducible app that can be sent to others with minimal prerequisites (only uv needs to be installed).

    Such a hot combo.

  • kristjansson 5 hours ago

    e: I misread your example, disregard below irrelevant pattern matching of 'uv add --script' to 'uv add' in the project context!

    ~~That mutates the project/env in your cwd. They have a lot in their docs, but I think you’d like run --with or uv’s PEP723 support a lot more~~

    https://docs.astral.sh/uv/guides/scripts/

    • misnome 5 hours ago

      PEP723 support is exactly what the poster is using?

eats_indigo 5 hours ago

Love UV!

Also love Ruff from the Astral team. We just cut our linting + formatting across from pylint + Black to Ruff.

Saw lint times drop from 90 seconds to < 1.5 seconds. crazy stuff.

  • greatgib an hour ago

    Until the moment you will realize that ruff perform only a part of pylint checks and that very obvious mistakes can go through easily like code that can't run because of an obvious error.

    • oblio 4 minutes ago

      Which checks is it missing?

nrvn 4 hours ago

this is my new fav for running small executable scripts:

  #!/usr/bin/env -S uv --quiet run --script
  # /// script
  # requires-python = ">=3.13"
  # dependencies = [
  #     "python-dateutil",
  # ]
  # ///
  #
  # [python script that needs dateutil]
  • mdeeks 3 hours ago

    I really wish that hashbang line was something way WAY easier to remember like `#!/usr/bin/env uvx`. I have to look this up every single time I do it.

    • PufPufPuf 2 hours ago

      Sadly hashbangs are technically limited to: 1) Support only absolute paths, making it necessary to use /usr/bin/env which is in standardized location to look up the uv binary 2) Support only a single argument (everything after the space is passed as a single arg, it's not parsed into multiple args like a shell would), making it necessary to use -S to "S"plit the arguments. It's a feature of env itself, for this very use case.

      So there isn't really much to do to make it simpler.

      • mdeeks an hour ago

        I wasn't really referring to env. I meant change the behavior of uvx. If the first argument passed to uvx is a file path, then execute it exactly the same way as `uv --quiet run --script` does.

        Or maybe create a second binary or symlink called `uvs` (short for uv script) that does the same thing.

pu_pe 5 hours ago

Tried uv a while ago and I was shocked by how fast and easy it is to use. There's basically no reason to use pip anymore, and if you're using only Python there's basically no reason to use conda either.

  • oceansky 5 hours ago

    It seems to make pyenv and poetry droppable too.

psunavy03 5 hours ago

I'm sold. Never going back to pip/twine/requirements.txt again if I don't have to. I have several projects that all share a common wheel hosted on an internal GitLab instance, and I was able to replace like 10 lines of YAML with just "uv build" and "uv publish." Importing is quick and easy, and you can see what your core dependencies are as opposed to everything just being in one mess of a requirements.txt.

6ak74rfy 4 hours ago

UV is fast, like FAST. Plus, it removes the need for pyenv (for managing different Python versions) and pip for me. Plus, no need to activate env or anything, `uv run ...` automatically runs your code through the env.

It's a nice software.

  • nomel 4 hours ago

    > Plus, it removes the need for pyenv

    I don't see a way to change current and global versions of python/venvs to run scripts, so that when I type "python" it uses that, without making an alias.

    • zbentley 25 minutes ago

      Two options other than aliases:

      1. Modify your PATH:

          export PATH="$(uv run python -BISc 'import sys; print(sys.base_exec_prefix)')/bin:$PATH"
      
      2. Put this in an executable file called "python" early in your PATH:

          #!/bin/sh
          exec uv run python $*
      
      Those are basically what pyenv does (via a symlink and PATH entry).

      The second option will always pick up the Python local to the project directory you're in, if any. The former (if you put it in your shell profile) will not.

    • adamckay 4 hours ago

      If they're your scripts (i.e. your writing/editing them) then you can declare dependencies following the PEP723 format and uv will respect that.

      https://docs.astral.sh/uv/guides/scripts/#declaring-script-d...

      • nomel an hour ago

        > uv run example.py

        I specifically want to run "python", rather subcommands for some other command, since I often I want to pass in arguments to the python interpreter itself, along with my script.

samsartor 2 hours ago

I'm in ML-land. I thought we were all hopelessly tied to conda. But I moved all my own projects to uv effortlesly and have never looked back. Now first thing I do when pulling down another reseacher's code is add a pyproject toml (if they don't have one), `uv add -r`, and `uv run` off into the sunset. I especially like how good uv is with non-pypi-published dependencies: GitHub, dumb folders, internal repos, etc.

esafak 33 minutes ago

It's not just the programs that are fast, but the company's response. I reported an issue on Discord and the founder fixed it right away, collaborating with me on the PR I created.

shpx 4 hours ago

For running scripts on my personal computer I really don't care for all the dependency management stuff. I just want a single, globally installed latest version of a library, like what pip does. I've never had the problem of an old script breaking, I guess because I just don't run that much software. These things for writing out explicit versions of everything and reinstalling libraries for every single project just add a lot of overhead, extra typing and an extra layer of complexity that I don't care for. I like just typing "pip install requests" and then being able to do "import requests" in any REPL or script for the next few years, occasionally running a Bash alias to upgrade all dependencies (which is a feature that pip incredibly still doesn't have, 14 years later).

I can see how if you've had issues with dependencies you would rave about systems that let you control down to the commit what an import statement actually means, but I like the system that requires the least amount of typing/thinking and I imagine I'm part of a silent majority.

  • chuckadams 4 hours ago

    I've been out of the python world for a while now, but I would think a global install should just be a special case of a local install, one that's shared among scripts, which is basically how npm and yarn work. I'm kind of surprised uv doesn't support something like this already. Maybe it should be up to the distro to base the global python package management on uv then?

    • shpx 4 hours ago

      I think it does support it, like this

        uv pip install --system requests
      
      but it's more typing. If I type 5 characters per second, making me also type "uv --system" is the same as adding 2 seconds of runtime to the actual command, except even worse because the chance of a typo goes up and typing takes energy and concentration and is annoying.
      • chuckadams 4 hours ago

        If only there were a way to alias commands, eh?

  • globular-toast 3 hours ago

    Word of warning, if you use Linux you can easily break your system like this. Many distros stop you being able to modify the system packages now but that's quite a recent development. You should look into doing user-level installs instead. Don't know about Mac. If you use Windows then you do you. It might break eventually but probably not irrecoverably.

jauntywundrkind 2 hours ago

I ran into this via Aider. Who centers their user experience on using uv as an installer. https://aider.chat/2025/01/15/uv.html

What was super unclear was how I develop locally with uv. Figuring out I needed `aider sync --extra` then `aider run --projrct /opt/aider aider` to run was a lot of bumbling in the dark. I still struggle to find good references for everyday running projects use with uv.

It was amazing though. There were so many pyproject and other concerns that it just knew how to do. I kept assuming I was going to have to do a lot more steps.

spapas82 an hour ago

I don't use uv for my (django) projects because my pip/venv workflow is good enough and had never failed me in 15 years.

However I really like installing uv globally on my Windows systems and then using uvx to run stuff without caring about venvs and putting stuff to path.

zero-g 2 hours ago

What’s their motivation to build tools like uv and ruff? What’s their business model?

tfarias 2 hours ago

I switched to uv in an open source project I maintain as our builds were taking dozens of minutes to resolve dependencies using Poetry. The migration itself was pretty easy as both tools use pyproject.toml, so it was a matter of changing headers and minor adjustments.

After the switch, the same dependency resolution was done in seconds. This tool single-handedly made iteration possible again.

xnyan 5 hours ago

Worth using just for 'uv pip' as a replacement for pip on the grounds of being much much faster. I've completely switched to uv and I can't imagine ever going back to pip for this and lots of other outstanding features.

nchmy 3 hours ago

I've been happily using it for a long time, and rye before that.

Just today I set it up on 20 PCs in a computer lab that doesn't have internet, along with vs code and some main packages. Just downloaded the files, made a powershell script and it's all working great with Jupyter etc... Now to get kids to be interested in it...

neves 4 hours ago

It is a venture capital startup. If I start use uv, what's our protection against the company going rogue?

  • zffr 4 hours ago

    Why wouldn't you be able to switch back to using pip ?

    • nullhole 4 hours ago

      It seems like that'd work as long as you restrict yourself entirely to the pip interface. Stray outside of that, and you start accumulating a real dependency on uv itself.

    • kylecordes 4 hours ago

      If uv disappeared tomorrow, five projects would spring up to make compatible implementations of its functionality.

  • xyst 4 hours ago

    Community will fork it and move on. See the following examples:

    * Redis -> redict, valkey

    * elastic search -> opensearch

    * terraform -> opentofu

    (Probably a few more but those are the ones that come to mind when they "go rogue")

andy_ppp 2 hours ago

I have no understanding of python really but very glad I settled on uv. It’s such an improvement user wise over poetry or pip and it’s extremely fast too.

I have one complaint though, I want ./src to be the root of my python packages such that

> from models.attention import Attention

Works if I have a directory called models with a file called attention.py in it (and __init__.py) etc. The only way this seems to work correctly is if I set PYTHONPATH=./src

Surely the environment manager could set this up for me? Am I just doing it wrong?

  • andy_ppp 2 hours ago

    ChatGPT says two things to this “uv is a package manager not a file runner” um uv run seems to be one of the best parts of uv… and that “You’re doing things correctly you are just hitting one of Python’s most notorious pain points: managing import paths cleanly”.

    I have read a few tickets saying uv won’t support this so everyone running my project will have to read the README first to get anything to run. Terrible UX.

tandav 5 hours ago

Still no option to store virtual envs outside projects after a year

https://github.com/astral-sh/uv/issues/1495

jhoechtl 3 hours ago

Its sad that pyrhon only tooling is aparently so imcapable that you have to write it in a compiled language.

After that many years of optimization pure python seems still to be wishfull thinking. It's ai/mk success is also only as a shim language around library calls.

  • globular-toast 3 hours ago

    So? It's the best language there is for shimming around library calls. Use the right tool for the job. There's no single language that can do it all.

lucideng 5 hours ago

UV solved any issue I had getting python to run on a machine.

Just `git clone someproject`, `uv run somescript.py`, then mic drop and walk away.

  • oezi 5 hours ago

    `uvx` directly from the repo also works nicely

kissgyorgy 2 hours ago

Astral's tools are so fast in general, when you first try them out, you double check what went wrong because you are sure nothing happened.

Same with uv. They are doing very nice tricks, like sending Range requests to only download the metadata part from the ZIP file from PyPI, resolve them in memory and only after that downloading the packages. No other package manager does this kind of crazy optimization.

bunderbunder 2 hours ago

uv is indeed fast and easy. But I'm also finding that the maintainers' efforts to make it work like Cargo mean it can be more difficult to use in more complex project structures. As sensible as Rust's project management ethos is, you're never going to escape Python's underlying design in a Python project, and friction between the two philosophies may not be avoidable.

One possible alternative is Pants. It's also written in Rust for performance, but has more flexibility baked into the design.

  • kortex 2 hours ago

    Pants and uv are two very different tools, with very different use-cases.

    uv is basically souped-up pip.

    Pants is an entire build/tooling system, analogous to something like Bazel. It can handle multiple dependency trees, multiple types of source code, building and packaging, even running tests.

IceHegel 2 hours ago

Biggest dev QoL improvement for me since TypeScript has been going all in on uv.

I switched everything over and haven’t looked back.

It’s everything I hoped poetry would be, but 10x less flakey.

incognito124 5 hours ago

uv is almost perfect. my only pet peeve is updating dependencies. sometimes I just want to go "uv, bump all my dependencies to the as latest version as possible while respecting their constraints". I still haven't found an elegant way to do this, but I have written a script that parses pyproject.toml, removes the deps, and invokes `uv add --upgrade` with them.

other than that, it's invaluable to me, with the best features being uvx and PEP 723

  • jmtulloss 5 hours ago

    Does `uv lock —upgrade` not do what you want?

    • incognito124 5 hours ago

      Unfortunately, no. Only `uv.lock` gets updated, but the dependencies in `pyproject.toml` are frozen at their original constraints.

      What I want is, if my project depends on `package1==0.4.0` and there are new versions of package1, for uv to try install the newer version. and to do that for a) all the deps, simultaneously, b) without me explicitly stating the dependencies in the command line since they're already written in the pyproject.toml. an `uv refresh` of sorts

      • hxtk 4 hours ago

        If you specify your constraints in pyproject.toml like this: `package1==0.4.0`; then that is the latest (and only) version satisfying your constraints. Not upgrading is expected behavior, because upgrading would violate constraints.

        pyproject.toml’s dependency list specifies compatibility: we expect the program to run with versions that satisfy constraints.

        If you want to specify an exact version as a validated configuration for a reproducible build with guaranteed functionality, well, that’s what the lock file is for.

        In serious projects, I usually write that dependency section by hand so that I can specify the constraints that match my needs (e.g., what is the earliest version receiving security patches or the earliest version with the functionality I need?). In unserious projects, I’ll leave the constraints off entirely until a breakage is discovered in practice.

        If `uv` is adding things with `==` constraints, that’s why upgrades are not occurring, but the solution is to relax the constraints to indicate where you are okay with upgrades happening.

        • incognito124 4 hours ago

          > ... the solution is to relax the constraints to indicate where you are okay with upgrades happening.

          Yeah, that's pretty much what I've been doing with my workaround script. And btw most of my projects are deeply unserious, and I do understand why one should not do that in any other scenario.

          Still, I dream of `uv refresh` :D

      • Eridrus 4 hours ago

        Why not depend on package1>=0.4.0 rather than specifying an explicit version? Then uv will upgrade it to the latest version.

        pyproject.toml is meant to encode the actual constraints for when your app will function correctly, not hardcode exact versions, which is what the lockfile is for.

        • IshKebab an hour ago

          Because then you don't get to use the new features in 0.5.0.

          Though I do think with Python in particular it's probably better to manually upgrade when needed, rather than opportunistically require the latest, because Python can't handle two versions of the same package in one venv.

      • wtallis 4 hours ago

        > What I want is, if my project depends on `package1==0.4.0` and there are new version of package1, for uv to try install the newer version.

        I think you're just specifying your dependency constraints wrong. What you're asking for is not what the `==` operator is for; you probably want `~=`.

      • gschizas 4 hours ago

        I think what you want is `uv sync --upgrade`

      • petters 4 hours ago

        You are writing your project file incorrectly. It's not a lock file

        • incognito124 4 hours ago

          I never, ever, write my project file[1]. uv {add,remove} is all I ever use.

          [1]: I do sometimes write the title or the description. But never the deps themselves

          • wtallis 3 hours ago

            Even using `uv add`, you don't have to limit yourself to declaring exact versions when your intention is to allow newer versions.

          • pests an hour ago

            You can specify bounds when using uv add:

            uv add example>=0.4.0

            Then it will update as you are thinking.

  • ketozhang 4 hours ago

    You could either delete the .venv and recreate it or run `uv pip install --upgrade .`

    Much prefer not thinking about venvs.

    • incognito124 4 hours ago

      Actually, it won't work. I tried it and running `uv run script.py` just reinstalls the deps back... which is, I admit, the behaviour I expect and want as a user.

octo888 3 hours ago

uv has converted people in the same way Google search, Chrome, Git and SSDs did.

Fast is a massive factor.

I haven't used it much, but being so fast, I didn't even stop to think "is it perfect at dependency management?" "does it lack any features?".

linsomniac 2 hours ago

I'm setting up a new dev workstation and I'm trying NixOS and for Python work I'm going to try living with "uv" for the bulk of my Python work. An experiment I've just started, but I've been using uv for ~6 months and it's been spectacular and I think it's going to be able to live up to my expectations.

codethief an hour ago

uv is great! One surprise I recently ran into, though, was that `uv run path/to/foo.py` doesn't look up the nearest pyproject.toml / .venv. Similarly, a script with a shebang `#!/usr/bin/env -S uv run` only executes correctly when executed from within the folder containing the .venv. Am I the only one finding this weird?

jimjag 4 hours ago

This has the same issue as so many package managers for Python, namely, it doesn't provide a way for --no-binary to remain sticky.

There are times when you do NOT want the wheel version to be installed (which is what --no-binary implements in pip), but so many package managers including uv don't provide that core, basic functionality. At least for those that do use pip behind the scenes, like pipenv, one can still use the PIP_NO_BINARY environment variable to ensure this.

So I'll not be migrating any time soon.

  • csl 4 hours ago

    Maybe not exactly what you need (sticky) but you can set UV_NO_BINARY=1

    See https://docs.astral.sh/uv/reference/environment/#uv_no_binar...

    • jimjag 4 hours ago

      It helps, that's for sure. But this sort of knowledge should not exist in the environment in any case. It should be part of the canonical package list, and not hidden away elsewhere. The whole idea of a dependency manager should be a centralized and consistent way to install everything you need and not be dependent on what values may or may not exist as a env-var.

      • wtallis 3 hours ago

        Can you elaborate on the reasons why a package would need to declare that its dependencies must be installed from source rather than from pre-built binaries? I'm having trouble imagining a scenario where that capability would be used as anything other than a workaround for a deeper problem with how your dependencies are packaged.

      • csl 3 hours ago

        Yes, I can see how it would make sense to be able to set this in pyproject.toml (typically for private package)

        uv is still quite new though. Perhaps you can open an issue and ask for that?

  • mixmastamyk 3 hours ago

    > There are times when you do NOT want the wheel version to be installed

    When, why? Should I be doing this?

    • jimjag 3 hours ago

      There are some wheels, for example, 'lxml' that bundle in their binary possibly incompatible external libraries, or older libraries than what you would like. This can cause library conflicts.

johncole 4 hours ago

I love uv. I am a convert, I use it for everything. One area I find it incredible for: deployment. If I have to launch a docker container and install a lot of python packages it saves so much time and compute.

I also appreciate that it handles most package conflicts and it constantly maintains the list of packages as you move. I have gotten myself into a hole or two now with packages and dependencies, I can usually solve it by deleting venv an just using uv to reinstall.

AJRF 4 hours ago

Is there something inheret about rust that means its faster at dep resolution than Python? Like where is the speed up coming from?

Or would it be possible to go this fast in python if you cared enough about speed?

Is it a specific thing that rust has an amazing library for? Like Network or SerDe or something?

  • simonw 4 hours ago

    The biggest speed-up in uv comes from the way it uses caching and hard links. When you install a package into a virtual environment uv uses a hard link to a previously cached version rather than copying files.

    Using Rust is responsible for a lot of speed gains too, but I believe it's the hard linking trick (which could be implemented in any language) that's the biggest win.

  • tcdent 4 hours ago

    They came up with a faster resolution algorithm, in addition to implementing it in a faster language.

    pip could be made faster based on this, but maybe not quite as fast.

spennant 4 hours ago

I moved from pip to poetry a while back, but was introduced to uv recently while working on MCP stuff. I now use uv for everything and haven’t looked back.

  • decko 3 hours ago

    What made you switch from poetry?

oezi 5 hours ago

I continue to be puzzled why sometime running uvx (uv tool run) will redownload all dependencies even though it just downloaded them for another tool. Downloading torch 15 times per day gets old even on 500 mbits

  • holysoles 5 hours ago

    Might be worth opening an issue, the project states it aims to aggressively cache dependencies: https://docs.astral.sh/uv/concepts/cache

    Maybe that functionality isnt implemented the same way for uvx.

    You could try this equivalent command that is under "uv run" to see if it behaves differently: https://docs.astral.sh/uv/concepts/tools/#relationship-to-uv...

    • oezi 5 hours ago

      I haven't been able to find any kind of rhyme or rhythm to it, so I don't know how to explain when it happens or how to better debug it for a bug report.

  • veganjay 5 hours ago

    Initially, I used `uv tool run <name>`, but later discovered `uv tool install <name>`. The `install` command downloads dependencies once and caches them for future use. It is similar to how `pipx install <name>` works.

    e.g.

    $ uv tool install asciinema

    $ asciinema play example.cast

carlosdp 5 hours ago

I love uv, not just for local development, but it also makes it WAY easier to manage python environments you setup for running python workers / user code in the cloud.

marifjeren 2 hours ago

Seems a lot of people like this and are happy about it, but I for one am tired of the proliferation of python package management tools.

Many languages have many package management tools but most languages there are one or two really popular ones.

For python you just have to memorize this basically:

- Does the project have a setup.py? if so, first run several other commands before you can run it. python -m venv .venv && source .venv/bin/activate && pip install -e .

- else does it have a requirements.txt? if so python -m venv .venv && source .venv/bin/activate && pip install -r requirements.txt

- else does it have a pyproject.toml? if so poetry install and then prefix all commands with poetry run

- else does it have a pipfile? pipenv install and then prefix all commands with pipenv run

- else does it have an environment.yml? if so conda env create -f environment.yml and then look inside the file and conda activate <environment_name>

- else I have not had to learn the rules for uv yet

Thank goodness these days I just open up a cursor tab and say "get this project running"

  • kortex an hour ago

    uv handles most, if not all, of those cases.

    > - else does it have a pyproject.toml? if so poetry install and then prefix all commands with poetry run

    That's not even correct. Not all projects with pyproject.toml use poetry (but poetry will handle everything with a pyproject.toml)

    Just try uv first. `uv pip install .` should work in a large majority of cases.

    pipenv is on the way out. bare `setup.py` is on the way out. `pyproject.toml` is the present and future, and the nice thing about it is it is self-describing in the tooling used to package.

    • marifjeren 35 minutes ago

      > That's not even correct. Not all projects with pyproject.toml use poetry

      I didn't say "all projects with pyproject.toml use poetry"

FL33TW00D 3 hours ago

uv has completely changed the experience of Python for me.

forrestthewoods 5 hours ago

Here’s my #1 complaint about uv: I’m new to the python ecosystem. I don’t know anything about pip or the existing tools. I would love for uv to at least have documentation and a user guide that doesn’t assume knowledge of the old bad tools that uv replaces.

Perhaps uv will continue its ascendancy and get there naturally. But I’d like to see uv be a little more aggressive with “uv native” workflows. If that makes sense.

  • wrs 4 hours ago

    That exists! [0] But IMHO the guides should be linked in big text as the first thing on the homepage. Right now you have to read through a bunch of meaningless comparative bullet points and reassurances then click a few times to get to the guides. If it weren’t for everyone telling me I need to switch, I might not have had the patience to find them.

    [0] https://docs.astral.sh/uv/guides/

egonschiele 5 hours ago

uv has become essential for me. conda and virtualenv never worked smoothly for me, but uv was easy and "just worked" from day 1.

colechristensen 5 hours ago

So how does Astral plan to make money?

  • simonw 5 hours ago

    Here's a loose answer to that question from uv founder Charlie Marsh last September: https://hachyderm.io/@charliermarsh/113103564055291456

    I don't want to charge people money to use our tools, and I don't want to create an incentive structure whereby our open source offerings are competing with any commercial offerings (which is what you see with a lost of hosted-open-source-SaaS business models).

    What I want to do is build software that vertically integrates with our open source tools, and sell that software to companies that are already using Ruff, uv, etc. Alternatives to things that companies already pay for today.

    An example of what this might look like (we may not do this, but it's helpful to have a concrete example of the strategy) would be something like an enterprise-focused private package registry. A lot of big companies use uv. We spend time talking to them. They all spend money on private package registries, and have issues with them. We could build a private registry that integrates well with uv, and sell it to those companies. [...]

    But the core of what I want to do is this: build great tools, hopefully people like them, hopefully they grow, hopefully companies adopt them; then sell software to those companies that represents the natural next thing they need when building with Python. Hopefully we can build something better than the alternatives by playing well with our OSS, and hopefully we are the natural choice if they're already using our OSS.

    • leobuskin 3 hours ago

      They are hijacking the entire python's ecosystem in a very smart way, that's all. At some point we, probably, will find us vendor locked-in, just because the initial offer was so appealing. Take a closer look at it: package manager, formatter/linter, types, lsp. What's left before it will poke cpython one way or another? Maybe cloud-based IDE, some interesting WASM relationship (but RustPython is not there yet, they just don't have enough money). Otherwise, Astral is on a pretty straightforward way to `touchdown` in a few years. It's both, the blessing, and the curse.

      Let's be honest, all tries to bring a cpython alternative failed (niche boosters like PyPy is a separate story, but it's not up-to-date, and not entirely exact). For some reason, people think that 1:1 compatibility is not critical and too costly to pursue (hello, all LLVM-based compilers). I think, it's doable and there's a solid way to solve it. What if Astral thinks so too?

      • IshKebab an hour ago

        Honestly... I don't really care. If in 5 years they turn around and try to charge for uv we'll still be in a much better place than if we'd all stuck with the catastrofuck that is pip.

  • serjester 5 hours ago

    Anaconda makes on the order of 100M a year “solving” data science package management. I would argue it has a significantly worse product, attacking a much smaller part of the ecosystem.

    It seems easy to imagine Astral following a similar path and making a significant amount of money in the process.

    • wrs 4 hours ago

      In theory, Anaconda solves the next higher level of the Python package management nightmare, namely knowing what versions are compatible with each other. But that could presumably be done on top of uv.

    • colechristensen 5 hours ago

      Anaconda isn't free. I don't want to pay per-seat fees for slightly improved versions of open source tools which is why I'm very skeptical of Astral and uv.

      One day they're going to tell me I have to pay $10/month per user and add a bunch of features I really don't need just because nobody wants to prioritize the speed of pip.

      And most of that fee isn't going to go towards engineers maintaining "pip but faster", it's going to fund a bunch of engineers building new things I probably don't want to use, but once you have a company and paying subscribers, you have to have developers actively doing things to justify the cost.

      • serjester 4 hours ago

        Enterprises don't care about faster, but they do care an enormous amount about security. Astral is very well positioned here.

  • SSchick 5 hours ago

    Apparently VC(1) so far, I'd assume there will be LTS support contracts and tailored enterprise features down the line; for the moment I'd assume it's just a bunch of talented devs fixing problems they've been tired off / see as long term existencial threats to the python ecosystems.

    1: https://old.reddit.com/r/Python/comments/12rk41t/astral_next...

renewiltord 4 hours ago

The Astral projects are all great. I hope the company finds a revenue stream in the future with some hosted something or the other because these tools are so useful I don't want them to become pay-for etc.

mikevm 4 hours ago

I think http://pixi.sh is much cooler because it supports conda environments so you can install non-python packages as well (e.g, gcc).

  • stere0 19 minutes ago

    I was surprised to not see earlier laude for the amazing work at pixi.sh too. In our team we use both. UV is great when not dealing with any underlying dependencies. Pixi is our fallback if that fails.

aanet 2 hours ago

Now.. can they make Python itself as fast as the package manager??

Vaslo 5 hours ago

Moved to this and have no need for anything else, especially since uv pip install whatever works with uv and is faster than pip (though I usually use uv add).

putna 5 hours ago

unfairly fast

gamegod 4 hours ago

Bragging that your program is faster than anything written in Python is a low bar, lol.

Also, it seems like a sign that even Python tooling needs to not be written in Python now to get reasonable performance.