Skip to content

build: added support for uv#241

Merged
mrava87 merged 41 commits intoPyLops:devfrom
mrava87:build-uv
Feb 16, 2026
Merged

build: added support for uv#241
mrava87 merged 41 commits intoPyLops:devfrom
mrava87:build-uv

Conversation

@mrava87
Copy link
Contributor

@mrava87 mrava87 commented Jan 31, 2026

This PR adds support for UV to PyProximal:

  • update pyproject.toml
  • add uv.lock*
  • add targets in Makefile
  • add build action in GA
  • update readthedocs
  • Currently created with the following set of commands:
uv venv --python 3.13
uv lock
uv sync --locked --all-extras --all-groups 

It also modernizes the overall tooling:

  • move content of setup.cfg to pyproject.toml
  • switch from black/flake8/isort to ruff
  • add coverage support with upload to codacy (like pylops)

Finally, it contains various improvements to the documentation:

  • add installation instructions for uv
  • use sphinx-design to have tabs in installation instructions for conda/pip/uv
  • added emojii in main section titles

@mrava87 mrava87 marked this pull request as draft January 31, 2026 20:27
@mrava87 mrava87 marked this pull request as ready for review February 5, 2026 18:42
@mrava87 mrava87 requested a review from cako February 5, 2026 18:42
Copy link

@cako cako left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks really good! The website looks great. Left a few comments. For some improvements I'd consider:

  • Use nox for local testing different Python versions
  • Retire Makefile (doesn't work natively on Windows, but nox does)
  • Default to uv, then conda, then pip.

Also, for the segmentation.py example to work, I need to crank up the maxiter to 30. Otherwise I get:

    r = _zeros._bisect(f, a, b, xtol, rtol, maxiter, args, full_output, disp)
RuntimeError: Failed to converge after 20 iterations.

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why is this file still required?

Copy link
Contributor Author

@mrava87 mrava87 Feb 9, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I guess same for requirements.txt?

For now I kept it because I was allowing one to use pip to create an environment, but I think we can just change the instructions to pip install .[dev]... will do that and get rid of these files 😄

Copy link
Contributor Author

@mrava87 mrava87 Feb 9, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ok, change of plan... completely removed the requirements* files and any use/mention of them - 0235ecb, 308a007

We have now only conda and uv as two alternative setups 😄 For pure pip venv-based environments we used to provide commands with the requirements files, I tried using dependency-groups but that apparently does not work, so I think it is time to retire it - if someone wants to use it at the own risk, I am sure they can figure out how to install the required dependencies by hand 😉

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

awesome! i think that's the right move. have you tried the latest version of pip? it should support dependency groups

@mrava87 mrava87 mentioned this pull request Feb 9, 2026
@mrava87
Copy link
Contributor Author

mrava87 commented Feb 10, 2026

Looks really good! The website looks great. Left a few comments. For some improvements I'd consider:

  • Use nox for local testing different Python versions
  • Retire Makefile (doesn't work natively on Windows, but nox does)
  • Default to uv, then conda, then pip.

Also, for the segmentation.py example to work, I need to crank up the maxiter to 30. Otherwise I get:

    r = _zeros._bisect(f, a, b, xtol, rtol, maxiter, args, full_output, disp)
RuntimeError: Failed to converge after 20 iterations.

@cako thanks a lot!

I tried to reply to the various comments inline and add links to the commit which hopefully address your suggestions 😄

Regarding the 4 points above, I'll reply here (and ask a few questions 😉):

  • nox: I tried to add not support - ee375c7 - have a look what you think. Compared to curvelets, I did however mostly focus on the test part, which I find cool to be able to run for various python versions with just one command... I did not add the precommit/lint one (wasn't sure why one would not just run precommit with uv or let it automatically run with you git commit given that you don't really want to run it on multiple versions?) and same for the documentation build (the one I use with uv seems very robust - courtesy of ChatGPT - so again I didn't see what nox can bring to the table if not just another way to do the same?). Same for the build, I added it and it seems to work, but again not sure where it would fit in the grand scheme of things, as I see you don't use it in the CD Github action and there is the only place I usually build (and upload to PyPI).
  • Makefile: I decided to keep Makefile as it still seems to be a easy way to make some of the commands a bit easier and framework agnostic.. and I have always tried to shy away from Windows anyways, as I believe it is the scientific computing anti-pattern, so not working on Windows doesn't bother me much 😉
  • Default to uv, then conda, then pip.: this is where I am more curious to hear what you think. In quite a few places in the PyLops documentation (and so also in PyProximal) we recommend conda for its ability to install packages like NumPy with the most performant BLAS backend - eg We highly encourage using the Anaconda Python distribution or its standalone package manager Conda. Especially for Intel processors, this ensures a higher performance with no.... Do we still believe it in it? Or shall we suggest uv for its simplicity, speed, and platform-transferable benefits despite (I suspect) being based on pip under the hood it inherits all the 'issues' of pip - wheels vs binaries, no MKL by default, etc.?
  • segmentation.py: no idea why, as both locally with conda and uv (though I use OS X whilst I guess you use Linux?) and on RTD I do not see this... I bumped the number of iterations anyways to be on the safe side.

@cako
Copy link

cako commented Feb 13, 2026

  • nox: I tried to add not support - ee375c7 - have a look what you think. Compared to curvelets, I did however mostly focus on the test part, which I find cool to be able to run for various python versions with just one command... I did not add the precommit/lint one (wasn't sure why one would not just run precommit with uv or let it automatically run with you git commit given that you don't really want to run it on multiple versions?) and same for the documentation build (the one I use with uv seems very robust - courtesy of ChatGPT - so again I didn't see what nox can bring to the table if not just another way to do the same?). Same for the build, I added it and it seems to work, but again not sure where it would fit in the grand scheme of things, as I see you don't use it in the CD Github action and there is the only place I usually build (and upload to PyPI).

Makes sense! The benefit of nox for precommit is that you dont get fooled by possible local environment. Its a small issue though, I think for testing is a lot more valuable.

  • Makefile: I decided to keep Makefile as it still seems to be a easy way to make some of the commands a bit easier and framework agnostic.. and I have always tried to shy away from Windows anyways, as I believe it is the scientific computing anti-pattern, so not working on Windows doesn't bother me much 😉

Fair enough lol

  • Default to uv, then conda, then pip.: this is where I am more curious to hear what you think. In quite a few places in the PyLops documentation (and so also in PyProximal) we recommend conda for its ability to install packages like NumPy with the most performant BLAS backend - eg We highly encourage using the Anaconda Python distribution or its standalone package manager Conda. Especially for Intel processors, this ensures a higher performance with no.... Do we still believe it in it? Or shall we suggest uv for its simplicity, speed, and platform-transferable benefits despite (I suspect) being based on pip under the hood it inherits all the 'issues' of pip - wheels vs binaries, no MKL by default, etc.?

I don't think I've seen recent benchmarks on this but pip has come a long way now that wheels exist and libraries are moving towards it. MKL is still a sore topic for sure but again I don't know if this would make any difference with PyProximal. Worth a benchmark! Conda/mamba/pixi still have many benefits but to be honest conda's slowness and issues with licensing have made me dump it entirely and stop recommending it. Its biggest benefit which is dependencies in other languages is not required here. I think if we are worried about performance the most obvious step should be a GPU integration (I'd be happy to write some code for this btw!). As it stands, I'd be happy to keep it as a easy-to-install-but-not-so-optimal as the default.

  • segmentation.py: no idea why, as both locally with conda and uv (though I use OS X whilst I guess you use Linux?) and on RTD I do not see this... I bumped the number of iterations anyways to be on the safe side.

I'm on Ubuntu 22.04.4 Intel i7-8565U. This is my numpy config:

Build Dependencies:
  blas:
    detection method: pkgconfig
    found: true
    include directory: /opt/_internal/cpython-3.14.0/lib/python3.14/site-packages/scipy_openblas64/include
    lib directory: /opt/_internal/cpython-3.14.0/lib/python3.14/site-packages/scipy_openblas64/lib
    name: scipy-openblas
    openblas configuration: OpenBLAS 0.3.31.dev  USE64BITINT DYNAMIC_ARCH NO_AFFINITY
      Haswell MAX_THREADS=64
    pc file directory: /project/.openblas
    version: 0.3.31.dev
  lapack:
    detection method: pkgconfig
    found: true
    include directory: /opt/_internal/cpython-3.14.0/lib/python3.14/site-packages/scipy_openblas64/include
    lib directory: /opt/_internal/cpython-3.14.0/lib/python3.14/site-packages/scipy_openblas64/lib
    name: scipy-openblas
    openblas configuration: OpenBLAS 0.3.31.dev  USE64BITINT DYNAMIC_ARCH NO_AFFINITY
      Haswell MAX_THREADS=64
    pc file directory: /project/.openblas
    version: 0.3.31.dev
Machine Information:
  build:
    cpu: x86_64
    endian: little
    family: x86_64
    system: linux
  host:
    cpu: x86_64
    endian: little
    family: x86_64
    system: linux
Python Information:
  path: /tmp/build-env-l8zbez0v/bin/python
  version: '3.14'
SIMD Extensions:
  baseline:
  - X86_V2
  found:
  - X86_V3
  not found:
  - X86_V4
  - AVX512_ICL
  - AVX512_SPR

@mrava87
Copy link
Contributor Author

mrava87 commented Feb 16, 2026

@cako thanks!

Alright:

  • segmentation.py: still not sure but I am quite sure of one thing, numpy should not matter. If you have numba installed (as I would expect from the way uv is synced), the Segment solver should use numba for the bisection; if not it would use from scipy.optimize.bisect... I suspect given the error you are hitting one of the messages here https://github.com/scipy/scipy/blob/main/scipy/optimize/_zeros_py.py#L4.... now, why you don't seem to use numba, any idea?

  • nox: makes sense. Still I guess this does not apply to the precommit run on git commit?

  • conda: agree, I haven't seen any recent benchmark, so this belief may be outdated. For performance, pyproximal is fully GPU compatible, I used it a lot in various research projects running entire inversions on GPU... and yes I agree that the little more / little less that MKL can bring over say OpenBLAS is nothing like what a GPU vs CPU can bring 😉 I'll make a new PR where I make uv first-class citizen and leave conda as second since a lot of people still use it.

@mrava87 mrava87 merged commit a68434f into PyLops:dev Feb 16, 2026
17 of 18 checks passed
@mrava87 mrava87 deleted the build-uv branch February 16, 2026 21:29
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants