Replies: 5 comments
-
From within an activated conda environment, By way of example, this is the environment created via $ conda create -p ./env37-conda-demo -c conda-forge python=3.7 flask
$ conda activate ./env37-conda-demo
(./env37-conda-demo) $ pip install pandas And then the name: /Users/jdouglass/workspace/phargogh/invest/env37-conda-demo
channels:
- conda-forge
- defaults
dependencies:
- ca-certificates=2020.6.20=hecda079_0
- certifi=2020.6.20=py37hc8dfbb8_0
- click=7.1.2=pyh9f0ad1d_0
- flask=1.1.2=pyh9f0ad1d_0
- itsdangerous=1.1.0=py_0
- jinja2=2.11.2=pyh9f0ad1d_0
- libcxx=10.0.0=h1af66ff_2
- libffi=3.2.1=h4a8c4bd_1007
- markupsafe=1.1.1=py37h9bfed18_1
- ncurses=6.1=h0a44026_1002
- openssl=1.1.1g=h0b31af3_0
- pip=20.1.1=py_1
- python=3.7.6=cpython_h1fd5dd1_6
- python_abi=3.7=1_cp37m
- readline=8.0=hcfe32e1_0
- setuptools=49.1.0=py37hc8dfbb8_0
- sqlite=3.32.3=h93121df_0
- tk=8.6.10=hbbe82c9_0
- werkzeug=1.0.1=pyh9f0ad1d_0
- wheel=0.34.2=py_1
- xz=5.2.5=h0b31af3_0
- zlib=1.2.11=h0b31af3_1006
- pip:
- numpy==1.19.0
- pandas==1.0.5
- python-dateutil==2.8.1
- pytz==2020.1
- six==1.15.0
prefix: /Users/jdouglass/workspace/phargogh/invest/env37-conda-demo We should therefore be able to:
Unresolved issues:
|
Beta Was this translation helpful? Give feedback.
-
I can confirm that for 64-bit Windows builds conda works for testing as is implemented currently for the 64-bit Mac testing framework. I do NOT believe conda will work for 32-bit builds as there haven't been GDAL 32 bit binaries since like GDAL 2.2.4 in the gdal feedstock. We would have to get GDAL from Gohlke still to support 32 bit. |
Beta Was this translation helpful? Give feedback.
-
UpdateNow that we've had a little while to let this sit for a little while, I thought it might make sense to revisit some of the things we've learned about this. ProblemAlways using the very latest versions of packages means that we're vulnerable to having builds fail when there's an unexpected issue with an upstream package when the package is updated. It would be preferable to have our specific build dependencies tracked in such a way that our CI services can reliably build and test using specific package versions. This would allow us the opportunity to deliberately update packages when appropriate, and also would enable us to more easily determine what packages were used in a specific version of InVEST. Considerations
Adding @dcdenu4 to this since he had expressed interest in looking into a solution. Did I miss anything from the above @dcdenu4 ? |
Beta Was this translation helpful? Give feedback.
-
This issue came up again over in #635 (comment). @emlys has a good suggestion, specifically:
Getting back to the age-old question of whether InVEST is a library or an application, I'm starting to lean more towards it being an application. And if that's the case, then we should be more prescriptive about what versions of packages can be installed. Since part of the difficulty here is in the fact that our environment cannot be completely described by |
Beta Was this translation helpful? Give feedback.
-
I think some of the solutions described above could be accomplished with a lockfile approach, such as https://conda.github.io/conda-lock/ One recurring problem we have is that PyInstaller tends to be one step behind other libraries, since libraries do not ensure their own PyInstaller compatibility. So a typical pattern is,
Basically, we get the latest SciPy immediately after it is released, but a compatible PyInstaller release comes weeks later. If we used a lockfile, our CI would not install the latest available packages. But I could imagine some automation that periodically updates all packages and runs our full build and test workflows. Similar to our automated release process, it could do a PR with the updated lockfile. If all tests pass, we merge it, if they don't we address the compatibility issues. I assume we can still have an |
Beta Was this translation helpful? Give feedback.
-
We've had a couple of issues just in the past few months where a package that InVEST depends on is built in some faulty way. Recent examples of this include:
rtree
include thelibspatialindex
DLL, but fail to handle the import path correctly. (Released on pypi.org)shapely
had a faulty build (we never did figure out exactly what went wrong).The only way we could (sometimes) work around these issues was to explicitly forbid a specific package version in
requirements.txt
, which is like using a sledgehammer on a nail: functional, but crude. When a package is released on pypi.org, it's considered the single source of truth and it cannot be updated. This is not the case with Gohlke's packages, where he does sometimes rebuild a package and update it on his index. Only conda actually allows for specific builds to be available and referred to for each package version.So, can we use conda to produce more reliable builds? Here's one process that might work:
requirements.txt
), we'll run a script to re-generate these derived files fromrequirements.txt
. When these file changes go through the PR process, we'll verify that tests still work.This pinning process would allow us to only update packages in our build/test environments when we actually need to and would also allow us to verify that the build process will work reliably with the packages specified.
Beta Was this translation helpful? Give feedback.
All reactions