You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Nice, could be useful because create an abstraction layer preventing messing with local host Python versions and so on.
However how can I avoid to lose python modules (for example, docker py for handling Compose objects)?
I think that adding them on build phase could not be useful, since it can create modules mess... maybe using a Daemon approach like CoreOS with SystemD unit along with named volumes to preserve state across restarts?
If it's designed just for CI I would like to put them in the Dockerfile so all deps are shipped... but locally I would like to get something customizable, wdyt? :)
BTW great job.
The text was updated successfully, but these errors were encountered:
Thanks for your comment, I honestly did not think about it.
A solution must surely be found, adding dependencies to the creation phase may be inconvenient as it is necessary to extend the image. However, I would reject the approach with Daemon, I do not think it's necessary for this type of use.
Maybe the best option is to use a volume and an ENV variable and finally edit the entrypoint to install the missing dependencies at runtime.
Well, it really depends on the use that you want to address.
In case of CI (as mentioned in the tagline of the repo) installing requirements at build time is the best and reasonable solution since you need a solid, immutable and reproducible output.
In case of local environment, things can be tricky: a named volume could be a solution, even considering a requirements.txt file inject as volume/config (even it's available only on Swarm mode) that install pip requirements or whatever at runtime.
TBH I hate when you have to deal with Ansible on local environments, even you can use virtualenv: this is the reason why I'm interested about your project :)
Yes, I understood. I can not figure out a case where you need to use python dependencies in the local control machine with Ansible. For example, if you need docker py, almost always you'll need to install it on the remote machine to use an Ansible task that requires it.
When do you need to install these dependencies locally? Perhaps to develop Ansible modules?
If I could find an use case, surely the volume solution with the requirements.txt file would be the one I would choose.
Nice, could be useful because create an abstraction layer preventing messing with local host Python versions and so on.
However how can I avoid to lose python modules (for example,
docker py
for handling Compose objects)?I think that adding them on
build
phase could not be useful, since it can create modules mess... maybe using a Daemon approach like CoreOS with SystemD unit along with named volumes to preserve state across restarts?If it's designed just for CI I would like to put them in the Dockerfile so all deps are shipped... but locally I would like to get something customizable, wdyt? :)
BTW great job.
The text was updated successfully, but these errors were encountered: