Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How handles Python modules? #2

Open
prometherion opened this issue Nov 5, 2018 · 3 comments
Open

How handles Python modules? #2

prometherion opened this issue Nov 5, 2018 · 3 comments
Assignees
Labels
enhancement New feature or request

Comments

@prometherion
Copy link

Nice, could be useful because create an abstraction layer preventing messing with local host Python versions and so on.

However how can I avoid to lose python modules (for example, docker py for handling Compose objects)?
I think that adding them on build phase could not be useful, since it can create modules mess... maybe using a Daemon approach like CoreOS with SystemD unit along with named volumes to preserve state across restarts?

If it's designed just for CI I would like to put them in the Dockerfile so all deps are shipped... but locally I would like to get something customizable, wdyt? :)

BTW great job.

@gadiener gadiener added the enhancement New feature or request label Nov 5, 2018
@gadiener gadiener self-assigned this Nov 5, 2018
@gadiener
Copy link
Owner

gadiener commented Nov 5, 2018

Hi @prometherion,

Thanks for your comment, I honestly did not think about it.

A solution must surely be found, adding dependencies to the creation phase may be inconvenient as it is necessary to extend the image. However, I would reject the approach with Daemon, I do not think it's necessary for this type of use.

Maybe the best option is to use a volume and an ENV variable and finally edit the entrypoint to install the missing dependencies at runtime.

What do you think about that?

@prometherion
Copy link
Author

Well, it really depends on the use that you want to address.
In case of CI (as mentioned in the tagline of the repo) installing requirements at build time is the best and reasonable solution since you need a solid, immutable and reproducible output.
In case of local environment, things can be tricky: a named volume could be a solution, even considering a requirements.txt file inject as volume/config (even it's available only on Swarm mode) that install pip requirements or whatever at runtime.

TBH I hate when you have to deal with Ansible on local environments, even you can use virtualenv: this is the reason why I'm interested about your project :)

@gadiener
Copy link
Owner

gadiener commented Nov 8, 2018

Yes, I understood. I can not figure out a case where you need to use python dependencies in the local control machine with Ansible. For example, if you need docker py, almost always you'll need to install it on the remote machine to use an Ansible task that requires it.

When do you need to install these dependencies locally? Perhaps to develop Ansible modules?

If I could find an use case, surely the volume solution with the requirements.txt file would be the one I would choose.

TBH I know we hate the same things :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants