Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Containerizing Python applications can really simplify things in that regard.


No, the problem is just moved inside the container.


In other words, it becomes the concern of the person shipping the code, rather than the concern of the person trying to run the code. That's exactly how it should be.


In other languages it's a problem for neither, which I think is the parent's point.


Still, the person trying to run the code has to setup Docker.

And if you're on Windows, your Docker host is in a virtual machine, so networking and volumes are not so simple anymore.

Replacing one kind of complexity by another is not a solution, it is a trade-off.


Are people still suffering through hosting Docker containers on Windows? Why would anyone do that at this point other than to comply with outdated, arbitrary IT policies?


Just an example of platform specific issue even with Docker.


The problem with this is that someone else can't even run your program outside of a Docker container anymore. That doesn't seem ideal.


They also can’t run any program outside of a computer and OS; their are some basic prerequisites to running software - Having a Docker/container host has become one of those prerequisites for many applications, but it actually reduces the headache of numerous other traditional prerequisites.


I don't want to have to run a simple Python program in a container for quick and simple development or testing. That's a failure of engineering discipline. By all means, do provide a Docker container and do use containers for actual deployments, but also make it easy for me to just use, say, pip-tools or whatever else your organization has standardized on for Python. If we're talking about something with complex C or C++ dependencies that's quite different. If it's just a few pip dependencies and there's no way for me to just run it reliably outside of a container, though, that's a result of not following best practices.


Agreed, I typically include a README as well as a requirements.txt so one can easily 'pip install -r requirements.txt' and then 'python app.py' to run simple apps without a bunch of rigamarole.


I probably misunderstood you -- apologies. I think we're 100% in agreement.


I use constantly Docker in my job and projects yes. Yet, I do not believe and advocates it gets rid of the complexity.

According to the user needs, your dockerized application will run with different base distro. Alpine and musl for small OS footprint ? Or Debia(or debian-slim) for glibc compatibility ?

Those concerns are the same with or without Docker. Docker makes things easy, just not those things because it is not its purpose.


I typically specify these things in the Dockerfile - if the end user wants to modify the Dockerfile because they prefer Alpine over Debian... they've now taken responsibility of maintaining their customized Dockerfile and ensuring that everything runs as expected. This doesn't seem like something that would be encountered with any frequency in my experience, and you would technically have the same problem with or without Docker in the mix.


In the professional world, your end user is either : - someone without the skills to make a Dockerfile - another team who has not the responsability to integrate your work

The packager of an application is part of the project's team. It's not up to the user to package your application.


docker with pip-tools is great combination, you get deterministic builds easily




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: