Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I've found it really useful on embedded systems (e.g. Raspberry Pi). It takes a long time to install anything complicated, and you occasionally need to make kludges. Having a Dockerfile means (a) I can remember what I did to make everything work, and in what order (b) using a hub, I can easily duplicate the environment to another Pi without waiting overnight for all the applications to build.

Yes, you can clone the SD card, but I think it's cleaner to use a version-controlled Dockerfile. Otherwise you always need some master SD card to clone from (and keep track of a multi-GB image file), and you have to faff with resizing images if the new card is smaller.

A fresh system install is then: flash Raspbian, update system, setup some init scripts, install docker, pull the image and clone the latest version of the code from github.

This is also the approach embraced by Balena, who (conveniently) provide Docker base images for a bunch of common embedded systems.

Another reasonably big user-base is machine learning.



What stuff are you putting on your Pi through docker? Just getting started and interested to hear how others leverage it.


It's for a drone-based sytsem. Python, OpenCV, ROS are the main parts, plus some machine vision camera SDKs. I've also put in some optimised machine learning libraries which are a bit finnicky to setup. ROS is an absolute mule to get right and I like having it in a closed off place so it can't mess around with the rest of the system. I have the actual ROS workspace in a shared volume so they're persistent if I fiddle with the Dockerfile.

None of it really requires docker, but it's nice to have the whole environment encapsulated, and having a record of what I had to do to get some of these things to install is invaluable.

I have a shell script which launches the container (I just run a new one) every time the Pi boots.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: