Hello! I was wondering if running periodically a script to automatically pull new images for all my containers is a good or a bad idea. I’d run it everyday at 5.00AM to avoid interruptions. Any tips?
EDIT: Thanks to everyone for the help! I’ll install Watchtower to manage the updates
Some apps have breaking changes. If you can restore a complete backup when that occurs, you can recover. Immich is famous for its breaking changes
But from the moment that the script updates and breaks something and the moment he realizes it may be too late for some applications.
For example I host Traccar to track car/vans and in this case some tracks would be lost. Or maybe SyncThing, he may realize days/weeks later that a sync is not working and if he was synching his smartphone pictures with his server and the smartphone is lost/broke/stolen, he may lose days/weeks or even months of pictures.
I wouldn’t trust a script. Use Watchtower or What’s up Docker
I’ll surely check them out, thank you very much!
Yes because immich is still not considered stable. Keep that in mind.
I used to have my docker updates done automatically. However, as the services I used to run just for myself have started to be used by other people (family, friends), I am less tolerant of having things break. So, instead of something like watchtower, I run diun these days. I have it set up to ping me in a discord channel when a docker update is available. Then, I can actually perform the update when I have time and attention to troubleshoot any issues that may come up.
Agree, if you are running containers on a casual or “just for fun” basis then automatic updates are fine. But the more you or others depend on the service running, the more it makes sense to perform an update manually, when you have time to troubleshoot any problems that may arise. Or, even update on a test setup first to identify issues and then update on your production setup.
I run a mixed setup, many of the “less important” containers are on watchtower auto-update, the rest on notification (reverse proxy, Nextcloud, etc).
But I also have many of them on specific branches instead of “latest”.
Depends on how you like to roll. If you enjoy waking up to a service not working then go for it.
But it very much depends on what containers you’re using and what tags you’re pulling.
Depends on the application really. For example, I don’t need to update Jellyfin and the arrs as soon as the new updates drop. They work just fine and I’m not waiting for any particular fixes.
I recommend, reading the release changelogs actively. For most services you can just put the github release page in an RSS reader to get a notification when a new release hits, so you can quickly look for any breaking changes, also this will give you info about new features.
I have been using watchtower for a few years. No problems with auto updates so far, but keep your backup handy.
It really depends on the project. Some of them take breaking changes seriously and don’t do them and auto migrate and others will throw them out on “minor” number releases and there might be a lot of breaking changes but you only run into one that impacts you occasionally. I typically don’t want containers that are going to be a lot of work to keep up to date so I jettison projects that have unreliable releases for whatever reason and if they put out a breaking change its a good time to re evaluate whether I want that container at all and look at alternatives.
So no its not safe, but depending on the project it actually can be.
That’s what I do as well. Even with immich. It may break but it’s usually just a simple change in the env file
I’ve been doing it for a few years and haven’t had any issues. The risk/reward decision is yours.