Back in the day it was nice, apt get update && apt get upgrade and you were done.

But today every tool/service has it’s own way to being installed and updated:

  • docker:latest
  • docker:v1.2.3
  • custom script
  • git checkout v1.2.3
  • same but with custom migration commands afterwards
  • custom commands change from release to release
  • expect to do update as a specific user
  • update nginx config
  • update own default config and service has dependencies on the config changes
  • expect new versions of tools
  • etc.

I selfhost around 20 services like PieFed, Mastodon, PeerTube, Paperless-ngx, Immich, open-webui, Grafana, etc. And all of them have some dependencies which need to be updated too.

And nowadays you can’t really keep running on an older version especially when it’s internet facing.

So anyway, what are your strategies how to keep sanity while keeping all your self hosted services up to date?

    • BlackEco@lemmy.blackeco.com
      link
      fedilink
      English
      arrow-up
      2
      ·
      10 hours ago

      Yes, but usually when you use automerge you should have set up a CI to make sure new versions don’t break your software or deployment. How are you supposed to do that in a self-hosting environment?

      • tofu@lemmy.nocturnal.garden
        link
        fedilink
        English
        arrow-up
        1
        ·
        7 hours ago

        Ideally, you have at least two systems, test updates in the dev system and only then allow it in prod. So no auto merge in prod in this case or somehow have it check if dev worked.

        Seeing which services are usually fine to update without intervening and tuning your renovate config to it should be sufficient for homelab imho.

        Given that most people are running :latest and just yolo the updates with watchtower or not automated at all, some granular control with renovate is already a big improvement.