I work in tech and am constantly finding solutions to problems, often on other people’s tech blogs, that I think “I should write that down somewhere” and, well, I want to actually start doing that, but I don’t want to pay someone else to host it.
I have a Synology NAS, a sweet domain name, and familiarity with both Docker and Cloudflare tunnels. Would I be opening myself up to a world of hurt if I hosted a publicly available website on my NAS using [insert simple blogging platform], in a Docker container and behind some sort of Cloudflare protection?
In theory that’s enough levels of protection and isolation but I don’t know enough about it to not be paranoid about everything getting popped and providing access to the wider NAS as a whole.
Update: Thanks for the replies, everyone, they’ve been really helpful and somewhat reassuring. I think I’m going to have a look at Github and Cloudflare’s pages as my first port of call for my needs.
You’ll be fine enough as long as you enable MFA on your Nas, and ideally configure it so that anything “fun”, like administrative controls or remote access, are only available on the local network.
Synology has sensible defaults for security, for the most part. Make sure you have automated updates enabled, even for minor updates, and ensure it’s configured to block multiple failed login attempts.
You’re probably not going to get hackerman poking at your stuff, but you’ll get bots trying to ssh in, and login to the WordPress admin console, even if you’re not using WordPress.
A good rule of thumb for securing computers is to minimize access/privilege/connectivity.
Lock everything down as far as you can, turn off everything that makes it possible to access it, and enable every tool for keeping people out or dissuading attackers.
Now you can enable port 443 on your Nas to be publicly available, and only that port because you don’t need anything else.
You can enable your router to forward only port 443 to your Nas.It feels silly to say, but sometimes people think “my firewall is getting in the way, I’ll turn it off”, or “this one user needs read access to one file, so I’ll give read/write/execute privileges to every user in the system to this folder and every subfolder”.
So as long as you’re basically sensible and use the tools available, you should be fine.
You’ll still poop a little the first time you see that 800 bots tried to break in. Just remember that they’re doing that now, there’s just nothing listening to write down that they tried.However, the person who suggested putting cloudflare in front of GitHub pages and using something like Hugo is a great example of “opening as few holes as possible”, and “using the tools available”.
It’s what I do for my static sites, like my recipes and stuff.
You can get a GitHub action configured that’ll compile the site and deploy it whenever a commit happens, which is nice.