$ cd lemmy-dir
$ du -sh *
456K lemmy-ui
15G pictrs
4.3G postgres
Guys this is no longer funny please I feel literally chased by the “no space left” message. Please help I don’t need those pics I did not upload them
Have you posted this question on the lemmy_admin community over on lemmy.ml? Or possibly joined their matrix chat as linked on their github project? I suspect you will be able to get much more targeted support directly from the team or their community rather than the selfhosted community which is more general to all kinds of self hosting.
Thanks a lot, I was looking for this exact kind of community. Posted there <3
Did you get any solution?
Haha I’m literally on it right now. My instance crashed a couple of hours ago because of it, so I emptied
~/.rustup
to get some time, but idk how to go about it from here. LPP didn’t do anything. That seems really curious, does literally everyone use S3?Okay, you may not gonna like it but I rented a 1TB storage box from Hetzner for 3 euros a month, just to get that foot off my neck. It’s omega cheap and mountable via CIFS so life is good for now. I’m still interested in what I described in the OP, and I even started scribbling some Python, but I’m too scared of fucking anything up as of now.
The annoying part in writing that script was discovering that the filenames on disk don’t match the filenames in the URLs. E.g., given this URL:
https://lemmy.org.il/pictrs/image/e6a0682b-d530-4ce8-9f9e-afa8e1b5f201.png. You’d expect that somewhere insidevolumes/pictrs
you’d finde6a0682b-d530-4ce8-9f9e-afa8e1b5f201.png
, right…? So that’s not how it works, the filenames are of the exact same format but they don’t match.So my plan was to find non-local posts from the
post
table, check whether thethumbnail_url
column starts withlemmy.org.il
(assuming that means my instance cached it), then finding the file by downloading it via the URL and scanning thepictrs
directory for files that match the exact size in bytes of the downloaded files. Once found, compare their checksums to be sure it’s the same one, then delete it and delete its post entry in the database.When get close to 1TB I’ll get back here for this idea… :P