I woke up this morning and decided my main drive (just a 500GB SSD) was too full, at about 85%, so I decided to do something about that. I go through the usual: pacman -Sc, paccache -rk0, and pacman -Qqtd | pacman -Rns - (which I’ve aliased to “orphankiller” because that’s too much typing for me). None of that did anything though, as I’m usually pretty up on this, and I expected it, so my next step was to find other ways of deleting unnecessary files floating around, and that meant a trip to the usually very helpful Arch wiki.
On the page “pacman Tips and Tricks”, I find 1.7: Detecting More Unneeded Packages. “Perfect!” I thought, “That’s exactly what I’m looking for!” I enthusiastically type in the command pacman -Qqd | pacman -Rns -, and then quickly go check how much space I just saved. Nada. Or at least not enough to move the percentage point. “Oh well, keep looking,” I think and I go back to Firefox to click some more links in hopes that one of them will be the space saving ultra-script that I need. The first one I click, I get an error from my trusty browser, I don’t remember exactly what it was but it was something about not being able to verify the page. “Weird, let’s try another one.” Nope, same thing.
Well, being that I had just deleted something, I figured I should go see what exactly it was that I did. It was a good thing I’d left the terminal window open, because after just a few scrolls I saw it: ca_certificates, which Firefox absolutely needs. “Great, I’ll just reinstall.” Nope! I just deleted my pacman cache, and pacman also needs those certificates to download from the Arch repo’s mirrors! “Fantastic,” I grumbled while I tried to think of how I could get this pesky package back on my machine.
Then it occurred to me: I’ve been keeping up with my btrfs snapshots (for once, lol)! I can just backup to yesterday and forget this whole mess! So I bring up Timeshift, and we’re on our way back to a functioning system! Or so I thought. See, I don’t have a separate /home partition, but I do have a separate @home subvolume, so when Timeshift asked me if I wanted to restore that too, I clicked the check mark. Only thing is, I don’t think I actually have a separate @home subvolume, which brings us to the error in the meme. /home wouldn’t mount, and that meant I was borked.
Fortunately, our story has a happy ending! I DDG’d the error on my phone, and found a post from like seven years ago, about someone who had this same set of circumstances, and the one reply was my fix: just go into /etc/fstab and delete the “subvolid” part of whatever partition that’s giving you grief. Did that, reboot, and we’re finally fixed! And now, forevermore, I shall check what I’m deleting before I hit the enter button!
The post-script is bittersweet though, because after all this trouble, and then the rest of the afternoon working on the original problem, I am down to… 81%. Oh well.
I use BTRFS with zstd compression at the default level basically everywhere and it’s great. I don’t notice any performance difference but I have a lot more storage.
Defrag will remove the CoW of the snapshots tho. It will definitely make things worse. I’d say remove (but keep at least one per subvolume) snapshots, set the flags, and wait until the snapshots trinkle down
You did mention a “main drive”. I don’t know what’s taking all that space on your SSD but if you have a media library that takes some space you could move that to a connected HDD. While HDDs aren’t good as a boot drive it does the job well enough with most “standard” quality media. So can be said for documents and more obviously. You can then auto-mount your other drive to be inside your home directory for seemless access.
One thing that isn’t mentionned but I’ll just say this just in case. Always have external backups. I’ve scared myself way too many times thinking I had lost my main drive’s data just to find it the next day on one of my backup. Really a life saver if your setup has a problem where you find that one forum post from 12y ago with a “Nvm I fixed it” marked as [FIXED].
Other than that, thanks for sharing and with the solution at that.
Yeah, my other drive is a 1TB HDD, and I do have all my media/documents/pictures/etc. there, I think what’s filling up my drive is actually plugins for Ardour lol, plus I might have too many Things I Definitely Need™. Maybe the real solution to my storage problems is to look within… (like do I seriously need No Man’s Sky installed all the time for the once every three months that I play it?)
But yeah, I wanna set up a NAS for this sort of thing, next time I have money lol
(like do I seriously need No Man’s Sky installed all the time for the once every three months that I play it?)
That sound’s like the data is in semi-regular use at least. For me it’s more like “Do I seriously need the sequel installed for that other game I haven’t even started yet, but am definitely going to start any day now, after years of having it installed?”.
Optimizing your system for space is usually wasted effort in Linux, this is not Windows. To get what uses all the space, there’s plenty of storage analyzing tools like Baobab, qdirstat, etc.
I’d say you might have had a snapshot still holding the deleted data when you first deleted the cache. I don’t use time shift for my backups but I’d assume it uses the same kind of incremental snapshot as btrbk. Which means that, until the next backup date, it will hold onto the previous state of the system, preventing it from truly deleting the file.
You may also have some balance issues, having way more metadata allocation than needed. Try running a balance and see if it changes something.
Okay, so here’s the recap:
I woke up this morning and decided my main drive (just a 500GB SSD) was too full, at about 85%, so I decided to do something about that. I go through the usual:
pacman -Sc
,paccache -rk0
, andpacman -Qqtd | pacman -Rns -
(which I’ve aliased to “orphankiller” because that’s too much typing for me). None of that did anything though, as I’m usually pretty up on this, and I expected it, so my next step was to find other ways of deleting unnecessary files floating around, and that meant a trip to the usually very helpful Arch wiki.On the page “pacman Tips and Tricks”, I find 1.7: Detecting More Unneeded Packages. “Perfect!” I thought, “That’s exactly what I’m looking for!” I enthusiastically type in the command
pacman -Qqd | pacman -Rns -
, and then quickly go check how much space I just saved. Nada. Or at least not enough to move the percentage point. “Oh well, keep looking,” I think and I go back to Firefox to click some more links in hopes that one of them will be the space saving ultra-script that I need. The first one I click, I get an error from my trusty browser, I don’t remember exactly what it was but it was something about not being able to verify the page. “Weird, let’s try another one.” Nope, same thing.Well, being that I had just deleted something, I figured I should go see what exactly it was that I did. It was a good thing I’d left the terminal window open, because after just a few scrolls I saw it:
ca_certificates
, which Firefox absolutely needs. “Great, I’ll just reinstall.” Nope! I just deleted my pacman cache, and pacman also needs those certificates to download from the Arch repo’s mirrors! “Fantastic,” I grumbled while I tried to think of how I could get this pesky package back on my machine.Then it occurred to me: I’ve been keeping up with my btrfs snapshots (for once, lol)! I can just backup to yesterday and forget this whole mess! So I bring up Timeshift, and we’re on our way back to a functioning system! Or so I thought. See, I don’t have a separate /home partition, but I do have a separate @home subvolume, so when Timeshift asked me if I wanted to restore that too, I clicked the check mark. Only thing is, I don’t think I actually have a separate @home subvolume, which brings us to the error in the meme. /home wouldn’t mount, and that meant I was borked.
Fortunately, our story has a happy ending! I DDG’d the error on my phone, and found a post from like seven years ago, about someone who had this same set of circumstances, and the one reply was my fix: just go into
/etc/fstab
and delete the “subvolid” part of whatever partition that’s giving you grief. Did that, reboot, and we’re finally fixed! And now, forevermore, I shall check what I’m deleting before I hit the enter button!The post-script is bittersweet though, because after all this trouble, and then the rest of the afternoon working on the original problem, I am down to… 81%. Oh well.
Delete unused BTRFS snapshots. Enable compression by setting flags on /etc/fstab and run btrfs defrag to compress existing snapshots.
Great suggestions, that will absolutely be my tomorrow project!
I use BTRFS with zstd compression at the default level basically everywhere and it’s great. I don’t notice any performance difference but I have a lot more storage.
Defrag will remove the CoW of the snapshots tho. It will definitely make things worse. I’d say remove (but keep at least one per subvolume) snapshots, set the flags, and wait until the snapshots trinkle down
Try removing any unused language packs! I’ve heard that the French one takes up allot of space, remove it with
sudo rm -rf /
/s
You mean the Rfench one?
you messed up
sudo rm -fr /*
no /s, this actually works
/s
You joke, but I actually did remove locales with BleachBit, and then changed
pacman.conf
to skip the unnecessary ones. Saved me about 400MB!BleachBit? Wipe with a cloth?
Bleachbit
Thank you, I was making a tongue in cheek reference to this though: https://www.bleachbit.org/cloth-or-something
LMAO I was unaware of this! That’s hilarious!
Nono,
-rf
deletes all your files. Usesudo rm -fr /*
instead to actually delete the French language pack!also /s
edit: someone already made the joke, darn
Found this incredibly relatable lmao.
Yeah, that’s pretty much how I solve all my problems lol
You did mention a “main drive”. I don’t know what’s taking all that space on your SSD but if you have a media library that takes some space you could move that to a connected HDD. While HDDs aren’t good as a boot drive it does the job well enough with most “standard” quality media. So can be said for documents and more obviously. You can then auto-mount your other drive to be inside your home directory for seemless access.
One thing that isn’t mentionned but I’ll just say this just in case. Always have external backups. I’ve scared myself way too many times thinking I had lost my main drive’s data just to find it the next day on one of my backup. Really a life saver if your setup has a problem where you find that one forum post from 12y ago with a “Nvm I fixed it” marked as [FIXED].
Other than that, thanks for sharing and with the solution at that.
Yeah, my other drive is a 1TB HDD, and I do have all my media/documents/pictures/etc. there, I think what’s filling up my drive is actually plugins for Ardour lol, plus I might have too many Things I Definitely Need™. Maybe the real solution to my storage problems is to look within… (like do I seriously need No Man’s Sky installed all the time for the once every three months that I play it?)
But yeah, I wanna set up a NAS for this sort of thing, next time I have money lol
That sound’s like the data is in semi-regular use at least. For me it’s more like “Do I seriously need the sequel installed for that other game I haven’t even started yet, but am definitely going to start any day now, after years of having it installed?”.
Creating your own linux-based nas is a very fun project!
Optimizing your system for space is usually wasted effort in Linux, this is not Windows. To get what uses all the space, there’s plenty of storage analyzing tools like Baobab, qdirstat, etc.
try qdirstat maybe
That’s the second recommendation for qdirstat, so it’s definitely on the tomorrow list!
And for cli ncdu is great
I’d say you might have had a snapshot still holding the deleted data when you first deleted the cache. I don’t use time shift for my backups but I’d assume it uses the same kind of incremental snapshot as btrbk. Which means that, until the next backup date, it will hold onto the previous state of the system, preventing it from truly deleting the file.
You may also have some balance issues, having way more metadata allocation than needed. Try running a balance and see if it changes something.