Hi, Once in a while I try to clean up my tabs. First thing I do is use “merge all windows” to put all tabs into one window.
This often causes a memory clog and firefox get stuck in this state for 10-20 minutes
I have recorded one such instance.
I have tried using the “discard all tabs” addon, unfortunately, it is also getting frozen by the memory clog.
Sometimes I will just reboot my PC as that is faster.
Unfortunately, killing firefox this way, does not save the new tab order, so when I start firefox again, it will have 20+ windows open, which I again, merge all pages and then it clogs again !
So far the only solution I have found is just wait the 20 minutes.
Once the “memory clog” is passed, it runs just fine.
I would like better control over tab discard. and maybe some way of limitting bloat. For instance, I would rather keep a lower number of undiscarded youtube that as they seem to be insanely bloated.
In other cases, for most website I would like to never discard the contents.
In my ideal world, I would like the tabs to get frozen and saved to disk permanently, rather than assuming discard tabs can be reloaded. As if the websites were going to exist forever and discarding a tab is like cleaning a cache.
I’m really curious about the workflow you have that needs that many tabs. How does the History and Bookmark functions fall short of what you need?
It’s easier to use google than the bookmarks manager, which can’t even find text inside the pages. I do often dump all those thousands of tabs into a bookmarks folder. And it has never happened that I went back into that enormous pile to fetch something that would take hours to find again. I have no use for the history either. A gigantic, alphabetic ordered list of everything I have seen in the last 7 days. Again, easier to just use google.
The one thing that is better and faster than google, is not closing those tabs that may contain the stuff I need.
Of course, it’s not really possible to search the text body of open tabs, unless you search them one by one.
But I’m going to ask for only one computing miracle at a time !
What I’d recommend, based on the insistence that seeing to not change your workflow, is to locally download the pages you have open with httrack, wget or a similar application. This would allow you to locally search all your tabs and their contents very quickly without Google, they will load faster because of lack of needing to redownload them, which if I understand correctly Firefox is trying to do at some level.
Thanks, I didn’t know that one.
I have been experiementing with a transparent proxy like squid or something like Archive Box, to create static pages on the fly and load that.
But so far I’ve not made something seamless and pleasant to use. It would have to be at least as low friction as using google.
I am going to try using Mixtral 8x7b to perform natural language search over my archives and pull tabs from the collection of all pages I have ever seen. But that’s still a long way away from being operational !
…has Google still been giving you the same results recently? This is an extremely weak link in your setup to me. You’d be better off looking at a locally run search engine like peARs or something similar with locally downloaded and indexed files if you insist on using search, and it’ll be waaaay more reliable than an LLM here.
Google is giving me increasingly poor results, I am looking into deploying Searxng locally.
I really would like to operate my own local crawler and sorting algorithm.
I will check out the peARs you mentionned !
If you need offline version of the sites you can save them with SingleFileZ