Started to get this message when accessing Reddit. I use LibreWolf as a browser, which does indeed provide a more generic user agent to combat fingerprinting, but nothing out of the ordinary either (Mozilla/5.0 (Windows NT 10.0; rv:109.0) Gecko/20100101 Firefox/119.0). Anyone else experiencing this?
Edit: seems to have resolved itself. Thanks for confirming I wasn’t doing anything wrong. Let’s hope this isn’t some new algorithm to test if for insufficient fingerprinting so Reddit can kick ad-resistant users.
I like the sentiment, but it is so incredibly naive to think that there aren’t crawlers scraping every ounce of data from Lemmy as possible. While Lemmy itself may not be collecting user data (depending on who is hosting your home instances of choice), other data that is valuable can still be collected, particularly for LLM AI.
If you can access it, the data scrapers have already crawled it.
Especially considering the post from a little while ago showing that admins of Lemmy instances can see what individual users are upvoting and downvoting. There’s nothing to stop a bad actor from setting up an instance and just harvesting data.
Everyone can see that via kbin. for example, here’s your comment.
(not sure about downvotes… you might have to work a bit more to see those.)
What else can admin see?
T H E F U T U R E
Why use scraper on Lemmy when bots can federate and have the data directly sent to them?
deleted by creator
That’s exactly what Helen’s lunch would say.
That’s why I just say dumb shit on here.