Slack trains machine-learning models on user messages, files and other content without explicit permission. The training is opt-out, meaning your private data will be leeched by default.
There’s a safe bet that if you’ve put something on the internet, it’s been scraped by a bot by now for training. I don’t like that, for the record, just saying I’m not surprised at this point. Companies are morally bankrupt
I don’t know why everyone is all shocked all of a sudden, there have been various scraper bots collecting text info for…many years now, LONG before LLMs came onto the scene.
I agree, but it’s one thing if I post to public places like Lemmy or Reddit and it gets scraped.
It’s another thing if my private DMs or private channels are being scraped and put into a database that will most likely get outsourced for prepping the data for training.
Not only that, but the trained model will have internal knowledge of things that are sure to give anxiety to any cyber security experts. If users know how to manipulate the AI model, they could cause the model to divulge some of that information.
There’s a safe bet that if you’ve put something on the internet, it’s been scraped by a bot by now for training. I don’t like that, for the record, just saying I’m not surprised at this point. Companies are morally bankrupt
I don’t know why everyone is all shocked all of a sudden, there have been various scraper bots collecting text info for…many years now, LONG before LLMs came onto the scene.
I agree, but it’s one thing if I post to public places like Lemmy or Reddit and it gets scraped.
It’s another thing if my private DMs or private channels are being scraped and put into a database that will most likely get outsourced for prepping the data for training.
Not only that, but the trained model will have internal knowledge of things that are sure to give anxiety to any cyber security experts. If users know how to manipulate the AI model, they could cause the model to divulge some of that information.