Slack recently updated its terms and conditions to allow the feeding of private user data to AI models, a practice also seen in other companies like Figma.

Breakdown

Slack recently updated its terms and conditions to allow the feeding of private user data to AI models, a practice also seen in other companies like Figma. In his post, 'you are being brain-drained', Michal highlights how moves like this raise concerns about the privacy and security of sensitive information such as NDA-protected content, company secrets, and financial reports.

Key Points

  • Slack and other companies are utilizing private user data, including sensitive information, to train AI models.

  • Opting out of this data sharing process involves sending an email request to Slack, which isn't too user-friendly.

  • The practice of feeding private data to AI models raises ethical and privacy concerns, especially when it involves confidential information.

Highlights

"They did it quietly - almost in the middle of the night - by just changing their terms and conditions."

"Of course, the right way would be to Opt In instead of out and features like that should never be enabled by default."

Offering an opt-out option is a step in the right direction, but more transparent and user-friendly processes should be implemented to ensure user data privacy and security.