Slack’s Controversial AI Training Data Collection Policy
Slack, the popular workplace communication tool, has come under fire for its policy on collecting user data for AI training. This policy, which was recently discovered by security researcher Corey Quinn, has raised concerns about privacy and data security.
Opt-Out Process
Organizations that wish to opt out of Slack’s AI training data collection must email Slack’s support team. The email must include the organization’s workspace URL and a subject line that reads “Slack Global model opt-out request.”
Data Used in Model Training
Slack collects a wide range of user data for its AI training models, including messages, content, files, and usage information. However, the company excludes data from premium generative AI tools from this process.
Slack’s Privacy Policies
Slack’s privacy policies have been criticized for being inconsistent and misleading. The company states that it “can’t access underlying content during model training,” yet its policy contradicts this statement by allowing for model training on user data without explicit permission. Additionally, Slack claims that “Your data is your data. We don’t use it to train Slack AI,” which is misleading since Slack does use user data for model training, excluding only premium generative AI tools.
Slack’s AI Training Data Collection Policy: A Deeper Dive
Opt-Out Process
Slack’s opt-out process is straightforward but requires decisive action. The workspace owner must email Slack, clearly expressing their desire to opt out. They must include the workspace URL and specify “Slack Global model opt-out request” in the subject line.
Data Used in Model Training
Slack’s AI models are trained on a vast dataset of messages, content, files, and usage information. Notably, data from premium generative AI tools is excluded.
Slack’s Privacy Policies
Inconsistent Statements
Slack’s privacy policies contain conflicting statements. One asserts that Slack cannot access underlying content during model training, while another allows for data usage without explicit consent. This inconsistency raises concerns about user privacy.
Marketing Claims
Slack’s marketing claims that “Your data is your data. We don’t use it to train Slack AI” are misleading. The company’s privacy policies clearly state that user data is indeed utilized for model training.
Corey Quinn’s Discovery
Corey Quinn’s astute observation of Slack’s Privacy Principles brought this issue to light. His public post on X sparked widespread discussion and raised awareness.
Slack’s Response
Slack defended their actions, claiming that the models are designed to enhance platform functionality, such as search improvements. However, they neglected to address the misleading nature of their privacy policies.
Engadget’s Inquiry
Despite reaching out to Slack, Engadget did not receive a response, further fueling speculation and concerns.
Implications
Slack’s data collection practices raise significant implications for user privacy. Employees have limited control over how their data is used, and the company’s confusing privacy policies leave room for misinterpretation. This may prompt businesses to reconsider their use of Slack, especially if data privacy is a paramount concern.
Conclusion
Slack’s AI training data collection policy has sparked a necessary conversation about data privacy. While the company’s intentions may be to improve its platform, the inconsistent statements in its privacy policies and the lack of transparency regarding data usage are troubling. Organizations should carefully consider the implications before adopting Slack, ensuring their data privacy concerns are adequately addressed.