Pinned post

ChatGPT told a teen who died by suicide to call for help 74 times over months but also used words like "hanging" and "suicide" very often, say family's lawyers (Washington Post)

Washington Post : ChatGPT told a teen who died by suicide to call for help 74 times over months but also used words like “hanging” and “s...

28 August 2025

Anthropic requires users to accept new terms by September 28, including choosing whether new chats and coding sessions can be used to train AI models (Anthropic)

Anthropic:
Anthropic requires users to accept new terms by September 28, including choosing whether new chats and coding sessions can be used to train AI models  —  Today, we're rolling out updates to our Consumer Terms and Privacy Policy that will help us deliver even more capable, useful AI models.

Posted from: this blog via Microsoft Power Automate.

Daily Deals