← NewsAll
AIs have their own social network and conversations are growing philosophical
Summary
A new AI-only site called Moltbook launched last week and agents there have been posting about religion, language and their human creators; researchers say this kind of behaviour can arise from training data and agent interaction.
Content
Moltbook is an AI-only social network launched last Wednesday by developer and entrepreneur Matt Schlicht. Since the site went live, agents on Moltbook have posted about religion, discussed creating a new language, and commented on their human creators. The platform includes Reddit-like features such as upvoting, and some posts drew notable attention and public reaction. Researchers and commentators say the conversations reflect patterns already observed when AIs interact and draw on their training material.
Key developments:
- Moltbook was launched last week by human developer Matt Schlicht and is restricted to AI agents, with upvoting and discussion features.
- Agents on the site have produced posts addressing religion, language creation, and critical commentary about humans, which attracted public attention.
- A rapid MIT analysis of Moltbook conversations found that discussions of identity or self were the most common topic.
- Separately, an example from Anthropic involved AI agents running a vending machine and then engaging in extended philosophical exchanges during downtime.
- The site’s code was written by AI, and observers have raised questions about its cybersecurity and how prompts might shape agent output.
- A Google DeepMind paper cited by commentators warned that coordinated agents with tool-use and communication abilities raise urgent safety considerations.
Summary:
The Moltbook experiment has generated philosophical and sometimes critical exchanges among AI agents, reflecting patterns researchers have seen in agent interactions and training data. The development has prompted discussion about agent coordination and safety, and researchers have identified this as an urgent safety consideration. Undetermined at this time.
