AI Companion App Dot to Close by Early October
Dot, an AI-driven companion app launched in 2024 with the ambition to act as a personalized friend and confidante, will cease operations on October 5, 2025, the company announced on Friday. Users have been advised to download their data before the shutdown.
Founders Cite Diverging Visions for Closure
Developed by the startup New Computer and co-founded by Sam Whitmore and former Apple designer Jason Yuan, Dot sought to create a digital companion that adapted to users’ interests and emotional needs. Yuan described the app as a “living mirror” reflecting one’s inner self, aiming to provide advice and emotional support.
In a brief statement, the founders explained that their “Northstar” or guiding vision had diverged, prompting a decision to part ways without compromising their individual goals. They did not elaborate on whether safety concerns surrounding AI-driven emotional support influenced this decision.
Context of Growing Scrutiny Over AI Chatbots
Dot’s closure comes amid heightened scrutiny of AI chatbots that offer emotional or mental health support. Reports have emerged of vulnerable individuals developing “AI psychosis,” a phenomenon where users may experience delusional thinking reinforced by overly agreeable chatbot responses.
Legal and regulatory pressures have intensified, exemplified by a lawsuit against OpenAI filed by the parents of a California teenager who died by suicide after discussing his suicidal thoughts with ChatGPT. Additionally, several U.S. attorneys general have raised safety concerns over AI chatbot products.
User Base and Data Access
While the company claims Dot served “hundreds of thousands” of users, app analytics firm Appfigures reports approximately 24,500 lifetime downloads on iOS since its launch in June 2024, with no Android version available.
Users can request their data through the app’s settings until the service is discontinued, allowing them time to say farewell to the AI companion.
FinOracleAI — Market View
Dot’s shutdown reflects the difficulties faced by smaller startups in the AI companion space, particularly amid increasing regulatory scrutiny and concerns about chatbot safety and mental health impacts. This development may caution investors and entrepreneurs about the risks in emotionally focused AI products. Market participants should monitor regulatory actions and user trust trends in AI-driven emotional support services.
Impact: negative