Skip to content

DCVC DTOR 2025: Chatbots shouldn’t replace human connection

Chat-based generative AI models such as OpenAI’s GPT‑5, Google’s Gemini, and Anthropic’s Claude are changing the way we search for knowledge. And customized AI models based on similar machine-learning technology are giving researchers new ways to find long-hidden patterns in biology and many other fields. But there are certain things chatbots can’t and shouldn’t be used for. We’re not fans of — or investors in— companies touting generative models as replace­ments for humans in the creative arts, counseling, therapy, or clinical care.

The 2025 edition of the DCVC Deep Tech Oppor­tu­ni­ties Report, released in June, explains the global challenges we see as the most critical and the possible solutions we hope to advance through our investing. This is a condensed version of Section 2.2 of the report — in which we chronicle a shiny object” we see as a distraction from more important work.

2024 was the year large AI models went multimodal. In December OpenAI released Sora, a paid ChatGPT feature that generates lifelike short videos based on text prompts like In a pastel bathroom with a rubber ducky, an adorable dragon made entirely of shampoo bubbles.” Days later Google’s DeepMind rolled out a similar video-generation tool called Veo 2, and smaller companies like Pika and Luma AI offered their own variations. In essence, these tools make high-quality computer-generated graphics available to average users, in part by taking over many of the tasks tradi­tion­ally performed by human visual effects artists.

Make no mistake, generative videos can be arrestingly realistic. Who doesn’t enjoy seeing Rockefeller Center overrun by golden retrievers? But what these eye-popping videos are usually missing is the spark of human emotion or inspiration. AI can be an enabler,” says DCVC general partner Ali Tamaseb. It can help a moviemaker create visual effects. But ultimately you still need the human to write the story and to direct the movie.” (After the 2023 Writers Guild of America strike, the big Hollywood studios agreed, signing a contract that says AI models aren’t writers and that nothing they can create can be considered literary material.)

A similar point applies to AI models meant to function as stand-ins for psychol­o­gists, therapists, or physicians, such as Pi, Woebot, Wysa, Youper, talk2us, or even ChatGPT. Some of these models are trained on validated therapeutic techniques such as cognitive behavioral therapy or mindfulness meditation. And the chat screen itself can be a useful space for self-reflection, almost like a paper journal that talks back to you. But these models can be alarmingly sycophantic, and they obviously lack true empathy; the insights of a stochastic parrot are, by definition, canned. It should be no surprise that chatting with a bundle of weighted features and equations, even one trained on trillions of parameters, is less nourishing than connecting with a single human soul. 

I think what a patient wants in a therapeutic or medical encounter is the actual human being, not necessarily the conver­sa­tion,” Tamaseb says. When you go see the doctor, you get 20 percent better just thanks to the placebo effect. An LLM can offer much more content than your therapist or doctor might, but I don’t think it will ever replace the impact of having a real human talking back to you.”

That’s not to say that we don’t see a need for better treatments for depression and anxiety, conditions that, beyond their inherent misery, cost the world’s economies 12 billion workdays and $1 trillion per year in lost produc­tivity. AI-based therapy offers a seemingly affordable and scalable way to help people learn better coping skills — but it’s a technology that’s still in its nascent stages, lacking clinical validation or even consistent reporting standards. In short, AI-based creative tools and AI-based therapy are fields that will no doubt attract venture-scale investments, but for now we plan to steer away, especially when new tech­nolo­gies purport to replace what’s deeply and essentially human. AI is extremely useful in high-touch professions as an adjunct to human experience and skill — but we hope it will never come to be seen as a valid substitute.

Related Content