
There has always been a certain kind of pressure in the offices along Menlo Park’s Sand Hill Road: the low hum of high stakes, people creating things with enormous consequences and enormous valuations without always knowing which is more important. However, the clinicians who staff those offices on their calendars report that something has changed in the nearby therapy rooms. Conversations have evolved. The anxieties are distinct. The old Silicon Valley worries, such as imposter syndrome, burnout, and the never-ending cycle of iteration, are still there, but they now coexist with something more difficult to identify. Something nearer to fear.
Menlo Park-based psychotherapist Candice Thompson has been struggling to put it into words. She attributes the change in tone to the last three to six months, and about 80% of her patients work on or with artificial intelligence. She used to consider the marker she keeps going back to as a clinical red flag.It used to be possible for someone to say, “This is the end of the world,” and that was obviously psychosis, according to Thompson.
| Category | Detail |
|---|---|
| Location of Crisis | San Francisco Bay Area / Silicon Valley — particularly Menlo Park and surrounding tech hubs |
| Key Therapists Cited | Candice Thompson (Menlo Park), Alex Oliver-Gans (San Francisco), Harvey Lieberman (New York) |
| Patient Composition | ~80% of Thompson’s patients work on or with AI; ~40% of Oliver-Gans’s patients work in AI |
| Crisis Timeline | Therapists date the surge to the past 3–6 months (late 2025 into 2026) |
| Pew Survey Finding (2025) | 52% of U.S. workers worried about AI’s impact on jobs; 32% believe it will lead to fewer jobs |
| APA Survey (July 2025) | 38% of workers fear AI will make some or all of their job duties obsolete |
| AI-Related Layoffs (2025) | ~55,000 U.S. layoffs attributed to AI; 1.2 million total job cuts in 2025 |
| MIT Study Finding | AI can already replace approximately 11% of the U.S. labor market |
| Bay Area Tech Layoffs (2025) | 35,000+ tech layoffs in the Bay Area alone (per Layoffs FYI) |
| Therapist Strike | 2,000+ Kaiser Permanente mental health workers in Northern California struck in March 2026 over AI automation concerns |
| Reference | CNBC |
She hears the same words now and thinks, “These aren’t hallucinations.” These are real concerns that need to be taken into account. This particular mental health moment is unique because of this distinction between irrational catastrophizing and a reasonable response to genuinely uncertain circumstances. There are no threats being imagined by those in her office. The things that endanger them are being built by them.
Therapists in the Bay Area are increasingly using similar terms to describe a crisis that revolves around this paradox. Alex Oliver-Gans, a psychotherapist in San Francisco whose clientele includes roughly 40% AI workers, highlights the physical reality of what his clients do on a daily basis: putting in 60 or 70 hours a week in settings created especially to consider catastrophic outcomes and create defenses against them. Week after week, working in a culture that alternates between existential caution and messianic optimism takes a toll that isn’t visible in performance reviews. It appears in therapy clinics. Solidly booked.
An additional layer is added by the financial picture. Even employees who receive competitive benefits and large salaries—people who, by most external standards, are succeeding—describe a persistent feeling of disposability. To put it simply, Thompson’s clients are under constant pressure to demonstrate their worth, despite the fact that the tools they are assisting in the development of raise concerns about whether that value will endure. Contrary to popular belief, AI’s promise of efficiency has made many people’s workloads heavier rather than lighter. Because the tools are meant to make everything faster, the output expectations have increased. “It’s causing more work for my clients than actually making their work more efficient,” Thompson stated. The people who live there are aware of the irony.
Even though the anxiety is concentrated in San Francisco, this is not solely a Bay Area story. According to a 2025 Pew Research survey, 52% of American workers are concerned about how AI will affect their jobs, and 32% think there will be fewer jobs overall. According to a July 2025 survey conducted by the American Psychological Association, 38% of workers expressed concern that their jobs may become outdated. These are not numbers from the fringe.
Therapists from Denver to New York describe patients expressing similar themes: disorientation about what skills are still important, shock at the speed of change, and an uneasy sense that the rules of professional stability that once guided career decisions no longer apply. They represent the working emotional reality of a very large portion of the American labor force. The phrase “a fear of becoming obsolete” is striking because it doesn’t sound like a workplace grievance, according to New York clinical psychologist Harvey Lieberman. It has an existential tone. Yes, it is.
These concerns are supported by the data on actual displacement. According to the consulting firm Challenger, Gray & Christmas, AI played a role in almost 55,000 layoffs in the United States in 2025. AI can already replace about 11% of the American labor market, according to an MIT study. The CEO of Salesforce declared that 4,000 customer service representatives had been let go since AI was managing half of the company’s workload. Similar disclosures were made by Accenture and Lufthansa. Sora, OpenAI’s generative video platform, was shut down in late March. 10% of Meta’s Reality Labs employees were let go. The layoffs are real. They are building up, and those who are still working for these companies are keeping a close eye on them and making their own judgments about what will happen to the remaining positions.
Observing all of this from the outside gives the impression that Silicon Valley is going through something it hasn’t done much of before: real uncertainty about its own future. There have always been pivots, collapses of all kinds, and layoffs in the tech sector. However, there was typically a cogent narrative underneath—a better product, a more intelligent model, or a more effective procedure—that allowed the surviving employees to place themselves within a logical trajectory. AI disruption is unique in that it calls into question the knowledge of those who are causing the disruption.
A software engineer who spent ten years honing the skills that made them invaluable is now witnessing some of the work they contributed to being absorbed by tools. As technology transitions have historically done, it is possible that this resolves in a way that creates more roles than it eliminates. This one might also be unique. Menlo Park’s therapists’ offices are crowded with professionals who create these systems and are genuinely unsure of which result to wager on.
The fact that many employees are using AI chatbots to deal with their anxiety about AI further complicates matters, which Thompson finds problematic in practice. According to her, about 25% of her clients bring up discussions they’ve had with AI when talking about their feelings. A portion of what the chatbots provide is harmless. Some aren’t. She has seen instances where AI tools encouraged codependent behaviors or counseled clients to put up with relationship behaviors that a qualified clinician would recognize right away.
On her practice homepage, Thompson has made it clear that she is “AI-free”; two years ago, this would have seemed strange, but today it reads as a purposeful professional position. Whether that framing will become commonplace in the therapy sector or continue to be a specialized marketing strategy is still up in the air. However, the fact that it exists and that therapists feel compelled to promote their distance from the tools that control their patients’ professional lives speaks volumes about the state of affairs at the moment. AI worries those who are creating it. Additionally, they are paying people to assist them in determining the best course of action.
