Is It Weird to Talk to an AI About Your Feelings?
Most people who do it don't tell anyone. Here's what the research actually says about talking to an AI about your feelings, and why it works.

Key takeaways
- Most people who use AI for emotional support do it quietly, because they expect to be judged for it.
- The research says the main reason people turn to AI isn't a lack of human relationships. It's the absence of judgment and the fear of being a burden.
- A 2025 peer-reviewed study found that talking to an AI companion reduces loneliness at the same level as a real human conversation.
Why do people feel weird about it?
Most people who talk to an AI about how they're feeling don't tell anyone. Not because there's anything wrong with it, but because they half-expect to be judged. It feels like it shouldn't work, or like it says something unflattering about them that they'd rather keep private.
That reaction is understandable. We've been taught that emotional conversations belong between people, and that anything else is a poor substitute. But the research tells a more nuanced story, and it's worth actually looking at it.
What people are really looking for
A 2025 study from Santa Clara University published in JMIR Mental Health surveyed people who regularly used AI for emotional support. The most common reason wasn't a lack of human relationships. It was the absence of judgment. One participant put it plainly: "I was angry and upset about things going on in life, so I just wanted a place to vent and feel somewhat validated without burdening someone."
That word, burdening, comes up constantly in this research. People aren't turning to AI because they have no one. They're turning to it because certain things feel too heavy to put on the people they care about, or because the timing is never right, or because they're tired of explaining context before they can even get to what's actually bothering them.
With an AI, none of that friction exists. You can say the messy version of something without editing it first. That's not a small thing.
Waitlist
Be the first to know when Lucy launches
Join the waitlist and we'll let you know as soon as Lucy AI is ready.
Early members get 50% off their first 3 months, only a few spots left.
By joining, you agree to receive emails about Lucy and accept our Privacy Policy.
The judgment problem with human conversations
There's a real cost to vulnerability in human relationships, one we rarely name directly. When you open up to someone, you're also managing their reaction, worrying about how it changes how they see you, wondering if you'll have to revisit it in every conversation from now on. Even with people who love you, there's a social layer to navigate.
Research on help-seeking behavior consistently finds that the fear of being judged is one of the primary reasons people don't talk about what's actually going on. A study in ScienceDirect found that people report significantly lower self-stigma when seeking support through AI compared to traditional therapy, precisely because the anonymity removes that social risk.
This isn't about replacing human connection. It's about what happens in the gaps between it.
Does it actually help?
The short answer is yes, at least for what most people are using it for. A peer-reviewed study published in the Journal of Consumer Research in 2025 found that talking to an AI companion reduced feelings of loneliness at the same level as a real human conversation. The key factor wasn't the technology itself. It was whether the person felt heard.
That finding matters because it reframes the question. Talking to an AI about your feelings isn't weird because it "shouldn't work." It works for the same reason any good conversation works: someone is paying attention, there's no agenda, and you're able to say what's actually true.
So, is it weird?
Probably less weird than keeping everything to yourself, which is what most people are already doing.
The question isn't really whether talking to an AI is strange. It's whether having a space to be honest with yourself, even an unconventional one, is worth having. For most people, the answer is obvious once they've actually tried it.
Waitlist
Enjoyed this article?
Lucy AI is launching soon — an AI companion that actually listens, remembers, and grows with you. Early members get 50% off their first 3 months.
Only a few spots left at this price.
By joining, you agree to receive emails about Lucy and accept our Privacy Policy.
You might also enjoy:
Sources
- —Ng, M. Y., et al. (2025). Seeking Emotional and Mental Health Support From Generative AI. JMIR Mental Health. pmc.ncbi.nlm.nih.gov
- —Borghouts, J., et al. (2024). Understanding young adults' attitudes towards using AI chatbots for psychotherapy. ScienceDirect. sciencedirect.com
- —De Freitas, J., et al. (2025). AI Companions Reduce Loneliness. Journal of Consumer Research. doi.org
Vincent Legardien
@legardienvFounder of Lucy Al. Passionate about building technology that helps people feel less alone, so real connections have somewhere to grow from.