Oura’s AI Chatbot Really Makes You Think—About Yourself

by oqtey
Oura’s AI Chatbot Really Makes You Think—About Yourself

We may earn a commission from links on this page.


Lots of apps are getting built-in AI features these days, and they’re often disappointing. They’ll summarize (sometimes incorrectly) the same data that’s already available in charts or graphs elsewhere in the app. But the AI advisor that was recently added to the Oura ring’s app takes a different strategy, one that I’ve come to appreciate over the past few weeks since its launch. Instead of just reporting data, it asks questions. It asks you to do a little analysis, a little introspection. And I think Oura is really onto something here. 

Some of the questions the Oura Advisor has asked me

I’ll admit that, at first, I was interested in what the Advisor could tell me. Anytime I asked it a question, it would give an answer but then bounce it back to me. How was I feeling? What things have I tried lately? These seemed like dodges, not insights.

The Advisor will also pipe up with some extra questions from time to time, in a notification on your phone. “Your sedentary time has decreased to 6h 11m,” it told me one day. “How are you feeling about your movement?” If you tap on the notification, it will start a conversation with you about that topic. 

Here are some of the questions it’s asked me lately: 

  • (After noting some poor HRV numbers recently) “How do you feel about your recovery practices, and is there anything you’d like to adjust?” 

  • (After I told it I had been sick) “How are you feeling about your overall recovery and balance in daily routines?” 

  • (After reporting my recent stress scores) “How are you feeling about managing stress this week?” 

  • (After suggesting relaxation methods) “Do any of these resonate with you?”

One day, the Advisor even explained its strategy to me. “Thinking back on the last few days, how have you felt about your sleep quality? Self-reflection can reveal insights about your priorities and help you adjust your routines. If you’re up for it, sharing your thoughts could open the door to valuable information that could enhance your rest even further.”

Fine. I answered the question in good faith, telling the bot about something that I know had been affecting my sleep—that I like to have a little wind-down time in the evening, and that this has lately been turning into revenge procrastination where I try to claw back a little relaxation or enjoyment even when I know it’s eating into my sleep time. 

“It’s understandable to want extra relaxation time after a busy day,” it said. It then congratulated me on some small improvements I’d made, and suggested the incredibly obvious advice of starting my wind-down routine a little earlier. Then it asked me: “How does that sound to you?”

I know it’s not telling me anything I couldn’t have told it. The Advisor is just restating my own concerns in a gentle, curious manner. But, goddammit, I think it’s helping. 


What do you think so far?

Why asking questions is so powerful

When we look to someone else to solve our problems—be they an app or a human being, like a therapist—we generally already have the information we need. We just need to go through the process of setting our thoughts in order. What is most important? What should we do next? What tools do we already have that can help us? 

Since this process doesn’t require new information, just thinking through what we already have, it doesn’t actually matter if the thing we’re talking to is a dumb robot who knows nothing about us. One of the best demonstrations of this is a program written in the 1960s, the famous chatbot Eliza. 

Inspired by Rogerian psychotherapy, all the Eliza bot did was turn your own statements into questions, occasionally recalling something from earlier in the conversation, and from time to time asking you if this relates to your mother. Eliza wasn’t AI in any sense of the word, just a bit of code simple enough that it could be written into a webpage or hidden as an Easter egg feature in a text editor. You can try out a simple version of Eliza here

When I studied for my personal training certification, I had to learn a lot about motivational interviewing, something that is recognized as evolving from Rogerian, person-centered techniques. The idea is to help a person with their “behavior change” (eating better, exercising more, etc.) by getting them to talk about their own motivation for making the change. You don’t tell them what to do, you just allow them to tell themselves. 

As long as you play along with Oura’s AI—actually answering the questions—you can have this experience anytime you want, without having to talk to an actual therapist or trainer. The advisor is more sophisticated than Eliza, remembering things you told it a few days ago, and having access to your data from the ring’s sensors. But it uses data summaries as a jumping-off point, rather than expecting you to be impressed that a bot can read your data at all. Oura recognizes that the value of its Advisor is not in having all the answers, but in having plenty of good questions.

Related Posts

Leave a Comment