At CES 2026, robotics fell into two categories: expensive humanoids designed for factories or novelty toys. And there were many toys. It’s an obvious early application of LLMs and computer vision. They talk robotically, have a camera, and operate with an app. They are exactly what you would expect. Ollobot caught my eye because it’s designed to be more like a pet, designed to communicate emotionally, with its eyes and expressive, cute sounds. It’s designed to live with a family, learn their routines, and record moments that matter, without being ordered to do so.

Ollobot is the first consumer robot from a Shenzhen-based company led by founder and managing director Lyn Fang, a serial entrepreneur whose background spans solar energy, fashion, hospitality, and technology. Feng acquired an AI and audio technology company two years ago and rebuilt it around a single idea: that the most useful home robot would not look or behave like a human, but like something warm, expressive, and nonthreatening.

Physically, Ollobot is small and mobile, riding on wheels rather than legs, with a flexible neck, expressive eyes, flippers for arms, and a smooth egg-like body. It is designed to be touched and hugged, not operated. Feng describes it as “a pet and a family member,” not a device. The robot responds through body language, facial expressions, and a custom nonverbal sound system closer to R2D2 than Alexa. It does not speak in sentences unless necessary, a deliberate design choice meant to avoid the feeling of a talking appliance.

Under the hood, Ollobot is an AI-enabled computer with cameras, microphones, environmental sensors, and onboard memory. It recognizes family members, learns habits, and adapts its personality over time. According to the company’s internal design framework, Ollobot tracks dimensions such as emotional neediness, curiosity, activity level, and sociability, adjusting how it behaves based on long-term interaction rather than presets.

One of the most distinctive features is memory. Ollobot automatically identifies what it considers “high value moments,” birthdays, gatherings, achievements, and begins recording photos and videos without being asked. Those memories are then organized in a companion app that functions as a diary, offering parents and caregivers a running record of daily life. All data is stored locally on a removable memory module designed to be transferred to future devices, an unusual approach at a time when most consumer AI products default to the cloud.

Feng says the goal is not surveillance or constant documentation, but continuity. “Technology reacts, but it doesn’t create moments with you,” she told me during our interview at CES. Ollobot is meant to notice when something meaningful is happening and quietly step in, rather than wait for a command.

Pricing is expected to range from roughly $1,200 to $2,000, depending on size, with additional revenue coming from accessories and outfits. Feng plans an international Kickstarter launch this summer, with a lower-cost entry model aimed at younger children also in development. The company has invested more than $6 million in development so far and operates as part of a publicly listed Chinese firm, giving it more financial stability than many first-time hardware startups.

Walking the CES floor, I saw dozens of robots that wanted to be people. Ollobot wants to be something else entirely. It is deliberately limited in what it can do physically. It does not open doors or fetch objects. It does not vacuum your house or manage your calendar. What it does do is notice you, remember you, and respond in ways that feel personal rather than programmed.

That focus may explain why the Ollobot booth stayed crowded throughout the show. Parents, retailers, and distributors from Europe, the U.S., and Asia lingered not because the robot demonstrated a flashy trick, but because it felt emotionally legible. Children instinctively treated it like a pet. Adults imagined where it might fit into their homes.

CES is full of electronic AI toys. Ollobot is rare in that it promises presence, not tech. For me, it was one of the most thoughtful consumer robots at the show, and one of the few I could imagine buying not as a gadget, but as something my granddaughters might grow up with.

Charlie Fink is the author of the AR-enabled books “Metaverse,” (2017) and “Convergence” (2019). In the early 90s, Fink was EVP & COO of VR pioneer Virtual World Entertainment. He teaches at Chapman University in Orange, CA.