Neon: Is the world ready for artificial humans?

Neon: Is the world ready for artificial humans?

Everyone following CES 2020 seems to be fascinated by Neon's artificial human technology. Yes, the intangible Neon is more viral than Charmin's toilet paper robot. [Samsung-backed startup Star Labs has been extremely cryptic in its introduction of Neon, parading its lifelike avatars at the world's largest tech show, but sharing little about how they work or why they are on display.

These avatars, which vary in age and race, look so realistic that they could be mistaken for video streams of actual humans. However, the "neons" are actually conceptual chatbots designed to converse and convey emotions. It will learn to become more human-like over time, thanks to an AI infrastructure to be announced soon.

Confused? Don't worry. Neither was I until I had the opportunity to learn about Neon's second day in the wild at CES2020.

The question is actually "What is Neon?" and the answer is a computer-generated avatar that looks like a human. Unlike AI assistants like Alexa, however, each Neon is its own entity. There is no catch-all "Hey, Neon" that inhibits a Neon from developing its own unique personality.

In fact, Pranav Mistry, CEO of Star Labs, assigned each avatar on display an individual human name (and even an occupation). (In a Neon demonstration I attended at CES, Mistry explained that the project began by extracting face maps from existing videos. Using landmarks and keypoints, he developed code to manipulate facial features in the graphic image. Eventually, this program, called Core R3, was trained enough to create faces and bodies. And that is what he showed off at CES.

Neon can replicate human activity and facial expressions. Neon can be controlled to raise a single eyebrow or fully reproduce emotions such as joy or confusion. When I saw this technology in action, a Neon employee was using a tablet to control four different avatars.

However, that employee could only control one Neon at a time. Those not in use or set to "automatic" mode continue to exist, just like humans. One neon, a yoga instructor, was stretching; another, wearing jeans and white sneakers, was flipping through a book. Had I not seen the manual controls a moment earlier, I would have thought I was watching a recorded looped video.

Neons can also talk. The code takes a highly robotic third-party voice and makes Neon speak basic phrases such as "Hello. While I was impressed that it could do so in a variety of languages, the choppy animation while speaking spoiled the avatar's attempt to pass as an actual human being.

Also, Neon's communication features are clearly in their infancy.

Neon today is far from perfect. Although it reflects emotions with uncanny accuracy, its ability to communicate at this early stage feels robotic and disrupts the illusion of humanity. Neon is therefore looking for a partner to help build the AI component of Neon, which is already in development under the name Spectra.

Once Spectra is out in the world, Neon will begin using its memory to learn emotions and personalities. But this will take time, as powering Neon, a non-AI, already requires a large amount of computing power; 128 cores are needed.

It seems that we are in the early stages of learning about Neon so that we know how avatars are made before they become our best friends. Or even a hotel concierge or health care provider.

When the technology becomes mainstream, it will have ethical implications; the only way Neon will succeed is if users fully understand how it works through demonstrations like the one being given at CES.

Be sure to check out our CES 2020 hub for the latest announcements and hands-on impressions from Las Vegas.

Categories