Can Avatars become sentient?

Can Avatars have their own emotions and intelligence?

That will mean they do not act merely as a vector for a living person, but become ‘aware’ in their own right…

I recently did a deep dive into the metaverse to see what all the fuss was about. I did not plan to spend much time there as I suspected it was early days, with a lot of technical work ahead.

I was not wrong.

So, is it really going to be that big? For sure…but not now, some time in the future.

Moving Teams meetings into a 3D environment with the ability to replace your face with a copycat Avatar is fine and fun – but not nearly as much fun as working with Xzistor robots in the metaverse.

One of the first robots I designed to run the Xzistor Concept brain model on was a simple differential-drive simulated robot in a 3D ‘learning confine’. It was just some C++ and OpenGL code (and a good couple of late nights I will not lie) and there it was – a simple robot moving about in a 3D room. And immediately it – I mean ‘Simmy’ – started to learn like a baby.

Here is a legacy pick (screengrab) from one of the first simulations about 22 years ago.

Legacy pic of Simmy – note archaic MS Office icons in top right corner!

Simmy learned by reinforcing all actions that led to solving a set of simple emotions. With a bit of initial help it quickly learned to navigate to the food source and push the correct button to open it. It also learned to avoid the walls as this made for some painful encounters. What was exciting about this robot was that it was given visceral (body) sensations – it had its own little body map in its brain – and these were then used as simple emotions to make it constantly ‘feel’ good or ‘feel’ bad. It was quickly evident that Simmy was really ‘feeling’ pain when bumping into the walls.

It was a big kick to see the facial expressions on this little robot – a simple frown or smile reflex based on the average internal emotional state.

A later refined version of the crude initial 3D simulation – and an Xzistor robot that can easily be let loose onto the streets of the Metaverse.

I still see people struggling to understand how one can provide robots with emotions – and I do not mean just mathematical ‘reward’ states to satisfy homeostatic imbalances. For me emotions must include somatic body states which will make the robot ‘feel good’ and ‘feel bad’. The trick how to do this is explained in my short guide below:

Click on image

Simmy also allowed me to put my ‘intelligence engine’ to the test which forms part of the Xzistor Concept brain model. I could turn this intelligence engine ON or OFF so that the little virtual robot either learnt like an animal (Pavlov’s dogs) or like humans (actually thinking to derive answers from previous experiences). This approach did not only offer a way to define intelligence in a scientific manner, but also provided an easy way to implement intelligence in robots.

The simplest test of intelligence I could inflict upon my intrepid little robot was to secretly change the button to open the food source from the GREEN button to the ORANGE button. After trying the GREEN button, Simmy figured out it should actually be the ORANGE button without any help from me. This was quite and exciting moment as one could actually see Simmy ‘think’ about it, and it proved that the intelligence machinery was working correctly.

This intelligence algorithm also provided the robot with the ability to understand ‘context’ which many AI researchers feel is still missing from current robot brain models. All of this is explained in my short (and surprisingly simple!) guide below.

Click on image

To start building Avatars that have their own emotions and intelligence, will merely require me to drop this 3D simulated robot into somebody else’s metaverse and perhaps steer it a few times past the food and water sources (and other Satiation sources – read my guides). In this way a little virtual Xzistor robot will learn by itself to navigate around its 3D environment. It will constantly keep on learning… and make new friends.

Make new friends?

The first thing these Xzistors (I guess they will take exception if I call them Avatars) will need to do is to see other objects and virtual characters. For this I will use a simple method called CameraView which provides the view of the 3D confine as seen by the simulated robot. This will be processed as an optic sense so that Simmy can see and recognize objects and other Avatars. Simmy will quickly learn to ‘appreciate’ friendly Avatars that share their food, water, etc. and befriend those that are FUN to play with!

The metaverse creates the perfect test ground or ‘sandbox’ where these Xzistor robots can be allowed to learn and become more intelligent without concerns of super-intelligent robots harming humans. If Simmy gets fed-up with Avatars hogging the food or water source and start throwing punches at them (yes we can also provide aggression as an emotion) we can always just push the RESET button on either the robot or the game.

Of course, Simmy has tactile sensing, else how can it feel pain when walking into walls – and this tactile interaction with objects and Avatars in the metaverse will obviously not be physical, but ‘calculated’. But Simmy won’t ever know the difference. We did design Simmy to ‘hear’ sounds and words, but it cannot smell and taste…yet!

Building an Xzistor-type virtual robot for the metaverse brings about numerous simplifications. The main advantage is no need for costly body parts, motors, batteries, cameras, sensors, onboard processors, etc. that need integration. We can come back to anthropomorphic (humanlike) shapes and it is no issue to make them keep their balance and not fall over obstacles. This might sound trivial, but a large part of why Bill Gates never saw his 2008 promise of a ‘robot in every home’ come to true, was the science-fiction led notion of a ‘Jeeves’ butler robot by many of that time – a home robot that would have spent much of its time tripping over carpets or toys – and which would have regularly fallen through the glass-topped coffee table.

What would have made much more sense at the time, and Gates eluded to these ‘peripheral devices’ in his article, was an Amazon-type storage robot – basically a box that runs on rails up the wall, across the ceiling and to the kitchen to fetch beer and peanuts and bring it back to the sofa – without getting wise or maty.

Science fiction has both inspired and misdirected many human pursuits of the future. Elon Musk punts his vision of humans becoming a multi-planet species – but building an expanded space station orbiting Earth will be much more practical than setting up camp on Mars. A simple engineering risk, safety and cost-benefit analysis should quickly point this out.

At the same time the ambitious endeavors of these inspiring individuals is what keeps me going!

Is the metaverse just another distant dream by tech drivers that had gone mad? Or will we one day move into a reality other than the physical realm that we have come to know – much like the the worlds portrayed in the movie The Matrix? The question would be a practical one: Can we ever produce enough server semi-conductors to run all these live 3D simulations? And will we be able to generate enough power to drive these electrical worlds and the cryptocurrencies that will undoubtedly fuel them?

I think we will find a way to achieve all of this.

The metaverse will steadily grow and become our main reality. In time it will become just too much trouble to engage the physical world where we will have to dress up to go to work, be quietly judged by our body mass, shape, looks, apparel brands – and be condemned for on occasion accidently passing wind whilst forgetting we are not on our home laptops with the mute button on. I firmly believe virtual reality and the metaverse is where it is at – it is where the naked ape is headed next!

One blue sky project we proposed already years ago was to release an Xzistor robot ‘copy’ of oneself into the metaverse.

How will this be achieved?

Without going into too much detail here, a Wizard App can be developed that asks an individual a few questions to score the individual’s general personality traits and preferences: temper, compassion, fears, favorite pastimes, sports, foods, games, likes and dislikes, values and detail around required attributes of a future dating partner – physical (brunet, blond, etc.) and interests (food, sport, games, leisure activities, etc.). The Wizard App will then translate these preferences into lower tier emotion engine indices to create a virtual Xzistor robot brain that can broadly represent the individual in the metaverse. Of course it is not going to be very accurate, but imagine checking back after work to see who your Xzistor virtual ‘copy’ robot had hooked up with in the metaverse while you were away – and who are the real people behind these Xzistor robots or Avatars.

It could start a whole new way of virtual dating!

Will we one day see Xzistors and Avatars getting married? Or will humans marry them in these mysterious virtual worlds? Who knows – your guess is as good as mine.

But when it comes to the metaverse – never say never!

Ano

Rocco Van Schalkwyk (alias Ano) is the founder of the Xzistor LAB (www.xzistor.com) and inventor of the Xzistor Concept brain model. Also known as the 'hermeneutic hobbyist' his functional brain model is able to provide robots and virtual agents with real intelligence and emotions.

Leave a Reply

Your email address will not be published. Required fields are marked *