Can this radio detect your mood and play songs to match?

Would we get on better with clever machines if they understood what mood we were in?

Many roboticists and computer engineers seem to think so, because they're always trying to make their creations more human.

Take Solo, the "emotional radio", for example. A wall-mounted device that resembles a large clock, it features a liquid crystal display at its centre. When you approach it, the pictogram face shows a neutral expression.

But it then takes a photo of your face, a rod or antenna on the side cranks into life, and the LCD display indicates that it's thinking.

"When it's doing this, it's analysing different features of your face and deciding how happy, sad or angry you are," explains Mike Shorter, senior creative technologist at the Liverpool-based design and innovation company, Uniform, Solo's creator.

"It will then start to reflect your mood through music."

If Solo thinks you look happy, it will play you an upbeat number like Hey Ya! by Outkast. A more downbeat expression may turn up Everybody Hurts by REM.

Your reward for being angry could be a dose of Motorhead.

As well as playing music to suit your mood, Solo's makers envisage their smart radio being able to alter your mood.

Say you've been driving for a long time, it could recognise signs of tiredness on your face and play upbeat music to pep you up.

The study of how to make computers and machines more empathetic is known as affective computing, and examples of supposedly emotionally intelligent gadgets have been springing up around the world.

Japan's Softbank Robotics has been plugging its Nao and Pepper robots for a while now.

The 1.2m (4ft) tall cute humanoid, Pepper, developed jointly with French robotics firm Aldebaran, has been deployed inĀ hospitals, shopping centres, banks and train stations.

While toddler-sized Nao (59cm) has been used in schools to help kids with autism and paediatric units of hospitals.

Softbank is also behind the "emotion engine" within the Honda NeuV (pronounced new-vee), an automated electric concept car unveiled at this year's Consumer Electronics Show in Las Vegas.

This AI-driven technology - combining biometric sensors as well as cameras - will try to detect drivers' emotions and learn from the type of actions that result from them.

So angry drivers who are driving rashly and erratically, for example, might be encouraged to calm down. The AI might even reduce the car's power temporarily, or switch to autonomous mode, until you've cooled off.

This "network assistant" will check on the driver's emotional well-being - making music recommendations based on mood, changing the lighting scheme, and even triggering mood-enhancing scents.

Boston-based Affectiva has developed "emotion recognition software" called Affdex that monitors the minute changes in our facial expressions when we're watching adverts, TV programmes or films.

The AI software has learned from studying nearly four million faces - and their changing expressions - from more than 75 countries.

Companies such as Sony are using the software to test how audiences respond to film trailers, and advertising agencies such as Millward Brown are using it to measure responses to their TV ads.

Affectiva, which emerged from Massachusetts Institute of Technology's Media Lab, is similar to Emotient, another company teaching computers how to recognise expression and emotion. It was bought by Apple last year.

Misreading the situation?

But while emotion-reading tech might be all the rage at the moment, does it actually work?

David Lane, professor of autonomous systems engineering at Heriot-Watt University in Edinburgh, points out that mistakes made by affective computing applications could have serious consequences.

"There's lots of research in this field with robots sensitive to gesture, tone of voice, eye expressions and so on, but one of the issues is getting it right," he says.

"If Siri or some other voice-activated assistant on your phone fails to give you the football results, you have alternatives, but if a critical, affective computing function fails, that will cause serious frustration at the very least.

"Put simply, if it doesn't work, people will switch off."

Christian Madsbjerg, a founding partner of "human science" consultancy Red Associates, is concerned that affective applications are "built to Western, Japanese or Chinese models, and emotions are different in other cultures".

He also points out that our bodies, and their physical context, are crucial to our moods and reactions.

"An emotional response to a given commercial in the warm, dark room of the focus group may have no relation to the way that same commercial is perceived at home or on a subway platform," he argues.

A violinist soloing at Carnegie Hall at a high point in her career may be feeling exultant, but her face won't show it, he says, because she's concentrating so hard. A robot would struggle to interpret her "frozen" facial expression, he maintains.

Solo's creators admit that the radio doesn't always read emotions correctly.

And even Pepper the robot gets it wrong sometimes.

"After a few late nights and being in a somewhat grumpy mood, Pepper added 10 to 12 years on to my age when she evaluated it," says Carl Clement, a founder of Emotion Robotics, a UK-based partner with Softbank in Europe.

Solo, the emotional radio, might just manage a wry smile at that. And possibly play Frank Sinatra's Young at Heart?