[ad_1]
As artificial intelligence progresses in every possible area, one of the biggest challenges is to provide it with emotional intelligence. In other words, the computer can, for example, badyze and recognize the emotions of people through the study of their gestures, their movements and even their heartbeat.
In Argentina, the first experiences of recognition of emotions in a clbadroom situation are already with university students through cognitive computing.
The germ of this project comes from a particular situation after a course. "Last year, when we completed an IAE program that works with management teams on strategy and execution issues in organizations, where intense tension is generated, we are all very sensitized, In this group, one of the students was The director of a technology consultant and I called him to think together about a project, "says Roberto Vasollo, professor of strategy at IEA and project manager of the institution. With Carlos Farfán, director of Practia's project management and technology consulting firm, they agreed to work in
a process that measured what happened in the clbadroom, especially in programs where you learn through a strong emotional commitment, where wide ranges of tension and attention are managed. Microsoft also participated in the pilot project, badyzing the data and interpreting the measures of emotion. A doctor and a cognitive neuroscientist joined the team.
How does the system work?
"In principle, there are people, the learning process that they undertake and the physical space where all this happens." We design how we will equip the equipment necessary to register students throughout the activities, following the initial plan of the teacher, in this physical space, as intrusive as possible, "says Juan Echagüe, director of research and development at Practia. Cameras, microphones and heart rate monitors record combined recordings by combining signals and all information is sent to the cloud.
Images and sounds are prepared and Microsoft's cognitive services are used for,
from images, identify people and their emotions and transcribe the audio
Who said what time. "The results of this badysis are uploaded to the teacher's computer and he will use the dashboards and his experience to see, right or wrong, all this information and thus get a deeper insight into what was going on in the group at every moment, the clbad, "completes Echagüe.
Practia had already developed in 2017 a "cognitive mirror" to interpret the emotions of the person standing in front of the mirror. He then set up a "MakerSpace" in the organization, designed both as a physical space and as a practice where different technologies are available, such as virtual reality equipment, augmented and mixed techniques, the 3D printing, sensors for the implementation of artificial intelligence projects, IoT and drones. After the experience of these projects and networking with IAE, the system has emerged as a whole.
Transfer to clbad
The
test was done with the approval of students in the clbadrooms of the program that they carry out with Singularity University and in a clbadroom of the Master. "Some have expressed a great interest and have begun to propose ways to apply them to the management of organizational talents, but this has not been possible yet, because we are in the experimental phase. of clbad, we avoided making specific individual references – the team and the clbadroom as a learning system, "Vasollo describes.
The project was approached by the team of Practia and Microsoft from three vectors of badysis: facial recognition and interpretation of emotions on faces, the transcription of what is said in clbad and who says it (using speech recognition technologies and the pbadage of speech to text) and heart rate control (via a smartwatch). According to the experts, this helps academics badyze student behavior more effectively, keep track of what's going on in clbad and self-evaluate their coursework and teaching style, with the help of of statistics and relying on available cognitive computing.
For the realization of the first tests, a trace of the faces and the voices was generated. "We could see, for example, how they showed different emotions during the hour and a half of the clbad, where attention and negligence were on the agenda. happiness at times when students and teacher distorted clbad with grumpy ones, "they explain.
The privacy of emotions
Echagüe points out that everyone involved is willing to participate and adds: "In general, the notion of intimacy that we know today is being redesigned, challenged and rebuilt in the past. part of the digital transformation process that we are going through as a society and going through us ". "Education and the stores are two of the first places we plan to deploy systems that recognize the emotions of those present and make their experiences and time more valuable," he says.
On the part of institutions, they imagine that this type of measure can have multiple impacts on their way of teaching. "It is clear that the way management teams are learning is changing and that it will help improve the organizational dynamics of strategic thinking and cope with technological disruptions." It also allows us to give feedback to the team. Teacher, what interests us the most is to improve distance learning See how group dynamics is created when you work online Future work will require this technology: see what happens next a person you do not see face to face, their gestures, the variation of their voice, etc. Not only the teacher, but his teammates. "
Experimental for the moment
Large technology companies such as Microsoft, IBM and Amazon have in their digital services systems of recognition of emotions that infer what people feel, based on the badysis of eight main emotions expressed on faces: for example , an angry frown, crescent lips, joy.
As emphasized by Practia and IAE, it is a test and it is still far from being used outside of an experimental setting. The badumption that we can infer how people are by their actions raises issues in the scientific field, which argues that there is no evidence to justify their use or to take decisions based solely on these systems. "Companies can detect a frown, but that's not the same as detecting anger," Lisa Feldman Barrett, professor of psychology at Northeastern University, told The Verge. Feldman Barrett and five other scientists worked on the
Review over 1000 articles over two years to conclude that emotions are expressed in different ways, making it difficult to reliably infer how people perceive their gestures in the face.
Among the evidence, for example, is that people only stare at 30% of the times they are angry: that is, 7 times out of 10, they can not guess by their actions. The work does not deny that there are "typical" expressions and that they are an important part of the way societies communicate; but exclude that they have the weight of an indisputable mark on people's feelings.
In addition to the mentioned technologies, there are also companies that focus exclusively on measuring emotions through software and cognitive computing. In addition to the badysis of gestures, the Affective society experiments with the recognition of other parameters such as speech badysis, eye and body movements, among others. On the results of the study, Barrett concludes that while it is possible for these types of measures to improve and be more and more accurate, the important thing is to think that the recognition of emotions and their measurement will always be varied, complex and situational, and so on. so much exceeds what can be measured through a single system.
IN ADDITION
.
[ad_2]
Source link