Years ago, we used to look into the newspapers for job vacancies. We used to carefully mail our handwritten resume to the employer’s address. We would commute to the interview location and give wait with apprehensions and excitement outside the interviewer’s office. We would wait for weeks and check our mail boxes daily for the appointment letter. Times have changed. We apply for a job at the click of a button, our resumes are screened by ATS, we give our interviews sitting in front of the screen and answering the questions asked by the AI and then gets notified in the same portal regarding our selection. The process of application to selection is just a matter of few clicks. Not only these, HRs are now heavily delegating the performance management and leave management aspect to technology. Since these repetitive and clerical tasks are off their shoulders, HR managers can actually be the real ‘human resource managers’ and explore the real pain points of the organization.

Now the question arises, is technology so reliable so as to replace the human intervention in doing activities like short listing future business leaders to be on-boarded in the company? How can technology map into human competencies with utmost accuracy? Can AI predict how emotionally stable the employee will be in difficult situations? Can it predict how great the employee will be in maintaining team cohesion? What is the probability that the employee will maintain a long time association with the company? Can a non living entity predict who the future CEO of the company will be?

Well, if you ask me, the answers to a lot of the above questions is yes today and those which are not, will be affirmative in the next couple of years. That is technology for us! I have been just looking into this topic from the lens of a HR professional; however, let us also look into some other perspectives.

I am addicted in watching the Netflix Science fiction series “Lost in Space”. Interestingly, it shows how an AI robot from an alien planet is so advanced to decipher the emotions of humans especially that of a little boy who rescued him. The robot makes its own decisions and saves everyone in times of crisis. In the movie Her, the main protagonist makes a conversational AI robot and named her Samantha. Samantha talks to him in a sensitive, warm and human voice, making her indistinguisible from any other human female. Theodore eventually falls in love with Samantha, who is just a computerized voice.

AI designers today are incorporating emotions in new applications. In their May 2018, Duplex developer conference, Google demonstrated an extension to its AI Helper programme, which can make phone calls in certain situations where the listener feels he or she is interacting with a real human being. To delve a little deeper in the topic, let us understand the difference between human and machine emotions.

What are human emotions?

Human emotions have played a long evolutionary role for the survival of our species as whole. Emotions are either reaction to a spontaneous expression of an internal thought process or a reaction to an external stimulus. For example, I feel happy at being able to solve a difficult problem. Here, the happiness is a result of my internal thought process of being satisfied by my efforts in solving the problem and contentment with the results obtained. When we cross a busy road, there is constant fear of being involved in an accident. This ‘fear’ response is evoked due to the prior knowledge of road accidents. Here, fear is an integral component in our survival as we take actions accordingly.

Microsoft’s AI tools can identify different emotions of a person just by facial recognition. Their Seeing AI app helps visually impaired people identify the emotion of the person standing in front of them by just clicking the photo of the person. The AI tool can predict the emotional state of the person by analyzing the photo.

What are machine emotions?

Marvin Minsky, one of the founding fathers of AI said, “The question is not whether intelligent machines can have any emotions, but whether machines can be intelligent without any emotions.” Although as of yet, machines do not think the way we do, react the way we do or respond the way we do, they use logic to stimulate emotions that enable them to interact with humans in more appropriate ways.

In the book, “How to Create a Mind” by written by Ray Kurzweil, it is mentioned that in theory any neural process can be digitally reproduced in a computer. If the machine is equipped with adequate sensors, sensory feelings like feeling hot or cold can be simulated from the environment. However, some physiological feelings like hunger and tiredness are triggered by hormones and our digestive system. Such feelings are difficult to be reproduced in a machine. The challenge will be to incorporate a more sensory feedback, which will enable the machines to experience a wider range of feelings and emotions.

How machines can replicate human speech?

The ability to replicate natural sounding speech by a machine is an ongoing challenge. Personal assistants like Siri (Apple), Alexa (Amazon) and Google Assistant use text-to-speech software to create a convenient interface with users. These systems forge together words and phrases from the pre-recorded files of a particular voice. To change the voice, a new audio file is required containing every possible word the device might need to communicate.

Lyrebird, a Canadian based company has created an AI system which learns to mimic a person’s voice by analyzing the speech recordings and the text transcripts. It can mimic almost any voice. It can also generate completely new sentences that include different intonations and emotions of each voice.

How AI detects emotions?

Voice recognition AI softwares are slowly learning to detect human emotions through speech intonation and pauses. Seeing AI detects human emotions through voice, body language and facial expressions. A deep learning AI program now can predict with 90% accuracy whether a person has a criminal tendency by just looking at its facial features. In 2016, Apple bought Emotient, a start up company which can read facial expressions. Now technologies like this can also be used in retail segment, here the customer thinking can be analyzed from their body language. It can be a useful tool for store assistant to approach the customer with the product they might be interested in.

Will technology replace HR functions in the future?

I do not think so. Earlier, a recruiter might have to sift through a lot of resumes, shortlist them and schedule interviews. Now that effort intensive task is taken over by AI, which leaves room for HR managers to engage with candidates effectively with regular communication and personal touch. A psychometric map of the candidate can be created based on his or her emotional states while simulating questions of situations are put forth. Pre-hire assessments will be easier and more effective with the help of AI based video analysis.

But no candidate will ever be comfortable without a human to talk to? Speaking in front of a screen to a chatbot or a virtual assistant does not initiate a lively conversation. Deep inside psychologically, a candidate still wants the listener to comprehend and emotionally associate with their stories and answers. While a employee wants his or her grievance to be recorded, he or she prefers to talk it out in front of a live human being than a machine. So, till humans nature to connect with another human being does not wash away, Human resource management will be handled by humans instead of robots.

What lies in the future?

In the book Homo Deus, author Yuval Noah Harari writes, “Humans are essentially a collection of biological algorithms shaped by millions of years of evolution.” This means that non-organic algorithms like artificial intelligence can replicate and even surpass organic algorithms of humans. I leave the question open for further brain storming, “Can machines judge a person better than a human can do?”

Tags:

You might like reading:

A Vibrant Hue of Happiness

The Social Responsibility Committee of Symbiosis Institute of Business Management (SIBM), Bengaluru organized Utthaan 2016 – Little feet…Big Strides on Sunday, the 28th of August. The event was centred on the theme ‘Blue Paradise’ which intended to showcase the beauty and serenity of the underwater world to the underprivileged children from various NGOs of Bengaluru. The day-long event was attended […]

0 comments

Catch Ideas Early: Identify Mega Trends

Should I take up this idea and start a business around this? How will I know whether this business idea is sustainable? Should I hire a Consultant to understand the nuances?   Well the answer to all this is: Believe in your capabilities and Identify a Mega Trend. Going by the history all successful entrepreneurs identified a trend and modeled […]

0 comments
0 0 votes
Article Rating
Subscribe
Notify of
0 Comments
Inline Feedbacks
View all comments
Follow us

©2010-2023 IdeasMakeMarket.com |Contact Us | T&C | Privacy Policy

 

CONTACT US

We're not around right now. But you can send us an email and we'll get back to you, asap.

Sending

Log in with your credentials

or    

Forgot your details?

Create Account