Get ready for AI “meeting bots” capable of reading human body language in the boardroom
Monday, April 30, 2018/
Voice recognition technology has been generating a lot of attention of late. Smart devices decked out to deliver voice-prompted assistance are entering the market in vast numbers, and the future applications of this technology are still somewhat unknown.
There is significant investment being directed towards the development of artificial intelligence (AI) technology, and it’s possible we will soon be working in conjunction with voicebots.
This is something Rowan Trollope, collaboration leader at Cisco, reflected on in a guide to meeting-based AI on Medium last year: AI meeting bots are set to make their way into workplaces over time, evolving to the point of being able to “eventually anticipate our needs, and understand interpersonal and even company-wide dynamics”.
“Growth in machine intelligence is inevitable, and will lead to more perceptive and intuitive voicebots,” he writes.
“Soon enough, these AIs will be joining our work teams.”
Trollope believes the intelligence will need to evolve over five broad stages.
Welcome to the meeting bots
Meeting bots will initially arrive in the vein of “command-and-control bots” like those already being used today. They will be equipped with a limited vocabulary and awareness of context, and will need to be explicitly activated.
“They will make meetings less painful to join, by simplifying the mechanical procedures we all hate, like dialing complex conferencing numbers,” Trollope writes.
Beyond basic voice recognition
Bots will then start to emerge decked out with an understanding of context.
“On-demand Level 2 meeting bots will be able to make basic linguistic connections, keep track of what’s happening in a meeting (who’s in a meeting, which file is being displayed) and will be able to operate more of the minutiae of work interactions,” Trollope predicts.
The bots are listening in
Bots will then increasingly be able to listen in on meetings and will be able to draw on external sources, such as company and domain-specific knowledge bases, in formulating analysis, assisting in undertaking “some of the cognitive processing of meeting content”, Trollope writes.
“This level of meeting bot will listen to meetings and will be able to discern the topics discussed.
“It will offer its analysis after a meeting ends – which might help you remember key points raised in a meeting.”
Trollope believes we are five years away from ‘Level Three’ bots being reliable enough to commence shipping in products.
Proactive and understanding of human intent
Bots “accurate enough in their understanding of human intent” will be able to proactively participate in meetings in real time, and in future could also potentially be able to understand non-verbal interpersonal communication.
“How AI will use human communication that may be at odds with what people actually say, though, will be an interesting challenge for developers,” Trollope writes.
Bringing teams together
‘Level five’ bots will have the capacity to take a more holistic view and bring together separate teams.
“This level of bot is aware of overlapping meeting topics, workers’ individual skillsets and the projects that people are working on across the company, based on data gleaned not just from content of meetings, but also from social network analysis that includes chat and email data mining,” Trollope writes.
“A Level 5 bot might be aware of overarching company goals, and could suggest team members for projects, and make introductions between people based on goals, project needs and personal compatibility.”
Whether or not we are ever willing to accept a “robo-boss”, Trollope says that, if we extend artificial intelligence, machine learning and social data mining trend lines, “it’s inevitable that we’ll be able to create this capability”.