If you want to predict the hottest future B2B IT trends, one of the most effective methods is to look at the current consumer IT trends. One example that illustrates this point is from 2007, the year the first iPhone was unveiled and smartphones started gaining popularity among consumers. Within a couple of years, progressive companies like Intel embraced BYOD (bring your own device) in the workplace.
We’re now seeing a similar phenomenon happening with voice technology. Although it’s been nearly six years since Amazon released Alexa, we’re seeing a massive uptick in smart speaker use this year. Adswizz, for example, is reporting smart speaker use in the home is up 100% since the COVID-19 pandemic hit earlier this year.
Following the recent VOICE Global livestream, presentations from companies like Arria NLG demonstrated that It’s not just consumers who are embracing voice technology and conversational AI.
On the B2B front, friendly user experiences overcome employee reluctance to use apps for everyday decision-making. According to research from Salesforce, 53% of service organizations currently expect to use chatbots within the next 18 months — a 136% growth rate. In a related Adobe Analytics study, 91% of business decision-makers surveyed said they already are making significant investments in voice technologies and 94% said they plan to increase their investments in 2020. Brands see the incredible potential for natural language technologies, with 66% strongly agreeing that voice can help drive conversion and increase revenue, and 71% see it improving the user experience.
Not All Voice Technologies are Equal
If you’ve used a voice assistant for any length of time, you’ve likely experienced some of the limitations associated with a virtual assistant that caused you to revert to manually opening an app or searching for information in a browser. The process works similarly in a business environment if the voice technology isn’t able to translate the user’s request correctly and requires multiple follow-up questions or simply provides the wrong information.
For ISVs and software vendors looking to capitalize on this trend, it’s imperative to use AI that meets these four criteria:
- Smart. Some voice AI requires the user to be very precise with each request and to use the same key phrases each time a subsequent request is made. This becomes very cumbersome for the user and leads to a poor experience. A “smarter” AI platform is programmed with domain knowledge, which enables your VDA (virtual digital assistant) to answer your next question before you even ask it. This feature also allows the VDA to understand synonyms, acronyms and other sophisticated language features (e.g., idioms, slang) that comprise human-like interactions. It also filters unimportant (and uninteresting) information from the VDA’s response, which further builds user confidence.
- Personalized. One of the best ways to ensure user adoption and create a positive customer experience is to enable users to engage with technology as naturally as possible. For instance, rather than responding the same way to each user, a voice AI platform should react uniquely to each user based on the user’s preferences.
- Contextual. In regular conversations, a user may stop mid-sentence to reword a phrase or expand on a point. The AI platform should be able to decipher between the new information and the old information without requiring the user to start over from the beginning and repeat the critical details from the original question. What makes an AI platform contextual is its ability to handle, “under-specification,” which includes parts of dialogue that are omitted when those parts are predictable. This feature is also known as multi-turn conversation technology. For example, if a salesperson asks, “What were my sales in May?” a less advanced AI will need further clarification from the salesperson to determine which year he or she is inquiring about. A more advanced AI platform equipped with multi-turn technology, on the other hand, can apply contextual awareness to infer the speaker is asking about the current year.
- Accessible. By understanding natural language commands, an advanced voice AI platform makes it easier for non-technical users to interact with their dashboard whenever they need to be hands-free, such as in the board room, on their way to work, in a warehouse or a hospital.
Natural Language Technologies are the Lynchpins
Between Data and Actionable Intelligence
Creating conversationally intelligent applications requires the use of multiple natural language technologies, such as natural language query (NLQ), natural language processing (NLP) and natural language generation (NLG). Only when conversational AI produces dynamic, multi-turn conversations by pairing NLQ on the human side with NLP and NLG on the machine side, can a human experience be created from AI.
NLG plays a vital role mining structured data from a CRM or business intelligence dashboard. NLG technology providers like Arria NLG, for example, make it easier to connect voice assistants and chatbots to business intelligence dashboards. This AI streamlines access to insights and eliminates the need for employees to manually generate reports or search through spreadsheets to find even the smallest pieces of data.
Arria’s narrative APIs allow developers to immediately add the power of language to applications, dashboards, and websites.
After analyzing all underlying data in a CRM system or business intelligence platform, Arria instantly produces responses as contextual narratives that are almost indistinguishable from analyses authored by human subject matter experts.
Narrating the essential data points within a CRM or BI dashboard provides intelligence well beyond two-dimensional visuals. It also gives the user nearly unlimited drill-down capabilities with multiple dimensions from any number of external data feeds.
As the de-facto communications layer of the AI stack, NLG gives machines the ability to intelligently and instantly respond to queries with dynamic and relevant answers that correspond directly to the user’s query.
Why is this important?
KPMG, in its report, How may A.I. assist you? states that, “One of the most visible applications of AI is conversational agents — chatbots and intelligent assistants that interact with people via voice or text channels, on devices such as smartphones, automotive infotainment consoles, and smart speakers. Having conversations with AI is becoming routine for consumers — and soon it will be for employees, too. In the workplace, conversational agents can help workers interact more seamlessly with each other, streamline office operations, execute internal processes, and deliver data more efficiently.”
Pioneers like Arria NLG extend the utility and ubiquity of data analytics, allowing users to have literal, fluid conversations with their data, accessing actionable intelligence on-demand.
For ISVs, integrating BI and other solutions with natural language technology can create a huge competitive differentiator. One example is Arria NLG Studio, which features a low-code, RESTful API architecture at ingestion and presentation that removes complexity, allowing seamless integration with a variety of systems and data formats. It also supports multiple deployment options for cloud, on-premise or hybrid environments.
Bringing natural language technologies to the enterprise opens up a new and more natural way for businesses to unlock actionable insights hidden within their siloed data sets that can lead to new revenue opportunities. In the early phases of adoption, the technology becomes a novelty that’s a nice change from the old way of having to type queries and wait for written responses.
The more human-like an ISV can make the user experience, the more likely the technology will become a must-have confirmed by exceptional productivity gains, deeper business insights and the agility to adapt without sacrificing productivity.