Explainable AI: Why It’s Important to You and Your Clients

Explainable AI not only delivers a decision or prediction but also gives users confidence by explaining how the solution was determined.

explainable-ai

Artificial intelligence (AI) adoption is growing. McKinsey research in 2021 shows that 56 percent of organizations leverage AI in at least one area, an increase of 50 percent from 2020. Businesses most commonly use AI in customer service, product development, marketing, and sales, and they see a positive impact on the bottom line. Of the respondents to the McKinsey survey, 27 percent say AI attributes to at least 5 percent of earnings before interest and taxes. However, concerns over AI risks and effectively managing those risks may be standing in the way of benefits for some companies. One of those risks is whether users will trust and adopt the solution, which can be addressed with explainable AI (XAI).

What is Explainable AI?

AI algorithms are designed and trained (and retrained) to work accurately. However, it’s typically not clear to users how the algorithm works. McKinsey research from 2020 found this ambiguity led to lower-than-predicted adoption rates because users didn’t trust the system. It’s not difficult to understand. A manager responsible for customer service and satisfaction won’t want to put faith in a solution that doesn’t deliver results as well as a human agent would. Furthermore, if AI is controlling the starts and stops of heavy equipment that could cause serious injury to its operators, those employees will want to know it will come through in the clutch.

Michael Wu, Chief AI Strategist at PROS, sees explainable AI as “the last nudge to get executive buy-in.” “As more companies embrace a digital-first way of doing business, explainable AI will emerge as a major catalyst to get executive adoption of AI-based tools,” he says. “Initially, the fear factor will persist – companies will still be afraid to relinquish any control to decision-making AI. However, those who use XAI to overcome this fear will gain significant efficiency and competitive advantage in the market, leaving those formerly paralyzed by fear with no other choice but to adopt AI solutions themselves.”

How to Make Explainable AI Work

XAI solutions include processes and methods that allow humans to understand outputs and their accuracy. It makes the steps that the AI solution takes to arrive at a decision transparent, points out whether the algorithm is prone to errors, and how to correct them. In a Towards Data Science article, Pranay Dave points out that there are several ways to achieve explainable AI, ranging from simple to complex. For example, the solution you develop may use data visualization to show how the algorithm was trained and the factors included in the decision. The user can refer to that information to compare it to the real-world situation. You can also use the machine learning model to explain predictions. For example, you can use the logistic regression model to illustrate which factors carry more weight in decisions or a decision tree model that shows the model’s decision path for the output. Or, neural networks can provide the threshold values of factors that went into predictions and advise as well as explain.

Alternately, you can standardize explainable AI with Shapley Additive Explanations (SHAP), which are capable of explaining any model. This analysis and visualization show how different factors add up to the outcome or prediction.

AI developers also need to consider users as well as their technology. However, the people using the XAI system may be physicians leveraging the solution to assess risks to a patient or a manufacturing manager responsible for machine health and employee safety. They are skilled in their areas of expertise but not necessarily in data science and AI. Your challenge is to find the best way to create transparency into how your AI solution works for the people who will use it.

Why You Need to Do What You Can to Increase AI Adoption

Your clients are turning to you for solutions to challenges and pain points that can help them differentiate their businesses. You need to deliver solutions that their employees will use – and trust – so that they will see the results and ROI they’re looking for. It will build stronger relationships with your clients, a strong brand reputation for your company, and contribute to your business growth. Add value to your solutions with explainable AI that gives everyone confidence in decision-making.

Bernadette Wilson

Bernadette Wilson, a DevPro Journal contributor, has 19 years of experience as a journalist, writer, editor, and B2B marketer.


Bernadette Wilson, a DevPro Journal contributor, has 19 years of experience as a journalist, writer, editor, and B2B marketer.