Gartner’s research predicted that people increasingly rely on the outcome of AI solutions. By 2022, 30% of consumers in mature markets will rely on AI to decide what they eat, what they wear or where they live, said the analyst firm.
Relying on AI to tell me where I should live may be ok, because I have seen enough data points to validate the “truth” in AI based recommendations. But in so many other areas it may be hard to accept the truth proposed by AI. And that precisely is what makes AI a hard sell.
Why would you trust the recommendations made by an AI engine, more importantly when it is a black box.Prospects and customers at #codedataio have voiced similar concerns. They are interested in making the leap, but the AI black box makes it hard for them to rely on our recommendations. The “trust factor” will get better over time for sure, as they have more data points to validate the impact of our recommendations to their businesses.
That’s where Explainable AI comes in. We are following the how the space evolves and will add tools as they become available to include XAI as part of #codaai but until then we allow your data scientists to dive down into the data that makes the recommendations.
To read more on #explainableai read on.
https://www.computerweekly.com/news/252457364/Explainable-AI-How-and-why-did-the-AI-say-true