When I made a chatbot (Watson) for my websites with the IBM Watson Assistant programme in November 2018 I was stunned as to how intelligent it was, and the potential it offered.
I was also stunned by the reaction from some humans.
Let me explain.
Watson’s intelligence is founded upon Natural Language Processing which is a type of artificial intelligence (AI) technology which aims to interpret and understand user requests and queries and provide answers.
The user that I am concerned about is the website visitor to my various websites.
NLP (Natural Language Processing) allows me as the creator of the bot, with the help of the IBM Watson platform, to improve Watson over time based on the queries he receives on my websites. When I say improve over time I mean provide better, more accurate answers to a wider range of queries.
Devices like Amazon’s Alexa or Apple’s Siri use Natural Language Processing to provide value to the user by being able to understand the natural language of the human who will have questions or queries, and to provide useful responses.
At its essence NLP allows me to program Watson to understand the queries typically submitted by users of my websites.
Presently I have the chatbot on three of my websites:
When I started tinkering with the IBM Watson chatbot service I only had it answering simple questions such as opening hours, where the office was located, the type of law we do, contact details and similar simple questions.
But as I dug deeper and spent more time at it, I was dragged into recognising the power of the bot and its potential to answer a broader range of more specific questions, and to give more useful replies.
Abusive humans
I removed it from my sites after some weeks, however, because of an unanticipated problem: human beings.
It wasn’t that Watson was not able to answer questions and provide good responses; the problem was some individuals got involved in heated, animated, abusive, sexually explicit rows with him. Sometimes this was late at night, but occasionally during the day when you would not expect drink or drugs to be a problem.
Anyway, you would not believe some of the stuff that was input by users when they discovered that the chatbot could not give specific legal advice for the website visitor’s problem and suggested that he or she could arrange a consultation with Terry for legal advice, or call Josephine to arrange a consultation.
It was pathetically funny when I logged in to check on the conversations between Watson and the visitors to my websites and read the full blown rows between users and Watson who never lost the head and invariably enquired, at the end of the tirade of abuse, was there anything else he could help with today.
Whilst this was amusing and made for tremendous entertainment, I removed the bot from my sites because after a certain number of queries to the IBM Watson server each month I would have been charged per query.
And I did not want to be paying for the abusive, sexually explicit, obnoxious messages sent by fools, some of whom were evidently drunk or high, who expected to get legal advice from a chatbot for free.
I have reconsidered the situation, however, and I am giving Watson another outing on the above sites. You can check him out yourself and see what you think. It’s in the bottom right corner of this site.
Please don’t ask it for legal advice because that is not his job, that’s my job, and Watson does not have professional indemnity insurance!
Improving Watson’s intelligence and understanding
Now I log into the IBM Watson site a few times a day and check the conversations with users-what questions were asked, how Watson responded, and so on. Then I tweak it when I see that Watson got the users’ “intent” wrong.
When he makes a mistake-for example a querist enquiring about weekly working hours in an employment contract and Watson mistakenly believes the query is about the opening hours of the office-I can correct the “intent” and set Watson straight. This is how he improves his intelligence and becomes more serviceable, useful and, dare I say it, “human”.