More and more of the housing providers and local authorities I talk to have recognised that their customers want self-service capability and that a chatbot is the best way to provide it. From providing automated customer service for rent queries to raising a repair, chatbots offer an opportunity to serve customers on multiple digital channels and free up contact centre staff to deal with more complex or sensitive customer problems.
Chatbots can help your contact centre deliver an immediate and responsive service 24/7. They respond to customers, engage them and answer their queries instantly. A good chatbot will automate business transactions and enable tenants to manage their own affairs without the delay and hassle of contacting the customer care team.
Most of us find engaging with a contact centre frustrating and expectations of our service providers seem to be increasing. 22 per cent of millennials say they would stop engaging with a brand after one bad interaction, while 67 per cent of respondents to a recent survey by desk.com expect the quickest response from using chat to engage with customer support.
Social housing should take pride in how highly it regards customer service and how it places customers at the heart of everything it does. As a sector, we need to take the issues concerning the ethics of AI and chatbots seriously. However, the ethics of chatbots are complex; they cover a range of topics including data ownership, privacy, transparency and abuse.
Rob High, the CTO of IBM Watson was recently featured in a Forbes article on the subject. The article – “Ethics and Artificial Intelligence with IBM Watson’s Rob High” – reported how the only way for AI to be ethical is for it also to be transparent. High advised that when a person interacts with a chatbot, they need to know they are talking to a chatbot and not a live person.
Ethics should be at the foundation of how AI is used. This ranges from facial recognition to driverless cars to customer profiling and we should also apply it to how chatbots are built and how customer data is used in any machine learning algorithms. Your chatbot is an extension of your customer care team and how a chatbot behaves will almost certainly influence the perception your tenants have of their landlord. If the chatbot, and indeed the landlord, is unethical then it leads to distrust from residents and potential litigation problems. Ethical chatbots, on the other hand, promote brand loyalty and encourage a relationship built on trust.
Putting the customer first
When an organisation builds a chatbot, it must decide who the bot will serve; does it serve its own needs, or the needs of the customer? For social landlords, the aim tends to be to reduce contact centre call volumes and enable customer self-service across multiple digital channels. By reducing call volumes, contact centre agents can spend more time and respond faster to those tenants that need or prefer to talk. As such, it’s clear that the chatbot is there to serve the needs of tenants. However, if on the other hand, the chatbot is there to act as a barrier to tenants getting the help they need or to simply reduce costs, then its design and purpose should be reconsidered.
In general, an ethical organisation must always put the needs of their customers before their own; specifically, I’d expect this to always be the case of a local authority or housing provider. That means providing a product that can automate business processes, such as checking a rent balance or making a payment, instead of one that can be implemented quickly merely as a box-ticking exercise, such as an FAQ bot with a handful of questions and automated responses. Users should have the option to provide feedback about the chatbot to better identify issues, maintain ethical behaviour, and improve overall customer satisfaction. Bots that use algorithms and machine learning to book repairs or make recommendations should be subjected to regular health checks to meet this need.
Are you talking to a human or a chatbot?
Establishing trust between machines and humans works similarly to building trust between humans. A brand can build up trust by aligning their expectations to reality, learning from their mistakes and correcting them, listening to feedback from customers and being transparent.
Transparency is a critical consideration when designing a customer service chatbot. It all comes down to the simple question of whether it is obvious that users are talking to a human or a machine? Customers can usually tell the difference between the two, and they expect that brands will be honest about it. Customers hardly expect the chatbot to be perfect, but they would like to know what they are and aren’t able to do.
When dealing with sensitive information such as moves, finance and ASB or when updating contact details, you must have security checks in place.
A tenant should have the option to speak to a real person if the bot is unable to give them the response or service they need, either by transferring to live chat or arranging a call-back.
How should a chatbot handle privacy?
The protection and privacy of user data is vital for the modern interconnected world. Laws that protect users’ data, such as GDPR, are a prime example of how important user privacy has become.
When developing a chatbot, the ethics involved with user privacy must be considered. This helps to answer questions such as:
- Where is the data within the chat transcript stored?
- Can the conversations with a chatbot be studied to improve and optimise the user experience?
- How long should the chat transcript be kept?
- How are tenants authenticated?
- If a complaint is raised via the chatbot, who will see this?
As with so many things, transparency is the best course of action here. The chatbot must ensure the privacy of users’ information during interactions; in effect, an unspoken confidentiality agreement between the user and the bot. This means the bot should encrypt communications and delete transcripts of chats within a reasonable timeframe.
Ethics must be at the heart of every action a business takes. Given that chatbots are still relatively new, it’s likely that more ethical concerns will become apparent over time. Housing providers must continue learning from the emerging cases and continue building their guiding principles and ethical standards.
When in doubt, side with the customer and offer transparency.
Scott Summers is the co-founder of Fuzzlab.