• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer
Housing Technology Main Logo

Housing Technology

Housing | IT | Telecoms | Business | Ecology

  • Free Subscription
  • Contact
  • Home
  • Research
  • Magazine
  • Events
  • Awards
  • Recruitment
  • On Demand
Home / Free Subscriber Access / A code of ethics for chatbots

A code of ethics for chatbots

More and more of the housing providers and local authorities I talk to have recognised that their customers want self-service capability and that a chatbot is the best way to provide it. From providing automated customer service for rent queries to raising a repair, chatbots offer an opportunity to serve customers on multiple digital channels and free up contact centre staff to deal with more complex or sensitive customer problems.

Chatbots can help your contact centre deliver an immediate and responsive service 24/7. They respond to customers, engage them and answer their queries instantly. A good chatbot will automate business transactions and enable tenants to manage their own affairs without the delay and hassle of contacting the customer care team.

Most of us find engaging with a contact centre frustrating and expectations of our service providers seem to be increasing. 22 per cent of millennials say they would stop engaging with a brand after one bad interaction, while 67 per cent of respondents to a recent survey by desk.com expect the quickest response from using chat to engage with customer support.

Social housing should take pride in how highly it regards customer service and how it places customers at the heart of everything it does. As a sector, we need to take the issues concerning the ethics of AI and chatbots seriously. However, the ethics of chatbots are complex; they cover a range of topics including data ownership, privacy, transparency and abuse.

Rob High, the CTO of IBM Watson was recently featured in a Forbes article on the subject. The article – “Ethics and Artificial Intelligence with IBM Watson’s Rob High” – reported how the only way for AI to be ethical is for it also to be transparent. High advised that when a person interacts with a chatbot, they need to know they are talking to a chatbot and not a live person.

Ethics should be at the foundation of how AI is used. This ranges from facial recognition to driverless cars to customer profiling and we should also apply it to how chatbots are built and how customer data is used in any machine learning algorithms. Your chatbot is an extension of your customer care team and how a chatbot behaves will almost certainly influence the perception your tenants have of their landlord. If the chatbot, and indeed the landlord, is unethical then it leads to distrust from residents and potential litigation problems. Ethical chatbots, on the other hand, promote brand loyalty and encourage a relationship built on trust.

Putting the customer first

When an organisation builds a chatbot, it must decide who the bot will serve; does it serve its own needs, or the needs of the customer? For social landlords, the aim tends to be to reduce contact centre call volumes and enable customer self-service across multiple digital channels. By reducing call volumes, contact centre agents can spend more time and respond faster to those tenants that need or prefer to talk. As such, it’s clear that the chatbot is there to serve the needs of tenants. However, if on the other hand, the chatbot is there to act as a barrier to tenants getting the help they need or to simply reduce costs, then its design and purpose should be reconsidered.

In general, an ethical organisation must always put the needs of their customers before their own; specifically, I’d expect this to always be the case of a local authority or housing provider. That means providing a product that can automate business processes, such as checking a rent balance or making a payment, instead of one that can be implemented quickly merely as a box-ticking exercise, such as an FAQ bot with a handful of questions and automated responses. Users should have the option to provide feedback about the chatbot to better identify issues, maintain ethical behaviour, and improve overall customer satisfaction. Bots that use algorithms and machine learning to book repairs or make recommendations should be subjected to regular health checks to meet this need.

Are you talking to a human or a chatbot?

Establishing trust between machines and humans works similarly to building trust between humans. A brand can build up trust by aligning their expectations to reality, learning from their mistakes and correcting them, listening to feedback from customers and being transparent.

Transparency is a critical consideration when designing a customer service chatbot. It all comes down to the simple question of whether it is obvious that users are talking to a human or a machine? Customers can usually tell the difference between the two, and they expect that brands will be honest about it. Customers hardly expect the chatbot to be perfect, but they would like to know what they are and aren’t able to do.

When dealing with sensitive information such as moves, finance and ASB or when updating contact details, you must have security checks in place.

A tenant should have the option to speak to a real person if the bot is unable to give them the response or service they need, either by transferring to live chat or arranging a call-back.

How should a chatbot handle privacy?

The protection and privacy of user data is vital for the modern interconnected world. Laws that protect users’ data, such as GDPR, are a prime example of how important user privacy has become.

When developing a chatbot, the ethics involved with user privacy must be considered. This helps to answer questions such as:

  • Where is the data within the chat transcript stored?
  • Can the conversations with a chatbot be studied to improve and optimise the user experience?
  • How long should the chat transcript be kept?
  • How are tenants authenticated?
  • If a complaint is raised via the chatbot, who will see this?

As with so many things, transparency is the best course of action here. The chatbot must ensure the privacy of users’ information during interactions; in effect, an unspoken confidentiality agreement between the user and the bot. This means the bot should encrypt communications and delete transcripts of chats within a reasonable timeframe.

Ethics must be at the heart of every action a business takes. Given that chatbots are still relatively new, it’s likely that more ethical concerns will become apparent over time. Housing providers must continue learning from the emerging cases and continue building their guiding principles and ethical standards.

When in doubt, side with the customer and offer transparency.

Scott Summers is the co-founder of Fuzzlab.

See More On:

  • Vendor: Fuzzlab
  • Topic: Infrastructure
  • Publication Date: 074 - March 2020
  • Type: Contributed Articles

Primary Sidebar

Most Recent Articles

  • Artificial intelligence in housing
  • Mobysoft – Data problems affecting complaints’ handling
  • Data, AI and private-sector strategies
  • Smart repairs & smarter homes
  • From firewalls to fortresses
  • Achieving three quick wins in AI
  • Rebuilding Selwood Housing’s IT infrastructure
  • Are you ready for organisational AI?
  • PIMSS releases AI Document Reader for compliance
  • Calico Homes cuts arrears with RentSense
  • FourNet launches digital transformation index
  • New income recovery software from Voicescape
  • Asprey Assets at YMCA
  • I love spreadsheets…
  • All watched over by machines of loving grace – AI assistants and adult social care
  • The rent revolution – The case for AI-powered payments
  • Unlocking safer living through data
  • Aareon acquires MIS ActiveH
  • Vericon launches MouldSense
  • Back to the future at Housing Technology 2025
  • FireAngel wins Which? Award
  • Maximising income and preventing homelessness
  • Anchoring digital innovation with Plentific
  • Cynon Taf Community Housing gets Housing Insight’s Arrears Manager
  • Tenants, AI & your biggest compliance risk
  • EDITOR’S NOTES – Data, standards & straight-through processing
  • AI as a social housing expert
  • South Yorkshire Housing halves arrears with Mobysoft
  • Bromford Flagship wins Aico’s smart-home competition
  • Putting VIVID’s customers in control of their tenancies

Footer

Housing Technology Main Logo
  • Instagram
  • LinkedIn
  • YouTube
  • Contact
  • Free Subscription
  • Book an event
  • Research
  • Update Your Subscription
  • Privacy Policy

Welcome to the housing Technology – Trusted Information For Business Professionals in HOusing

Housing Technology is the leading technology information service for the UK housing sector and local governments. We have always believed in the fundamental importance of how the UK’s social housing providers use technology to improve their tenants’ lives.

Subscribe to Housing Technology to gain market-leading research, unsurpassed peer networking opportunities and a greater understanding of your role to transform your business.

Copyright © The Intelligent Business Company 2025 | Terms and Conditions | Privacy Policy
Housing Technology is published by the The Intelligent Business Company. A company with limited liability. Registered in England No. 4958057 | Vat Registion No. 833 0069 55.

Registered Business Address: Hoppingwood Farm, Robin Hood Way, London, SW20 0AB | Telephone: +44 (0) 20 8336 2293