The future of chatbots and online chat in financial services

If the chatbot makes decisions that could impact the financial interests of the customer, the customer is expected to be informed of its underlying characteristics. The extent to which certain features may have limitations may also need to be disclosed.

In many cases, companies will need to rely on external vendors to develop, deliver, integrate and update their chatbot solutions and online or in-app chat platforms. The conditions to which the service provider is committed will have to be taken into account by the financial services companies.

Performance indicators and testing regimes will need to be carefully designed. This will probably not be enough if the performance is limited to evaluating the number of customers whose request was answered successfully without a human agent needing to intervene. Consideration should also be given to data quality standards and relevant processes for the development and use of the system; and the extent to which safeguards have been put in place to protect against unfair bias and software maintenance issues that may arise if the bot relies on different sources of data to feed machine learning models, has hidden dependencies on third-party tags or incorporates unstable code that has resulted from rapid iteration and experimentation.

Chatbot technology is changing rapidly, so businesses will want to retain the flexibility to switch vendors if another vendor has developed a product that better meets their needs. When it comes to using live agents online to answer customer queries, companies will need to consider the appropriate arrangements to have in place, especially if live agents are provided by an outsourced service provider.

When a financial services company engages chatbots or live agents online, there is a high likelihood that there will be some processing of personal data. Financial services firms will need to consider GDPR compliance issues and legal grounds for processing. If relying on customer consent to process personal data, companies will need to consider how that consent is obtained. Under the GDPR, consent is only obtained if it is “freely given, specific, informed and unambiguous”. One way to obtain client consent is to include the consent mechanism with a privacy notice.

Compliance with consent requirements can impact the customer experience. Financial services firms should consider whether it would be less disruptive to rely on “legitimate interests” as a lawful ground for processing. Under the GDPR, controllers do not need to obtain consent from users where they have a legitimate interest in processing the data and can demonstrate that the processing is necessary to achieve this. If companies are relying on legitimate interests as a ground for processing, they should consider preparing a legitimate interests assessment. This will help them determine that “legitimate interests” is an appropriate legal basis for the intended data processing and can help demonstrate compliance with the GDPR accountability principle.

With regard to lawful, fair and transparent processing, measures must be taken to inform the customer of the processing concerned, the personal data collected from the chatbot and how this data is used by the company and the chatbot. This can be achieved by including the information in the privacy notice and implementing measures to bring this to the attention of customers before they engage with the chatbot.

Financial services firms also need to consider intellectual property risks when using chatbots. In most situations, the company will engage a vendor to deploy the solution, so the ownership and development of any intellectual property should be considered from the outset of any vendor agreement. Businesses need to consider whether they should own the solution provided by third parties so that it can be developed in the future on their behalf, or whether a change in vendor would require a new solution to be installed instead. .

Problems have arisen with chatbots that use machine learning tools. It is important to establish who is responsible if your chatbot responds to a customer in a way that could cause them harm. If a chatbot uses abusive language or provides offensive responses to the customer, there is a possibility that the customer may file a defamation suit against the business or service provider. To avoid these situations, it is important that the solution is tested vigorously and censored before its launch. Companies should ensure that there are strong Software as a Service (SaaS) agreements or license agreements with all service providers.

Chatbots can be cost savings

The seamless integration of chatbots is likely to become a major part of companies’ customer engagement strategies. Many large financial firms are already moving towards answering the majority of customer queries via online chat platforms and chatbots. This will significantly reduce costs for businesses and the time spent handling customer calls.

Written by Carrie McMeel of Pinsent Masons.

Stephen V. Lee