Virtual Assistant Update

 

We recently published “Virtual Assistant Update.” It’s a broad and not too deep update on virtual assistant technologies, products, suppliers, and markets from the perspective of the five leading suppliers: [24]7, Creative Virtual, IBM, Next IT, and Nuance. These are the leaders because they:

  • Have been in the virtual assistant business for some time (from 16 years for [24]7 via its acquisition of IntelliResponse to four years for IBM).
  • Have attractive and useful virtual assistant technology
  • Offer virtual assistant products that are widely used and well proven.
  • Want to be in the virtual assistant business and have company plans and product plans to continue.

The five suppliers are quite diverse. There’s the public $80 billion IBM and the public $2 billion Nuance. Then there are the private [24]7, a venture backed company big on acquisitions and the more closely held Creative Virtual and Next IT. Despite these big corporate-level differences, the five’s virtual assistant businesses are quite similar. Roughly they’re all about same size and the five compete as equals to acquire and retain virtual assistant business.

By the way, across the past 12 to 24 months, business has been good for all of the five suppliers. Customer growth has been very good across the board. Our suppliers have expanded into new markets and have introduced new and/or improved products.

Natural Language Processing and Machine Learning

Technologies are quite similar, too. All five have built their virtual assistant offerings with the same core technologies: Natural Language Processing (NLP) and machine learning.

Virtual Assistants use NLP to recognize intents of customer requests. NLP implementations usually comprise an engine that processes customer requests using an assortment of algorithms to parse and understand the words and phrases in a customer’s request. An NLP engine’s processing is guided by customizable and/or configurable deployment-specific mechanisms such as language models, grammars, and rules. These mechanisms accommodate the vocabularies of a deployment’s business, products, and customers.

Virtual assistants use machine learning technology to match actual customer requests with anticipated customer requests and then to select the content or execute the logic associated with the anticipated requests. (Machine learning algorithms learn from and then make predictions on data. Algorithms learn from training. Analysts/scientists train them with sample, example, or typical deployment-specific input then with feedback or supervision on correct and incorrect predictions. A trained algorithm is a deployment-specific machine learning model. The accuracy of models can improve with additional and continuing training. Some machine learning implementations are self-learning.)

Complex and Sophisticated Work: Consultant-led or Consultant-assisted

The work to adapt NLP and machine learning technology implementations for virtual assistant deployments is sophisticated and complex. This is work for experts: scientists, analysts, and developers in languages, data, and algorithms. The approach to this is work differentiates virtual assistant suppliers and products. The approach drives virtual assistant product selection. Here’s what we mean.

All the virtual assistant suppliers have built tools and package predefined resources to make the work simpler, faster, and more consistent. Some suppliers have built tools for the experts and these suppliers have also built consulting organizations with the expertise to use their tools. Successful deployments of their virtual assistant offerings are consultant-led. They require the services of the suppliers’ (or the suppliers’ partners’) consulting organizations.

Some suppliers have built tools that further abstract the work and make it possible for analysts, business users, and IT developers to deploy. While these suppliers have also built consulting organization with expertise in virtual assistant technologies and in their tools, successful deployments of their virtual assistant offerings are consultant-assisted and may even approach self-service.

So, a key factor in the selection of a virtual assistant product is deployment approach: consultant-led or consultant-assisted. Creative Virtual, Next IT, and Nuance offer consultant-led virtual assistant deployments. [24]7 and IBM offer consultant-assisted deployments. For example, IBM Watson Virtual Agent includes tools that make it easy to deploy virtual assistants. In the Figure below, we show the workspace wherein analysts specify the virtual assistant’s response to the customer request to make a payment. Note that the possible responses leverage content, tool, and facilities packaged with the product.

ibm watson va illos

© 2017 IBM Corporation

Illustration 7. This Illustration shows the Watson Virtual Agent workspace for specifying responses from the bot/virtual assistant.

 

Which is the better approach? Consultant-assisted is our preference, but we’ve learned over our long years of research and consulting that deployment approach is a function of corporate, style, personality, and culture. Some businesses and organizations give consultants the responsibility for initial and ongoing technology deployments. Some businesses want to do it themselves. For virtual assistant software, corporate style could very well be a key factor in product selection.

 

 

 

 

Evaluating Customer Service Products

Framework-based, In-depth Product Evaluation Reports

We recently published our Product Evaluation Report on Desk.com, Salesforce’s customer service offering for small and mid-sized businesses. “Desk” is a very attractive offering with broad and deep capabilities. It earns good grades on our Customer Service Report Card, including Exceeds Requirements grades in Knowledge Management, Customer Service Integration, and Company Viability.

We’re confident that this report provides input and guidance to analysts in their efforts to evaluate, compare, and select those customer service products, and we know that it provides product assessment and product planning input for its product managers. Technology analysts and product managers are the primary audiences for our reports. We research and write to help exactly these roles. Like all of our Product Evaluation Reports about customer service products that include multiple apps—case management, knowledge management, web self-service, communities, and social customer service—it’s a big report, more than 60 pages.

Big is good. It’s their depth and detail that makes them so. Our research for them always includes studying a product’s licensed admin, user, and, when accessible, developer documentation, the manuals or online help files that come with a product. We read the patents or patent applications that are a product’s technology foundation. Whenever offered, we deploy and use the products. (We took the free 30-day trial of Desk.) We’ll watch suppliers’ demonstrations, but we rely on the actual product and its underlying technologies.

On the other hand, we’ve recently been hearing from some, especially product marketers when they’re charged to review report drafts (We never publish without the supplier’s review.), that the reports are too big. Okay. Point taken. Perhaps, tt is time to update our Product Evaluation Framework, the report outline, to produce shorter, more actionable reports, reports with no less depth and detail but reports with less descriptive content and more salient analytic content. It’s also time to tighten up our content.

Product Evaluation Reports Have Two Main Parts

Our Product Evaluation Reports have had two main parts: Customer Service Best Fit and Customer Service Technologies. Customer Service Best fit “presents information and analysis that classifies and describes customer service software products…speed(ing) evaluation and selection by presenting easy to evaluate characteristics that can quickly qualify an offering.” Customer Service Technologies examine the implementations of a product’s customer service applications and their foundation technologies as well as its integration and reporting and analysis capabilities. Here’s the reports’ depth and detail (and most of the content). Going forward, we’ll continue with this organization.

Streamlining Customer Service Best Fit

We will revamp and streamline Customer Best Fit, improving naming and emphasizing checklists. The section will now have this organization:

  • Applications, Channels, Devices, Languages
  • Packaging and Licensing
  • Supplier and Product
  • Best Prospects and Sample Customers
  • Competitors

Applications, Channels, Devices, Languages are lists of key product characteristics, characteristics that quickly qualify a product for deeper consideration. More specifically, applications are the sets of customer service capabilities “in the box” with the product—case management, knowledge management, and social customer service, for example. Channels are assisted-service, self-service, and social. We list apps within supported channels to show how what’s in the box may be deployed. Devices are the browsers and mobile devices the product supports for internal users and for end customers. Languages are two lists: one for the languages in which the product deploys and supports for its administration and internal users and one for the languages it supports for end customers.

Packaging and Licensing presents how the supplier offers the product, the fees that it charges for the offerings, and the consulting services available and/or necessary to help licensees deploy the offerings.

 Supplier and Product present high level assessments of the supplier’s and the product’s viability. For the supplier, we present history, ownership, staffing, financial performance, and customer growth. For the product, we present history, current development approach, release cycle, and future plans.

Best Prospects and Sample Customers are lists of the target markets for the product—the industries, business sizes, and geographies wherein the product best fits. This section also contains the current customer base for the product, a list of typical/sample customers within those target markets and, if possible, presents screen shots of their deployments.

 Competition lists the product’s closest competitors, its best alternatives. We’ll also include a bit of analysis explaining what make them the best alternatives and where the subject product has differentiators.

Tightening-up Customer Service Technologies

Customer Service Technologies is our key value-add and most significant differentiator of our Product Evaluation Reports. It’s why you should read our reports, but, as we mentioned, it’s also the main reason why they’re big.

We’ve spent years developing and refining the criteria of our Evaluation Framework. They criteria are the results of continuing work with customer service products and technologies and our complementary work the people who are product’s prospects, licensees, suppliers, and competitors. We’re confident that we evaluate the technologies of customer service products by the most important, relevant, and actionable criteria. Our approach creates common, supplier-independent and product-independent analyses. These analyses enable the evaluation and comparison of similar customer service products and results in faster and lower risk selection of a product that best fits a set of requirements.

However, we have noticed that the descriptive content that are the bases for our analyses has gotten a bit lengthy and repetitive (repeating information in Customer Best Fit). We plan to tighten up Customer Service Technologies content and analysis in these ways:

  • Tables
  • Focused Evaluation Criteria
  • Consistent Analysis
  • Reporting

Too much narrative and analysis has crept into Tables. We’ll make sure that Tables are bulleted lists with little narrative and no analysis.

Evaluation criteria have become too broad. We’ve been including detailed descriptions and analyses of related and supported resources along with resources that’s the focus of the evaluation. For example, when we describe and analyze the details of a case model, we’ll not also describe and analyze the details of user and customer models. Rather we’ll just describe the relationship between the resources.

Our analyses will have three sections. The first will summarize what’s best about a product. The second will present additional description and analysis where Table content needs further examination. The third will be “Room for Improvement,” areas where the product is limited. This approach will make the reports more actionable and more readable as well as shorter.

In reporting, we’ll stop examining instrumentation, the collection and logging of the data that serves as report input. The presence (or absence) of reports about the usage and performance of customer service resources is really what matters. So, we’ll call the criterion “Reporting” and we’ll list the predefined reports packaged with a product in a Table. We’ll discuss missing reports and issues in instrumentation in our analysis.

Going Forward

Our Product Evaluation Report about Microsoft Dynamics CRM Online Service will be the first to be written on the streamlined Framework. Expect it in the next several weeks. Its Customer Service Best Fit section really is smaller. Each of its Customer Service Technologies sections is smaller, too, more readable and more actionable as well.

Here’s the graphic of our Product Evaluation Framework, reflecting the changes that we’ve described in this post.

Slide1

Please let us know if these changes make sense to you and please let us know if the new versions of the Product Evaluation Reports that leverage them really are more readable and more actionable.

Salesforce Service Cloud

Evaluation of Service Cloud Winter ’15

This week’s report is our evaluation of Salesforce Service Cloud and its collection of tightly integrated but variously packaged and priced features and add-on products—Service Cloud, itself, for case management and contact center support, Salesforce Knowledge for knowledge management, Live Agent for chat, Social Studio for social customer service, and Salesforce Communities for communities and for customer self-service. Winter ’15 is the current release of the offering and the release that we evaluated in this report.

The offering earns an excellent evaluation against the criteria of our Framework for Customer Service Applications. We found no areas where significant improvement is required.

We had last published an evaluation of Service Cloud Winter ’13 on January 24, 2013. Winter ’15 is the sixth of the regular cycle of Winter, Spring, and Summer releases since that date. Every new release has included significant new and/or improved capabilities.

Salesforce Communities – a New Platform for Customer Self-Service

Salesforce Communities is one of the new capabilities in Winter ’15. It packages an attractive set of facilities, facilities that let customers perform a wide range of collaboration and self-service activities and tasks. However, none of these facilities use new technology; all of them have been existing features of Salesforce applications. What’s new and what’s innovative is their use as the platform for customer self-service. With Communities, Salesforce.com has extended the customer service provider-centric, web content-intensive self-service of portals with social and collaborative self-service that lets customers (and customer service agents) answer and solve customers’ questions and problems. Here’s what we mean.

Customers can use Communities’ packaged, portal-style facilities to perform these self-service tasks:

  • Search a Salesforce Knowledge knowledgebase to find existing answers and solutions for similar questions and problems
  • Browse a hierarchy of “Topics” to find existing answers and solutions to their problems in the knowledgebase or within community content.
  • Create new Service Cloud Cases when they can’t find answers and/or solutions via searching or browsing a knowledgebase, or by browsing Topics and community content.
  • Note that during the case creation process, Communities uses Automatic Knowledge Filtering, a Salesforce Knowledge feature, that automatically suggests knowledgebase Articles relevant to the content of the fields of the new Case.
  • Contact support for escalation to assisted-service

Customers can also use Communities’ packaged social and collaborative facilities to perform self-service tasks.

  • Post their questions or problems on a threaded, post-and-reply forum to solicit answers and solutions from other customers or from customer service staff members who monitor community activity. Note that Communities’ threads are implemented with Salesforce Chatter Feeds. Feeds are Twitter-like stacks of posts and replies/comments.
  • Search post-and-reply Feeds to find existing answers and solutions or previously posted questions and problems and replies/comments about them.

You may have read these lists of bullet points and said, “So, what. There’s nothing new here. We already have these facilities on our portal and on our community.” Exactly right, but that separate portal and community approach forces customers to go to two places to find answers and solutions, and, based on the experience that you’ve given them, they go to one place or the other depending on the type of question or problem they have or the quality and usefulness of answers and solutions that they’ve found. Salesforce Communities gives customers one place to go for self-service answers and solutions. One place not two makes it easier and faster for them to do business with you and makes it easier and more efficient for you to do business with them.

community.seagate.com

For example, Seagate Technology LLC, the provider of hard disk drives and storage solutions based in Cupertino, CA, has a Salesforce Communities-based self-service site. Its home page is shown in the screen shot below.

seagate blog1

As a Mac user needing some advice on drives for backups, I clicked on the Mac Storage Topic and was taken to the Mac Storage products page shown below in the next screen shot. This page presents a list of combined questions, (Salesforce Knowledge) Articles, Solved Question, Unsolved Questions, and Unanswered Questions in the center with a drop-down at top of the list to filter the presentation. Links to product-specific pages are at the left.

seagate blog 2

At the bottom of the Mac Storage Product Page are links to additional customer service facilities, including, “Get Help from Support.” We show them in the screen shot below.

seagate blog 3

The Seagate community offers a complete set of easy-to-use self-service facilities. Community-style self-service gives customers everything they need for customer service—finding answers and solutions or getting assisted-service when answers and solutions don’t exist or can’t be found.

Tools and Templates

By the way, Salesforce Communities includes tools and reusable templates that can make it easy and fast to deploy customer self-service communities. Community Designer is the toolset for building and managing the web pages of Communities deployments. Community Designer can also customize the three web page templates packaged with Communities—Koa, Kokula, and Napili. For example the web pages for the Koa self-service template contain facilities that let customers search for or navigate to Salesforce Knowledge Articles by categories called Topics or contact support if they can’t find answers or solutions.

Salesforce.com is changing and improving self-service with Salesforce Communities. What a good idea!

 

 

Framework for Evaluating Customer Service Products

This week’s report is a new version of our Framework for Evaluating Customer Service Software Products. We had two goals for its design. First, we wanted your evaluation, comparison, and selection processes to be simpler and faster. Second, we wanted shorter and more actionable Product Review Reports. The new Framework eliminates evaluation criteria that do not differentiate. For example, we no longer analyze and evaluate web content management for a product’s self-service and assisted-service UIs. These UIs have become a bit static. They’re configurable and localizable, but they’re no longer as customizable and manageable as they had been. The new Framework also decreases the number of factors (sub-criteria) that we consider within an evaluation criterion. For example, the Knowledge Management criterion now has two factors: Knowledge Model, and Knowledge Management Services. The previous version of the Framework examined these and six others.

We also added a criterion—Case Management. When we began evaluating customer service products back in 1993, we felt that case management, while a critical customer service process, was well understood, did not differentiate, and was not really customer-centric. We’ve changed our point of view. We still believe that the purpose for customer service is answering customers’ questions and solving customers’ problems. However, we also recognize that at the point in time that a customer asks a question or poses a problem you might not have an answer or solution available. You create a case to represent that question or problem, your process to resolve the case is a process to find or develop an answer or solution, and its resolution is, itself, the answer or solution. Our evaluation of case management considers four factors that focus on a product’s packaged services and tools for performing the tasks of the case management process. The process includes finding and using case resolutions in communities and social networks.

Customer Service Best Fit and Customer Service Technologies are the Framework’s two top-level evaluation criteria. Customer Service Best Fit presents information and analysis that classifies and describes customer service software products. Customer Service Technologies examines the implementation of a product’s customer service applications. The graphic below shows the Framework, its top-level criteria, and their sub-criteria.

framework

We plan to use the Framework to evaluate every type of customer service product within our current research—case management, knowledge management, virtual assistant, and social network monitoring, analysis, and interaction. The Customer Service Best Fit criterion applies very nicely to any product. The application of the Customer Service Technologies criterion is product-type dependent. Look for our Product Review Report on Salesforce Service Cloud. It will be the first against the new Framework. Based on the draft of that report, the Framework works very nicely.

Voices of Customers

With this week’s report, the 4Q2013 Customer Service Update, we complete our tenth year of quarterly updates on the leading suppliers and products in customer service. These updates have focused on the factors that are important in the evaluation, comparison, and selection of customer service products.

  • Customer Growth
  • Financial Performance
  • Product Activity
  • Company Activity

Taking from the framework of our reports, for Company Activity, we cover company related announcements, press releases, and occurrences that are important to our analysis of quarterly performance. In 4Q2013, three of our suppliers, Creative Virtual, KANA, and Nuance, published the results of surveys that they had conducted or sponsored over the previous several months. All of the surveys were about customer service and the answers to survey questions demonstrated customers’ approach, behavior, preferences, and issues in their attempts to get service from the companies with which they’ve chosen to do business. The responses to these surveys are the Voices of the Customers for and about customer service. This is wonderful stuff.

Now, to be sure, suppliers conduct surveys for market research and marketing purposes. Suppliers’ objectives for surveys are using the Voice of the Customer to prove/ disprove, validate, demonstrate, or even promote their products, services, or programs. Certainly, all of the surveys our suppliers published achieved those objectives. For this post, though, let’s focus on the broader value of the surveys, the Voice of the Customer for Customer Service.

Surveys

The objectives in many of the survey represent the activities that customers perform, the steps that customers follow to get customer service from the companies with which they choose to do business. By getting customer service, we mean getting answers to their questions and (re)solutions to their problems. Ordering our examination and analysis of the surveys in customers’ typical sequence of these steps organizes them into a Customer Scenario. Remember that a Customer Scenario is the sequence of activities that customers follow to accomplish an objective that they want to or need to perform. For a customer service Customer Scenario, customers typically:

  • Access Customer Service. Customer login to their accounts or to the customer service section of their companies’ web sites, or call their companies’ contact center and get authenticated to speak with customer service agents
  • Find Answers and (Re)solutions. Use self-service, social-service, virtual-assisted service, and/or assisted-service facilities to try to help themselves, seek the help of their peers, seek the help of customer service agent for answers and (re)solutions.
  • Complain. If customers cannot get answers or (re)solutions using these facilities, they complain to their companies.

Here, in Table 1, below, are the surveys that examine how customers perform these activities and how companies support those activities. Note that these surveys are a subset of those surveys that were published by our suppliers. Not all of their surveys mapped directly to customer activities. Note that our analyses of survey results are based on the content of the press releases of the surveys. This content is a bit removed from the actual survey data.

Sponsor Survey Objective Activity Respondents
Nuance Privacy and security of telephone credentials Access Smartphone users
Nuance Telephone authentication issues and preferences Access US consumers
KANA Email response times for customer service Find answers and (re)solutions N/A
KANA Twitter response times for customer service Find answers and (re)solutions N/A
Nuance Resolving problems using web self-service Find answers and (re)solutions Web self-service users, 18–45 years old
Nuance Issues with Web self-service Find answers and (re)solutions Windstream Communications customers
KANA Usage of email vs. telephone for complaints Complain N/A
KANA Customer communication channels for complaints Complain UK consumers
KANA Customer complaints Complain US consumers, 18 years old and older

Table 1. We list and describe customer service surveys published by KANA and Nuance during 4Q2013 in this Table.

Let’s listen closely to the Voices of the Customers as they perform the activities of the customer service Customer Scenario. For each of the surveys in the Table, we’ll present the published survey results, analyze them, and suggest what businesses might do to help customers perform the activities faster, more effectively, and more efficiently.

Access

If questions and problems are related to their accounts, before customers can ask questions or present problems, they have to be authenticated on the customer service system that handles and manages questions and problems. Authentication requires usernames and passwords, login credentials. In these times of rampant identity theft, security of credentials has become critically important.

Nuance’s surveys on privacy and security of telephone credentials and on telephone authentication shed some light on customers’ issues with authentication.

  • 83 percent of respondents are concerned or very concerned about the misuse of their personal information.
  • 85 percent of respondents are dissatisfied with current telephone authentication methods.
  • 49 percent of respondents stated that current telephone authentication processes are too time consuming.
  • 67 percent of respondent have more than eleven usernames and passwords
  • 80 percent respondents use the same login credentials across all of their accounts
  • 67 percent of respondents reset their login credentials between one and five times per month.

Yikes! Consumers spend so much time and effort managing and, then, using their credentials. We’ve all experienced the latest account registration pages that grade our new or reset passwords from “weak” to “strong” and reject our weakest passwords. While strong passwords improve the security of our personal data, they’re hard to remember and they increase the time we spend in their management.

In voice biometrics, Nuance offers the technology to address many of these issues. On voice devices, after a bit of training, customers simply say, “My voice is my password,” to authenticate account access based on voiceprints and  voiceprints are unique to an individual.

Find Answers and (Re)solutions

KANA’s surveys on email response times for customer service and Twitter response times for customer service examine response times for “inquiries.” When customers make inquiries, they’re looking for answers or (re)solutions. In the surveys, KANA found:

  • According to Call Centre Association members, response times to email inquiries was greater than eight hours for 59 percent of respondents and greater than 24 hours for 27 percent of respondents.
  • According to a survey by Simply Measured, a social analytics company, the average response times to Twitter inquiries were 5.1 hours and were less than one hour for 10 percent of respondents.

While it’s dangerous to make cross-survey analyses, it seems reasonable to conclude that customer service is better on Twitter than on email. That’s not surprising. Companies have become very sensitive to the public shaming by dissatisfied customers on Twitter. They’ll allocate extra resources to monitoring social channels to prevent the shame. Customers win.

However, remember that these are independent surveys. The companies that deliver excellent customer service on Twitter might also deliver excellent customer service on email and the companies that deliver not so excellent customer service on email might also deliver not so excellent customer service on Twitter. The surveys were not designed to gather this data. That’s the danger of cross-survey analysis.

If your customers make inquiries on both email and social channels, then you should deliver excellent customer service on both. Email management systems and social listening, analysis, and interaction systems, both widely used and well proven customer service applications, can help. These are systems that should be in every business’s customer service application portfolio.

Email management systems help business manage inquiries that customer make via email. These systems have been around for way more than ten years, helping businesses respond to customers’ email inquiries. Businesses configure them to respond to common and simple questions and problems automatically and to assign stickier questions and problems to customer service staff. Business policies are the critical factor to determine response times to customers’ email inquiries.

Social listening, analysis, and interaction systems have been around for about five years. They help businesses filter the noise of the social web to identify Tweets and posts that contain questions and problems and the customers who Tweet and post them. These systems then include facilities to interact with Tweeters and posters or to send the Tweets and posts to contact center apps for that interaction.

Find Answers and (Re)solutions Using Web Self-Service

Nuance’s surveys about web self-service really show the struggles of customers trying to help themselves to answers and (re)solutions.

In the survey about consumers’ experiences with web self-service, the key findings were:

  • 58 percent of consumers do not resolve their issues
  • 71 percent of consumers who do not resolve their issues spend more than 30 minutes trying
  • 63 percent of consumers who do resolve issues, spend more than 10 minutes trying

In Nuance’s survey of Windstream Communications’ customers about issues with web self-service, the key finding were:

  • 50 percent of customers who did not resolve their issues, escalated to a live agent
  • 71 percent of customers prefer a virtual assistant over static web self-service facilities

The most surprising and telling finding of these surveys was the time and effort that customers expend trying to find answers and (re)solutions using web self-service facilities. 30 minutes not to find an answer or a solution seems like a very long time. Customers really want to help themselves.

By the way, Windstream’s customers’ preference for a virtual assistant is not a surprise. Windstream Communications, a Little Rock, AK networking, cloud-computing, and managed services provider, has deployed Nina Web, Nuance’s virtual agent offering for the web. Wendy, Windstream’s virtual agent, uses Nina Web’s technology to help answer customers’ questions and solve their problems. The finding is a proof point for the value of virtual agents in delivering customer service. Companies in financial services, healthcare, and travel as well as in telecommunications have improved their customer services experiences with virtual agents. We cover the leading virtual agent suppliers—Creative Virtual, IntelliResponse, Next IT, and Nuance—in depth. Check out our Product Evaluation Reports to find the virtual agent technology best for your business.

Complain

Customers complain when they can’t get answers to their questions and (re)solutions to their problems. KANA’s surveys about complaints teach so much about customer’s behavior, preferences, and experiences.

  • In KANA’s survey on usage of email or telephone channels for complaints, 42 percent of survey respondents most frequently use email for complaints and 36 percent use the telephone for complaints.
  • In KANA’s survey of UK consumers on communications channels for complaints, 25 percent of UK adults used multiple channels to make complaints. Fifteen percent of their complaints were made face-to-face.

The surprising finding in these surveys is the high percentage of UK consumers willing to take the time and make the effort to make complaints face-to-face. These customers had to have had very significant issues and these customers were very serious about getting those issues resolved.

The key results in KANA’s survey about customer complaints by US consumers were:

  • On average, US consumers spend 384 minutes (6.4 hours) per year lodging complaints
  • In the most recent three years, 71 percent of US consumers have made a complaint. On average, they make complaints six times per year and spend one hour and four minutes resolving each complaint.
  • Thirty nine percent of US consumers use the telephone channel to register their complaints. Thirty three percent use email. Seven percent use social media.
  • Millenials complained most frequently—80 percent of 25 to 34 year old respondents. Millenials are also most likely to complain on multiple channels—39 percent of them.
  • Survey respondents had to restate their complaints (Retell their stories) 69 percent of the time as the responsibility to handle their complaints was reassigned. On average, consumers retold their stories three times before their issues were resolved and 27 percent of consumers used multiple channels for the retelling.

The surprising findings in this survey are the time, volume, and frequency of complaints. Six and a half hours a year complaining? Six complaints every year? Yikes!

No surprise about the low usage of social channels to register complaints. Customers want to bring our complaints directly to their sources. They may vent on the social web, but they bring their complaints directly to their sources, the companies that can resolve them.

Lastly and most significantly, it’s just so depressing to learn that businesses are still making customers retell their stories as their complaints cross channels and/or get reassigned or escalated. We’ve been hearing this issue from customers for more than 20 years. Customers hate it.

Come on businesses. All the apps in your customer service portfolios package the facilities you need to eliminate this issue—transcripts of customers’ activities in self-service apps on the web and on mobile devices, threads of social posts, transcripts of customers’ conversations with virtual agents, and, most significantly, case notes. Use these facilities. You’ll shorten the time to solve problems and resolve customers’ complaints. Your customers will spend less time trying to get answers and (re)solutions (and more time using your products and services or buying new ones).

4Q2013 Was a Good Quarter for Customer Service

By the way, Customer Service had a good quarter in 4Q2013. Customer growth was up. Financial performance was up as a result. Product activity was very heavy. Nine of our ten suppliers made product announcements. Company activity was light. Five suppliers did not make any company announcements. Most significantly, KANA was acquired by Verint. And of course, three suppliers published customer service surveys.

A Good Quarter for Customer Service in 3Q2013

This week, continuing our tenth year of quarterly updates on the suppliers and products in customer service, we published our 3Q2013 Customer Service Update Report. Just a reminder, these reports examine customer service suppliers and their products along the dimensions of customer growth, financial performance, product activity, and company activity. We currently cover ten leading customer service suppliers. They lead in overall market influence and share, in market segment influence and share, and/or in product technology and innovation.

3Q2013 was a good quarter for customer service. Customer growth was up and improved customer growth resulted in improved financial performance. Product activity was light. Six of our suppliers did not make any product announcements, but remember that third quarters are summer quarters. They’re usually never big for products. Company activity was also on the light side but what company action we saw was highlighted by expansion into new markets by four of our suppliers. That’s a key customer service trend and a solid indicator of customer service growth in the quarters ahead. Here’s a bit more detail:

  • On July 17, IntelliResponse and BolderView, a Melbourne, AU-based consultancy specializing in virtual agent solutions for large enterprises in utilities, banking, technology, higher education and government markets, jointly announced that BolderView had become a value-added reseller of IntelliResponse VA for Australia and New Zealand. Within the release, IntelliResponse also announced the opening of its own office in Sydney, AU.
  • On September 5, KANA and Wipro jointly announced a partnership that will apply Wipro’s consulting, systems integration, and insurance industry expertise and experience to accelerate deployments of KANA Enterprise for large global insurers and financial services providers. The companies will form a dedicated, joint deployment team to work on customer deployments.
  • On September 17, Clarabridge announced the expansion of its global operations into Latin America. A sales team will use Miami, FL offices and will leverage Clarabridge’s partnerships with Accenture, Deloitte, and Salesforce.com initially to focus on opportunities in Argentina, Brazil, Chile, Colombia, Mexico, and Peru.
  • On September 25, Moxie announced the expansion of its operations in Europe. The expansion includes opening an office in Reading, UK, forming partner ships with Spitze & Company in Denmark and IZO in Spain, and appointing Andrew Mennie General Manager for EMEA.

This expansion is a win for customer service suppliers, a win for their customers, and a win for their customers’ customers.

It’s already winning for customer service suppliers. For example, Moxie claims to have doubled its European customer base in the last six months. New customers include Allied Irish Bank and the British Army. IntelliResponse and BolderView recently launched “Olivia,” their first joint virtual agent deployment. Olivia is the virtual agent for Optus, Australia’s second largest telecommunications provider. And, Creative Virtual, a UK-based virtual agent software supplier that we’ve been covering in our quarterly reports for the past four quarters, recently announced Sabine, the Dutch-speaking virtual agent for NIBC Direct, the online retail unit of The Hague, NE-based bank. Sabine’s deployment is supported from Creative Virtual’s new Amsterdam office. See Sabine at the bottom right of NIBC Direct’s home page, below.

nibc png

Expansion demonstrates the strength and viability of customer service suppliers. Their products have reached the level of maturity and reliability that their deployment “far from home” carries little or no risk. They have the resources to open offices and hire the staff to promote, sell, and support their products in new markets. And they recognize the potential for new and additional business in those markets.

Our suppliers’ customers and their (end) customers in Australia and New Zealand, Latin America, and Europe benefit, too. Customer service applications like Clarabridge Analyze, a CEM (Customer Experience Management) app, Creative Virtual V-Person and IntelliResponse VA (Virtual Agent) virtual agents apps, and Moxie Social Knowledgebase, a social customer service app have been proven to lower cost to serve and to improve customer experiences. Companies in expanded markets that deploy these apps will have more satisfied, more profitable customers. These apps will help answer customers’ questions and solve customers’ problems more quickly and more easily.

We’ve been ready for this expansion. Language support has long been a criterion in our frameworks for evaluating customer service applications. We examine the languages that the apps support for internal users and the globalization/localization facilities to deploy the apps to end customers. Generally, we’ve found that most customer service apps can be localized to support locale-specific deployments. On the other hand, the tools and reporting capabilities for internal users tend to be implemented and supported only in English.

2Q2013 Customer Service Stars

This week, continuing our tenth year of quarterly updates on the suppliers and products in customer service, we published our 2Q2013 Customer Service Update Report. These reports examine customer service suppliers and their products along the dimensions of customer growth, financial performance, product activity, and company activity. We currently cover eleven leading customer service suppliers. They lead in overall market influence and share, in market segment influence and share, and/or in product technology and innovation.

For 2Q2013, overall customer service performance was mixed but three of our suppliers—Clarabridge, IntelliResponse, and Salesforce.com—earned Customer Service Stars for the quarter. Very briefly, Clarabridge is a privately owned firm based in Reston, VA that was founded in 2005. Clarabridge offers a suite of VoC applications. IntelliResponse is a privately owned firm based in Toronto, ON that was founded in 2000. IntelliResponse offers a suite of virtual agent products. Salesforce.com is public (NYSE:CRM) firm based in San Francisco, CA that was founded in 1999. The company has a broad product that includes Salesforce Service Cloud, which provides case management, knowledge management, contact center, and web self-service applications.

So, what’s a Customer Service Star? Well, since 2009, we’ve been awarding Customer Service Stars for excellent quarterly performance balanced across those dimensions of customer growth, financial performance, products, and company activity. (Since 2010, we’ve also been awarding Customer Service Stars for the year—same criteria across four quarters.) It’s not easy to earn a Customer Service Star and we take awarding them pretty seriously. Here are the award criteria:

  • Customer growth: We examine significant quarter-over quarter acquisition of new customers and additional business from existing customers.
  • Financial performance. We examine quarterly revenue improvement as reported for public companies or as we estimate for private companies based on customer growth, customer base, and pricing.
  • Products. We examine new products, new versions in a quarter.
  • Company activity. We examine new M&A, partnerships, branding, patents, organization, and facilities in a quarter.

Typically, we award one Customer Service Star for a quarter. Frequently, we award none. Three in a quarter is a big deal, especially when many of our suppliers did not have good quarter. Here’s how Clarabridge, IntelliResponse, and Salesforce.com earned their Customer Service Stars for 2Q2013:

Customer growth and financial performance

  • On a base of approximately 250 customer accounts, Clarabridge acquired 10 to 15 new customers and did additional business with 55 to 65 existing customers, driving excellent financial performance
  • On a base of approximately 160 customer accounts, IntelliResponse acquired eight new customers and did additional business with six existing customers, driving very good financial performance.
  • On a base of approximately 165,000 customer accounts, growth in subscription and support revenue indicated that Salesforce.com acquired approximately 21,000 new customer accounts. We estimate that something around 20 percent of them licensed customer service products. Total revenue increased by more than seven percent to $957 million.

Products

  • Clarabridge made one product announcement in 2Q2013: Clarabridge 6.0, a major new version of its VoC application suite.
  • IntelliResponse made two product announcements in 2Q2013: OFFERS, a marketing application that delivers targeted offers within a virtual agent’s answers and VOICES, a Voice of the Customer analytic application. Both apps integrate with IntelliResponse Virtual Agent, “IR’s” virtual agent offering.
  • Salesforce.com made four product announcements: Salesforce Mobile Platform Services, mobile application development tools and programs for building and deploying Android, iOS, HTML5, and hybrid applications, Social.com, a new social advertising application, Salesforce Communities, a community application, and a suite of G2C (Government to Citizen) solutions for federal, state, and local agencies all built on Salesforce.com general-purpose apps.

Company activity

  • Clarabridge made three company announcements: a new corporate logo, web site, and brand for its products, a new general Counsel, and a partnership with Brandwatch for collection and analysis of social data.
  • IntelliResponse was awarded a U.S. patent for its answer matching technology.
  • Salesforce.com made three company announcements: an agreement with NTT to build a cloud-computing data center in the UK, the acquisition of ExactTarget, a marketing automation/campaign management supplier, and the appointment of a new President and Vice Chairman.

Props to all three for an excellent quarter!

We know all three of the companies and their current customer service product offerings very well. During 2013, we published a product evaluation of Clarabridge Analyze, Clarabridge Collaborate, and Clarabridge Engage against our Framework for Customer Social-Service on March 28, 2013. We published a product evaluation of IntelliResponse Virtual Agent (VA) against our Framework for Customer Virtual Assisted-Service on May 9, 2013. We published product evaluations of Salesforce Service Cloud against our Framework for Customer Cross-Channel Customer Service on January 24, 2013 and our evaluation of Salesforce Marketing Cloud Radian6 against our Framework for Customer Social-Service on August 1, 2013.

The three suppliers also made it easy for us to do our research for these product evaluations. All three gave us trial versions of their products as well as access to product documentation. For Clarabridge and IntelliResponse, we also read their appropriate patents and patent applications.

We usually publish our Quarterly Customer Service Update reports early in the third and last month of calendar quarters. IntelliResponse and Salesforce.com run on fiscal years that end on January 31. Their fiscal quarters end a month later than calendar quarters.

In a few weeks, we’ll begin research on our 3Q2013 Customer Service Update Report. Third quarters are summer quarters, quarters when the software business (and many other business) typically, shall we say, relaxes. But, we hope that a Customer Service Star or two will shine.