Like many enterprises, ServiceNow has been incorporating artificial intelligence (AI) into its internal systems and customer-facing products for years. But when Open AI’s ChatGPT emerged a year ago, everything changed — fast.
Suddenly, what had been machine learning — or “analytical AI” that could produce recommendations based on financial, sales, and marketing data — became natural-language processing. A brand new employee could suddently ask the corporate generative AI (genAI) application for an answer to an in-depth client question. Seasoned employees could ask the platform for information about company benefits or how to get a new laptop.
Chris Bedi joined ServiceNow in September 2015 and serves as the company’s chief digital information officer. Prior to joining ServiceNow, he spent almost four years as CIO of JDS Uniphase Corp. (JDSU), where he was responsible for IT, facilities, and indirect procurement. Before that, Bedi held various positions at VeriSign between 2002 and 2011, including CIO, vice president of corporate development, and vice president of human resource operations.
When he joined ServiceNow, the company was earning about $800 million a year in revenue. Today, its revenue tops $8 billion, and it employs about 2,000 employees. Bedi has also gone all-in on AI.
ServiceNow is now implementing genAI through an internal pilot program. Leveraging its own platform and third-party LLMs, the company has gone live with 15 genAI pilots across multiple departments, including customer service, IT, HR, and sales.
Those trials are focused on driving better customer and employee experiences with higher self service, agent productivity, automated marketing lead management, and text-to-code software development.
Bedi recently spoke with Computerworld and explained why he sees the introduction of ChatGPT and genAI as a watershed moment for enterprises, and why he worries less about what could go wrong and more about whether he’s creating an environment where the technology can advance as fast as its capabilities enable it.
The following are excerpts from that interview.
When did your company begin using AI on any level? “I joined September 2015, and I remember meeting with our machine learning team as part of my onboarding. So, we’ve been doing machine-learning applications as early as 2015. As you can imagine in 2015, a lot of this was a bit more pilot, science projects.
“Over the years, we’ve scaled it tremendously. The industry hasn’t really settled on a term. What do we call the AI that existed before genAI? I just call it analytical AI. If you think about it, it’s infusing machine learning into all of our important ranking, rating, or recommendation [engines] on where revenue is going to end up, the possibility that a sales deal is going to close, the likelihood that we could have a customer doing this. We’ve been doing this for a long time.
“And, we’ve been building a lot of AI into the ServiceNow platform, as well — whether it’s MLU, NLP, machine-learning-based calculation mechanisms, risk ratings, routing of customer cases, etc. Even before GenAI, we were using ML to automatically assess processes to find bottlenecks and inefficiencies. Once ChatGPT came out, we very quickly started moving to experiments, and now experiments are in production.
“Most leading CIOs did something similar. If I call it analytical AI, they’ve been at it for a while. What genAI and ChatGPT has done is it has kind of woken up the rest of the C-suite to this whole AI thing that most leading CIOs have been on for a while.”
Can you give me a definition for analytical AI? “I struggle with it, too. We were doing AI before ChatGPT came along, but we were calling it supervised machine learning, unsupervised machine learning, natural language understanding, natural language query, natural language processing. All those names were under the category of AI. Now we have genAI, so do we call it traditional AI? Do we call it analytical AI because most of it has to do with numbers. So, when I say analytical AI, it’s a placeholder term for all the AI that came before ChatGPT came along. So, again NLU, NLP, NLQ, virtual agents, process mining, RTA…, a basket of stuff.”
What changed in November 2022 when ChatGPT was released? “I think what changed in November, and I ask myself this, is this a metaverse moment where we’re all enamored with the tech? Or is this blockchain where we’re all searching for use cases? Or is this more of an ‘iPhone moment,’ which is really going to change almost everything we know.
“I feel like this is more like the iPhone moment. What genAI allowed us to do — with large language models the underpinning of AI — what defines a language, I think that’s where it’s up to us to reimagine. Coding we’ve defined as a language now. Obviously, there’s text-based, whether it’s resumes or summaries of security incidents, all of that is a language. We’ve been doing things on the NLP [natural language processing] side with products for a few years.
“So, we’ve been partnering with Hugging Face and StarCoder in developing large language models on the product side of the house. And, obviously I serve as customer zero for all of our products, which is why we’ve been able to release products so quickly. I think we were one of the first to offer real working [AI-infused] products to the marketplace.”
Can you offer up some of the top use cases for genAI at ServiceNow — both internally and externally? “Right now, we have 15 use cases that are live and using genAI. I’d simplify those into four general use cases. One is around customer/employee experience. The industry has been after case deflection, employee self-service for a while now. That’s not a new concept. I think what genAI did is it gives that function a step up in terms of effectiveness.
“If you think about searching for an answer on any customer support site, including our own now, you’ll now get a genAI response. That’s the equivalent of a search engine response where you’re getting a bunch of links and then a genAI search response where you’re getting the information you’re actually looking for.
“Even in 10 weeks, we’ve seen a 3% to 4% jump in case deflection rates on our customer support site. We’re seeing very similar results on employee self-service. As employees need to know, ‘Who’s my benefit provider? How do I get a new laptop? How do I get a new travel card? Is this thing worthy of a press release?’
“All those answers that are buried in…corporate policies and documents — genAI makes them instantly accessible. So, customer and employee service is number one.
“The second use case is…agent productivity. An agent could be an IT agent, HR operations, customer support, or someone in finance. How do you help an agent be more productive by analyzing large sets of information quickly, summarizing it for the agent so they can get to the heart of the answer quickly?
“On the flipside of that, as they’re managing their work and handing it off from one person to another or resolving it, you need to send a nice summary to the customer; genAI can write that summary for them. And we’ve actually seen with what we’ve deployed, 70% of agents are accepting those genAI summaries with minimal edits. We actually measure minimal.
“We’ve seen an increase in cases solved for agents per week. We’ve seen shortened durations of the time it takes to resolve cases. So any way you slice it, the productivity boost is starting and we’re in the early innings. I know we’ll get better.
“One other measure we’re looking at, which I think is really important, is the sentiment of the people using genAI. I’ll take my own shop that’s using AI for its IT agents, I think 56% have already said… this thing is a boost to their productivity. That sentiment is hard to get to because we’re so wedded to our current ways of working.
“The third use case is accelerating digital transformation. So, text-to-code is real. Text to workflow is real. What we’ve seen is a 26% acceptance rate on our software developers accepting what genAI is providing them on a text-to-code standpoint. For people outside the industry, 26% may seem small, but as a practitioner, I’m really pleased with that number. If you pull on that thread, that’s 26% more lines of code that can be written without a human having to do it. Pull on the thread a little more and you’ve got a 26% productivity bump in an area that’s one of the scarcest talent areas regardless of the industry — software engineers. And it’s only going to get better.
“I would say there’s work to do on sentiment and adoption. People have been working a certain way for decades. GenAI is very new and, like with anything new, it’s going to take some time for the adoption rate to climb up to the point where it’s like you or I listening to Spotify for music or something like that.
“When genAI works, I think the adoption rate will be a bit slower than leaders like myself want. That’s the standard recipe of change management, and training and skillset development on all those things.
“The fourth use case is around…how to help a human become an instant expert. If you think about us as a high-growth software company, we want to serve our customers in the best way. We have lots of innovation coming out of our platform every month; keeping up with that in the interest of serving our customers is pretty hard. I think about a new person joining ServiceNow. If you remember the movie “The Matrix” when he plugs that thing into the back of his head and he instantly learns how to fly a helicopter — that’s the vision I have.
“So, we put all our product documentation, from high-level value messaging to low-level product expectations — how do you configure this — and took every RFP response, every sales presentation and sales training, indexed it in large language models. So now, if someone joins ServiceNow on a Monday, on Tuesday morning someone from [our customer] FedEx asks how do your solutions help out with operational technology risk management, that new employee can quickly go to a portal and type in the question and will get a very intelligent response. Think about that context of ‘instant expert,’ it can apply to any persona. We just happened to apply it to the one that serves the customer first, but we’re going to roll that out across the organization.”
Explain what the term ‘customer zero’ means in terms of rolling out new technology? Are you saying you roll it out for internal operations first? “Two things: I’m talking about deploying a lot of this for internal and customer-facing use cases, but as a software provider we’re offering a lot of genAI products to the marketplace. When I say customer zero, I mean we use all that same technology to power our own business. We use it to scale ourselves, and…the productivity gains we get from using our own platform are self evident. So, by customer zero, I mean we use every one of our products internally, prove out the value before our customer use them to make sure we’re confident on the technical side and the business side, change management, etc.”
What feedback are you hearing from your peers and customers about the current landscape for IT decision makers when it comes to genAI? “Because ServiceNow serves about 90% of the Fortune 500, I have the privilege of talking to a number of Fortune 500 CIOs every week. I’d say there are three camps:
- One camp is saying, ‘We have to spend the next three or four months figuring out governance, figuring out security, figuring out all the underpinnings in a super-solid architecture, before I start doing pilots and putting stuff into production.’
- A second camps is saying, ‘Yep, the tech works, but I need to see a real ROI before I invest material human capital or dollars into this.’
- A third camp believes this is inevitable. ‘We need to just get on with it. We don’t measure the ROI [on an established technology like] email, and genAI in the workplace will be as common as that. So, the faster we get moving, the better.’
“Again, none of the viewpoints are wrong. Everybody is probably doing some of each, but the common underpinning is everyone is doing something around genAI. Most CIOs I’m talking to have been asked by their CEO or C-suit to have a genAI strategy for each department.”
When it comes to corporate policies, standards and oversight, what have you instituted to ensure the safe and ethical use of genAI, especially in light of President Biden’s recent executive order? “With President Biden’s new rules, I think that’s a great step in the right direction. We have a commitment to the responsible and ethical use of AI models.
“So, we’re supportive of the new executive order. But we also have an AI ethics and governance council…to protect our employees and customers from bias, cybersecurity vulnerability, data privacy risks, and even the user experience of transparency. If someone is getting a genAI answer to their question, we want them to know this was generated by a machine.
“We are very focused on it. We have an AI governance and ethics committee, which is cross-functional in nature. It’s legal, it’s my organization, it’s our product organization, it’s cyber. We test all that out before releasing anything to the market or our own employees.”