Syncing with ServiceNow

Unlocking ServiceNow Lens: On-Screen Intelligence Meets Generative AI

XenTegra Episode 49

In Episode 49 of Syncing with ServiceNow, host Andy Whiteside is joined by Mike Sabia and John Dahl for an in-depth discussion on ServiceNow Lens—a powerful new capability that uses generative AI to extract and act on data directly from your screen.

The team explores how Lens empowers support agents and developers by instantly capturing error messages, screenshots, and log files, then automatically generating incident tickets, filling in scripts, and suggesting next actions.

You’ll learn:

  • How Lens differs from traditional data fabric and RPA tools
  • Real-world use cases for troubleshooting and client support
  • Key features like custom prompts, multi-source capture, and secure session handling
  • Where Lens sits on the spectrum between LLM and agentic AI
  • The potential (and limits) of AI in de-escalating and resolving support tickets faster

Whether you’re curious about ServiceNow's AI roadmap or looking to enhance user productivity, this episode is packed with insights.

Tune in and see how on-screen intelligence is transforming the way we work in real time.

WEBVTT

1
00:00:02.560 --> 00:00:12.260
Andy Whiteside: Hello, everyone, and welcome to Episode 49 of syncing with Servicenow. I'm your host, Andy Whiteside. I've got Mike Sabia and John Dahl with me, Mike. How's it going out there in California?

2
00:00:13.842 --> 00:00:15.619
Andy Whiteside: Sorry it's close.

3
00:00:16.070 --> 00:00:19.550
Andy Whiteside: That's what I meant to say. Just came out as California.

4
00:00:20.150 --> 00:00:27.980
Mike Sabia: It's going. Well, I was just talking with John this morning that we're gonna switch temperatures this coming weekend. He's gonna be only in the nineties I'm gonna be in a hundred.

5
00:00:28.300 --> 00:00:29.499
Mike Sabia: Usually the opposite.

6
00:00:30.110 --> 00:00:32.660
Andy Whiteside: Yeah. Here's the dry 100, though. Right?

7
00:00:32.970 --> 00:00:33.720
Mike Sabia: It is.

8
00:00:34.300 --> 00:00:37.120
Andy Whiteside: So you go outside and sizzle, but then you can find the shade and cool off.

9
00:00:38.360 --> 00:00:41.530
Mike Sabia: And Colorado really, isn't that bad when hot? If you're in the shade.

10
00:00:41.530 --> 00:00:43.220
Andy Whiteside: What altitude are you at.

11
00:00:44.390 --> 00:00:46.979
Mike Sabia: I'm around 5,100 or so.

12
00:00:46.980 --> 00:00:47.720
Andy Whiteside: Okay.

13
00:00:48.150 --> 00:00:48.740
Mike Sabia: Around, there.

14
00:00:48.740 --> 00:00:54.440
Andy Whiteside: That's up there. I know it gets a lot higher. Another 5,000 feet. But that's pretty high, as far as I'm concerned.

15
00:00:54.913 --> 00:00:59.639
Andy Whiteside: Yeah. John Doll, how's it going? You're Texas, I wanna say, Houston.

16
00:01:00.230 --> 00:01:09.030
John Dahl: Houston, Texas. And we're we're we're doing just fine here. The humidity is down to a reasonable 80%. I think.

17
00:01:09.750 --> 00:01:15.220
Andy Whiteside: And those are last week actually, for a little bit. And it was, you know it was warm.

18
00:01:15.840 --> 00:01:21.130
Andy Whiteside: It's that time of year, though. Can't really complain about it. We know it's coming. It's not like it's a surprise. We always act like a surprise every year

19
00:01:21.300 --> 00:01:22.059
Andy Whiteside: I'm a surprise.

20
00:01:22.770 --> 00:01:25.619
John Dahl: It. It makes you enjoy the pools and lakes that much more.

21
00:01:25.920 --> 00:01:27.090
Andy Whiteside: Yeah, for sure.

22
00:01:27.740 --> 00:01:32.970
Andy Whiteside: So guys, you, Mike and John brought a topic today. And it's from a blog entitled

23
00:01:33.210 --> 00:01:37.320
Andy Whiteside: Scan, Understand and Activate on-screen data

24
00:01:37.826 --> 00:01:44.060
Andy Whiteside: on screen data with lens Mike, what's this all about? And why'd you pick this one.

25
00:01:44.450 --> 00:01:49.190
Mike Sabia: Well, first, st I'll say that when I 1st saw the word lens, i IA couple

26
00:01:49.570 --> 00:02:16.929
Mike Sabia: couple weeks months ago, I thought it was talking about Google Lens, and how this interface with service now. And and there are some parallels. But this definitely service. Now, Lens, if you're familiar with Google lens. I used it a couple of months ago when I went out of country to scan a menu. That was another country, and it immediately translated, which was fantastic. And there's some parallels here with service now where you could be doing your job, and you have some data which is not in service now, and

27
00:02:17.120 --> 00:02:33.480
Mike Sabia: briefly, service now has the ability to use their lens product to look at that data, whether it be a log file or an email and pull in the appropriate data, and maybe even do a little bit more and assess and and based on the prompts digest it and make recommendations.

28
00:02:34.250 --> 00:02:40.009
Andy Whiteside: Interesting. So would you call this? I'm probably getting ahead of myself. Here. Is this a generative, AI, or is this agentic? AI.

29
00:02:42.170 --> 00:02:51.470
Mike Sabia: I mean, there have been capabilities before lens with Rpa robotic processing automation where they allow for you to

30
00:02:52.230 --> 00:03:05.769
Mike Sabia: scan a Pdf. Or go to a website that doesn't have an Api and and get that data, I'd say it's a little further than that. And and the article this particular block even mentions that the generative AI actually goes and looks to the data and actually

31
00:03:06.780 --> 00:03:17.009
Mike Sabia: looks to see what it is. It could be as simple as you know, decide what the short description of the ticket is, but it could, you know, pull the appropriate information. So I definitely say, it's on the general base AI side.

32
00:03:17.140 --> 00:03:22.350
Andy Whiteside: Okay, that's what I would guess coming to it. John, what are you? What are you excited about? As far as this blog goes?

33
00:03:23.100 --> 00:03:50.079
John Dahl: Well, we're service now has been pushing this idea of workflow data fabric where you're trying to bring more of your enterprise data into availability for the various processes in service now, but that workflow data fabric has traditionally been more static. You define it in the enterprise, and you just want to use it in the. In the instance, the lens allows you to really do more ad hoc work. So Mike talked about a log file. But this could actually be just an error message that pops up on your screen.

34
00:03:50.210 --> 00:04:17.140
John Dahl: You can click and activate lens. It'll look at that error message and it can generate the incident right from there. So this is more ad hoc, things that are happening right now. I need to capture this, and I need to act on it. And it could be from a website that, viewing it could be from an email that you're looking at, it could be from just about anything, including images. It will extract the information from the image, and it will use the generative AI capabilities, the analysis capabilities to drive meaning from it.

35
00:04:17.149 --> 00:04:28.359
Andy Whiteside: Yeah. So I'm gonna hold off of my question around how this applies to the platform and how it impacts business and enterprise. Maybe that'll be discovered through the conversation. But I'm very curious

36
00:04:28.509 --> 00:04:32.509
Andy Whiteside: as to what this is gonna mean, for, you know, service now, customers.

37
00:04:33.451 --> 00:04:35.299
Andy Whiteside: Mike, this 1st section

38
00:04:35.489 --> 00:04:42.619
Andy Whiteside: tries to explain, with a picture included, what lens is. Can you do your best to articulate what this is covering.

39
00:04:43.604 --> 00:04:57.310
Mike Sabia: I mean, it's very similar to what John already said. It's the idea that you are on a ticket you scan that information, or that image or the lock file, that with the Lm. It it understands what the important

40
00:04:57.580 --> 00:05:06.530
Mike Sabia: portions of that are, and then it can, you know, create that ticket for you, or I I guess extensively it could do orchestration based on that.

41
00:05:07.410 --> 00:05:21.429
Andy Whiteside: So is this something where I, as a service person, would have a ticket lens, would take it and digest it for me, and give me a good starting foundation? Or is this something where lens is? Gonna take it. Solve the problem. Tell me what it did, and tell me to close the ticket, or maybe even close the ticket.

42
00:05:23.050 --> 00:05:27.469
Mike Sabia: I think, with anybody getting into General AI probably need to take some

43
00:05:27.620 --> 00:05:40.759
Mike Sabia: 1st steps 1st before you run it, you know. Focus on on understanding what it is and creating tickets. And then, as you get familiar and comfortable with that, then go that that next stage into hey? What can we do to orchestrate and automate this.

44
00:05:41.440 --> 00:05:44.380
Andy Whiteside: So that's that's how I look at AI, or generative AI. Specifically.

45
00:05:44.730 --> 00:05:52.089
Andy Whiteside: at least. Now, at the moment it doesn't do the job for you. It might be able to soon. In fact, it might could do it now, but at a minimum. It's a coach.

46
00:05:53.370 --> 00:06:20.480
Mike Sabia: Well, I mean, that kind of goes back to, you know. Is it L. An Llm. Or is it agentic? And you know what we've seen the last 6 months year, however long it's been, is it digests that information, or you query it gives that information, and agentic is where it kind of does it on your behalf. Somebody submits a ticket, and without a user choosing to use it, it goes and does the analysis and orchestration lever and makes it happen. So I I think this is a little bit more on the

47
00:06:20.570 --> 00:06:36.080
Mike Sabia: hands on, and less agentic in that it the the product, and particularly this article focuses on, hey? I'm doing this now. I want to do that. But you know, it can extend further. It could maybe, make recommendations. It could maybe even kick some off those. Some of those items off.

48
00:06:36.660 --> 00:06:40.399
Andy Whiteside: You know, here's here's what I'm most excited about, and it's gonna sound really kind of silly.

49
00:06:40.620 --> 00:06:45.169
Andy Whiteside: I have tried since my days at Microsoft on a support desk

50
00:06:45.510 --> 00:07:12.610
Andy Whiteside: to coach people into. Yes, solving the problem as far as they know, but always asking for buy-in and acceptance from the person who opened the ticket? Did I solve your problem before just closing the ticket? I've given up on training human beings to do that. I hope that AI can be trained to do that. And that's part of the final process is, did I solve your problem? And if the answer is anything other than Yes, I need to know what I need to do better, or what I need to do.

51
00:07:12.610 --> 00:07:16.869
Mike Sabia: I think they're gonna you know, people who are setting this up will absolutely

52
00:07:17.000 --> 00:07:29.229
Mike Sabia: where appropriate. Ask those questions, you know, if it's something that's you know, maybe some log file, and it does it automatically. Maybe it'll do it without interaction. But if it's initiated by a customer, yeah, of course it's gonna ask, hey? Did I solve your problem?

53
00:07:29.530 --> 00:07:30.120
Andy Whiteside: Yeah.

54
00:07:30.540 --> 00:07:35.309
Andy Whiteside: Hey, John, what did what did Mike, miss? In the kind of the general coverage of what this is.

55
00:07:36.550 --> 00:07:41.039
John Dahl: Well, it's again. It's really about extracting meaning from

56
00:07:41.310 --> 00:07:44.269
John Dahl: the kinds of data that isn't already in service. Now.

57
00:07:44.440 --> 00:08:00.630
John Dahl: so as long as you can. As long as you know how to respond to that data that you're extracting, there's a lot that you can do it. It really is designed around helping a user be more productive with what they're already doing.

58
00:08:00.940 --> 00:08:18.159
John Dahl: And it it has the same caveats that now assist does right? Make sure that you understand what it's doing, and that you can confirm or correct what's there, and it gives you a chance to actually correct, to put in their custom instructions in the prompt before

59
00:08:18.440 --> 00:08:25.519
John Dahl: it processes the information. So it it really is designed around making a user more efficient at what they're already doing.

60
00:08:27.400 --> 00:08:35.890
Andy Whiteside: And you know, AI, that works. That's pretty much the basics of it these days, and it just making us all faster and better and more efficient.

61
00:08:37.400 --> 00:08:42.290
Andy Whiteside: All right, Mike. You may already cover this, but this walking through the article, it says, how does it work? So how does lens work.

62
00:08:42.480 --> 00:08:57.059
Mike Sabia: So this is kind of like a practical explanation of it. You would be on an incident, because that's what we've been talking about, you would say, Hey, I wanna create with lens. And there's a button, and it would initiate the lens product to come up.

63
00:08:57.180 --> 00:09:15.392
Mike Sabia: You point it to whatever screen has that error message or screenshot. And then you do analyze, and it does the analysis, and it scans the log messages like. And then it fills out that incident with the appropriate details. And then, per what John just said, it also allows

64
00:09:17.230 --> 00:09:24.679
Mike Sabia: a rest resolution if prompted for. And you can, you know, adjust the the query on what you're looking to get

65
00:09:25.080 --> 00:09:27.549
Mike Sabia: out of that. The record that you are scanning.

66
00:09:28.050 --> 00:09:28.840
Andy Whiteside: So

67
00:09:29.450 --> 00:09:46.100
Andy Whiteside: interesting. Because obviously, we do a lot of end user computing, we have a product called control up. I, I could see this being a world where it can read and see what's on the screen. Meanwhile, it's grabbing real time data and historical data from control up, using all that to within seconds give an analysis of what's going on.

68
00:09:47.800 --> 00:10:14.770
John Dahl: Even even if you're talking about providing support to an end user. And you've got a screen share up right? Screen shares are are essentially images in terms of what the agents can see. They can't easily select and and capture that information. Well, this allows you to capture that information right off the screen share and enter it into the ticket, including, taking the error code that's being displayed potentially looking up knowledge articles automatically that refer to it and so on.

69
00:10:14.770 --> 00:10:19.880
Andy Whiteside: Yeah, no, this is great. I that that real time troubleshooting use case

70
00:10:20.240 --> 00:10:35.829
Andy Whiteside: where? Not? Only, okay, let's say I'm not even doing a screen capture. It just sees what's on the screen in parallel with me, the support person. It's reading the data, real time analytics, either showing it to me or not, and it's saying, Hey, here's the here's the 3 things I think are going wrong here. What do you think

71
00:10:36.940 --> 00:10:38.899
Andy Whiteside: that sounds like? Nirvana? Really.

72
00:10:40.141 --> 00:10:46.389
Andy Whiteside: Key features of lens. Mike, I'm gonna walk you through these one time. The 1st one is on demand data extraction.

73
00:10:48.140 --> 00:10:54.499
Mike Sabia: Yeah, I mean, be able to pull that data out of whatever source you're looking at screenshot log file image.

74
00:10:55.070 --> 00:10:55.670
Andy Whiteside: Yeah.

75
00:10:56.080 --> 00:10:59.049
Andy Whiteside: Instant form or script, filling.

76
00:10:59.050 --> 00:11:06.149
Mike Sabia: So it takes that data that it just extracted and then fills out your ticket for you. Short description, pertinent information.

77
00:11:06.300 --> 00:11:17.910
Andy Whiteside: All right. So I didn't think about this part. Not only is it suggesting to the support person or just doing it what's wrong, it's literally putting in the notes simultaneously, which, as someone who's done, support in my past

78
00:11:18.180 --> 00:11:20.980
Andy Whiteside: doing both those things at the same time, is kind of a struggle.

79
00:11:23.020 --> 00:11:25.579
Andy Whiteside: Support, for custom, prompts.

80
00:11:26.150 --> 00:11:30.859
Mike Sabia: So rather than simply scanning it and it determining what you want, you might want to say, hey.

81
00:11:31.240 --> 00:11:36.639
Mike Sabia: look at the error log, or or the error message rather, or summarize

82
00:11:37.190 --> 00:11:39.520
Mike Sabia: the key points in this message.

83
00:11:39.750 --> 00:11:44.170
Mike Sabia: Whatever you want it, the the lens to gather out of that data.

84
00:11:44.170 --> 00:11:47.100
Andy Whiteside: Yeah multicapture.

85
00:11:47.430 --> 00:11:55.220
Mike Sabia: Capture data from more than one source, and then, you know, bring it together. It's not just a single screenshot or single log file. It can look at multiple items.

86
00:11:55.220 --> 00:11:59.640
Andy Whiteside: Yeah, that true, multi-threaded looking everywhere simultaneously.

87
00:11:59.790 --> 00:12:04.609
Andy Whiteside: things a human could never begin to keep up with standalone mode.

88
00:12:05.150 --> 00:12:09.689
Mike Sabia: You can be it on the PC. It can be on your mobile device anything on the screen.

89
00:12:10.890 --> 00:12:11.260
Andy Whiteside: Good.

90
00:12:11.260 --> 00:12:17.170
Mike Sabia: Add that prompt get that initial media analysis. So it's available in that that standalone mode.

91
00:12:17.560 --> 00:12:19.200
Andy Whiteside: Secure user session.

92
00:12:21.130 --> 00:12:33.339
Mike Sabia: Well, you can put controls about who is able to use it, and what permissions they need in order to do it. And then, you know, put some structure around what happens with that data? Who can access it.

93
00:12:34.520 --> 00:12:35.240
Andy Whiteside: Oh! Once!

94
00:12:35.240 --> 00:12:39.119
Mike Sabia: Scan it, and then have somebody else be able to go in there and look at that information that you scanned it.

95
00:12:39.510 --> 00:12:41.849
Andy Whiteside: And then finally, the last one multi persona usage.

96
00:12:42.580 --> 00:12:59.510
Mike Sabia: The fact that's useful for different people could be useful to the requester the developer. Who or let me get jumped ahead the the Requester who's submitting it, the agent who's fulfilling it, or even a developer to say, Hey, look at this, and maybe.

97
00:12:59.700 --> 00:13:03.340
Mike Sabia: you know, create a script to do something automatically.

98
00:13:03.510 --> 00:13:08.490
Andy Whiteside: Yeah, alright, John, you heard all those, any of these you want to double, click on and explain further.

99
00:13:10.508 --> 00:13:33.511
John Dahl: Well, I I tend to like the the help that it gives you for support. Right? So as you're working through something with a customer, and especially if you're watching them reproduce the issue that you're supporting. You can simply just grab those snapshots along the way, and it compiles it all together and says, Okay, here's what I observed, and here's what I think you should do after it.

100
00:13:36.840 --> 00:13:44.419
Andy Whiteside: So, guys, I'm going to give you both the the opportunity to answer this question. If you had this somewhere in your past.

101
00:13:44.730 --> 00:13:50.150
Andy Whiteside: give me one example where you think it would have really just been very helpful. John, go ahead.

102
00:13:51.160 --> 00:14:06.309
John Dahl: Well, I as soon as you said it, it kind of triggered kind of an old nightmarish thing for me, right? So as you're trying to write notes and listening to the customer. You're you're writing notes behind a few few seconds behind what the customer is saying.

103
00:14:06.430 --> 00:14:20.379
John Dahl: and it's difficult to pay attention and hear the new content. As you're writing notes about the old content, I think this would have made it a lot easier to be able to focus on the customer and and make sure they understood that I care about what they're telling me.

104
00:14:20.550 --> 00:14:22.579
Andy Whiteside: Yeah, Mike, how about you?

105
00:14:23.600 --> 00:14:26.390
Mike Sabia: I think John's point is great, and it's not just

106
00:14:26.530 --> 00:14:47.479
Mike Sabia: something that would help us in the past, but helps us today and tomorrow, if we were having a workshop with a customer, be able to, you know, grab a screenshot of the reverse demo of their current product to see what maybe some categories are on that record. Be able to pull that information out rather than having to. You know type it in yourself, or maybe use some tool that'll do some Ocr, some

107
00:14:47.490 --> 00:14:57.009
Mike Sabia: optical character recognition, I I would see. Say, that's that's where I'm looking to use it now. I'm sure I could come up with a number of examples in the past.

108
00:14:57.418 --> 00:15:18.469
Mike Sabia: I want to say that it would help me pull out the proper questions out of a user. We've all been in conversations where they present information you have to like, you know. Then ask the appropriate questions. I'm not quite sure it's there yet, but that would be my hope that it goes even further to to suggest questions based on that data.

109
00:15:18.900 --> 00:15:33.740
Andy Whiteside: How about this one? I I assume you guys have both done support somewhere along the way. You know, some of the times you're just triaging, and, more importantly, trying to de-escalate the situation. What if you could do that? And simultaneously working on the ticket? To me that seems like that, could have some huge benefits as well.

110
00:15:35.830 --> 00:15:36.500
Mike Sabia: You had to have

111
00:15:36.500 --> 00:15:46.161
Mike Sabia: conversation, while you know, just a couple of clicks. Say, look at this and analyze it. So your your focus is on the customer rather than the ticket. I could definitely see that could help out

112
00:15:47.460 --> 00:15:48.679
Andy Whiteside: Yeah. Yeah. Go ahead.

113
00:15:48.680 --> 00:16:12.059
Mike Sabia: You know that that could sometimes give a customer a a sense that you could do everything at once, and that's not always possible. You have to be able to say, Hey, this is something new. I want to make sure I give you the correct and best answer. So you know, oftentimes we'll still take that offline. But to have that tool to John's point, you know. Take that information, grab that information, maybe summarize that information as we're going. So you can, you know, focus on customer. That's that's key.

114
00:16:12.710 --> 00:16:23.010
Andy Whiteside: I think your comment about not doing everything at one time is is true and impossible. However, it's fair to say, with tools like this, we're going to be able to do a whole lot more simultaneously than ever before.

115
00:16:24.240 --> 00:16:24.890
Andy Whiteside: Yeah.

116
00:16:25.100 --> 00:16:30.570
Andy Whiteside: Alright well, that kind of wraps this up, Mike. Anything we didn't cover in this topic that you that you want to bring up.

117
00:16:31.054 --> 00:16:51.389
Mike Sabia: Of course, anything that's Llm related has licensing impacts, you know. Service now, with now assist puts a certain number of calculations based on certain actions, and and sometimes because it has to query the Llm. And then requery it. They put a budget of, Hey, this is, gonna take 5 actions to do this type of thing. I'm curious to see how this works out.

118
00:16:51.400 --> 00:17:03.570
Mike Sabia: and with any Llsm situation you need to pull out an Roi, and for our customers, who maybe listen to this. If you had questions or or interest in this product, please reach out to us, and we'll have some further discussion.

119
00:17:04.650 --> 00:17:06.489
Andy Whiteside: John anything we didn't cover. You want to bring up.

120
00:17:07.466 --> 00:17:34.809
John Dahl: Yeah. As always, there are caveats with related to any kind of AI solution. With normal AI integrations that service now has promoted. You've understood ahead of time where, while you're defining it, this is more ad hoc. It's important to understand that you don't want to capture sensitive information and provide that as as source material for the Llm. To train and potentially give to other people. So just be aware of what information you're exposing.

121
00:17:35.469 --> 00:17:36.309
Andy Whiteside: You know, but just

122
00:17:36.310 --> 00:17:55.849
Andy Whiteside: I wasn't gonna bring that up because it was like kind of a downer on the topic. But I appreciate that you did because I had the same thing like, Okay, I get, you can see my screen and what I'm telling you, my error code is and what I'm working on. You can also see everything else on my screen and capture it real time, with lots of magnitude associated with what you saw and what you could do with it. Mike, go ahead.

123
00:17:56.230 --> 00:18:06.210
Mike Sabia: Well, I I would say that that's always an issue. Anybody could do a screenshot at any point to find, you know, information that's in the background. That's 1 of the reasons why that we tend to share a application rather than the whole desktop.

124
00:18:06.970 --> 00:18:07.670
Mike Sabia: But

125
00:18:09.260 --> 00:18:28.359
Mike Sabia: going to what John said about giving L. Data to Llm. Service now does have its own. Llm. So everything stays within service. Now, as long as you have that agreement. So it's not like it's going out to chat. Gpt, though, you know. Source now does allow you to integrate with those external Llms as well.

126
00:18:29.250 --> 00:18:51.680
Andy Whiteside: You know, had an interesting moment in our Crm. Last night, I think I went to say a customer asked me about a model number, and I plugged it into Microsoft Copilot. Not only did it give me the information on the Internet about that model of device, it also went into our system, and pulled a quote from a month ago for the same customer, and said, Oh, it's just like the one you sold a month ago, and I was like, oh, my God! That was unbelievably fast and powerful!

127
00:18:53.270 --> 00:18:53.870
John Dahl: Yep.

128
00:18:53.870 --> 00:18:57.509
Andy Whiteside: No, gentlemen, I appreciate it. I look forward to doing again another week.

129
00:18:58.450 --> 00:19:00.130
Mike Sabia: Thank you. Thanks, Sandy.