Salesforce Simplified
Dive into the expansive world of Salesforce with Derek Cassese, a seasoned expert with years of experience both within Salesforce and at XenTegra. In each episode, Derek unveils actionable insights, shares insider secrets, and imparts best practices rooted in his deep industry knowledge. Whether you're just starting out or are a seasoned Salesforce pro, tune in to demystify the platform and discover ways to make it more user-friendly for all. Join us on our journey to simplify Salesforce for everyone.
Salesforce Simplified
Salesforce Simplified: 5 Things Architects Should Know About Einstein
If you’re a Salesforce architect your customers have probably started to ask you about generative AI. To maintain their trust, you’ll need to understand what the Einstein 1 Platform is, the unique advantages it provides, and how it’s different from other Salesforce products. This blog discusses five key concepts that architects need to know about Einstein now.
Host: Andy Whiteside
Co-host: Derek Cassese
WEBVTT
1
00:00:02.450 --> 00:00:15.852
Andy Whiteside: Hello! Welcome to Episode 8 of Salesforce simplified. I'm your host, Andy Whiteside, and as always, I have Derek Cassis with me. Derek, let me get the commercial all the way real quick. So Zintegra is committed to helping people be successful with salesforce.
2
00:00:16.410 --> 00:00:40.600
Andy Whiteside: and that is not a small task. There is a lot to do. And we're gonna talk about some stuff today that will be relevant to that topic. But, we believe that there is an opportunity to raise the bar as a salesforce partner for both. The customer who's who has invested in the platform and salesforce who has developed the platform. I say that as a customer of salesforce and other partners, the the opportunity to do it better
3
00:00:40.600 --> 00:00:52.920
Andy Whiteside: is there because we've had to do it for ourselves. That's what led to starting the practice and bring in Derek on Derek. What's the most exciting thing that you talked to a customer about in the last I don't know. Last 5 business days, 7 days.
4
00:00:53.987 --> 00:00:58.752
Derek Cassese: The most exciting thing, let's see. So I've talked to a customer
5
00:00:59.680 --> 00:01:05.870
Derek Cassese: had to explain what change sets were to a customer and they didn't really understand what that was, because
6
00:01:06.060 --> 00:01:14.310
Derek Cassese: they don't typically do that. So I got to explain to them why the you know, the importance of sandboxes as it pertains to an org, and
7
00:01:14.420 --> 00:01:22.759
Derek Cassese: I went and just showed them, and then, you know, had an Aha like you could see the Aha moment happen. And they're like, Oh, that's what that is. Okay, cool.
8
00:01:22.940 --> 00:01:29.270
Derek Cassese: So. And they've had, I mean, multiple people in and out, and nobody's ever taken the time to explain that to them. So it's pretty.
9
00:01:29.270 --> 00:01:35.305
Andy Whiteside: You're just proving my point right there, and I would know to change. Sets are either. But it I mean, kind of makes sense when you start talking about it. But
10
00:01:35.550 --> 00:01:41.980
Andy Whiteside: Not only is it an idea of knowing what it is, but also having some degree of holding yourself accountable to following.
11
00:01:42.660 --> 00:01:43.280
Derek Cassese: Yes.
12
00:01:44.260 --> 00:01:45.990
Derek Cassese: yeah, yeah, I mean.
13
00:01:46.340 --> 00:01:54.660
Derek Cassese: and that could be another. That'll be a podcast. For a later date. Is getting into change sets and the new devops center, which is pretty cool.
14
00:01:54.800 --> 00:01:56.350
Derek Cassese: Yeah. So.
15
00:01:56.610 --> 00:02:00.219
Andy Whiteside: Let's see. So the topic you brought for today
16
00:02:03.080 --> 00:02:07.729
Andy Whiteside: 5 things architects should know about Einstein.
17
00:02:07.830 --> 00:02:11.249
Andy Whiteside: written by Shoshone Poleski.
18
00:02:11.250 --> 00:02:12.160
Derek Cassese: Susannah.
19
00:02:12.620 --> 00:02:14.796
Andy Whiteside: Oops. Suzanne Susannah. Sorry.
20
00:02:15.340 --> 00:02:17.423
Derek Cassese: Believe it's Suzanne Place, Dad.
21
00:02:19.320 --> 00:02:22.540
Andy Whiteside: And so, Derek, this is this is good cause. I've been waiting
22
00:02:22.560 --> 00:02:24.369
Andy Whiteside: well before you joined us.
23
00:02:24.590 --> 00:02:39.019
Andy Whiteside: and now, since you've been here. And I hear all this talk about generative AI. And I'm thinking that little salesforce Einstein guy is finally gonna be really relevant. And I think that's kind of what we're gonna cover today is how this Einstein thing which I love, that salesforce gives.
24
00:02:39.150 --> 00:02:51.150
Andy Whiteside: You know, these cartoon logo characters to these techniques, these topics. And I love that Einstein is the AI one that's been around for a while, but now is gonna actually start doing stuff for me right.
25
00:02:51.400 --> 00:02:54.993
Derek Cassese: Yeah, a big time. Yeah, I'm I'm excited about it. It's
26
00:02:55.710 --> 00:03:05.200
Derek Cassese: As we go through this, it'll all start to make sense. But it's it's definitely, really powerful. And you know, it took me a little bit to to really
27
00:03:05.360 --> 00:03:09.729
Derek Cassese: have my Aha moment with this, because, you know, sometimes in what we do.
28
00:03:09.800 --> 00:03:16.820
Derek Cassese: Sometimes, you see, tech being done just to do it. And it's not. It's almost like, Okay, that's cool. But I don't really need that.
29
00:03:16.960 --> 00:03:19.549
Derek Cassese: But this is this is different. This is
30
00:03:19.670 --> 00:03:28.179
Derek Cassese: like the generative AI capabilities that it's giving to the salesforce platform is pretty transformational. So I'm pretty excited about it.
31
00:03:28.180 --> 00:03:41.420
Andy Whiteside: Yeah. And and hopefully, the business is, gonna see that all these years we've been trying to get them into a platform is going to pay off, and it's gonna pay off exponentially. And I can't think of a better example than Einstein and salesforce.
32
00:03:41.750 --> 00:03:42.796
Derek Cassese: Yeah. Yup.
33
00:03:43.500 --> 00:03:53.049
Derek Cassese: yeah. And and part of what we're gonna talk about, too. So anybody that was at dreamforce in September of last year, you know. The big announcement was the Einstein one platform.
34
00:03:53.705 --> 00:04:00.610
Derek Cassese: So they're going, you know. Salesforce is going full bore ahead with, you know, AI as kind of their leading
35
00:04:01.222 --> 00:04:07.150
Derek Cassese: message to customers, and it brings along with it a lot of topics that we'll we'll get into here.
36
00:04:07.420 --> 00:04:08.150
Derek Cassese: Yep.
37
00:04:08.712 --> 00:04:14.370
Andy Whiteside: Alright. The first topic is, Einstein is hyper focused on trust. Derek, what does this mean? Why is it relevant.
38
00:04:14.810 --> 00:04:16.260
Derek Cassese: Yeah. So
39
00:04:17.010 --> 00:04:18.040
Derek Cassese: the
40
00:04:18.870 --> 00:04:25.140
Derek Cassese: number one value for salesforce is trust. That's why they have trust salesforce.com.
41
00:04:25.150 --> 00:04:29.080
Derek Cassese: By the way, in his books, you know, behind the cloud.
42
00:04:29.450 --> 00:04:38.420
Derek Cassese: Benny, off discusses how that was kind of groundbreaking at the time when they were like, you know what we're gonna have outages. Why don't we just put it up there and be totally transparent about it?
43
00:04:38.590 --> 00:04:44.089
Derek Cassese: And I found that really fascinating. So anybody at any time can go to trust at Salesforce com
44
00:04:44.210 --> 00:04:51.610
Derek Cassese: and see if there's any issues with the platform. So trust is a very, very important piece
45
00:04:52.110 --> 00:04:57.150
Derek Cassese: to the relationship of salesforce and their customers. And
46
00:04:57.690 --> 00:05:09.380
Derek Cassese: it's also really important. We get into like generative AI, because now we're in a situation where you gotta understand all the moving pieces right? And so what this is talking about is how
47
00:05:09.960 --> 00:05:22.160
Derek Cassese: it it mentions that you know the core value is trust, but also that you've got this concept within the Einstein one platform which is a rebranding of the salesforce platform with a focus on AI
48
00:05:23.006 --> 00:05:24.199
Derek Cassese: it the
49
00:05:24.210 --> 00:05:32.240
Derek Cassese: and there's this thing called the Einstein Trust layer, right? Which is a Comp. It's a variety of apps and services that kinda come together that
50
00:05:32.797 --> 00:05:37.390
Derek Cassese: do things to protect the customer and their data and content.
51
00:05:37.849 --> 00:05:45.760
Derek Cassese: And you know from, you know, one of the things that it does is it can mask personally personal, identifiable information. Pi.
52
00:05:45.850 --> 00:05:53.097
Derek Cassese: So if it's sending, you know, if you're sending your salesforce data into a lm, large language,
53
00:05:53.670 --> 00:05:55.450
Derek Cassese: model the
54
00:05:56.310 --> 00:06:03.380
Derek Cassese: the specifics of me like Derek Cassis. And you know, anything that is personally identifiable can be masked
55
00:06:03.610 --> 00:06:06.420
Derek Cassese: so that it's not actually being stored in that model.
56
00:06:07.560 --> 00:06:14.560
Derek Cassese: it's also hosted on hyper force. So that is the relationship of Salesforce and
57
00:06:14.840 --> 00:06:16.300
Derek Cassese: Microsoft.
58
00:06:16.750 --> 00:06:18.699
Derek Cassese: and it allows
59
00:06:18.710 --> 00:06:19.830
Derek Cassese: the
60
00:06:20.140 --> 00:06:22.469
Derek Cassese: use of generative AI
61
00:06:23.290 --> 00:06:43.049
Derek Cassese: without leaving the infrastructure. Since salesforce has moved the majority of orgs over to hyperforce. So now you're actually able to leverage AI without leaving that infrastructure from within, hyper force of which has already been vetted. Because that's where you're running your salesforce platform, anyway. So it's it's all kind of combined into this one single layer.
62
00:06:43.468 --> 00:06:49.319
Derek Cassese: And that's you know. That's what they're calling the Einstein trust Layer. It's a pretty important piece of this entire discussion.
63
00:06:50.094 --> 00:06:58.009
Derek Cassese: It says. Here, you know, it also supports the use of external models. And this is, this is really important, too. Right? So
64
00:06:58.522 --> 00:07:04.340
Derek Cassese: it only will. It's the use of external models that have agreed to a 0 retention policy.
65
00:07:04.470 --> 00:07:08.759
Derek Cassese: So that's important. Because what that means is companies like Openai
66
00:07:10.510 --> 00:07:25.150
Derek Cassese: that exist within the trust boundary of salesforce. They are not. Go they, if you're using their model, if you choose, hey? I want to use Openai. They're, you know, they have contracts and agreements that they're not saving any of your data to help train their model
67
00:07:25.270 --> 00:07:32.990
Derek Cassese: right? So if they're not retaining any of that, so you know where your data is going, how it's being used. You know that Pi is being masked.
68
00:07:33.070 --> 00:07:38.169
Derek Cassese: And all this is kind of building that that trust that you're giving salesforce
69
00:07:38.230 --> 00:07:46.699
Derek Cassese: so that you can then turn around and use this technology. They're also soon going to support. Bring your own model.
70
00:07:46.950 --> 00:07:48.070
Derek Cassese: So
71
00:07:48.220 --> 00:07:57.569
Derek Cassese: you know there are customers out there that are building their large language models. And they could bring their own right. And that's important. Because and I'll just kinda go back a couple of years.
72
00:07:58.527 --> 00:08:06.179
Derek Cassese: When I was at Salesforce I got the opportunity to work with the team on the blockchain product, which I thought was
73
00:08:06.470 --> 00:08:09.659
Derek Cassese: really really cool, very promising. But
74
00:08:10.221 --> 00:08:16.478
Derek Cassese: part of the issue was like, what chain do you use? You know there's a lot of blockchains out there, and without going down that rabbit hole
75
00:08:16.750 --> 00:08:34.730
Derek Cassese: I felt like that was part of the issue and then, you know, it's subsequently phased out. But I think that this is this is being done correctly. Where there you're not locked into using one particular model when it comes to the Llm. That you want to use for these technologies. Right? And so
76
00:08:34.860 --> 00:08:37.670
Derek Cassese: all this allows you to
77
00:08:38.110 --> 00:08:44.030
Derek Cassese: use this AI right that's being protected. And you're and you've got a trust layer.
78
00:08:44.150 --> 00:08:50.740
Derek Cassese: But then they're adding this one magical piece that I'll see if this reminds you of anything, but it's it's
79
00:08:50.750 --> 00:08:59.199
Derek Cassese: the ability to keep the keep the people in the loop that they're saying so. The ability to give your users the ability to
80
00:08:59.390 --> 00:09:14.230
Derek Cassese: give feedback on the responses from these models. So like a thumbs up thumbs down, was this, you know, a a good response, and that feeds back into the system and helps refine what's being provided to the users of this of this.
81
00:09:14.230 --> 00:09:17.979
Andy Whiteside: It's non intrusive. Real time feedback loop, feedback model.
82
00:09:17.980 --> 00:09:18.640
Derek Cassese: Yep.
83
00:09:18.840 --> 00:09:20.940
Andy Whiteside: Yeah, yeah, I think I've heard of that before.
84
00:09:22.050 --> 00:09:29.410
Andy Whiteside: That's a joke inside joke. People listening don't know we're talking about zoom. Derek wanted to invent something similar about 10 years ago in the It space where it was just
85
00:09:29.440 --> 00:09:35.159
Andy Whiteside: this constant little button you could hit on the screen, saying, I'm I'm happy, I'm happy, I'm happy. Oh, I'm unhappy. I'm unhappy.
86
00:09:35.170 --> 00:09:39.160
Andy Whiteside: Yeah, something was worth, you know, adding comments to you could do it.
87
00:09:39.790 --> 00:09:42.540
Derek Cassese: Yeah, just you know, it's a point in time, like, there.
88
00:09:42.680 --> 00:09:57.370
Derek Cassese: if you collect feedback at the time, the user experiences either a good experience or a bad experience, you're gonna get really good feedback. But if you send a survey like 2 days later, it's not gonna be that good. In my opinion, it's gonna be people forget they may not even fill it out. So
89
00:09:58.650 --> 00:10:01.899
Andy Whiteside: They've moved on. They're past the point of carrying. Whether it was good or bad at that point.
90
00:10:01.900 --> 00:10:02.970
Derek Cassese: Exactly. Yeah.
91
00:10:03.237 --> 00:10:21.160
Andy Whiteside: Derek. The next section yourself about Data cloud is crucial is a crucial part of the Einstein one platform, you know. I've I think I've heard of people talking about data lakes and data clouds several times over the past week, and how important it's gonna be salesforce is offering up one that can be used certainly with its platforms, and beyond it seems like.
92
00:10:21.500 --> 00:10:27.671
Derek Cassese: Yeah. And you also hear, like a data lake house, that's another term that it will be passed around here.
93
00:10:28.370 --> 00:10:33.539
Derek Cassese: I. And that that's a much deeper conversation as far as like data. Lake, Dale Lake House.
94
00:10:33.934 --> 00:10:37.825
Derek Cassese: data lake being like very unstructured. I think I have that right.
95
00:10:38.340 --> 00:10:47.779
Derek Cassese: But anyway, what they're saying is like data cloud. So again, another really big announcement, we were at sales at Dreamforce was that salesforce was giving all their customers on
96
00:10:47.790 --> 00:10:52.139
Derek Cassese: particular versions. And enough, I think it was like enterprise. And up they were giving them free data to cloud
97
00:10:52.846 --> 00:10:57.489
Derek Cassese: and so free data club means free a certain number of credits.
98
00:10:58.510 --> 00:11:02.050
Derek Cassese: But free data cloud to start
99
00:11:02.780 --> 00:11:07.360
Derek Cassese: digging into that technology because it's a crucial aspect of this in that.
100
00:11:09.900 --> 00:11:22.160
Derek Cassese: you know all the data that you're gonna be using for this generative, AI is going to be collected. And it's going to be sitting in the data in in the data cloud platform, which is still part of the salesforce platform.
101
00:11:23.128 --> 00:11:26.300
Derek Cassese: But it allows like extremely large
102
00:11:28.060 --> 00:11:31.179
Derek Cassese: like database sizes right? And so
103
00:11:31.800 --> 00:11:39.890
Derek Cassese: you know, you can basically get the performance of this stuff. That you you typically wouldn't been able to do without data cloud, you can get the performance that you need.
104
00:11:39.970 --> 00:11:47.169
Derek Cassese: But all the all what they're saying here is that all their generative AI capabilities are built using data cloud infrastructure on hyper force.
105
00:11:47.330 --> 00:11:52.320
Derek Cassese: Right? And so that's the again, you're gonna keep here in this hyper force which allows for extreme
106
00:11:53.068 --> 00:11:55.260
Derek Cassese: like petabyte scale data.
107
00:11:55.320 --> 00:12:07.079
Derek Cassese: So you know, you can ingest, just an insane amount of data into data cloud. And then, you know, work with structured, unstructured data across this separate system. So for example.
108
00:12:07.539 --> 00:12:16.520
Derek Cassese: maybe you have data in service. Now, maybe you have data in salesforce. Maybe you have data in Microsoft. Maybe you have data in other areas which you probably all do
109
00:12:16.806 --> 00:12:19.019
Derek Cassese: you can. You can have all of that
110
00:12:19.120 --> 00:12:25.200
Derek Cassese: in data cloud, and then start, you know, generating these responses that know about all of that
111
00:12:25.250 --> 00:12:26.720
Derek Cassese: which is pretty.
112
00:12:27.310 --> 00:12:31.960
Derek Cassese: but that's pretty. That's pretty impressive, right? And it's very useful. Useful information.
113
00:12:32.766 --> 00:12:36.030
Derek Cassese: To be able to just have a conversation right with
114
00:12:36.210 --> 00:12:40.410
Derek Cassese: this type of AI and the fact that
115
00:12:41.390 --> 00:12:47.150
Derek Cassese: you know fact, the data cloud is this crucial part is what they're saying in this article is a key differentiator
116
00:12:47.454 --> 00:12:51.869
Derek Cassese: to what Salesforce is doing with AI as opposed to some of these other companies.
117
00:12:52.020 --> 00:12:52.740
Derek Cassese: Up
118
00:12:52.930 --> 00:12:59.689
Derek Cassese: might not you know that might be, hey, go, go use Snowflake, go use something else. But this is kind of an all-encompassing solution
119
00:13:00.220 --> 00:13:03.619
Derek Cassese: for customers to really start embracing this
120
00:13:04.047 --> 00:13:06.970
Derek Cassese: and so you know, and data cloud.
121
00:13:07.250 --> 00:13:12.989
Derek Cassese: It's interesting because we've already done. We've already done a podcast on that right went through all the names that that that's had.
122
00:13:13.470 --> 00:13:24.419
Derek Cassese: So you if you're listening, and you haven't heard that and go back and listen to that one where we discuss the fact that it's not necessarily a brand new product. It's had a bunch of names, but it's evolved significantly
123
00:13:24.440 --> 00:13:27.980
Derek Cassese: from a use case and our reach perspective.
124
00:13:30.840 --> 00:13:41.350
Andy Whiteside: Derek. The next section is Einstein supports multiple models, and it's multi module model module model modal. Sorry start try again.
125
00:13:41.780 --> 00:13:46.170
Andy Whiteside: Mine. So supports multiple models. And it's multi
126
00:13:46.620 --> 00:13:47.590
Andy Whiteside: modal.
127
00:13:48.495 --> 00:13:59.560
Derek Cassese: Yeah, that's kind of a weird thing. And real quick. Also, before I jump into that one thing I didn't mention as well is that one of the other features of Data Cloud. And this is really important. Is the grounding piece of it
128
00:13:59.600 --> 00:14:03.950
Derek Cassese: so? And what that means is, you can ask. You can ask a generative
129
00:14:04.070 --> 00:14:06.320
Derek Cassese: AI a question.
130
00:14:06.680 --> 00:14:20.909
Derek Cassese: and it may be very ambiguous when it comes back a response. But if you've grounded it with like company name, like the data that you already have in salesforce. Right? You can. You can feed it that information because it already has that information, and it can get a really really refined response.
131
00:14:21.010 --> 00:14:26.729
Derek Cassese: So that's also another benefit of data cloud. But as far as what you're saying here, right? And it's multiple models
132
00:14:26.920 --> 00:14:31.430
Derek Cassese: and multimodal right? And what we talked about in the beginning was how
133
00:14:31.630 --> 00:14:34.300
Derek Cassese: they're not just locking you into a single.
134
00:14:35.040 --> 00:14:40.700
Derek Cassese: you know, a single Ll. LMLL. M. Right. It's not just
135
00:14:41.090 --> 00:14:54.189
Derek Cassese: the salesforce hosted like code. Gen. I know that. Anthropic, I think anthropic. I would say that wrong. I think it's anthropic or cohere. There's a lot of them out. There's a ton of them. But
136
00:14:54.820 --> 00:15:03.629
Derek Cassese: they, you know, if they've established the trust with salesforce like Openai has. Then you can. You can leverage them within this type of an arrangement.
137
00:15:04.028 --> 00:15:11.260
Derek Cassese: They also said that if you developed your own, then you'll be able to connect it to salesforce with your bring your own model capability. So
138
00:15:11.810 --> 00:15:12.710
Derek Cassese: you know.
139
00:15:13.380 --> 00:15:16.439
Derek Cassese: But you know, supporting multiple models
140
00:15:16.470 --> 00:15:19.659
Derek Cassese: is their core approach for AI,
141
00:15:19.900 --> 00:15:26.889
Derek Cassese: and that what that allows you to do is use what you want. Use the best of breed model. Share them across.
142
00:15:27.611 --> 00:15:30.499
Derek Cassese: You know your solutions and then
143
00:15:30.940 --> 00:15:35.239
Derek Cassese: use those and what you know when you build your prompts.
144
00:15:35.530 --> 00:15:44.199
Derek Cassese: And that's a whole nother discussion about prompt engineering. Which is this new role that's been birthed, that objective. AI.
145
00:15:44.630 --> 00:15:47.290
Derek Cassese: But it's the art of creating
146
00:15:47.360 --> 00:15:51.849
Derek Cassese: the prompts that you send to a generative AI model.
147
00:15:52.000 --> 00:15:53.230
Derek Cassese: And
148
00:15:53.390 --> 00:16:06.250
Derek Cassese: the more specific and the more information that you put into a prompt, the better the response is going to be for the user. And what salesforce has is prompt studio and different models will work with that
149
00:16:06.980 --> 00:16:08.140
Derek Cassese: that make sense.
150
00:16:09.353 --> 00:16:15.800
Andy Whiteside: That's I just keep thinking every time here someone say that that's where the whole prompt engineering skill sets gonna come in.
151
00:16:16.030 --> 00:16:19.640
Andy Whiteside: But yeah, it depends on what's behind the curtain as well.
152
00:16:19.860 --> 00:16:20.560
Derek Cassese: Yet.
153
00:16:20.980 --> 00:16:35.519
Derek Cassese: Yeah, and it's. And the other thing is is that it's like it's not even, you know. He, he! I was reading this. I was like, this is, this is kind of crazy, right? So it it doesn't just support one modality of content generation. Right? So when they say multi modal.
154
00:16:35.740 --> 00:16:54.640
Derek Cassese: It's not just generating text for an email. And that was something I was joking with you when we were so I was like every demo I saw seemed to be like just generating an email. And and then all of a sudden, I saw them generate code in vs. You know, in visual studio code for developers. Which kind of blew my mind as a developer that
155
00:16:55.470 --> 00:16:58.440
Derek Cassese: can't stand having to come up with test classes
156
00:16:58.760 --> 00:17:03.149
Derek Cassese: like the ability to have this generate a test class is just.
157
00:17:03.660 --> 00:17:05.690
Derek Cassese: you know, it's like Christmas morning
158
00:17:06.539 --> 00:17:21.700
Derek Cassese: and then you can even have it. Generate configurations and flow in salesforce. So like, it's not just like, Hey, so and so I want to have a meeting with you. It's it's a different context within everything that's being done in the salesforce platform, which is pretty impressive.
159
00:17:24.023 --> 00:17:27.780
Derek Cassese: And then here, here's where. Here's where this gets really interesting. So
160
00:17:29.670 --> 00:17:33.760
Derek Cassese: the article goes in and says that it's, you know, all this becomes even more relevant
161
00:17:33.790 --> 00:17:37.120
Derek Cassese: when we prepare for autonomous agents.
162
00:17:37.890 --> 00:17:47.120
Derek Cassese: And that, I was like, oh, this is really cool, right? It's the next wave of AI autonomous agents that leverage these Lms. Llms.
163
00:17:47.160 --> 00:17:57.149
Derek Cassese: I always say Lmmm Lms but not to predict the next word right or sentence, but to predict the next action.
164
00:17:58.460 --> 00:18:08.131
Derek Cassese: So just like that sink in for a minute. So let's just say you're having a conversation with salesforce Einstein co-pilot. Everybody seems to be using co-pilot
165
00:18:08.690 --> 00:18:10.860
Derek Cassese: and you're you're talking about.
166
00:18:11.300 --> 00:18:16.140
Derek Cassese: you know the example they say, you know. Let's just say we're talking about you. You want to get this opportunity closed.
167
00:18:16.170 --> 00:18:22.819
Derek Cassese: Well, it may actually understand, based on historical trending. And all this it may know what to do next.
168
00:18:23.050 --> 00:18:26.319
Derek Cassese: Right? It may automatically set up tasks
169
00:18:26.400 --> 00:18:28.929
Derek Cassese: or create a campaign
170
00:18:29.210 --> 00:18:32.130
Derek Cassese: for you? But
171
00:18:32.250 --> 00:18:35.219
Derek Cassese: they keep they they, they basically say, you know.
172
00:18:35.230 --> 00:18:42.509
Derek Cassese: so long as you've trained the agent on the types of tasks. You need to reach your goal. Then the AI will orchestrate.
173
00:18:42.570 --> 00:18:47.229
Derek Cassese: you know, getting that done right. There's always going to be the like the human in the middle.
174
00:18:47.230 --> 00:18:47.660
Andy Whiteside: Yeah.
175
00:18:47.994 --> 00:18:54.679
Derek Cassese: So they they kind of address the whole like this isn't gonna replace people. But it's gonna augment significantly.
176
00:18:54.680 --> 00:19:05.306
Andy Whiteside: Yeah, I I know you said the the word copilots mean. And I I hate the fact that it's being used earlier. But it's the most applicable world word where it's not in charge. It's this there to help and assist you.
177
00:19:07.000 --> 00:19:12.229
Derek Cassese: Yep, and it's and they can't use assistant, because that's been used everywhere. Right? So
178
00:19:12.594 --> 00:19:15.289
Derek Cassese: I guess you know that we'll probably see many name changes.
179
00:19:15.290 --> 00:19:19.449
Andy Whiteside: I actually like Coppili, because it really can fly the plane for you.
180
00:19:20.001 --> 00:19:29.189
Andy Whiteside: But in general you're gonna fly the plane and you're gonna give. Make sure the co pilots doing it right. You're not just, you know. It's not just there to help you. It could actually do it. But
181
00:19:29.310 --> 00:19:29.909
Andy Whiteside: you're
182
00:19:30.690 --> 00:19:33.790
Andy Whiteside: You're gonna make sure it's doing it right, or you're gonna take over as needed.
183
00:19:34.330 --> 00:19:37.874
Derek Cassese: Yeah, yeah. And they should. And in here they reference another article.
184
00:19:38.180 --> 00:19:52.850
Derek Cassese: cause it says, you know again. Like I said, Einstein Copeland isn't fully autonomous. There's still going to be humans in the loop. But to go deeper on it. There's another article about about that which links off of this blog, which is located on mediumcom.
185
00:19:53.660 --> 00:19:54.500
Derek Cassese: so.
186
00:19:54.970 --> 00:20:00.389
Andy Whiteside: Alright next topic here is Einstein will change the way we think about application, lifecycle management.
187
00:20:03.590 --> 00:20:05.112
Derek Cassese: Yup, yeah. So
188
00:20:05.770 --> 00:20:07.160
Derek Cassese: you know. And this one.
189
00:20:08.210 --> 00:20:14.459
Derek Cassese: this is, this is kind of an interesting one, too. So as a architect or somebody designing a system.
190
00:20:15.240 --> 00:20:24.160
Derek Cassese: And I'm like this a lot, too. Right? I'm used to writing code and then creating tests like I just mentioned. But I'm expecting a a particular response.
191
00:20:24.360 --> 00:20:27.420
Derek Cassese: and if it's not that response, then the desk fails.
192
00:20:27.700 --> 00:20:28.540
Derek Cassese: But
193
00:20:28.930 --> 00:20:29.740
Derek Cassese: with.
194
00:20:30.120 --> 00:20:43.079
Derek Cassese: you know, and that's a predictable result, right? And that's kind of why you test it, you need to have predictable result based on inputs and all that good stuff. What they're saying here is that with generative AI, it's non deterministic. So the response could be different
195
00:20:43.840 --> 00:20:47.329
Derek Cassese: every time the prompt is sent. It could just be a different
196
00:20:47.510 --> 00:20:48.520
Derek Cassese: answer.
197
00:20:48.580 --> 00:20:56.639
Derek Cassese: which is a reason why we can't guarantee that that answer is actually gonna be what you want. It could be totally irrelevant, based on how the system is set up.
198
00:20:57.146 --> 00:21:04.609
Derek Cassese: Which is why all these tools are like with grounding and all that good stuff you're trying to get it to be as close to relevant. And what you're asking for. But it's
199
00:21:04.780 --> 00:21:08.210
Derek Cassese: it's not like as predictable as
200
00:21:08.940 --> 00:21:11.929
Derek Cassese: what we're used to right? And so
201
00:21:13.180 --> 00:21:16.199
Derek Cassese: and you know, when you think about application, life, cycle.
202
00:21:17.800 --> 00:21:19.549
Derek Cassese: and unit testing.
203
00:21:20.050 --> 00:21:25.399
Derek Cassese: there's a whole concept of what they bring up here. Which is this ethical review
204
00:21:25.880 --> 00:21:27.439
Derek Cassese: to make sure that.
205
00:21:27.860 --> 00:21:38.119
Derek Cassese: like you understand to me, like to to look at the used cases so that you understand what you're building and how the A how the AI is going to be used.
206
00:21:39.870 --> 00:21:44.270
Derek Cassese: and then you build your templates with the prompts, and you've tested them in the sandbox.
207
00:21:44.470 --> 00:21:48.469
Derek Cassese: Yeah, you still need to think about the response, because this is not like
208
00:21:48.770 --> 00:21:53.169
Derek Cassese: a A B equals C every time a B equals C every time it's it's.
209
00:21:53.250 --> 00:21:58.410
Derek Cassese: you know, not deterministic. So what the the rollout plan has to be such that
210
00:21:58.640 --> 00:22:16.599
Derek Cassese: you know you have. You have a review, you develop, and then you roll it out to a pilot group. You get feedback, you fine tune, you roll it back out, you fine tune, and then you roll it out to the masses, which may not be the traditional way that you're doing. You know, lifecycle management.
211
00:22:17.378 --> 00:22:19.430
Derek Cassese: And they have, like, you know, they have this
212
00:22:19.580 --> 00:22:22.729
Derek Cassese: janitor today. I release plan diagram in here.
213
00:22:23.250 --> 00:22:33.199
Derek Cassese: Which, by the way, is in the format of well architected. So anybody that doesn't know about that go to salesforce architects and read up on well architected. It's very interesting.
214
00:22:33.820 --> 00:22:37.489
Derek Cassese: but I digress. This is, it talks about project definition.
215
00:22:37.660 --> 00:22:40.509
Derek Cassese: configure, a test pilot review.
216
00:22:40.580 --> 00:22:46.089
Derek Cassese: Beta, review that type of stuff. Right? So it's it is going to change the way you think about rolling out.
217
00:22:46.290 --> 00:22:48.540
Derek Cassese: you know, net new applications and stuff like that.
218
00:22:51.620 --> 00:23:01.260
Andy Whiteside: Dirk section 5 here talks about has a consumption based model of, I hear a lot about how much is gonna cost and how much power is gonna be used? How do people
219
00:23:01.350 --> 00:23:03.699
Andy Whiteside: financially justify? Yeah.
220
00:23:03.700 --> 00:23:07.320
Derek Cassese: Yeah, so this is a, this is a different one, too, especially
221
00:23:07.680 --> 00:23:11.430
Derek Cassese: because, you know, salesforce has not been a consumption based.
222
00:23:11.540 --> 00:23:12.490
Derek Cassese: Solution
223
00:23:12.980 --> 00:23:19.169
Derek Cassese: never has been until now. And so the way that Salesforce is doing it, which is kind of following
224
00:23:21.180 --> 00:23:29.749
Derek Cassese: following along with the majority of the providers is, they use what they call credits, and they're consumed on the the length of the prompts
225
00:23:29.860 --> 00:23:40.950
Derek Cassese: sent into the Llm. The outputs generated back so, depending on how complex or not complex, and how complex the answer is or not.
226
00:23:41.510 --> 00:23:50.120
Derek Cassese: you'll be using a certain number of credits, and that's exact. That's what salesforce has given to the to all customers to play with. They've given them
227
00:23:50.742 --> 00:23:53.349
Derek Cassese: so like a certain number of credits
228
00:23:53.630 --> 00:23:56.509
Derek Cassese: to just kind of get used to it. And
229
00:23:57.540 --> 00:23:59.700
Derek Cassese: you know, it's it's really
230
00:23:59.710 --> 00:24:06.000
Derek Cassese: that's that's real high level, as far as like. So you would buy a certain number of credits based on what you want to do.
231
00:24:06.020 --> 00:24:14.840
Derek Cassese: But that is a conversation I have with, you know, like with a partner like us, and in conjunction with the salesforce team.
232
00:24:14.860 --> 00:24:20.899
Derek Cassese: to really dig into what you're trying to do, so that you size that appropriately, because I know that there are.
233
00:24:21.160 --> 00:24:24.967
Derek Cassese: There are things you can do that use up a lot of credits.
234
00:24:26.000 --> 00:24:38.550
Derek Cassese: and then there are, you know, simple like imports that don't. And so it's a really important to understand that kind of out of scope for this discussion cause that gets really into the weeds on the licensing of it. But I think the takeaway here is that.
235
00:24:39.350 --> 00:24:47.470
Derek Cassese: unlike everything else, this is. This is a consumption based setup when it comes to data cloud. And it's something to be aware of.
236
00:24:50.510 --> 00:24:54.349
Andy Whiteside: And then there the bring it all together. Right? The conclusion of
237
00:24:54.450 --> 00:24:57.449
Andy Whiteside: why people should be interested in AI
238
00:24:57.480 --> 00:24:59.889
Andy Whiteside: specifically, as it relates to salesforce to.
239
00:25:00.200 --> 00:25:14.359
Andy Whiteside: I was kind of teeted up from my perspective. It's it's super easy, because well, maybe I'm just been living in the sales and marketing world too long, but there's so much we could do more efficiently and better in both marketing and sales that salesforce to me the platform that
240
00:25:14.690 --> 00:25:22.740
Andy Whiteside: that there's a lot to be accomplished because you can never. You can never come close to doing it all AI is gonna get us a chance to do more, better.
241
00:25:23.890 --> 00:25:26.279
Derek Cassese: And yeah, and faster and more efficiently. And
242
00:25:26.806 --> 00:25:28.840
Derek Cassese: yeah, I mean the, the.
243
00:25:29.010 --> 00:25:36.209
Derek Cassese: the overall takeaways of this are, you know, those 5 points. Right? It's it's built on a trusted framework
244
00:25:36.280 --> 00:25:38.039
Derek Cassese: with data cloud in the middle.
245
00:25:38.260 --> 00:25:40.130
Derek Cassese: supporting multiple models
246
00:25:40.150 --> 00:25:45.139
Derek Cassese: in different areas of what you do. Right? That's that multimodal setup.
247
00:25:45.240 --> 00:25:46.330
Derek Cassese: And
248
00:25:46.930 --> 00:26:01.639
Derek Cassese: it's different. So it needs to be. It needs to be thought of as different. So it with the application lifecycle stuff that's different. And the licensing is different. So that you know, all this stuff needs to be taken into account. But it's it's important to understand these things, because
249
00:26:02.050 --> 00:26:03.779
Derek Cassese: this is gonna be.
250
00:26:04.770 --> 00:26:13.810
Derek Cassese: In. In my opinion, this is going to be the differentiator between the the customers that are able to move quickly and get ahead of their competition.
251
00:26:14.323 --> 00:26:19.160
Derek Cassese: Faster and sell more and build bit more pipeline and be successful.
252
00:26:19.250 --> 00:26:20.839
Derek Cassese: Because if you think about it.
253
00:26:21.890 --> 00:26:25.449
Derek Cassese: and if you've ever written or wrote a report in salesforce.
254
00:26:25.470 --> 00:26:30.099
Derek Cassese: What do you? You're not. You're not. You're not sitting there staring at your screen because you want to write a report.
255
00:26:30.210 --> 00:26:34.259
Derek Cassese: You're sitting there spending time doing that because you want an answer to something.
256
00:26:34.530 --> 00:26:35.990
Derek Cassese: whatever that may be.
257
00:26:36.390 --> 00:26:36.990
Andy Whiteside: Yep.
258
00:26:36.990 --> 00:26:44.770
Derek Cassese: Imagine if you just literally like, if if you had somebody that knew all the answers knew everything in your salesforce environment, and you could just say, Hey, you know
259
00:26:45.130 --> 00:26:50.029
Derek Cassese: how many customers have done this in the past 90 days versus last year.
260
00:26:50.310 --> 00:26:53.239
Derek Cassese: and they were able to give you that answer in like literally
261
00:26:54.080 --> 00:27:01.340
Derek Cassese: 30 s or 10 s. Imagine what you could do like that right? And that's an easy example.
262
00:27:02.620 --> 00:27:04.800
Derek Cassese: imagine just saying, hey.
263
00:27:05.470 --> 00:27:07.830
Derek Cassese: you know, create a create, a.
264
00:27:08.160 --> 00:27:14.690
Derek Cassese: a funny email from my bet, from my focused customers about last week's event
265
00:27:16.210 --> 00:27:17.610
Derek Cassese: and send it in a week.
266
00:27:18.520 --> 00:27:22.849
Derek Cassese: That just stuff like that right? I mean, and I mean, it's you could think of all these different
267
00:27:23.170 --> 00:27:24.640
Derek Cassese: use cases.
268
00:27:25.000 --> 00:27:35.530
Derek Cassese: And that's really just making one individual more efficient. And so and that. And those are just really high level. I mean, there's a lot of other use cases for this, the ability to
269
00:27:36.218 --> 00:27:41.290
Derek Cassese: just have conversations with the data, I think, is really fascinating. We do that a little bit.
270
00:27:42.139 --> 00:27:45.729
Derek Cassese: In in analytics, we're but not to this level.
271
00:27:45.780 --> 00:27:51.879
Derek Cassese: And then when you bring in other other systems, you know, gets really, really interesting. So
272
00:27:51.920 --> 00:27:54.529
Derek Cassese: you know, you could talk about, hey? You know.
273
00:27:54.940 --> 00:28:05.890
Derek Cassese: how's that? You know? Let's just say you have projects going on that are stored in service now and assets, but yet you also have open opportunities, and you could just say, You know
274
00:28:06.150 --> 00:28:15.289
Derek Cassese: what is the status of Company ABC's project, and is that gonna potentially cause issues with any open opportunities? Boom. And then it gives you all those answers and correlates what's happening
275
00:28:15.700 --> 00:28:21.779
Derek Cassese: right? And all those. The way that you ask those questions are the prompts and those prompts are grounded with your data
276
00:28:21.970 --> 00:28:24.720
Derek Cassese: like nobody else's data isn't coming from some.
277
00:28:24.760 --> 00:28:30.879
Derek Cassese: you know, manufactured set of artificial data. It's coming from your actual data that you created in salesforce.
278
00:28:31.340 --> 00:28:37.919
Derek Cassese: So it's it's fascinating stuff. It's, I think, going to. It's gonna change the way that these tools are used.
279
00:28:38.649 --> 00:28:46.719
Derek Cassese: There are going to be those that embrace it. They're going to be those that don't, and time will tell who gets left behind on some of this stuff. But
280
00:28:47.040 --> 00:28:49.640
Derek Cassese: you know this, this stuff is here. It's here to stay.
281
00:28:50.390 --> 00:29:15.740
Andy Whiteside: Yeah, this is my third podcast of the day every single one of them talked about. You know, AI and where it's going it. This, this has to be so applicable you. We talked about crypto and things like that. They weren't. We're we're really real, they are. They are real, but they're not as applicable. This is applicable everywhere. You you helped me out immensely just now with the idea that I'm gonna be able to just tell salesforce what I want for a report, instead of having to go through and figure out all the filters myself
282
00:29:15.995 --> 00:29:22.779
Andy Whiteside: and then have to go back days and days afterwards. Oh, I should have made the filter like this! Not that, at least at the minimum. Get me a good starting point.
283
00:29:23.530 --> 00:29:26.210
Derek Cassese: Yeah, just have a conversation with it. Right? I mean.
284
00:29:26.320 --> 00:29:37.190
Derek Cassese: reports are just a tool, because you want to get an answer to something. That's the only reason you run. You have a report, and then filters are just because you know you want that in different ways
285
00:29:37.650 --> 00:29:44.539
Derek Cassese: that that those things are going to. I mean, this is a much, much better user interface to the same data that those reports are running on.
286
00:29:45.510 --> 00:29:46.700
Derek Cassese: Yeah, right?
287
00:29:46.890 --> 00:29:52.059
Derek Cassese: So it's, you know, and the other the other interesting piece of this is that
288
00:29:52.760 --> 00:29:55.369
Derek Cassese: you know, salesforce. But this is.
289
00:29:55.640 --> 00:29:57.000
Derek Cassese: you could be
290
00:29:57.210 --> 00:30:05.659
Derek Cassese: 5, you know, 5 employee customer on salesforce with sales cloud. And you could use the same AI capabilities that
291
00:30:06.110 --> 00:30:14.460
Derek Cassese: you know a hundred 1,000 employee company is using. And so it's getting this technology in the hands of anybody that wants to use it
292
00:30:14.770 --> 00:30:22.049
Derek Cassese: level in the playing field. And I think that that's you know you just not like, Hey, you need 2.3 million dollars to use this stuff. You know you don't.
293
00:30:23.028 --> 00:30:45.010
Andy Whiteside: Why, yeah, funny, you said lay level in the playing field, my my visual, that I was coming up my head is, it's like that show, you know. Are you smarter than a fifth grader, and having the ability to have phone a friend for every single question they ask, and how much the likelihood of getting it right would be so if you're not phoning a friend, you're phoning your most intelligent friend you could ever have.
294
00:30:45.380 --> 00:30:46.010
Derek Cassese: Yeah.
295
00:30:46.840 --> 00:30:47.400
Andy Whiteside: Oh!
296
00:30:47.670 --> 00:30:56.580
Derek Cassese: Yeah, and and you know, and that's just the again. That's just the like where you have a question. And you didn't answer use case right? It's not the
297
00:30:57.070 --> 00:31:08.040
Derek Cassese: you know, the the harder ones where it's like, hey? Make sure that we don't have any duplicates in here, and if we do have duplicates, let me know why and when was it created, and who you know all that stuff? There's just. There's so much you can do. That.
298
00:31:08.540 --> 00:31:12.999
Derek Cassese: you know, it is kind of like. The sky's the limit. But there are limits right now.
299
00:31:13.220 --> 00:31:24.140
Derek Cassese: So and there's limits. Just you know, it's it's still evolving. It's still new. You know, some of the stuff is new, like pilot, you know, like the prompting and all that stuff is new. But
300
00:31:24.490 --> 00:31:32.910
Derek Cassese: you know the idea is to embrace this and not be afraid of it. Be educated on it. Understand that there's trust behind the way Salesforce is doing it?
301
00:31:34.105 --> 00:31:35.010
Derek Cassese: And
302
00:31:35.970 --> 00:31:38.609
Derek Cassese: you know, underneath it all, though, is the data
303
00:31:38.810 --> 00:31:40.350
Derek Cassese: really important?
304
00:31:40.560 --> 00:31:43.030
Derek Cassese: People start thinking now about their data.
305
00:31:43.786 --> 00:31:47.599
Derek Cassese: and data integrity might you know. And maybe this is a
306
00:31:47.960 --> 00:31:51.830
Derek Cassese: a little sneak, peek. But data integrity might be a topic of another discussion.
307
00:31:52.700 --> 00:31:55.410
Andy Whiteside: Well, it's the it's core to the whole thing.
308
00:31:56.020 --> 00:31:56.810
Andy Whiteside: Yep.
309
00:31:57.880 --> 00:32:02.469
Andy Whiteside: Derek, I I appreciate it. What did we not cover that? You would want to add, maybe, to this topic, if anything.
310
00:32:04.190 --> 00:32:09.100
Derek Cassese: Justin, if anybody's listening to this, that is a salesforce customer that didn't realize they
311
00:32:10.020 --> 00:32:13.330
Derek Cassese: have some data cloud. If they are on like enterprise, or up
312
00:32:13.350 --> 00:32:19.169
Derek Cassese: to reach out because you do, and we can help you do a quick pilot, or understand what you've got.
313
00:32:20.600 --> 00:32:38.900
Andy Whiteside: Yeah. Me going back to my previous commercial I talked about in the very beginning. We need to help you. I don't. I don't care who you are. There's opportunities have a discussion to at least discuss where a value added partner who's in it for the right reasons, might be able to help with your salesforce current and future implementations, including
314
00:32:38.910 --> 00:32:53.730
Andy Whiteside: data cloud and AI related technologies that are they're coming period. They're coming. And as Derek pointed out, you've gotta be you've gotta be prepared and get your team prepared to use them. Because if you don't, you're gonna be getting behind instantly.
315
00:32:53.730 --> 00:32:59.440
Derek Cassese: Yeah. And also I will say, this is all this, a lot of this stuff is new, right? And
316
00:32:59.680 --> 00:33:13.960
Derek Cassese: so if you're if you're listening, and I said something that may not be that you don't agree with. I'd love to hear that, too, because we're all learning here. Right? And so, Ca, further conversations on these topics are welcome. I I would enjoy that, because you know, as these things evolve
317
00:33:14.326 --> 00:33:21.403
Derek Cassese: it's hard to keep up with, and that's, you know, part, why we do these podcasts to help people. Just, you know, in the car where we are, listen to stuff and just get get some of it.
318
00:33:22.001 --> 00:33:33.809
Derek Cassese: But again, we're, you know, a lot of this stuff is new and coming out as we as we go about our day throughout these years to. It's 2024 right now. So it'd be interesting to listen to this in 5 years and see how much it's changed.
319
00:33:34.536 --> 00:33:39.660
Andy Whiteside: And it will, we'll we'll get it wrong. But it won't be for lack of trying to connect the dots.
320
00:33:39.660 --> 00:33:40.465
Derek Cassese: Absolutely.
321
00:33:41.270 --> 00:33:44.030
Andy Whiteside: Alright, sir, thank you, and we'll do this again in 2 weeks.
322
00:33:44.450 --> 00:33:44.976
Derek Cassese: And ready.