Salesforce loves AI as much as it loves CRM.
On Tuesday, to coincide with its Dreamforce conference, Salesforce launched Einstein 1, a platform that makes it easier for companies to use data within Data Cloud to build AI-powered apps and experiences.
Data Cloud, (Salesforce’s newish name for its customer data platform), allows marketers to unify data from across all Salesforce apps and elsewhere via APIs, such as Slack, Google Workspace, Microsoft 365 and data visualization platform Tableau.
Marketers can also connect to outside vendors through MuleSoft, which allows developers to build APIs for enterprise apps.
(Salesforce spent a combined $22.2 billion to acquire MuleSoft in 2018 and Tableau in 2019.)
The Einstein 1 platform includes a conversational assistant, called the Einstein copilot, so that marketers can automate tasks with generative AI using customer data from Data Cloud. Marketers can either tap into templated prompts or build their own through the Einstein copilot studio.
Let me get that for you
Like Microsoft’s similarly named AI-powered assistant and related products on the market, Einstein copilot generates answers, recommendations and content in response to natural language questions and prompts.
What differentiates Einstein copilot from those tools is “the connectivity to data,” said Patrick Stokes, EVP and GM of platform at Salesforce, speaking at a press conference on Monday.
“We’ve worked really hard with our data cloud [and] with Einstein to provide a lot of optionality for customers so they can connect data into Salesforce and model it, but it doesn’t have to live in Salesforce,” Stokes said. “We’ve really spent a lot of time building that foundation so we can bring this into customer applications.”
Salesforce built Einstein copilot using Einstein GPT, which relies on machine learning to analyze large CRM data sets and generate content.
Einstein GPT was also the basis for Marketing GPT and Commerce GPT, two ChatGPT-inspired chatbots released by Salesforce earlier this year to automate segment creation, personalized content, journey creation, product descriptions and other related tasks.
Workflowing
In practice, Einstein copilot allows marketers and sellers to chat with their data in natural language without having to know code.
For example, the tool can flag when B2B prospects are surfing a company’s website, which is data that typically wouldn’t be available to a salesperson, said Sanjna Parulekar, VP of product marketing at Salesforce.
Einstein can summarize every past interaction with a prospect before the salesperson reaches out, then automatically transcribe the call, summarize it in real time and update the contact’s record.
(No mention, however, of the ongoing lawsuit in August by a former Salesforce executive claiming Salesforce doesn’t have the technology to process and organize customer data in real time.)
Marketers and salespeople can also use the Einstein copilot studio to build their own custom prompts and AI-driven actions that aren’t offered by Salesforce out of the box.
“You can add those customizations, you can make it your own, you can A/B test any number of versions and then roll it out to your entire [team],” said Clara Shih, CEO of Salesforce AI.
The trust gap
But the flip side of any conversation about the fun stuff people can do with AI is concern about the risks.
“Every CEO is thinking about AI and knows that it has the promise and potential to drive growth,” Shih said.
But there’s also an AI trust gap. More than half (52%) of consumers don’t trust the safety and security of generative AI. “And for good reason,” Shih said.
Generative AI has a tendency to hallucinate and produce biased information, incomplete results and sometimes straight-up incorrect outputs.
To manage the security issue, Einstein copilot will operate within a so-called “trust layer” that involves secure data retrieval from Data Cloud. That should reduce the risk of hallucination, Shih said.
Salesforce also obfuscates any sensitive or proprietary information before it’s sent to the language models and runs regular audits to make sure the AI is behaving itself.
“Twenty-five years ago, we created a way for companies to put their customer data in the cloud securely and safely, and 10 years ago we created a way for companies to use predictive AI safely and securely,” Shih said. “We’re going to do it again for generative AI.”