ChatGPT and large language models (LLMs) are having their moments in the hype cycle, it is important to explore how these generative AI tools will make the leap from consumer applications to real business value.
We caught up with Eugenio Cassiano, Senior Vice President of the Strategy & Innovation Group at Celonis, and Cong Yu, Vice President of Engineering, AI and Knowledge at Celonis, to talk about LLMs like ChatGPT and the intersection with process mining. You’ll hear more about this topic in the next edition of Celonis Labs Beyond later this year.
Here's a look at the key points from our conversation:
Biggest impact may be behind the scenes. Yu noted that a chat interface on top of search has captured the imagination in the tech sector, but the impact on enterprise is likely to be elsewhere. For instance, code generation and proprietary programming may be accelerated by LLMs. Yu said:
"If you think about how LLMs are trained, they're learning statistical patterns, and doing a very good job at that, from the vast amount of content on the Web. This includes common programming languages like Python and SQL, which has led to GitHub Copilot. In the enterprise setting, however, there are a lot of proprietary languages being adopted for specific business needs. For example, Celonis uses Process Query Language, a SQL-like language that is uniquely tailored to scalable and efficient process mining. While this is not as visible as a chat interface, we believe the impact of LLMs, with infused process knowledge, on PQL generation could be a game-changer: it will make the job of our value engineers, as well as our partners and other users outside of the company, much easier, and significantly shorten the value journeys for our customers."
Just one part of a broader process mining innovation equation. Cassiano said that LLMs can make things easier for customers and Celonis, but is just one part of the innovation equation. "We are on an exciting journey and we have already started with the introduction of Object Centric Process Mining (OCPM). This is basically the next stepping stone of our journey," he said.
With OCPM, Cassiano said "the next logical step is to create a common ontology across all our customers." This ontology would capture process knowledge, including benchmarking and recommendations, he added. LLMs could be a natural interface into that process knowledge.
What do ChatGPT and LLMs lack? Knowledge. Yu said publicly available LLMs "don't have the process knowledge and the business knowledge that we alone as Celonis have." He elaborated:
"We deeply understand all this structured data sitting inside of our clients’ data and operational systems, so the key thing for us is to combine the structured process knowledge with the capability of large language models, and put them together for much greater impact."
What would this interface combining LLMs and Process Mining look like? Yu said LLMs could make it easier to use natural language for charting in PQL. "Then you'd have a dialogue interface to explore your process data," said Yu.
The need for a business translation layer. Yu said a business translation layer could take the process knowledge within Celonis and make it more consumable with LLM. Yu said:
"The translation layer is essentially about how do you take a large language model that doesn't understand the enterprise data you care about and convert it into a model that actually understands, for example, you have an SAP ECC schema for your databases, and can actually answer questions with clear understanding what an invoice is."
Ultimately, the challenge is developing a LLM that is aware of the enterprise setting and understands the business data, he said. "We are standardizing the process knowledge with a common ontology so you actually learn from many different companies what the process happy path should be," said Yu.
Cassiano said the combination of ontology and process knowledge would become an abstraction layer that could power an LLM interface. ”To succeed, ontologies need to have a business domain and industry by design aspect. At Celonis, we believe strongly in defining common ontologies that can serve customers across the same industries and business domains,” said Cassiano.
The human behind Celonis will be in charge, not LLMs. "The driver will always be the user of Celonis—the human that is in charge and assisted by a generative AI navigator," he said. For enterprise use cases, this navigator and assistant theme will be important. "We don't want to diminish the human aspect, which I believe is very important," he said.