You seem to be located in .

Corporate website

Blog

"Unlocking the power of Artifi­cial Intel­l­igence: How Machine Learning and Knowl­edge Graphs work together."

By: Tanuja Gupta Solutions Architect

Artificial Intelligence (AI) has been a buzzword for quite some time now. From natural language processing to Machine Learning, AI has been applied in various domains and Scania has scaled and uplifted itself in the Machine Learning part of AI and using its core capabilities to solve day to day problems. However, one aspect of AI that Scania has started to investigate is semantic reasoning. 

I define semantic reasoning as the process of logically deriving conclusions from the knowledge or the ontology in the Knowledge Graph that is encoded in a machine-understandable format.

 

So today we discuss how KG’s (Knowledge Graphs) together with ML (Machine Learning) can make wonders if it works together and how critical it is to the success of AI applications and where Scania is in this journey.

Semantic reasoning and the difference between ML & Knowledge Graph

Semantic reasoning with Knowledge Graphs has been around for quite some time. However, it has not received as much attention as other AI technologies and in my opinion, this is because semantic reasoning requires a Knowledge Graph or an ontology to be built that represents the domain knowledge. Creating and maintaining such a graph can be a challenging task as today most of our customers need a quick solution. However, the benefits of creating it are immense. Knowledge Graphs can help AI application to understand the context of questions asked by users and provide more accurate and relevant results and this is while doing my course for Data Scientist at Scania I surely understood how important it is for someone doing analysis to understand the context/meaning of the data as we were five different people in the team and we had five different interpretations of the same data, so imagine basing all your analysis on assumption. 

Let's now try to understand the difference between ML & Knowledge Graph approaches. Machine Learning is a type of artificial intelligence that involves training a computer program to learn from data. The program is fed with a large amount of data, and it uses statistical techniques to learn patterns and relationships in the data. The program can then make predictions or decisions based on what it has learned from the data.

On the other hand, learning with Knowledge Graphs involves representing knowledge about a domain in a structured way using a Knowledge Graph or ontology. A Knowledge Graph is like a map of the domain, with entities as nodes and relationships between entities as edges. Learning with Knowledge Graphs involves using logical rules and reasoning to derive new knowledge from the existing Knowledge Graph which we term as inferencing. 

How does it work practically?

To understand this better I would like to illustrate the difference by using an example of a restaurant recommendation system. So, a Machine Learning-based recommendation system would be trained on a large dataset of user reviews and ratings of restaurants. It would learn patterns and relationships in the data, such as which restaurants are popular, which types of cuisine are preferred, and which price ranges are more popular. Based on this learning, the recommendation system would suggest restaurants to users based on their preferences.

 

On the other hand, a Knowledge Graph-based restaurant recommendation system excels in structuring restaurant-related information within a coherent web of knowledge. This Knowledge Graph seamlessly connects entities like restaurants, cuisines, price ranges, and user preferences, allowing for rich semantic relationships. For instance, it effortlessly links restaurants to cuisines, indicating which establishments offer specific types of cuisine. What sets this approach apart is its ability to employ logical rules and reasoning to deduce valuable insights, like identifying popular restaurants based on their cuisine and location, and providing tailored recommendations based on user preferences, traditional RDBMS systems on the other hand would struggle to deliver such subtle, context-aware recommendations due to their limitations in handling complex semantic relationships and reasoning.

Furthermore, traditional RDBMS systems would face challenges in adapting to evolving data structures and accommodating new attributes, making it difficult to handle the dynamic nature of restaurant recommendations. They would also lack the capacity to easily incorporate external data sources or adapt to changing user preferences and restaurant trends in real-time, which the Knowledge Graph effortlessly enables. 

How do we work with this at Scania?

So now let's place the next paragraph in the context of Scania. Scania produces many different types of products, different software applications or different departments have a lot of data about our production processes, the raw materials we use, the equipment in use, and the products that we produce.

To help us make sense of all this data, we (Domain experts along with Ontology Architect) first start by identifying the key concepts in the domain, such as raw materials, products, production processes, and equipment. 

Then we create nodes in the Knowledge Graph for each of these concepts and connect them together with relationships. For example, we might connect a raw material node to a production process node to indicate that the raw material is used in that process.

As we continue to add more data to the Knowledge Graph, we start to see patterns and relationships that we didn't notice before. For example, we might discover that a certain raw material is causing problems in multiple production processes, or that a certain piece of equipment is particularly efficient at producing certain types of products. 

 

With this knowledge, I think we can make better decisions about how to optimize our production processes, which raw materials to use, and which equipment to invest in. We can also use the Knowledge Graph to quickly answer questions about our production processes, such as which products use certain raw materials or which equipment t is responsible for a particular step in the production process.

Project example of using Knowledge Graph and AI

So now let’s talk about an actual running project that our team in collaboration with others is actively working on:

 

¨A risk analysis scenario involving the impact of a natural calamity, such as an earthquake, on a supplier¨

 

The current challenge we face at the purchasing department is obtaining detailed and comprehensive information about this scenario. Unfortunately, our purchaser’s only source of information is limited to news reports and public announcements. While these sources provide general insights about earthquake occurrences and potential consequences, they often lack the specific and detailed data necessary for a thorough analysis of the supplier's situation. To address this issue, we have proposed the use of a Knowledge Graph approach. By incorporating diverse data sources and creating a comprehensive understanding of the scenario, we aim to identify issues in the supply chain.

 

At present, we are in the process of creating the ontology (TBOX) for this use case, collaborating with domain experts. Simultaneously, we are populating the Knowledge Graph's ABOX with relevant data extracted from various sources (in-house application, data lake, and Google/External APIs). The TBOX includes entities such as the supplier, encompassing their manufacturing facilities, supply chain, and location. Additionally, it includes data about the earthquake itself, such as magnitude and location.

 

The goal of this endeavour is to fully populate the Knowledge Graph, enabling us to overcome the challenge of integrating data from news reports into our supply chain analysis. Moreover, by leveraging the power of the Knowledge Graph, we can pave the way for future capabilities.  
For example, once the Knowledge Graph is fully populated, we will be able to conduct simulations and analyze different scenarios. This will allow us to model the potential consequences of the earthquake on the supplier's manufacturing facilities, supply chain, and departments. Such analysis will provide us with insights to assess the overall risk associated with conducting business with that supplier in the context of an earthquake.

Summary and my conclusions

In conclusion, my journey as a Data Scientist highlighted the synergy between Machine Learning and Knowledge Graphs, emphasizing their importance in AI. Bridging unstructured data and structured knowledge demands a multidisciplinary approach involving data engineering, ontology design, and domain expertise. I'm intrigued by the evolving role of Language Models (LLMs) in enhancing natural language understanding within Knowledge Graphs. I'm motivated by the potential of combining Knowledge Graphs, Machine Learning, and LLMs to revolutionize user experiences and AI's future possibilities. This drives me to contribute to this ever-evolving AI landscape.

Reference list

 
  1. https://www.aifb.kit.edu/images/9/94/2007_1644_Grimm_Knowledge_Repre_2.pdf
  2. https://aibusiness.com/ml/semantic-reasoning-the-almost-forgotten-half-of-ai
  3. https://blog.vaticle.com/what-is-a-knowledge-graph-5234363bf7f5