An evaluation of BERT applied for AIOps

Research output: Contribution to conferencePosterpeer-review

46 Downloads (Pure)

Abstract

BERT (Bidirectional Encoder Representations from Transformers) is a masked-language model often used for natural language processing (NLP) applications such as text classification, named entity recognition, and sentiment analysis. Integrating BERT with AIOps (Artificial Intelligence for IT Operations) helps improve the analysis and processing of IT- related data. IT ticket log analysis and classification is one possible application of BERT for AIOps. Log data created by IT systems, applications, and infrastructure components can be large and complex, making it difficult to extract relevant insights. Considering this, BERT helps to analyse IT service management (ITSM) records to discover patterns and anoma- lies and extract relevant information, such as error messages, timestamps, and key performance indicators (KPIs). BERT can assist IT teams to promptly identify and resolving issues, optimising system performance, and proactively preventing incidents or major outages.
The current study is an industrial-based NLP project, intended to provide an advanced multi-modal analytics approach to help predict major incidents in the IT infrastructure of well-known large corporation operating in the fintech sector in real-time, allowing for preventative measures to be implemented. We have used the BERT model to detect behaviour patterns of IT systems and applications based on historical ITSM data. We then utilise this model to detect anomalies in real-time data streams. Our main aim is to provide incident categorisation and prioritisation. By evaluating incident descriptions, BERT automatically prioritises occurrences based on severity, impact, and urgency. For implementing BERT for IT incident prediction, we have prepared the data by pre-processing and tokenising the text by using the BERT tokeniser. This involves breaking the text into individual words and assigning them unique numerals. BERT also perform segment IDs and attention masks, which help the model to understand the context and relationships between different words in the input text.
Next, the dataset is split into training, validation, and testing sets. The training set is used to train the BERT model using a technique called fine-tuning, which involves training the model on the specific task of incident prediction using the labelled incident reports. During fine-tuning, the weights of the pre-trained BERT model are adjusted to optimise the model’s performance on the incident prediction task to learn the underlying patterns and relationships in the text. It also involves adjusting the model’s parameters to minimise the difference between its predictions and the actual incident labels in the training data. After fine-tuning, the BERT model can predict the likelihood of an incident occurring for new, unseen text data. This involves passing the pre-processed text through the model, which outputs a probability score indicating the likelihood of an incident occurring. Finally, the performance of the BERT model is evaluated on the testing set to determine its performance metrics such as accuracy, AUCROC score etc.
Our research aids IT operations teams to concentrate their efforts and resources more effectively by streamlining incident management processes. Besides BERT, we have compared state-of-the-art transformers models such as ERNIE and RoBERTa. The results demonstrate a significant improvement in reducing Mean time to resolution for IT incident outages.
Original languageEnglish
Publication statusPublished online - 24 May 2023
EventThe First UK AI Conference 2023 - Turing AI Fellowship Event - Natural Science Museum, London, United Kingdom
Duration: 24 May 202325 May 2023
Conference number: 1st
https://uk-ai.org/ukai2023/

Conference

ConferenceThe First UK AI Conference 2023 - Turing AI Fellowship Event
Country/TerritoryUnited Kingdom
CityLondon
Period24/05/2325/05/23
Internet address

Fingerprint

Dive into the research topics of 'An evaluation of BERT applied for AIOps'. Together they form a unique fingerprint.

Cite this