TensorFlow is the basic architecture used by SPOC dev team to develop a tool for the story estimation on the Now Platform. It’s an example of supervised machine learning, where the model is trained from the examples that contain labels. Before we walk through all tasks to solve this specific problem, let’s dive into the very algorithm and overall approach.

**Neural Network algorithm (NN)**

Neural Network algorithm (NN) is an abstract structure that tries to mimic behavior of elemental brain components – neurons. In general, each network is made up of many units that are able to communicate with neurons from the adjacent layers. The most basic type of NN is built with only 2 layers of neurons called “Input” and “Output”.

In our case, we also used 2 hidden layers that are located between “In” and “Out”. In some sense, the hidden layers are places where NN can store information about how important for a specific “out“ neuron is each previous neuron that was activated based on “In” signal. This knowledge is expressed via special values called “weights”.

**Neural Network weights**

Each neuron has as many different weights, as the number of neurons that may propagate signal to it. During the training process, NN tries to adjust how important for an end accuracy each weight is in context of a different combination of “In” signals and correlated with them “Out” values. This learning step is called back propagation.

Analyzing this structure in the context of our task, the number of neurons on “In” layer is equal to the number of defined words in the Dictionary set. The last “out” layer has only 8 elements that correspond with 8 different, possible answers interpreted as estimation value.

The NN that is the result of this step that has:

- defined size of “In” and “Out” for each hidden layer,
- selected proper activation functions (not discussed here to keep this article as simple as possible),
- adjusted weights (trained weights),

and it’s called a model.

From a mathematical perspective, the model is more or less just an ordered set of matrices that can be written as:

Where:

- x is a input vector,
- f-s are different activation functions,
- As are matrices where weights are stored,
- Bs are vectors where biases are stored.

**7-step story estimation with TensorFlow**

The solution our dev team built using the above mentioned algorithm is called TensorFlow. It can be applied for estimating Now Platform stories in terms of time and resources. Its architecture consists of the following steps:

**STEP 1: Data preparation and preprocessing**

Data prepared and preprocessed in our case are acceptance criteria and description from every story. We removed here the stories written in a language different than English, and also filtered out stories with estimate (which is our label) higher than 8 hours, to get proper training data set.

**STEP 2: Dictionary development**

Dictionary is built with the use of Natural Language Processing methods, and is meant to establish the occurrence of a certain word in the acceptance criteria and description of a story. The words are filtered based on the length (longer than 3) and a part of speech (nouns, verbs). The following NLP methods are used:

- normalization,
- tokenization,
- lemmatization.

The output:

**STEP 3: Model development**

The model is developed with the use of machine learning. In this step, it learns the structure of the training data set to make predictions about unseen data. This in turn requires the decision upon:

- number of neural network layers,
- number of epochs,
- loss and gradient function:
- selection of optimization algorithm.

**STEP 4: Training loop**

The training loop starts with each description and acceptance criteria being translated separately into binary vectors. The binary vectors represent a number of occurrences in the dictionary (see the picture below). They are processed by neural network together with labels (estimate value), and once the model is trained and “proven”, it’s ready to make predictions of unknown data.

**STEP 5: Value prediction**

To start predicting values, again description and acceptance criteria are translated separately into binary vectors. Then the vector is processed by the neural network, and the vector of probabilities (numbers representing hours from 1 to 8) is provided.

**STEP 6: Estimation**

Then the final values (number of hours estimated by the TensorFlow) for acceptance criteria and description was chosen using mathematical approximation based on the neighboring values of elements with the highest probability. In order to calculate it the following formula was given:

Where:

- Pn represents the probability of the n-th element (1<=n<=8),
*i*is the position of the element with the highest probability.

**STEP 7: Final value**

The final value is obtained with the combination of both the values from acceptance criteria and description. It’s delivered with the formula below:

The multipliers (0.6 and 0.4) are determined as a result of stories preprocessing.

**Story estimation for your organization**

If you face any challenges related to project estimation or incidents management, consider AI and our Now Platform solution to:

- estimate delivery time of your projects, epics and stories, from the onset to the final outcome,
- automate incidents with classification, reporting and management tools and automatically assign them to a category, group, or a person responsible.

Our team is at your disposal

Authors:

Marceli Tokarski – ServiceNow Developer,

Anna Sobkowiak – ServiceNow Developer,

Adam Goździewski – ServiceNow Consultant,

Agnieszka Milecka – Marketing Manager.

## Leave a Reply