How To Ingest Demo Data as Data Source
Ingesting Demo Data
Step 1: Login to workspace. Home Screen is as shown below.

Step 2: Click on cfxGenie App. The initial view/screen displays empty summary (as shown below):

Step 3: Job Overview
Click on '+' icon displayed at right-side top corner to add data sources. 'Job Overview' screen is displayed and is as shown below:

Click 'Next' at the bottom of the 'Job Overview' screen/view. This takes you to the next step 'Job Properties'.
Step 4: Job Properties
The 'Job Properties' screen captures the 'Job Details' including Data Source (that a user wants to use/input to system). This screen is as shown below:

Enter the job details (example input to job details screen is provided below).
Name: Enter job name
Description: Enter job description (Optional)
File Source: Select a Data Source. Application currently supports three different types of data sources - Demo Data, File Upload and ServiceNow. This section covers on using Demo Data provided by CloudFabrix. Select 'Demo Data' box provided on the screen. Select dataset type - Application Tickets, Cloud Tickets, and/or Infrastructure Tickets depending on nature of tickets / incidents type (default 'Application Tickets' are selected).
Acceptable Mean Resolution Time: Select the 'Acceptable Mean Resolution Time', which is used for reporting and cost analysis.
Note: 'Acceptable Mean Resolution Time' duration may vary based on your Business requirements, IT Incidents/Tickets, NOC/SOC recommendations.
Click 'Next' to go to next screen.
Step 5: User Inputs

The 'User Inputs' screen maps the underlying data source fields for demo data. Mandatory fields are automatically populated by application. User can update other fields if required or can leave as it is with defaults. Updating/Selecting other fields in this screen will allow application to generate additional reports. For example, reporting by sub ticket type or ticket classification, to name a few.
Click 'Next' to go to next screen.
Step 6: Clustering Inputs:
Select appropriate mapping field(s) to cluster input fields. If the fields were mapped in the previous screen, then they are automatically populated. These mappings are important for clustering analysis.
Note: Clustering is the task of dividing the data points into a number of groups such that data points in the same groups are more similar to other data points in the same group and dissimilar to the data points in other groups. It is basically a collection of objects on the basis of similarity and dissimilarity between them and is arranged using ML/AI algorithms.

Click 'Next' to navigate to Cost Inputs screen.
Step 7: Cost Inputs:
Cost inputs has direct impact on cost analysis. So update L1, L2, L3 rates per hour. By default CloudFabrix provides default values as per industry standard. But, user can modify these rates as per your IT/Business needs. The screen is as shown below.

Once L1, L2, L3 rates and currency type are updated, click 'Finish'. The data is ingested into application, ML algorithms are applied to generate the necessary analytics data.
Step 8: Depending on the data (data size), it may take some time to access reports. Click 'refresh' button/icon on the job screen (top right hand side). As soon as the job is completed, the status is moved to 'Completed' as shown below.

Last updated