Splunk – Introduction to Reporting – Alerts – Dashboards

Splunk introduction - notes!

Splunk is considered Google like search engine for the logs. Correlation of data is one of the key features considered to use Splunk.

Flexible data pipeline - any type of data can be roped into the platform, extract, and format it and make it searchable

Quick search, time normalization and powerful query language makes it stand top across competitors

ADHOC Search- considered in general inefficient on comparison with other types of searches.
As u are trying to find the problem- may-may not find it after the search - if many people do at the same time, efficiency may be impacted. It is done to make a feel for the data, to pin down the issue we are looking for! the discovery of issues which are already known and few which the user sees for the first time as well.

Scheduled Search-you know the problem- search in time intervals and make it efficient. During this time, we must make sure- the impact on the system is high| real time Search- real time search as it happens - heavy impact on environment- do not perform any real time without approval.

Licensing model - earlier- charge on amount of data bringing in- usually- filter the data - so correlating gets impacted-

Workload pricing model- computation charges- based on the compute on platform the charge is made and not for the data loaded-in. more computation, the licensing will be decided.

  1. Major features of Splunk enterprise.
  2. index - bucket of data -> as data enters its inspected and match to a source type and make it as a single event - timestamped and stored in the Splunk indexes so it can be searched. a particular level of access ex: network logs to a index, application logs to another index etc.
  3. index can be considered for the data retention ex: 30 days, 60 days
  4. by searching in the Splunk- diff source type can be searched
  5. Search - monitor- alert
  6. one can create alerts and monitor specific conditions
  7. allows you to collect reports in visualization and dashboard

Web Interface

  1. Apps - sit on top of Splunk instance, can also be called as workspace
  2. Roles- decide what the user can see, do or interact with
    1. Administrator: role is the powerful role in the list of roles; - install app- ingest data- create knowledge objects for all users
    2. Power User: create and share knowledge objects for users of an app and do real time searches. - this is in general people get to create alert and dashboards.
    3. User role - can see only their own knowledge objects and those shared with them.

Once logging in Splunk enterprise, it has 2 apps by default- and there are so many apps which can be picked from the Splunk base!

  1. Home app - manage other apps- gives quick space to - create custom dashboard as a default
    Admin can also add apps from home app
  2. Search & Reporting app: provides a default interphase for searching and analyzing the data and has 8 components
    1. Splunk bar-> edit->view messages->monitor the progress of search jobs
    2. App bar
    3. Search bar- used to run searches
    4. time range picker - events for specific time ex: 60 min, 1 day, 4 day - - do not perform long time search
    5. histogram - the events occurring in the specific period is not here.
    6. how to search panel
    7. Data Summary button
      1. host (IP address, domain name)
      2. source(path/filename)
      3. source type (classification of data)
  3. Table view-
  4. Search history- old search history can be searched with the filter option, can be re-run again across specific timeline on how many runs have been made.
  5. rolling over events- makes it highlighted- can add that to search
  6. failed password to the search - can remove the data from search by clicking on the highlight
  7. drop down for event actions.
  8. The vents can be extracted by clicking on the arrow in the recent. The data is in the key value pair, one point t
  9. The admin team will have to do the field extractions, only the key value pairs are extracted and made int he proper format. Field extractions to be done, manual extractions base done expressions limit the filtering on the later part of the search- so as a best practice do as must as search and filter in the base search as possible
  10. Key word search example "error" keyword is given to Splunk- it searches across all events for the keyword
  11. text from the pdf when updated may not format as expected. for format- properly - control | is used to format the results.
  12. Table commands- the field mentioned will allow you to see the results in the format of a table
  13. fields command- to remove fileds or order fields in a particular way
  14. top- finds the most common values of the given field and % distribution and count
  15. top is easy to make the visualization- in the results just cluck o visualization from the data searched.
  16. rare - opposite to top
  17. Stats - enables users to calculate the statistics
  18. Sum -
  19. As -
  20. Group by - count by
  21. eval - used to create an extra column with a default value or a formula evaluation of the values. ex: eval abdc=if(x<'5000, 8000, abdc) 
  22. Time chart- takes results and formulate in the time selected in the time picker.
  23. span - can be used in time chart command to chunk the time intervals- for trends etc.
  24. Stats -
  25. Base search - the search before the pipe, mostly index, source, source type, host.
  26. transforming search - everything after the pipe | symbol written after the base search

Search Processing language

  1. wild card- * ex: fail* leads to search of failed or failure or fails - used after the string is more efficient than at front.
  2. AND NOT OR -
    1. ex: failed password is like failed AND password
    2. ex: failed OR password displays all combinations
    3. Order is NOT OR AND
    4. parenthesis is used to control the order of evaluation
    5. "Failed password " in general used with quotes to search

Features and terms used on Splunk on day 2-day use!

  • Shared Search jobs
  • Export Results- raw- csv, xml, Json
  • Search mode- fast (no field discovery)
  • verbose- discovering all data as can
  • default mode- -
  • Timeline- visual rep of segments on the time- on clicking the timeline- we ca see the event generates on that time.

What is an event? - time index- based on time zone in user account bottom row has the selected fields, rolling

Other factors used can be noted below:

Add to Search

icon- to open in new browse window

Clicking on highlighted text can add or remove to search

event actions

field actions

Search Processing language

Wild cards - *

search terms are not case sensitive

AND OR NOT can be used for multiple familiar words like US or CA

Order Role evaluation

not or and (Preference)

"

\"

What are commands, functions, clauses, arguments in search terms?

how we want to search- a site's foundation of search queries.

Commands -what we need to do with the searches results- create charts, computing statistics and formatting

Functions - explains how we want to compute and evaluate the result

Arguments - variables we need to apply for the functions

Clauses - how we want results (group or defined)

Below terms can be used in the search

Index

host

Source type

Stats

Count

visits

search Visits >1

There are certain admin consoles not all the users may have access to.

Splunk Specialist – List of Roles and Responsibilities

Splunk Specialist with good IT infrastructure skills, in multi-platform environments, ideally familiar with Linux. There are several innovative projects in Splunk, and various companies are looking for qualified administrators with Splunk experience and/or certification. Main responsibilities:
  • Participated in all Splunk company initiatives, both internal projects and customer mandates.
  • Install and configure the necessary components to collect data from DB, log files, API, etc. to Splunk.
  • Install, configure, administer Splunk Enterprise on Windows and Linux.
  • Support Splunk updates.
  • Monitor and identify performance issues.
  • Perform data onboarding in Splunk: data collection, filtering, and transformation (source types, inputs, transforms, etc.);
  • Build use cases: advanced SPL, dashboards, reports, alerts, etc.
  • Always continue to develop product knowledge and act as a product expert.
  • Document best practices.
Qualifications required:
  • Integrating data from various sources (DB, log files, APIs, etc.) into Splunk (on prem or cloud);
  • Experience in CIM modeling in Splunk.
  • Experience in managing indexes and knowledge objects in Splunk.
  • Experience working with cloud offerings such as Azure or AWS.
  • Knowledge of basic security concepts.
  • Experience in access management (RBAC model) in Splunk.
  • Valuable experience in AIX, Linux (RedHat, CentOS) systems administration (permissions management, security (including TLS/SSL), debugging, etc.);
  • Exceptionally good experience in Splunk user support and training.
  • Good knowledge of system virtualization.
  • Good knowledge of server infrastructure.
  • Knowledge of storage, operating systems and networking.
  • Knowledge of Splunk Enterprise Security is an asset.

Python Programming – Operators and data types

The most important foundation level in python is understanding topics like Statements, keywords, Identifiers, Operators, datatypes, methods, class, objects, etc..

Let's see the concepts below on the Operators to begin with,

2 variables A and B, with A =5 and B=10

Arithmetic -> Addition +, Subtraction, -, Multiplication *, Division /, Modulo%, Floor Division //, Floor Multiplication **

A+B ->5+10=15

A-B=5-10 = -5

Conditional operators - lesser than<, Greater than>, less than or equal to<=, Greater than or equal to >=, Not equal to !=, Equal to ==

Boolean data types - TRUE, FALSE

Logical Operators - AND OR NOT

Membership Operators - IN, NOT

Identity Operators - IS, IS NOT

 

Appium – Mobile App Automation – Introduction

Appium (appium.io) is an open-source tool, used to automate mobile applications, IOS, Android and support windows and platforms like android, iOS, desktop, windows as well. It can be used to automate native, web and hybrid mobile applications. native- an application which is native to mobile OS like iOS or Android web - can run in mobile web browser hybrid- are native applications but still can be opened in a specific native container or browser. Appium uses selenium web driver library, and the client scripts can be written in multiple programming languages Java, python, C# ruby, PHP, JavaScript and robot framework. written in C# created in 2011 by Dan Cuellar under the name iOSAuto, 2012 made open source under Apache license. 2013- sauce labs agreed to support Appium development. Design of Appium: Appium uses selenium web driver script -> and using the web driver wire protocol (Json wire protocol) the client scripts are converted into http rest-based requests-> sends to Appium server-> Appium server runs the request-> and runs automation using the native automation framework on the mobile application in the mobile device. more info is int eh Appium documentation here https://appium.io/docs/en/about-appium/intro/ Install Apium in Windows, option we are going tose eis from node.js Check in command line node --v to check for the version of the node js  

image

 

If not available, download the node.js from the official website, Download | Node.js (nodejs.org)

now in command prompt type node -v or npm -v for checking on the versions of node and npm(node package manager)

 

The node and npm is installed successfully in the machine.

to see the location of the installed npm and node

npm where

node where 
install appium
npm install -g appium

To verify Appium installation
appium --version or appium -v

to start appium just type in the command prompt as appium

to stop ctrl+c and the appium server will shutdown

Install appium with appium desktop client based ont eh OS u are using select  the necessary package and install.
To start the server, click start

APPIUM Doctor -GitHub - appium/appium-doctor: Tool to verify appium installation
- is the next to be installed. npm install appium-doctor

to check if appium-doctor is installed, use appium-doctor --version

to check for android
appium-doctor --android


https://www.browserstack.com/guide/appium-with-python-for-app-testing

Java – Object Oriented Programming – Polymorphism

one method name used across multiple methods with the difference in datatypes, or number of arguments, when called based on the arguments used as inputs, the methods are picked up. Example 1: parent class:
//Polymorphism
//Compile time polymorphism
//object overloading
//different in data types and diff number of arguments
public class Calculator
{
public static void main(String[]args)

{
Calculator calc =new Calculator();
calc.add(10, 20);
calc.add(10, 20, 30);
calc.add(10, 20, 30.5f);
}

public void add(int a1, int a2)
{
System.out.println(a1+a2);

}
public void add(int a1, int a2, int a3)
{
System.out.println(a1+a2+a3);
}	
	
public void add(int a1, int a2, float a3)
{
System.out.println(a1+a2+a3);
}
}
// object overriding
//run time polymorphism

public class Scicalc extends Calculator
{
public static void main (String[]args)
{
Scicalc scicalc = new Scicalc();
scicalc.add(100, 100);
}
//intentional addition of the below method to override the parents method add
public void add(int a1, int a2)
{

if (a1>100 && a2 >100)
{
System.out.println(a1+a2);
}
else
{
System.out.println("Please enter numbers greater than 100");

}
}
}
Example 2:
//if class made as final, it cannot be inherited, remove the final in the next line to be inherited by child class.

public final class Parent3
{
//final keyword is to make sure the //variable is not updated in the child class	
//child class uses only the parent class //value and is not overridden
final int pocket_money =5;
public static void main(String[]args)
{
Parent3 parent3 =new Parent3();
parent3.watchTV();
}

public final void watchTV()
{
System.out.println("LG");
}
}
Below is child class
//method overriding
//dynamic binding
//run time polymophism - on an inheritance //code 
public class Child3 extends Parent3
{
public static void main (String[]args)
{
//regular mode of creating an obj in parent class
//Parent3 parent3 = new Parent3();
//Parent object reference for a child object
// parent obj ref can only call method of a //child class which is also in the parent //class
Parent3 parent3 = new Child3();
parent3.watchTV();
//parent3.coding();
//Child3 child3 = new Child3();
//child3.watchTV();
//new Child3().watchTV();
System.out.println(parent3.pocket_money);
//to check the value cannot be assigned to alreay declared final
//parent3.pocket_money=10;
}
//public void watchTV()
//{
//System.out.println("Smart TV");
//}

public void coding()
{
	System.out.println("coding");
}
}

Collection

First let's see the collections advantages over array

  1. Continuous memory
  2. Unused memory will be wasted
  3. cannot say the array size upfront

Collection is basically called a collection of objects Interfaces - all are contracts

  1. Set - ex: playing cards - no order is maintained- no duplicate elements are present
  2. List - ex: Grocery list - insertion order- can have duplicates!
  3. Map

JavaScript

basics to see, how JavaScript works

Open Browser-> Console-> see basics of JavaScript

Comments in java script is given by //

//This is a comment

Basic data types

All the below are considered as numbers

integers-> 10

floating point-decimal ->12.0

negative numbers-13.5

String -> Ex:1 "hello World" Ex:2 "12"

Boolean -> true , false

undefined and null are basic data types in java script

clear () - used to clear console

Apache JMeter Introduction – Basic tests

Apache JMeter is an Apache project used to in load test, to evaluate the performance of the applications. Let's get started in few simple steps in creating a simple test to understand the basic features of JMeter.

Accessing JMeter -the package can be downloaded from the Apache JMeter - Download Apache JMeter website and extracted, saved in the preferred location from where it could be used. The necessary pre-requisite is provided on the website.

Open the tool in windows OS by navigating to the path and double clicking the batch file, once the tool is opened

Navigate to file->new->Test Plan add necessary info and save the file.

New Test Plan creation in Apache JMeter

Add necessary info on to the test plan creation.

Once the Test Plan is created, Right click on the test plan and create a thread Group defining the total number of users, ramp-up time, etc.. to be used in the test.

Adding the users- ramping up and ramping down, here the users are10 and minutes to ramp-up (steadily increase the user count/simulation to do the similar actions) in the 20 seconds.
Adding Samplers

Adding listeners to see the results, here we are seeing 2 listeners View results tree and View Results in table, once the test plan is run the below information can be seen.  

View Results Tree

 

View Results in Table

 

Notes:

  1. Create a test plan
    1. View results
      1. Results tree
      2. Results in table
    2. Assertion
    3. Timer
    4. Listener
    5. Thread Group
      1. Ramp up
      2. Ramp down
    6. Heap dump
    7. Enable Debug
    8. HTML Report Viewer
      1. Log Level
        1. Trace
    9. Start->Remote->Stop+All
    10. Test/Sampler-> FTP,HTTP, JDBC, API

Python – Automation – P1- Install robotframework

Robot framework.org - python based framework - keyword driven approach!
Ex:
open browser url chrome
input text id text info
Close Browser

install on windows OS

1.As a pre-requisite python must be installed in the system you are going to create the automation scripts, once its installed
need to check if pip is available

2. pip install robotframework


3. pip install --upgrade robotframework


4. pip install robotframework==5.0.1

to check if the robot framework is correctly installed, please follow the below commands!
pip freeze


pip list


pip show robotframework


pip check robotframework

If you want to uninstall Robot framework

To check on the version of the Robot framework

robot --version

Set environment variables in the PATH, so python is accessible from across locations in the system from where the code/file is being saved.