Denis Jakus

Demystifying Tech & Management

CTO x Unique People

IT Consultant

Denis Jakus

Demystifying Tech & Management

CTO x Unique People

IT Consultant

CTO Note

Unlocking Excellence: A Deep Dive into Software Development Testing10 min read

01 October 2023 Other
Unlocking Excellence: A Deep Dive into Software Development Testing10 min read

Most of the time, TDD is completely overlooked, for numerous reasons!

Being it reasonable (like, the client doesn’t want to pay for it) or not (quoting developers: “It’s useless and we waste too much time instead of implementation”), one thing that pops out a lot is the bugs 🕷️.

During my professional software development career I have to admit that most of the time I was not writing tests.

Unfortunately, I have to say!

Due to senior management always wanting to prioritize implementation over stability, customer satisfaction, and inability to upsell this kind of service.

I will write about TDD from the developer’s perspective as well as from the CTO’s perspective on how to structure and define the QA team, roles, and processes.

TDD – Test-driven Development

As a must TDD should be implemented on all types of projects as it will provide a much more stable and enjoyable process for the developer and end customer.

Test Driven Development (TDD) is a software development approach in which test cases are developed to specify and validate what the code will do. In simple terms, test cases for each functionality are created and tested first and if the test fails then the new code is written to pass the test making the code simple and bug-free.

Test-driven development starts with designing and developing tests for every small functionality of an application. TDD framework instructs developers to write new code only if an automated test has failed. This avoids duplication of code. The TDD full form is

Test Methods


Black-Box-Test

  • No knowledge of source code
  • Functional test against interface specification
  • Test cases extend specification
  • Usage: System test, acceptance test, part of unit-tests

White-Box-Tests (aka Glass-Box-Test)

  • Tests based on source code
  • Control-flow-oriented
  • Compliance with specifications not proved
  • Test-Cases derived from the inner behavior and state of the test subject
  • Usage: Proving correctness, debugging

Gray-Box-Test

  • Combines Black-Box-Tests and White-Box-Tests

Test Types

(User) Acceptance-Tests / System-Tests

  • Start of specification phase … Start of implementation phase
  • Executable acceptance criteria tests for user story
  • Goal: Demonstrate user interaction and fitness for use cases and non-functional requirements
  • Does not include a proof of correctness of code

Integration-Tests

  • Start of implementation phase … End of implementation phase
  • Interaction of software units and integration into larger systems (stubs and mocks)
  • Goal: Test interoperability and communication of software units and use cases
  • In most cases: Does no proof of correctness for the software units and system

Unit-Tests

  • Start of implementation phase … End of implementation-phase
  • Testing of small units: single method and class (constructors & methods)
  • Dependencies to other units are mocked
  • Goal: Proof of correctness (against specification)

Performance-, Stress- and other Tests (non-functional requirements)

Test Driven Development in Scrum projects

Always break down features into stories

  • Always try to break down big stories into smaller stories
  • Acceptance criteria are a must for all backlog items
    • Features contain coarse-grained acceptance criteria
    • (User) Stories contain acceptance criteria that can be tested
  • For every user story define
    • Executable acceptance tests
    • User acceptance test description for manual testing
    • Unit tests
    • Integration tests
  • For every technical story define
    • Executable acceptance tests
    • Unit tests
    • Integration tests

Build pipelines

All stages of a build should consist of smth similar to this:

  • Preparation – this stage generates all the environment variables needed for further stages
  • Clean and compile – executes cleaning and compiling
  • Unit tests – executes unit tests
  • Integration tests – executes integration tests
  • OWASP executes OWASP dependency check
  • Build submodule – executes all the stages till publishing for submodules that are needed later for dependent modules
  • Build – executes build
  • Sonarqube – executes inspection of code quality with static analysis of code to detect bugs and code smells
  • 8_Helm lint – executes helm linting of helm charts
  • Docker/Helm – executes docker build and generates helm charts
  • NexusIQ – executes analysis of docker images
  • Publish – pushed helm charts and docker images to the corresponding repository
  • Deploy – creates namespace and deploy artifacts to Openshift (or another K8S solution)
  • System tests – executes all the system tests against the deployed environment
  • Clean – executes cleaning of the environment

We should have our own package manager system configured in which most reused components would live so it’s easy and fast to set up or develop a new project.

Developer measurement

The easiest way to start measuring a developer’s performance is to incorporate SonarQube and track all the metrics it provides. According to that, we can get a clearer picture of a developer’s work.

Also according to research regarding bugs/bug fixing:

It considers rates under 5% as acceptable. In this latest quality check, it determined that 250 calls were below standard. Defect rate = (350 / 10,000) x 100. Defect rate = 0.035 x 100. Defect rate = 3.5%

According to McConnell’s research, the industry average defect rate is around 1-25 bugs for every 1,000 lines of code

McConnels research

SonarQube is a Code Quality Assurance tool that collects and analyzes source code and provides reports for the code quality of your project. It combines static and dynamic analysis tools and enables quality to be measured continually over time.

SonarScanners running in Jenkins can automatically detect branches and pull requests in certain jobs. You don’t need to explicitly pass the branch or pull request details.

Sonarqube Quality Issues

When a piece of code does not comply with a rule, an issue is logged on the snapshot. An issue can be logged on a source file or a unit test file. There are 3 types of issues: Bugs, code smells, and vulnerabilities.

Bug

An issue that represents something wrong in the code. If this has not broken yet, it will, and will probably break at the worst possible moment. This needs to be fixed.

Code smell

A maintainability-related issue in the code. Leaving it as-is means that at best, developers maintaining the code will have a harder time than they should when making changes. .

Vulnerability

A security-related issue that represents a backdoor for attackers.

Test Coverage

Test coverage reports and test execution reports are important metrics in assessing the quality of your code. Test coverage reports tell you what percentage of your code is covered by your test cases. Test execution reports tell you which tests have been run and their results.

Test Coverage for Java projects

For Java projects, SonarQube directly supports the JaCoCo coverage tool.

Test Coverage for JS/TS projects

For JS/TS projects, SonarQube directly supports all coverage tools that produce reports in the LCOV format.

Now that we have some overview of tests from the developer’s perspective, let’s see what and how you will setup a QA team and what are the overall steps of building QA team.

QA/Testing part

As a rule in software development, you should have a Test Plan that is prepared as a part of Sprint Planning. This means that the plan includes the specification of tests for the acceptance criteria of each user story:

– what to test

– how to test

– what will be the criteria for pass or fail

The team should use one of the automatization tools, the most recent recommendation is to use Cypress.

Workflow

The typical workflow expected of a Software Test Engineer (QA) is as follows:

  1. PM creates a user story in the backlog and writes Acceptance Criteria
  2. During Backlog Refinement, you must carefully analyze functionality in the ticket, including what and how to test
  3. As a Tester, you should write the specification for that user story including the tests during the sprint, and include them in the Test Plan (after that you can automate everything in the Regression Suite, for example in Cypress)
  4. On every PR merged, Cypress tests are performed, you receive an email with the results, you review the result and if you see FAIL, then you try to manually analyze it and report a bug
  5. In case of reporting a bug, the Tester puts a bug into the backlog
  6. If there are new features, the Tester tests them manually and then automates them in future iterations
  7. A bug in the backlog goes into refinement like any other bug if it is not urgent

Tooling

You have specific tools for Test Management that have a plugin in JIRA. For example: XRAY, Zephyr Scale, Test Rail.

All of these tools have all the concepts needed for the Test Process:

– Test Plan

– Test Suite

– Test Collection

– Test Case

– Test Execution

– Test Report

The benefits of using any of these tools are that they have everything worked out very well. For example, you can attach issues from that tool to each user story: test case, test plan, test report, etc.

And as stated before these are plugins that integrate excellently with JIRA.

How to Create a Test Plan


In this section, I will provide bullet points on how to create a proper Test Plan. But first, let’s answer what is a Test Plan.

A Test Plan is a document that includes information about the testing strategy for a software product, the objectives, the time and available resources, the assessment process, and the deliverables. It gives us an idea about the efforts that will be necessary to create a quality product and thus helps estimate the necessary amount of time for validation.

Test Plan is the main responsibility of a Test Manager.

According to IEEE 829 (Standard for Software and System Test Documentation), there are certain steps to create it:

Analyze the product

  1. Who will use the website?
  2. What is it used for?
  3. How will it work?
  4. What software/ hardware does the product use?

Design the Test Strategy

  1. Define Scope of Testing
  2. Define Testing Type (Unit, API, Integration, System, Agile)
  3. Document Risk & Issues
  4. Define Test Logistics (Who, When)

Define the Test Objectives

Define Test Criteria

Resource Planning

Plan Test Environment

Schedule & Estimation

Determine Test Deliverables

  1. Test deliverables are provided before the testing phase.
    • Test plans document.
    • Test case documents
    • Test Design specifications
  2. Test deliverables are provided during the testing
    • Test Scripts
    • Simulators
    • Test Data
    • Test Traceability Matrix
    • Error logs and execution logs.
  3. Test deliverables are provided after the testing cycle is over.
    • Test Results/reports
    • Defect Report
    • Installation/ Test procedures guidelines
    • Release notes

QA Team

What are the roles and the expected amount of knowledge needed for each position in the QA team?

Positions:

  • Test Manager
  • Senior position
  • Mid position
  • Junior position

Test Manager

Is responsible for leading the testing team. Test Manager plays a central role in the Team.

The Test Manager takes full responsibility for the test team’s achievements. The role encompasses test advocacy, resource management, and managing problems related to launching the test. He also should have knowledge of at least one programming language.

The Test Lead / Manager is responsible for:

  • Being head of the Project Team and directing it to successful completion is the chief responsibility of the Testing Team’s leader.
  • Defining the scope of delivery testing for each release within the stage of each delivery.
  • Deploying and managing resources for testing.
  • Arranging the proper tests and metrics for testing the product in question with the Testing Team.
  • Installing, tracking, and controlling the testing process for any project. – Creating a Test Plan
  • Define an overall template with checklists of the testing process

Some challenges that will probably arise while leading a project:

  • Not enough time to test
  • Not enough resources to test
  • The project budget is low, and the schedule is too tight
  • Testing teams are not always in one place
  • The requirements are too complex to check and validate

Senior position

This position expects that the QA engineer is able to:

  • Set up the whole process
  • Include test management tools within the environment
  • Set up the automation framework
  • Set up CI/CD
  • Write(script) basic test utility tools as helper tools
  • Knowledge of systems, engineering, etc.

Mid position

This position expects that the QA engineer is able to:

  • Do the automatization/scripting
  • Has an overview of complete processes
  • Has knowledge of multiple technologies
  • Is able to set up CI/CD pipeline for test, etc.
  • Knows how to use reporting tools

Junior position

This position expects that the QA engineer is able to:

  • Write tests
  • Has the understanding of a technology that he tests
  • Knows how to debug by using at least DevTools
  • Knows how to create an issue with a detailed description
  • Has the basics of reporting tools

With this, we will wrap up today’s a bit longer CTO Notes.

Have a lovely weekend!

Denis J.

Taggs:
Related Posts
Write a comment