Software Requirement Reviews

Review the degree of maturity of the definition, specification and management of the software requirements.
The aim is to ensure a common understanding between the customer and the development team.
Recommendations are given on the following criteria:

  • Identify functional and nonfunctional requirements
  • Guarantee the quality of the definition and specification of requirements
  • Management requirements in the development cycle
  • Maintenance practice of the product roadmap
  • Change control administration
  • Mechanisms to measure requirements

Design Reviews

The architecture and use of standards is verified in the design.
The aim is to detect and identify nonconformities and improvement aspects of the design before moving on to coding.

Code Inspections

Identify non-conformities in the source code.
The goal is to detect and identify software anomalies, including errors and deviations from standards and specifications.
The developers are shown the benefits of using standards and how to avoid systematic defects.
It is defined from the set of criteria to evaluate and recommendations are generated for the development teams about:

  • Programming logic
  • Coding standards
  • Coding documentation (Headers, Comments)
  • Imports
  • Initializing
  • Calling method parameters
  • Use of nesting structures
  • Modularity
  • Code reuse mechanism
  • Error management


Functional software testing

We have proprietary methodologies that use specialized techniques for developing requirements for reusable test designs, to verify if they meet user needs and their behavior in unexpected entries. They are carried out by the product experimental operation and include emergency route testing.
Smoke Test

  • This type of test is used to validate code changes that come in new releases, or patches, before they are incorporated into the product base line. After code reviews are done, the smoking test is the most useful method for identifying software defects.
  • They are designed to confirm that the code changes work as expected and do not destabilize an entire structure.

Regression Testing
It means running tests, or a subset of them, in an updated version of the software to ensure quality after adding new features. Its purpose is to ensure that:

  • The identified defects have been corrected since prior execution of the test.
  • The changes made have not introduced new defects or reintroduced prior ones.

System level comprehensive testing

The objective of these tests is to evaluate the fluidity of the system from the starting point of the process to completion, including generating reports and indicators of the most representative processes. This type of testing can fully evaluate a single set of data reception, validation, processing, consolidation and information publication.

Usability Testing

The ISO 9126-1 standard defines usability as the ability of a software product to be understood, learned, used and attractive to the user.
GreenSQA ensures that failures in the user interface applications are identified and corrected promptly.
In addition to testing in real scenarios, GreenSQA offers usability testing using design paradigms established by the industry.

Integration Testing

These tests are needed to verify that the interfaces are correct among the modules of an integrated solution in one product.
GreenSQA has incorporated three best-practice strategies of Integration Testing into its methodologies. These are applied differently in each project depending on which one fits better in each case:

  • It consists on beginning the integration and the testing for the modules that are at higher levels of abstraction, and incrementally integrate the lower levels.


  • It consists on beginning the integration and the testing for the modules that are at lower levels of abstraction, and incrementally integrate the higher levels.


  • It consists on integrating and testing everything at the same time.

GreenSQA performs communication testing through remote and local devices, depending on the project conditions, to determine that the interfaces among the system components are working properly.

Acceptance testing

Its aim is to validate the integrity of the system through execution testing of business flows (basic and/or complex). They are performed along with the functional users.


Concurrency testing

These tests allow the simulation of concurrent transactions and ensure product quality in extreme situations of use.
Stress testing

  • Concurrent transaction simulation during defined time periods in the testing design.

Compatibility testing

It aims to ensure the transparency of the operation from the different devices and interfaces (i.e. browsers) that are to be considered in development. It is also applicable to reports.

Load testing

Its objective is to assess the compliance of a system or component with specific performance requirements, such as file sizes that transit through the system. This is typically done using an automatic test tool to simulate a large number of users, load and volume of information and to monitor the performance of the hard

Migration testing

This type of testing has two perspectives:
Data Migration:

  • This type of testing is performed to offer support to the data migration process, verifying that the data in the destination complies with the defined specifications. It normally includes the maturity of data, which implies verifying the behavior of the migrated data to the new system, a test that is performed after certifying that the system is functionally stable.

Application Migration:

  • This type of testing is designed to test the migration of the functionality of a different system to another, validating that the expected behavior is maintained. It often involves functional testing on the new system, which was designed for the previous system.

Performance Analysis – Application Profiling

Performance Analysis, also known as Profiling, is measuring the performance of an application in an environment, through information collected during its execution. Its main objective is to identify bottlenecks and determine which components could be optimized in order to improve response times, memory consumption and processor loading, among others.
Profiling is used during the initial software development stage as a testing method to early identify and correct potential faults or design errors. Detecting these errors during production will increase costs and efforts and delay time to market.
This process is done in three basic steps:

  • Initial Application Performance measure.
  • Analysis of potential problems (i.e. coding, memory consumption, processing)
  • Measuring the application´s performance thresholds.

Main types of testing for performance analysis:

  • Load Testing
    The behavior of an application is measured under normal and atypical situations of demand of resources in quantity, frequency or volumes of data. The results allow to determine: response time, process loading, work per time unit and usage of resources.
  • Spike Testing
    This test measures the behavior under a drastic change in loading due to increased user concurrency.
  • Stress Testing
    This test is used to identify “breaking point” of an application, systematically increasing the concurrency of users. The objective is to guarantee the correct operation of the application in extreme conditions of concurrence or computational resources.
  • Soak Testing
    The purpose of this test is to determine the capacity of an application to support the expected load continuously and ensure that it can behave exactly as expected for a length of time without memory leak.

Automatization Testing

It consists on the use of software to automate unattended (no human intervention) test executions (previously designed). It applies especially to situations in which the same manual test must be repeated frequently for maintenance and/or scalability of the Information System.
For the preparation of automated test cases, the generic test model defined by GreenSQA is recommended. It is based on the design of test “scripts” that are sustainable and reusable to facilitate the development of test “suites” destined for regression.
The different approaches used:

  • Capture/ Playback
  • Data Driven
  • Keyword Driven
  • Modularity Testing


Ethical hacking is the job performed by a security analyst, for a company.

The objective is to discover security holes in computer systems, to help the company protect itself and find solutions, before they are exploited by Hackers!


Contact us

Carrera 85b N° 1446
El Ingenio II
Cali – Valle

1809 W Jetton Av 33606
Tampa Florida

WhatsApp GreenSQA
Enviar Vía WhatsApp