go back to the blog

Artilce: Smoke Test Your Offshore Provider by Erik Boelen

  • 04/03/2011
  • no comments
  • Posted by EuroSTAR

Nowadays, outsourcing and off shoring is the way to go. More and more companies are turning to off shore third party companies to perform part of their development activities.

First, let’s focus on outsourcing itself. According to Wikipedia, ‘Outsourcing became part of the business lexicon during the 1980s and often refers to the delegation of non-core operations from internal production to an external entity specializing in the management of that operation. The decision to outsource is often made in the interest of lowering firm costs, redirecting or conserving energy directed at the competencies of a particular business, or to make more efficient use of worldwide labour, capital, technology and resources.’

When looking at an outsourcing process with an offshore provider, six major phases can be distinguished. The process described below is assuming that the off shore partner has already been selected. The vendor selection process is a process in itself, and therefore not addressed in this article.

pic1 (1)

This article is focusing on the last phase of this process where the delivered code is assessed in order to determine whether it can be accepted into the in-house testing phase.

1.1    Smoke testing to assess test readiness

As mentioned in the intro, a procedure needs to be put in place to determine the test readiness of the software delivered by the offshore vendor. A widely used ‘technique’ that is used, is Smoke Testing.

What is smoke testing?

Smoke testing was first used in the plumbing industry, where smoke was blown into the pipes, and if no smoke came out, the pipes, and the links in between were in order. This principle has been adapted to the software world also.

In Wikipedia, the following is defined to be smoke testing: “In software testing, a smoke test is a collection of written tests that are performed on a system prior to being accepted for further testing. This is also known as a build verification test. This is a “shallow and wide” approach to the application. The tester “touches” all areas of the application without getting too deep, looking for answers to basic questions like, “Can I launch the test item at all?”, “Does it open to a window?”, “Do the buttons on the window do things?”. There is no need to get down to field validation or business flows. If you get a “No” answer to basic questions like these, then the application is so badly broken, there’s effectively nothing there to allow further testing. These written tests can either be performed manually or using an automated tool. When automated tools are used, the tests are often initiated by the same process that generates the build itself.”

The Smoke Test Suite

In order to structure the smoke testing, a Smoke Test Suite can be set up.

A Smoke Test Suite is a set of test cases, possibly including functional test cases, automated test cases, unit test cases and others, that needs to run against each build to ensure that the build is stable and of high quality. The smoke test suite is composed of a standard part of the auto-build and deployment process of the software packages delivered by the off shore partner. The Project Lead, along with the Test Lead and other team members must identify the test cases that need to be incorporated into the smoke test suite and iteratively add test cases every development cycle.

The Smoke Test Suite is communicated to the off shore partner ahead of delivery, so that they know what the quality expectations are towards their deliverables.

A Smoke Test Suite has the following goals:

  • Point out the major problems with a small amount of test scenarios
  • Entry criteria for in-house testing of the software package
  • Assess the test readiness of the software package; it does not show if the product is ready for shipment


1.2    Practical Approach

When actually dealing with the set up and execution of the Smoke Test Suite, no significant difference can be seen between this and a regular test preparation and test execution process. Major differences are the depth of the test cases and the timing. For obvious reasons, the Smoke Test Suite is executed right before the execution of the other test phase. It does not, however, precede the preparation of the other test phases.

Include the Smoke Test Suite as a stage in the test process

It is important that the smoke tests are put in a stage of the current test process. By determining it as a test stage, all the standard test preparation and execution procedures and processes can be applied to the Smoke Test Suite and its results.

Prepare the Smoke Test Suite

The Smoke Test Suite – as a minimum – must include the following:

  • Common scenarios (e.g. registration/account creation/login)
  • Core features
  • Features affecting performance (e.g., search takes more than 20 seconds to return the result set)
  • Test cases to cover bugs that have showed up as a result of changes to the other parts of the code (regression)
  • Back end functionality (e.g., work flow, email alerts) that cannot be verified by the user interface
  • Integration with external sub systems (The Smoke Test Suite needs to ensure that the integration points were not broken by the new build)

Execute the Smoke Test Suite

Smoke Test Suite execution must include:

  • Manual verification of the test cases
  • Execution of unit test cases, possibly using unit test tools
  • Regression testing, possibly using tools
  • Performance testing, including review of output logs
  • Execution of custom tools to verify back-end functionality

1.3    Measure the outcome

In order to complete the process of assessing the test readiness, we need to set up metrics and acceptance criteria. The metrics – and the expected results – mainly involve the testing results of the Smoke Test Suite, and the acceptance criteria are about the delivery of the software package by the offshore vendor in general.

Acceptance Criteria

What needs to be delivered together with the code in order to be accepted for smoke testing?

• Development Validation Test Plan
A test plan, in which the offshore vendor stated the test strategy and the test scenarios used for development validation
• Proof of tests executed
A log of the tests executed and their results
• Release notes should be attached to the code
These release notes should contain the following set of information:
• Functional (Functionality)
• Planned functionality delivered
• Functionality postponed
• Included functionality delayed from previous release
• Unplanned additional functionality
• Change requests delivered
• Defects fixed
• Non-functional (Security, reliability, performance, usability, portability)
• Planned non-functional attributes delivered
• Postponed non-functional changes
• Included non-functional attributes delayed from previous release
• Unplanned changes to non-functional attributes
• Change requests delivered
• Defects fixed
• Issues to be resolved
• Known open defects per severity
• Other issues


Test Scenario Density

The Test Scenario Density (TSD) shows the number of test scenarios in comparison to the number of requirements delivered in the software package. This way, it is shown whether all requirements are covered by test scenarios. It speaks for itself that a detailed report should join this metric, in order to see which requirements are covered and which are not.


The result of this metric should at least be 90% for the Smoke Test Suite. This way, it can be stated that every requirement is covered by almost at least one test scenario. Under requirements should be both the functional requirements as the non-functional requirements.

Defect Density

Defect Density (DD) shows the relation between the number of defects found during the Smoke Test Suite execution and the lines of code of the delivered software package.

The defect density – based on the Smoke Test Suite – should not be higher than 5%.


The spread over different degrees of severity must be the following.

Combination TSD – DD

It is the combination of both metrics that shows the actual result of the Smoke Test Suite. Both metrics need to be in the acceptable range in order to assess the test readiness of the software package delivered by the offshore vendor.

1.4    Conclusion

No matter how good the relationship with the offshore vendor, or how strict the service level agreements may be, there always needs to be a software evaluation moment. During this moment, an assessment is made on the in-house test readiness of the software. This article is about using Smoke Test Suites in order to fulfil this assessment in a structured manner. Basically, it comes down to selecting a precise set of test cases of every level involved in testing and execute those on the moment of arrival of the new software. Based on the results of this Smoke Test Suite – measured by metrics – the software package is accepted for in-house testing or not. By communicating about this to your offshore vendor, the result will be a better understanding of your needs and in the end a more qualitative product.

Who is Erik Boelen?

Erik Boelen is a passionate test professional with his own company, qa consult. As a test coordinator and test consultant, he has helped a large number of companies set up effective test processes. In addition to specializing in agile testing, Erik challenges traditional views of testing to better serve his clients in their specific needs. After helping a specific client, Erik translates his learning into specialized training materials, generalizing them in order to help other organizations deal with their specific problems. Erik can be reached at [email protected].

Blog post by

go back to the blog


Leave your blog link in the comments below.

EuroSTAR In Pictures

View image gallery