unit 11

content

Installation Test
The philosophies of installation test, is to demonstrate that the system has been correctly installed and is operational. Demonstrating a subset of the total system requirements performs installation test Test engineering interacts with the other project team members (hardware and software). The installation test is designed to be a repeatable process and is documented. The installation test is conducted at the target site with the formal test procedures reused as much as possible. The software load build is under the care of configuration management. The installation testing type is considered black box testing. There are no drivers, stubs, simulators, stimulators, etc. installed. The software load build is the code and hardware that will be delivered. The dotted lined box in the graphic below of Installation Test conveys that a subset of requirements will be tested.

installation test


The test manager should manage the activities of the system engineers and software engineers in setting up a check-off procedure that provides a detailed level of activities for recording the installation of the hardware and software components. On a Government contract the client or the acceptance agency witnesses this. The activities of the acceptance agency are to:

  1. Supply the number of personnel the client wants to witness the test
  2. Determine the place, time and, the duration of this test
  3. Determine the schedule and budget because the client will generally want to witness acceptance test, which also demonstrates that the goals and objectives of installation test are satisfied

The installation test activities are the same as function and system test. The emphasis in installation test is different, but not the activities. Test engineering should reuse existing test documentation for installation test.

Test Plan
The test plan will be unique for this phase but very similar to previous test plans in format. The test plan document should contain the test group of requirements. The test group of requirements are reused from function and system test. The naming and numbering of these requirements should be preserved to allow the requirements to be easily identified. A table format referencing the original document should be developed for traceability. If test engineering is only using a subset of previous test groups, a requirements trace ability matrix probably does not gain them anything. If all the test groups are being reused, a requirements trace ability matrix should be included in the test plan.

Test cases
The test case document should contain the test cases. The test cases are reused from the function and system test. The naming and numbering of the test cases should be preserved to allow the requirement to be identified. A table format referencing the original document should be developed for trace ability. If test engineering is only running a subset of previous test cases, a requirements trace ability matrix probably does not gain them anything. If all test cases are being rerun, a requirements trace ability matrix should be included in the test cases document. If the original test case format contained an entry for begin and end time, this information, in coordination with duration, personnel, and hardware resources should be recorded in determining how many test cases may be executed during the installation test. If the installation test is approached as a dry run of acceptance test, test engineering may be able to generate a single test plan and test cases document that covers both of these phases.

Test Execution Conducting A Dry Run
Before the formal test execution begins, test engineering should conduct a dry run. The test manager should plan time for both the system engineers and software engineers to checkout their installation procedures. Test engineering will also run a set of test cases to verify the installations. The dry run during installation, with respect to software test, is less of an issue than the previous test phases. This is because the test cases for this phase are a subset of test cases from previous phases. However, if they are not and a new set of test cases are being introduced, the dry run needs to be planned and conducted in accordance with the dry run activities given in the previous lesson.

If test engineering used simulators, stimulators, or emulators, in a previous phase of testing in place of a hardware component, test engineering should plan to conduct dry run activities with these test cases replaced with the real system components. Plan on problems with external interface the first time they are used in a live test. Test engineering should verify that the software & hardware is correctly installed by executing critical paths that exercise all software components.

If dry run does not go well, rework the schedule so that dry run activities are successful. Test engineering should not put off problems at the dry run time, because they will appear during the acceptance test.

Conducting Test Execution
The installation test is being conducted at the client’s site. Test engineering should follow the procedures from the dry runs and obtain the information of the exit criteria. Test engineering should address the following issues of concern: access times, limit schedule, hardware resources, and accessibility on test bed.

The installation test techniques, tools, activities, and regression test scope are the same as for function test. Refer back to the function test section for those activities.

The installation test entry criteria and input are:

  1. Successful completion of system regression test
  2. Completion of system installation plan
  3. Intact hardware & software test environment
  4. All hardware under CM control
  5. Installed formal configuration build

The installation test exit criteria and output are:

  1. Hardware installation check-out sheet is complete
  2. Software installation check-out sheet is complete
  3. 100% of test case have been exposed
  4. 100% of test cases are successful or written agreement with client indicating acceptance
  5. All problem reports resolved or a written agreement with client indicating acceptance
  6. Test report are completed and delivered to client
Philosophies of Acceptance Test
The Philosophies of Acceptance Test is to demonstrate that the system meets its intended requirements in the target environment and is a subset of the total requirements. Test engineering team is independent from the other project team members (hardware and software). The acceptance test is a formal process with witnesses. The acceptance test is designed to be a repeatable process and is documented. The acceptance test is conducted at the target site with the formal test procedures reused as much as possible. The software load build is under the care of configuration management. The acceptance testing type is considered black box testing. There are no drivers, stubs, simulators, stimulators, etc. installed. The software load build is all the code and hardware that will be delivered. The dotted lined box in Figure Acceptance Test conveys that a subset of requirements will be tested. The client or the acceptance agency always witnesses the acceptance test.

acceptance test

The acceptance test activities are the same as function and system test. The emphasis during acceptance test is different, but the activities are the same. The test engineers should reuse existing test documentation in acceptance test.

Acceptance test preparation emphasizes reuse of the test plan and test cases. The goal of the acceptance test phase is to demonstrate that the product requirements have been met so the client will accept delivery of the product as complete. The acceptance criteria must be demonstrated to satisfy the client that the product works as stated in the requirement documents. The mechanism for demonstrating the criteria is a combination of the acceptance test documents. The client approves these documents and they are baselined and the acceptance criteria is established.

Test plan
The test plan is unique for this phase but very similar to previous test plans in format. The test plan document contains the test groupings of requirements. The requirements should be reused from function and system test. The naming and numbering of the requirements should be preserved from the other test phases for ease of identification. Test engineering should build a traceability matrix to reference the original document. If test engineering is using a subset of previous test groupings, a requirements traceability matrix probably does not gain them anything. If all test groupings are being reused, a requirements traceability matrix should be included in the test plan.

Test cases
The Test Case Document should contain the test cases. The test cases are reused from the function and system test. The naming and numbering of the test cases should be preserved to allow the requirement to be identified. A table format referencing the original document should be developed for traceability. If test engineering is only running a subset of previous test cases, a requirements trace ability matrix probably does not gain them anything. If all test cases are being rerun, a requirements trace ability matrix should be included in the test cases document. If the original test case format contained an entry for begin and end time, this information, in coordination with duration, personnel, and hardware resources should be recorded in determining how many test cases may be executed during the acceptance test.

Acceptance Test Conducting A Dry Run
Before the formal acceptance test execution begins, test engineering should conduct a dry run. The dry run may not be necessary if the installation tests were conducted and satisfy the criteria. If installation test was not conducted, test engineering must plan a dry run of some nature. The degree and extent of the dry run will depend not only on resources, but on the likelihood of encountering a problem such as: external interfaces, incorrect software and or hardware installation.

The test manager should plan time for both the system engineers and software engineers to checkout their installation procedures. Test engineering will also run a set of test cases to verify the installations. The dry run during installation, with respect to software test, is less of an issue than the previous test phases. This is because the test cases for this phase are a subset of test cases from previous phases. However, if they are not a new set of test cases that are being introduced, the dry run needs to be planned and conducted in accordance with the dry run activities given in the previous lesson.

If test engineering used simulators, stimulators, or emulators, in a previous phase of testing in place of a hardware component, test engineering should plan to conduct dry run activities with these test cases replaced with the real system components. Plan on problems with external interface the first time they are used in a live test. Test engineering should verify that the software & hardware is correctly installed by executing critical paths that exercise all software components.

Acceptance test techniques, tools, activities, and regression test scope are the same as for function test. Refer back to the function test section for those activities.

The acceptance test entry criteria and input are:

  1. Successful completion of installation test and regression test
  2. Intact and operational hardware and software test environment
  3. Installed formal configuration build

The acceptance test exit criteria and output are:

  1. 100% of test case have been exposed
  2. 100% of test cases are successful or written agreement with client indicating acceptance
  3. All problem reports resolved or a written agreement with client indicating acceptance
  4. Test report completed and delivered to client
  5. The client has accepted the system

The test manager and test engineers should have a set of forms for recording the test phases. The following instruction for completing the forms and the forms are provided as examples:

  1. Attendance record for formal qualification testing (FQT)
  2. Test procedure data sheet
  3. Test Activity log
  4. Test Result summary log

© January 1, 2006 James C. Helm, PhD., P.E.