Tik-76.115 Software project

Streaming Media Proxy for Mobile Communication

Master Test Plan

 
Author(s): Eero Tuomikoski, Risto Sarvas
Version:  3.0
Phase: T4
Latest update: 20.3.2001
Document state: Published
Inspected by:
Inspection date:

Executive Summary

This document is used as a master test plan in SMOXY project [SMOXY]. It contains a description of the overall testing process, the testing schedule, and initial plans for acceptance and system testing.

Acceptance testing measures the goals of the whole project described in the Project Plan [PROJPLAN] and the Requirement Specification [REQSPEC]. These high level targets are set by the customer. Therefore, the acceptance testing must be considered more from the customer point of view, as a final approval for the quality of the delivered product. One of the goals is the comprehensive documentation due to the research-oriented nature of the project. The acceptance testing also requires that the results of system testing must be presented as one criteria for successful exit of the project.

The acceptance test plan must be inspected by the customer in order to agree the common goals with the project team. Also the actual testing at the end of the project may be performed together with a representative of the customer, if that is seen feasible. This is decided by the customer. The functional requirements stated in the Requirement Specification [REQSPEC] and revised in the Functional Specification [FUNCSPEC] are not targets of acceptance testing. The testing of these features is covered in detail by the system testing.

The purpose of the system testing is to let the customers and users to check that the system fulfills their actual needs documented in the Requirement Specification [REQSPEC]. It concentrates on the non-functional requirements like installability, usability, maintainability, modularity and documentation of the system. For all these features, it is characteristic that it is challenging to find measurable key factors.

The system testing is performed as black-box testing. In this approach, it is assumed that the implementation details of the system are hidden and unknown to the tester(s). Testing concentrates on the interfaces of the system, especially to its externally observable functionality and input-output  behavior. In practice, this means the tester taking the roles of the end user and the system operator.


Table of Contents

1. Introduction
2. Testing environment
3. Training
4. Process
5. Acceptance test cases
6. System test cases
7. Integration testing
8. Module testing
9. Exit criteria
10. Risks
11. References


1. Introduction

1.1 Scope

The purpose of this document is to define the testing process which will take place for the end product of the SMOXY project [SMOXY] before the delivery to the customer. The primary output of the acceptance and system testing are the test summary reports, which will present the outcome of the end product's evaluation. The evaluation is performed based on this document, which describes the acceptance testing environment, acceptance test cases, system test cases, responsibilities and success criteria. It is also defined, what is the structure of all testing (V-model).

The end product of this project consist of an architecture for streaming media proxy and a prototype that implements a set of the key nodes in that architecture. A major contribution is the project documentation: it defines the end user scenarios and use cases, requirements, functional and technical specifications, test plans and reports. Due to the research-oriented approach of the project, the quality of this documentation is considered essential. The design decisions and the rejected alternatives for those must be reasoned and documented. In other words, there is more emphasis on the documentation than the actual binary code program developed. This emphasis is required by the customer.

The intended audience of this document is the customer, course personnel and project members.

1.2 Acronyms and definitions

All project specific terms and definitions are listed in the Glossary [GLOSSARY].

1.3 Changes from T2

In this version of the test plan, the whole testing has been rescheduled due to the test managers one month vacation. The testing planned for T3 has been moved to the beginning of T4. The testing effort for phase T4 has been therefore doubled. In addition to rescheduling the acceptance test criteria has been defined. Also the guidelines for integration and module testing and their respective exit criteria are given.

1.4 Changes from T3

The most dramatic change in this version was the refining of the system test cases. Now they are in that level, that it is possible for an external person (to the group) to verify the correctness of the system. Additional layer of test suites was introduced in terms of both integration and module testing. The most important aspect of these was that they are all fully automated with test scripts. Integration tests are targeted for inter-module functional verification, as module level testing concentrates on the basic buildding blocks. Other changes include numerous smaller corrections e.g. in chapter 2, 'Testing environment', which are mostly just clarifications. Chapter 10, 'Risks' was updated to address the realized risks from testing point of view.

2. Testing environment

2.1 Hardware

For testing, a standard Intel platform is used to host SMOXY. Testing requires also client devices. A laptop with WLAN network card is used for that purpose. If this is not available, standard PC with Internet connection can be used.

2.2 Software

SMOXY runs on top of Linux operating system.

Burana is used for error reporting when the project team finds it feasible. The target is set to start using it at the beginning of phase T4. 

Phase T4: considering the minimal benefits and additional bureaucracy involved, the team decided not to use Burana software for error reporting. Instead, information is stored in test reports as described in chapter 4.2 'Defect handing'. 

2.3 Tools

Testing tools are partly created by the project (when found necessary, mostly scripts), and also readily available tools are used. These are selected in phase T4.

Netscape browser (4.0 or higher) is used as a client software running on Win98/WinNT operating system. Also a media player supporting mpeg and avi formats (e.g. RealPlayer or Windows Media Player) is required.

CTC++ code coverage tool may be used, if white-box testing method is used. This is decided when planning testing for each phase individually.

Phase T4: All intergration and module tests were automated. See chapters 7 and 8 for details.

3. Training

All project members are briefed about the testing methods used in this project. This information sharing guarantees that all participants are aware of the testing process, and what are the requirements set by it.

No other training is seen necessary at this stage. If new testing tools are introduced, more training may be scheduled if needed.

4. Process

4.1 Description

In this project, SMOXY software testing is considered from the very beginning, and performed according the V-model presented in picture 1. The testing process is organized in levels. They reflect the different viewpoints of testing on different levels of detail: the level of abstraction decreases as we move along the streamlines. The motivation for using V-model in testing is to locate software failures as early as possible.
Picture 1. V-model of software testing

At the bottom, testing concentrates on the individual modules and their functionality, at the level of source code. The modules are developed independently of each other and can therefore be tested separately. Modules are not stand-alone programs, so special software must be created to test these. Often, these are side-products of the implementation. Modules are tested against their design documents. Only the critical modules are tested in this project. These are defined in each phase separately. The end product of the module testing is the corresponding test report.

Integration testing seeks to combine the different modules in to a single subsystem and verify the correctness of their interoperability. The focus is on their interfaces and co-operation. Even if components work independently, new defects may be revealed when they are put together. Integration testing can be performed in two ways: by assembling all the modules together for execution, or by composing the system with a bottom-up approach. The latter, incremental way of organizing integration testing is preferred in this project, since it is easier to locate errors in smaller subsystems. Integration testing relies on the functional specification and technical documentation, concentrating on the architecture point of view. In this project, integration testing tests the SMOXY excluding clients and other external components, and the end product is a summarizing report.

At the system testing level, the whole system is tested. The entry criteria for it is that most of the software is already covered in testing of the previous phases. System testing looks after issues which may be overseen in the previous software-intensive testing levels. Typically, these consist of parameters like poor overall performance, capacity requirements, configuration testing and recovery capability in error situations. These are the mostly the non-functional requirements described in requirements specification, but part of them is also traceable in the functional specification. Going beyond SMOXY, the system testing includes external components, like the clients, to the scope. Executed system testing is one of the requirements of the acceptance testing.

In this project, the acceptance testing verifies the high level project goals. The weight is on the customer point of view. The scope of the acceptance testing was already discussed in the beginning of this document.

Black-box testing is preferred over white-box testing. The concept of black-box testing is capsulated with well-know functionality of the tested object. In other words, the system is started with some input x, the system's output f(x) is measured and compared against the predefined, specified output y. If f(x) = y, the system has passed the test.

If it is felt necessary, the project may perform white-box testing of certain key blocks of the code. The idea of going through the all the software at source code level is felt too exhausting effort with the current resources of the project team, so it will be done on selected areas if so decided later.

The white-box testing is partially covered by selected inspections of the source code. Code inspections are arranged for the critical parts of the software. These are identified in each phase separately.

4.2 Defect handling

The purpose of all testing and validation is systematically to discover missing functionalities and requirements in the developed software. All found defects must ultimately lead to a correction of the behavior. In the SMOXY project, this is achieved by following these iterative steps in all levels of testing:

The following template is used to report any irregularities while running system, integration or module test cases. The information needed to solve or abandon error report is enclosed together with the original report.

Date Test identifier Status Priority Tester
[yyyy-mm-dd] [Unique test case id] [OK,NOK] [H,M,L] [Tester id]

Problem

[Write here how the problem occurred,  what was it, is it possible to repeat it, anything special in the environment, or is there something wrong with the test case.]

Fix date

[yyyy-mm-dd]

Fixed by

[Guru id]

Analysis

[Describe the actual problem in detail. How it was fixed? Was is not a problem? If the fixing is not done, explain why? Problem with the 3rd party software?]

 

4.3 Resources and timing

Picture 2. Testing effort scheduling.

Starting from the bottom of the V-model, module testing is planned and performed by the persons who develop the corresponding source code. This is feasible because these are the persons who has most knowledge of the required functionality. On the other hand, it violates the basic assumption: testing must be done independently, since the programmers are too biased by their solutions and blind to their errors. However, in this project there are no resources to execute the external testing on modules. Like at every testing levels, it must be documented what was tested and what where the results. Module testing is started at the end of project phase T3, when first source code blocks are created, and it continues until the last component is finished in phase T4. In this project, module testing is the responsibility of developers of the modules.

Integration testing starts during the phase T4 when there is available module tested components. For this, the persons testing the software are different from those developing it, if the division of work allows that. It is very beneficial to cycle developers to test each other's code, since they have the needed experience of the structure of the software. Full time testers often lack this information, especially if the knowledge transfer has not fully taken place yet. The integration testing is an incremental process starting with the basic skeleton of the software and ending up with the full system. Therefore, it lasts until the end of the last implementation phase T4, and is performed by a person responsible for integration testing (to be named).

For system testing, the primary resources are the persons not so heavily involved with the source code development. By having a bit more external to point of view, they don't bear the weight of ownership. The use of testing resources outside the project could be achieved by temporarily exchanging persons with another project group. The feasibility of this idea is studied. System testing planning was started at the end of phase T2, while the actual system testing is performed on phase T4, with rescheduling overhead to phase LU. System testing is the responsibility of the system tester (to be named).

The acceptance testing involves also the customer in the delivery phase, LU. It is in the customer's interest to oversee the final testing rounds in order to familiarize himself with the end product and to evaluate the quality of the product and, more importantly, the quality of the project documentation. This includes estimating the deepness, level of research and value for the customer in documentation. In practice, the acceptance testing starts already at the end of phase T4 and ends few weeks before the end of the whole project.

At the end of the phase T2, the project presents the first demo of the software. A separate design document discusses also the testing aspects of the demo [DEMODOC].

The need for code inspections is evaluated in each phase separately. For more detailed resource allocation in testing, the project plan is revised in the beginning of each phase to describe the division of work.

5. Acceptance test cases

This chapter presents the pass criteria for the project's end product. The end product defined in the Requirement Specification [REQSPEC] is as follows: The pass criteria are divided mainly on the documentation and the tested prototype. All returned documentation is in English and in HTML format. UML is used in the technical documentation. To pass the acceptance test all the following requirements (5.1 - 5.5) must be fulfilled.

5.1 Scenarios and use cases

The customer reads through and accepts the Use case and Scenarios documents. These two documents are not required by the course and therefore presented in a separate test case.

5.2 Architecture documentation

The customer reads through and accepts all the following documents. These documents are the documents required by the course and directly linked to the designed architecture. The version of the document is in parentheses.

5.3 Prototype of the architecture

The executable prototype is tested against the tests defined in the test plan. These tests are called System tests. The prototype passes the System tests within the defined pass criteria. The pass criteria are documented in the test plan. The customer has accepted the test plan prior to the testing.

5.4 Organized documentation

In the Requirement Specification [REQSPEC], there is a strong emphasis on documentation of the project. Therefore this acceptance test is designed to test the organization and presentation of all documentation provided by the project group. The project group must provide the documentation in HTML format and it must include all the documentation mentioned in the test cases 5.1 and 5.2. In addition to these documents the Project plan is presented and special documentation considering the development process of the system, i.e. how certain decisions were made and what decisions were discarded.

6. System test cases

6.1 Introduction

The purpose of the system testing is to verify the product against the requirements in Functional Specification [FUNCSPEC]. In system testing, all parts of the system (including client devices, content servers and databases) are brought together under the assumption that software intensive integration and module testing is already performed. Secondly, system testing targets to reveal problems that arise from interoperation of different parts. Further on, they reveal problems that are not quantative in their nature. The most significant of these in this project is the usability perspective.

The system test cases are defined in such a detail that it does not require a tremendous effort to execute them without deeper knowledge of how the SMOXY system performs. This is done bearing in mind the customer's need to verify the correctness of the system. Their second purpose is to facilitate the external testing effort performed by the opponent group in the beginning of the phase LU. System tests are human labor intensive due to the low level of automation.

System testing does not attempt to solve defects that are encountered in third-party software used in testing.

Installation is not tested due to the unavailability of implementation. This is scheduled for the last phase, LU. Therefore, the tester must be provided with a preinstalled version of SMOXY.
 
System test parameters
CLIENT Netscape 4 or higher [browser]
SMOXY [ip]
ADMIN_ACCOUNT [id:passwd]
FRONTDOOR [ip:port]
DIRECTUI n/a [url]
PORTAL n/a [url]
CONF_UTIL [web pages]
USER ACCOUNT [id:passwd]
LOGS [collection of files]
SMALL_STREAM [url]
MEDIUM_STREAM [url]
LARGE_STREAM [url]
COLOR_STREAM [url]
INCORRECT_STREAM [url]
AVI_STREAM [url]

6.2 Test cases

Test cases are recognized by a individual identifier. Priority states the relative importance of an individual test. Test cases are recommended to be executed in the presented order.
 
Category
Identifier
Priority
Refers to requirement
SYSTEM TEST-ADMIN-01 Medium FUNC-ADMIN-01
Description

Start SMOXY.

Input

Log in the [SMOXY] host with [ADMIN_ACCOUNT]. Enter commands 'smoxy.x' and 'proxyd.x localhost 6999 ~/usr/db' in different windows. (Stopping SMOXY requires entering CTRL-C for these windows.)

Output

SMOXY has started. [LOGS] show timestamps for this event.

 

Category
Identifier
Priority
Refers to requirement
SYSTEM TEST-PROFILE-01 Medium FUNC-PORTAL-03
Description

The user configures his profile in SMOXY. Changes are applied by the SMOXY system.

Input

Go to [CONF_UTIL]. Enter your [USER_ACCOUNT] information. Define the client to have a black-and-white display. Save the modification. Reload your profile.

Output

The profile must contain the modification on the second loading.

 

Category
Identifier
Priority
Refers to requirement
SYSTEM TEST-PROFILE-02 Medium FUNC-PORTAL-03
Description

The user tries to make a non-permitted change in his profile in SMOXY. Changes are not applied by the SMOXY system.

Input

Go to [CONF_UTIL]. Enter your [USER_ACCOUNT] information. Select non-permitted combination of filters. Save the modification.

Output

The system must not allow the modification to be applied.

 

Category
Identifier
Priority
Refers to requirement
SYSTEM TEST-ASPROXY-01 Low FUNC-PORTAL-02
Description

SMOXY must require the user to authenticate himself. With correct user id/password combination, SMOXY returns the requested page.

Input

Configure your [CLIENT] to use proxy settings [FRONTDOOR]. Enter [SMALL_STREAM] to location field in your [CLIENT]. SMOXY returns with a proxy authentication window. Enter [USER_ACCOUNT] parameters.

Output

Access to the SMOXY system is granted and the requested page is returned. [LOGS] show that user was identified.

 

Category
Identifier
Priority
Refers to requirement
SYSTEM TEST-ASPROXY-02 Low FUNC-PORTAL-02
Description

SMOXY must require the user to authenticate himself. With incorrect user id/password combination, the usage of SMOXY is denied.

Input

Verify that your [CLIENT] uses proxy settings [FRONTDOOR]. Connect to [SMALL_STREAM]. SMOXY returns with a proxy authentication window. Enter an incorrect password for [USER_ACCOUNT].

Output

Access to the SMOXY system is not granted. [LOGS] show that user was not identified.

 

Category
Identifier
Priority
Refers to requirement
SYSTEM TEST-ASPROXY-03 High FUNC-PORTAL-01
Description

SMOXY is configured to be used as a proxy and a video stream is fetched.

Input

Verify that your [CLIENT] uses proxy settings [FRONTDOOR]. Enter [SMALL_STREAM] to your [CLIENT]. SMOXY returns with a proxy authentication window. Enter [USER_ACCOUNT] parameters.

Output

The video stream is delivered, and it is displayed. [LOGS] show that stream was delivered through SMOXY. The size of the received file must be the same as the original.

 

Category
Identifier
Priority
Refers to requirement
SYSTEM TEST-FRONTDOOR-01 High n/a
Description

Recovery from a non-existent server.

Input

Connect to an url that points to any server, which does not exist.

Output

A proper error message is displayed in the client.

 

Category
Identifier
Priority
Refers to requirement
SYSTEM TEST-FRONTDOOR-02 High n/a
Description

Request cancelled by the user. SMOXY system performs a proper cleanup.

Input

Connect to fetch [SMALL_STREAM]. During the download, cancel action by stopping or shutting down the client.

Output

SMOXY must recognize loosing the client. Further connections are accepted and processed without interference.

 

Category
Identifier
Priority
Refers to requirement
SYSTEM TEST-FRONTDOOR-03 High n/a
Description

Fetching a medium-sized video stream.

Input

Verify that your [CLIENT] uses proxy settings [FRONTDOOR]. Enter [MEDIUM_STREAM] location to your [CLIENT]. SMOXY returns with a proxy authentication window. Enter [USER_ACCOUNT] parameters.

Output

The video stream is delivered, and it is displayed. [LOGS] show that stream was delivered through SMOXY.

 

Category
Identifier
Priority
Refers to requirement
SYSTEM TEST-FRONTDOOR-04 High n/a
Description

Fetching a large video stream.

Input

Verify that your [CLIENT] uses proxy settings [FRONTDOOR]. Enter [LARGE_STREAM] location to your [CLIENT]. SMOXY returns with a proxy authentication window. Enter [USER_ACCOUNT] parameters.

Output

The video stream is delivered, and it is displayed. [LOGS] show that stream was delivered through SMOXY.

 

Category
Identifier
Priority
Refers to requirement
SYSTEM TEST-ENGINE-01 High FUNC-CONTROL-01, FUNC-CONTROL-03
Description

Connection to the external user profile storage is tested. SMOXY is able to retrieve the profile corresponding to the user's identity, when a request originating from this user is received.

Input

Go to [CONF_UTIL]. Log in with [USER_ACCOUNT]. Modify the profile to accept only black-and-white streams. Save the modifications. Fetch [COLOR_STREAM].

Output

A black-and-white stream must be received. SMOXY logs show that stream was delivered through it.

 

Category
Identifier
Priority
Refers to requirement
SYSTEM TEST-ENGINE-02 High FUNC-ENGINE-01, FUNC-CONTROL-03
Description

Conversion from one video stream format to another, avi to mpeg, is tested. In user profile, client is required to receive all streams in mpeg format. This implies that SMOXY must convert the out coming stream to mpeg regardless of the incoming stream type.

Input

Fetch a [AVI_STREAM].

Output

The user receives a mpeg stream. This is visible from RealPlayer and [LOGS] show that stream was delivered through it.

 

Category
Identifier
Priority
Refers to requirement
SYSTEM TEST-ENGINE-03 High n/a
Description

The recovery from broken stream is tested. A proper stream has been modified to contain malicious crap.

Input

Fetch a [INCORRECT_STREAM].

Output

SMOXY must recognize the error situation and recover.

 

Category
Identifier
Priority
Refers to requirement
SYSTEM TEST-PORTAL-01 High FUNC-PORTAL-01
Description

The tester person is provided with a web page that contains a single link pointing to a stream. The link has a special format that actually connects to a SMOXY frontdoor, but this is not visible to the tester. Selecting the link results the stream to be delivered to the client. The stream is delivered as it is.

Input

Verify that your [CLIENT] does not use proxy settings. Use your browser [CLIENT] to select the single link from web page [PORTAL] that points to [SMALL_STREAM].

Output

The video stream is delivered, and it is displayed. [LOGS] show that stream was delivered through SMOXY.

 

Category
Identifier
Priority
Refers to requirement
SYSTEM TEST-DIRECTUI-01 High FUNC-DIRECTUI-01
Description

The tester person is provided with a web page [DIRECTUI] that contains an input field. The tester enters [SMALL_STREAM] location. Information is posted to a SMOXY frontdoor, but this is not visible to the tester. The stream is delivered (unmodified) to the client and displayed.

Input

Verify that your [CLIENT] does not use proxy settings. Go to [DIRECTUI]. Fetch a video stream by giving an url and choose 'Get it'.

Output

The user gets the requested stream. [LOGS] show that stream was delivered through SMOXY.

 

7. Integration testing

7.1 Introduction

The integration testing is done in two parts: first part starts immediately in the beginning of phase T4, and the second part starts approximately at the middle of phase T4. The first part tests the integration of the key modules defined in the first part of module testing, the integration testing for these modules starts obviously after the modules' module testing. The second part tests the modules in the first integration tests and in addition the next most important modules. The main difference between the second integration test and system testing is that for system testing actual databases, clients etc. are used. In integration testing these external modules are represented by stubs. The integration testing is designed in more detail at the beginning of phase T4.

7.2 Test cases

All integration tests are automated. They consist of executable programs. This is a valuable asset when using iterative approach in software development, since regression testing is required when new features are introduced gradually. The lifecycle of implementation bugs is expected to shorten.

The first set of integration tests the proprietary packet protocol between a Frontdoor and Smoxy Engine. This is a traditional client-server setup with Frontdoor and Engine, respectively.

The second set of tests verifies the functionality when Database is integrated with Frontdoor and Engine.

7.2.1 First set

The first set of test cases includes the testing Smoxy Engine, which consist of modules network.h, connection.h, parallel.h, stream.h. For this purpose, a test implementation of Frontdoor was created. The Frontdoor sends messages to Smoxy Engine with the packet protocol (PP). The purpose of the created FrontDoor is to test the strenght of the server side implemention of PP. Also the Smoxy's capability to handle parallel requests is tested. See chapter 4.1.1.1 in [TECHSPEC] for details.

Category
Identifier
Priority
Refers to requirement
INTEGRAT TEST-PP-01 Med n/a
Description

Successful client identification.

Input

Send a [VALID_USERID] using packet of type EPP_USERID.

Output

Smoxy must respond with packet of type EPP_OK.

 

Category
Identifier
Priority
Refers to requirement
INTEGRAT TEST-PP-02 Med n/a
Description

Unsuccessful client identification.

Input

Send an [INVALID_USERID] using packet of type EPP_USERID.

Output

Smoxy must respond with packet of type EPP_ERROR.

 

Category
Identifier
Priority
Refers to requirement
INTEGRAT TEST-PP-03 Med n/a
Description

Send a query to Smoxy to deliver a stream idenfied with a valid host:port combination and request message.

Input

Send a packet of type EPP_STREAM_LOCATION containing [VALIDCS:PORT]. Send a second packet of type EPP_STREAM_REQUEST containig a valid HTTP request.

Output

Smoxy must respond with EPP_STREAM_DATA packets. The last packet of transmission must be EPP_EOT.

 

Category
Identifier
Priority
Refers to requirement
INTEGRAT TEST-PP-04 Med n/a
Description

Send a query to Smoxy to deliver a stream idenfied with an invalid host:port combination and request message.

Input

Send a packet of type EPP_STREAM_LOCATION containing [INVALIDCS:PORT]. Send a second packet of type EPP_STREAM_REQUEST containig a valid HTTP request.

Output

Smoxy must respond with packet of type EPP_ERROR.

 

Category
Identifier
Priority
Refers to requirement
INTEGRAT TEST-PP-05 Med n/a
Description

Send a stream to Smoxy that it can handle.

Input

Send a packet of type EPP_STREAM_DIRECTLY to Smoxy containing followed by a packet of type EPP_STREAM_TYPE with data [KNOWN_STREAM].

Output

Smoxy must respond with packet of type EPP_OK.

 

Category
Identifier
Priority
Refers to requirement
INTEGRAT TEST-PP-06 Med n/a
Description

Send a stream to Smoxy that it cannot handle.

Input

Send a packet of type EPP_STREAM_DIRECTLY to Smoxy containing followed by a packet of type EPP_STREAM_TYPE with data [UNKNOWN_STREAM].

Output

Smoxy must respond with packet of type EPP_ERROR.

 

Category
Identifier
Priority
Refers to requirement
INTEGRAT TEST-PP-07 Med n/a
Description

Perform TEST-PP-05. Continue sending stream data until end of stream.

Input

Send EPP_STREAM_DATA packets and terminate with EPP_EOT packet.

Output

Smoxy must respond with EPP_STREAM_DATA packets. The last packet of transmission must be EPP_EOT.

 

Category
Identifier
Priority
Refers to requirement
INTEGRAT TEST-PP-08 Med n/a
Description

Perform TEST-PP-05. Send user identification, which violates the protocol messaging sequence.

Input

Send EPP_USERID.

Output

Smoxy must respond with packet of type EPP_ERROR.

 

Category
Identifier
Priority
Refers to requirement
INTEGRAT TEST-PP-09 Med n/a
Description

Use two frontdoors to run parallel requests.

Input

Run test TEST-PP-05 with two frontdoors.

Output

Smoxy must respond with EPP_STREAM_DATA packets to both frontdoors. The last packet of each transmission must be EPP_EOT.

7.2.2 Second set

Not available yet due to the pending implementation of Database in SMOXY.

8. Module testing

8.1 Introduction

Module testing is divided also in two parts. First the key modules (about three modules) are tested as soon as the tests are designed in the beginning of T4. These modules are then integration tested in the first part of integration testing. Then the rest of the modules are tested and their integration in the second part of integration testing. The actual modules and their tests are defined in more detail at the beginning of phase T4.

8.2 Test cases

Tested modules: connection.h, network.h

Testing is based on creating test programs using the functions defined in module header files. Both tested modules are used for communicating over TCP sockets and form the core of the SMOXY system. A straight-forward server implementation was created to act as a peer for both modules. With this setup, it was e.g. possible to test the transparency of the modules. This means, that when sending data to server that echoes all received data back, it can be verified that data remained unchanged.

The test cases are automated with scripts and presented in the Test Report [TESTREPORT] document.

9. Exit criteria

9.1 Acceptance test exit criteria

The acceptance test report summaries the results of testing at the end of the project. It is noted here that due to the nature of the course (it ends at predefined date), there is no iteration round after the presentation of the acceptance test results. A countermeasure for this is, that a representative of the customer is involved in the acceptance testing during phase LU. This reduces the risk that defects found acceptance testing would be left unresolved.

9.2 System test exit criteria

There are two criteria for exit of the system testing.

[1] There MUST NOT be system tests of priority "HIGH" failing (in chapter 6).

[2] All system test cases MUST be executed at least once in phase T4.

System testing is accepted by the project manager. This is enough, since one of the acceptance tests (approved by the customer) requires that the results of system testing are presented.

9.3 Integration test exit criteria

Phase T3: The test manager (Eero Tuomikoski) decides when the integration testing is passed. The programmers (Jukka Fiskari and Kari Jaatinen) are responsible for providing the test code and helping the test manager with the testing. After the design of the test cases the concrete exit criteria can be given.

Phase T4: The exit criteria for integration testing is that 80% of test cases must be executed successfully. 

9.4 Module test exit criteria

Phase T3: The test manager (Eero Tuomikoski) decides when each module has passed its tests. The programmers (Jukka Fiskari and Kari Jaatinen) are responsible for providing the test code and  testing the modules under test managers supervision. After the design of the test cases the concrete exit criteria can be given.

Phase T4: The exit criteria for module testing is that 80% of test cases must be executed successfully.

10. Risks

10.1 Risk analysis

]1] Testing effort is too large compared to the schedule.
Countermeasure: Plan well before hand, prioritize, drop something and reschedule.

[2] Testing does not find critical defects.
Countermeasure: Use V-model which starts from the bottom to the up. Divide testing to manageable tasks. Communicate actively with other team members and try to identify critical parts. Use iterative approach.

[3] Testing tools are missing or testers don't know how to use them.
Countermeasure: Select tools in advance and arrange training.

[4] The test manager takes one month vacation (the risk has already materialized).
Countermeasure: The testing is rescheduled, and the testing effort in phase T4 is doubled.

10.2 Realized risks

Phase T3: The person responsible for testing (Eero Tuomikoski) was on a vacation the month of January 2001, therefore the schedule has been changed from the previous one presented at the end of phase T2. Due to this rescheduling the T4 phase is heavily loaded with testing.

Phase T4: The decision to automize all integratation and module test cases took more time than estimated in the beginning of the phase. This explains the relatively small total number of tests. Especially on integration testing, effort has been concentrated on issues which are expected to more likely fail than success. This explains the lack of test cases for functionalities, which are well-proofed to be correct. These are addressed again in phase LU. The second realized risk was that Database design and implementation was not available early enough to give time to prepare the test cases for that.

11. References

[DEMODOC] SMOXY Phase T2 Demo Documentation
http://amnesia.tky.hut.fi/smoxy/t2/eTLA_DemoDoc_T2.html
[FUNCSPEC] SMOXY Functional Specification
http://amnesia.tky.hut.fi/smoxy/t4/eTLA_FuncSpec_T4.html
[GLOSSARY] SMOXY Glossary
http://amnesia.tky.hut.fi/smoxy/t3/glos.html
[PROJPLAN] SMOXY Project Plan
http://amnesia.tky.hut.fi/smoxy/t4/eTLA_PPlan_T4.html
[REQSPEC] SMOXY Requirement Specification
http://amnesia.tky.hut.fi/smoxy/t3/eTLA_reqs_T3.html
[SMOXY] SMOXY Home Page
http://amnesia.tky.hut.fi/smoxy/
[TECHSPEC] SMOXY Technical Specification
http://amnesia.tky.hut.fi/smoxy/t4/eTLA_TechSpec_T4.html
[TESTREPORT] SMOXY Test Report
http://amnesia.tky.hut.fi/smoxy/t4/eTLA_TestReport_T4.html