Tik-76.115 Test Plan, v5.1

Group: Hayabusa, Work: Monrovia, Responsible: Jarmo Mäki

Last modified 2001-04-24 - Jarmo Mäki

Change log


Table of Contents

1. Introduction
1.1 Purpose and scope
1.2 Product and environment
1.3 Definitions, Terminology and Abbreviations
1.4 References
2. Environment requirements
2.1 Hardware
2.2 Software
2.2.1 Server
2.2.2 Workstation
2.2.3 Palm
2.3 Security
2.4 Tools
3. Staff and their education
3.1 Staff
3.1.1 Code writers
3.1.2 System testers
3.2 Training
4. Test categories
4.1 Module testing
4.1.1 Responsibilities for module testing
4.1.2 Test sequence for module testing
4.2 Integration testing
4.2.1 Responsibilities for integration testing
4.2.2 Test sequence for integration testing
4.3 System testing
4.3.1 Functional testing
4.3.1.1 Architecture
4.3.1.2 Game requirements
4.3.2 Performance testing
4.3.2.1 Server performance
4.3.2.2 Client performance
4.3.3 Usability testing
4.3.4 Responsibilities for system testing
4.3.5 Test sequence for system testing
5. Test Report
5.1 Module and integration testing
5.2 System testing
6. Running test cases
6.1 Test grades
6.2 Methods and techniques
6.3 Scope
6.4 Restrictions
6.5 Testing application related parts
6.6 Testing the system
6.6.1 Functionality, ergonomics and clarity
6.7 Testing the recovery of the system
6.8 Testing the performance
6.9 Regression testing
7. Passing and failing tests
7.1 Passing a test
7.2 Failing a test
7.3 Requirements to interrupt a test
7.4 Requirements to carry on a test
7.5 Requirements to finish a test
8. Risk control
8.1 Methods
8.2 Goals
8.3 Informal list of possible risks
8.4 Risk scenarios
8.5 The worst cases
8.6 Summary
9. Schedule
9.1 Schedule for T2
9.2 Schedule for T3
9.3 Schedule for T4
9.4 Schedule for T5
10. Acceptors
10.1 Tests and test cases
10.2 Whole testing
Appendix A


1. Introduction

1.1 Purpose and scope

This document is a testing plan for the Monrovia project ordered by Mgine Technologies (formerly known as Done Wireless Oy). It describes how the testing is done.

1.2 Product and environment

Monrovia is a multi-player, turn based game platform for Palm clients. Its purpose is to show whether it is possible to do this kind of a system.

1.3 Definitions, Terminology and Abbreviations

Palm
A handheld computer that is described in the Functional Specification.
TCP/IP
The wide-area-networking protocol suite that makes the Internet work. [DICT: TCP/IP]
SSH
A Unix shell program for logging into and executing commands on a remote computer. [DICT: SSH]
MUD
Multi-User Dungeon. It is a class of virtual reality experiments accessible via the Internet. These are real-time chat forums with structure; they have multiple 'locations' like an adventure game, and may include combat, traps, puzzles, magic, a simple economic system and the capability for characters to build more structure onto the database that represents the existing world. [DICT: MUD]
POSE
Palm OS Emulator. A Palm emulator provided by Palm. The emulator runs on a Windows machine. The Linux version is too unstable to use at the moment.
KVM
K Virtual Machine. Ths Java virtual machine used to run the J2ME code. The machine is installed in the Palm device. There is also a Linux version of the program and that can be used to test the client software instead of POSE.

1.4 References

[DICT: keyword]
Some of the definitions were taken from dictionaries in DICT (<URL: http://www.dict.org/>) with the given keyword. (2000-11-01).

2. Environment requirements

2.1 Hardware

The required hardware for testing the different releases of the game is the Monrovia server at Mgine and a standard PC with 100Mb of hard disk space available and reasonable equipment. The PC should have access to the Internet.
The Monrovia server is described in the Technical Specification document.
A handheld Palm computer is especially needed in the system tests.

2.2 Software

2.2.1 Server

The operating system of the server is Linux Redhat 6.2. There should be an SSH daemon running on the server and J2SE libraries installed. The software parts being tested are copied to ~/hayabusa directory and can be accessed from there.

2.2.2 Workstation

A standard PC is required when doing tests using the Palm emulator. The operating system of the PC is Windows 95/98/2000/NT.
The PC requires an SSH client program to estabilish the terminal connection to the server. The user copies the client software including the Palm OS Emulator version 3.0a8 from the Monrovia server. If the emulator included in the Sun KVM distribution is used, it must be installed in the workstation.

2.2.3 Palm

The operating system of the Palm is Palm OS.

2.3 Security

We use ssh-authentication to connect to the Monrovia server. Workstations which are used to test the game are mostly our computers at home and they don't have public access. The server is backed up on tape every day and if something vital is lost it can be restored from tape.

2.4 Tools

Test personnel uses a test report to log test performances. The purpose of the test report is to collect precise information about the test cases, when the tests have been carried out, who performed the tests, how long it took to perform the tests, what the result of the tests is and if the test is passed or not.
The map which is generated for the test can be found in ~/hayabusa/monrovia/game/data.
Tools provided by the course, such as Burana and Tirana, are used to report errors and working hours.

3. Staff and their education

Persons who run the test cases are members of our group. The final version of the game will also be tested by employees of Mgine. Mgine employees perform only functional and usability tests.

3.1 Staff

The test personnel is divided up into code writers and system testers.

3.1.1 Code writers

Code writers test that their written code is correct. They are responsible for module testing as described in chapter 4. Code writers also perform integration tests.

3.1.2 System testers

There is a dedicated test group to test the whole system as described in chapter 4. The system testers test that the game works as described in the requirements specification.

3.2 Training

Members of our group don't require any special training to perform the test cases. Employees of Mgine will be explained carefully what they should do and if technical training is needed then such will be provided within the bounds of our resources.

4. Test categories

4.1 Module testing

Module testing is done by using debug messages to check that the written code produces wanted results. These debug messages can be enabled and disabled by changing the value of a boolean variable in the code.
Also an important requirement for the module testing is that the code should compile cleanly.
All modules which are specified in the Technical specification will be tested by code writers.

4.1.1 Responsibilities for module testing

Code writers carry out the module testing. Code writers are Anssi Kanninen, Christian Jalio, Ilpo Nyyssönen and Joni Pajarinen.

4.1.2 Test sequence for module testing

Module testing is done in T2, T3, T4 and T5 phases. Module testing is performed before integration testing and system testing.

4.2. Integration testing

The integration testing is aimed to test the server/protocol interoperability with the client/protocol interface. If required then also the integration of Monrovia components on demonstration equipment will be tested.

4.2.1 Responsibilities for integration testing

The persons who are accountable for integration tests are Ilpo Nyyssönen, Anssi Kanninen and Jarmo Mäki.

4.2.2 Test sequence for integration testing

In T2 phase we don't have any integration testing because we have only one module. In T3, T4 and T5 phases integration testing is partly done right after module testing has finished. This means that we will test after module testing the integration of server and client modules and after system testing we will test, if required, the integration of Monrovia components on demonstration equipment.

4.3. System testing

The system test's objective is to ensure that all required functions are working properly and that the GUI can be easily adopted. The system testing includes also performance testing and usability testing.

4.3.1 Functional testing

Functional testing is divided into two categories. The test cases are described with more detail in the test report.

4.3.1.1 Architecture

The system testers test that the game can be played by 10 players at the same time.
The effect of connection breaks should be tested. This means that the test persons test what happens when

4.3.1.2 Game requirements

The following requirements will be tested by system testers:

4.3.2 Performance testing

4.3.2.1 Server performance

The performance of the server will be tested. This means that the game should be running smoothly without any major lag. The perfomance of the server can be easily tested by making simple tests locally on the server and by using the GUI. Testing the server includes checking how much memory, processor and hard disk space is used.

4.3.2.2 Client Performance

The capacity of the client will be tested. These tests assure that the system doesn't overload the Palm.

4.3.3 Usability testing

The system testers will make tests to show how easy it is to learn to play the game. This is basically done by letting persons not belonging to the group play the game.
They also test that the map is easy to adopt and that possible delays in the network don't affect game play. They also test that the limited resources of the Palm have been taken into consideration.

4.3.4 Responsibilities for system testings

Persons responsible for all system testing are Jarmo Mäki, Juha Vainio and Oskari Mertalo.

4.3.5 Test sequence for system testing

System testing is done in phases T3, T4 and T5 after the integration of the modules has been finished. In T3 phase we will perform functional testing, but not usability and performance testing. In phases T4 and T5 we will perform also usability and performance testing.

5. Test report

Results from module, integration and system testing are gathered in a formal style into the test report.
The Test Report and Burana are the most important sources of information for code writers when they fix errors and therefore both should be filled out with care.
The logs of the server will be stored in a safe place if code writers need to examine them later.

5.1 Module and integration testing

The test cases are listed in the test report. Also the results of module and integration testing are written into the test report so that it's possible to check later what has been tested and what has not. It is not important to write down exactly how the test was performed and the prime goal is only to show, that we really have satisfactorily performed tests and that we cannot be accused of sloppy work.
The following information is written into the test report:

5.2 System testing

The system test cases are listed in the test report. The results of system testing are written into the test report. Code writers should be able to perform the tests later themselves so it is necessary to write down the exact process of the test.
The following information is written into the test report:

6. Running test cases

Many tests need to be done several times and therefore approximately 10 hours of resources need to be allocated for testing for each phase. This does not include resources needed to make test cases.

6.1 Test grades

In the system tests we use grades to indicate how well a test was passed:
A+:The tested part of the system worked perfectly
A:The tested part of the system passed the test without any major problems
A-:The tested part of the system worked as specified in the requirements specification, but there was something that didn't please the test person.
B:The tested part failed. The problem is minor and the fix can wait.
C:The test part failed. There is a major problem and it needs to be fixed as soon as possible.

6.2 Methods and techniques

When performing system tests the tester is supposed to perform tests which are listed in the system test part of the test report. When he has finished with the tests listed in the report he will spontaneously test things he thinks are important. The idea is to first make sure that all tests which have been created beforehand are tested and then let the test person try things out by himself.
If a test case fails it is very important that the test person knows how to repeat the operation which caused the failure. Tests are repeated a couple of times so that random functionality will also be detected.

6.3 Scope

The testing covers so much that the current version can be released without any major problems. Minor problems which only hinder, but don't harm the demonstration of the demos are acceptable. The final version should not have any bugs or errors left. Our goal is to have only A+ grades in the system test part when the final version is tested.

6.4 Restrictions

The required personnel to perform the system tests are two members of the Hayabusa group who have not participated in creating the game and one person of the Hayabusa group who is familiar with the system.
All versions should be tested for the first time at least two weeks before they are demonstrated so that there is enough time to make fixes.

6.5 Testing application related parts

The testing of application related parts is done before writing any code to prevent any unnecessary code from being written.
The following application related parts are tested:

6.6 Testing the system

6.6.1 Functionality, ergonomics and clarity

The functionality, ergonomics and clarity of the GUI will be tested as explained in chapter 4.3.1.

6.7 Testing the recovery of the system

The recovery of the GUI will be tested as explained in chapter 4.3.1 and in the test report.

6.9 Testing the performance

The performance of the Palm will not be tested until the end of T3.
The performance of the server will be tested as described in chapters 4.3.2 and in the test report.
The performance of the emulator has been tested earlier and it is known to work properly.

6.10 Regression testing

If parts of the system are updated then a regression test will be run to ensure that other parts of the system are still working properly. The regression test ensures that the whole system works properly. There is no regression test written at the moment and the test personnel have to co-operate with the code writers to figure out what must be tested again.

7. Passing and failing tests

This chapter defines the criteria to pass and fail a test.

7.1 Passing a test

A system test case is passed if it gets at least grade A-. So, to pass the whole system testing all test cases must get at least grade A-.
Module and integraton test cases are passed if the test case is graded as "passed". Module and integration tests are passed, if all test cases are passed.
If all tests are passed, then it is possible to come out with some kind of a release.

7.2 Failing a test

A system test case fails if it gets grade B or C. If one system test case fails, then the whole system testing also fails. Integration and module test cases fail, if they are graded as "failed". If one module or integration test case fails, then also the whole module or intergration testing fails.

7.3 Requirements to interrupt a test

A system test will be interrupted if an error, which causes immediate interference to the system, has been found. For example if system tests cannot be continued before the error has been fixed then the system test will be interrupted. Interrupting system test requires the approval of the project manager.
In case of an unexpected event the project manager can interrupt a test based on his judgement.

7.4 Requirements to carry on a test

After an error has been fixed the project manager decides if the test should be started from the beginning or if the test can be continued from where it was interrupted. It is also possible to start the test from the beginning and do only test cases which are thought to be affected by the fix.

7.5 Requirements to finish a test

A test is finished when all test cases have been tested.
In case of an unexpected event the project manager can end a test based on his judgement.

8. Risk control

8.1 Methods

The test plan uses the same methods as described in Project Plan.

8.2 Goals

The testing completes in time.
The demo matches customer’s requirements.

8.3 Informal list of possible risks

Too many bugs found and no time to fix them
The work is destroyed (described in the project plan)
Test cases are too complicated to be performed
The GUI is never completed
Some of the tests fail and therefore other tests cannot be executed
Unrealistic testing schedule
Problems in the development environment
Defined tests cannot be executed
Teaching test persons takes too long
Key person becomes mad/quits/leaves/becomes addicted to alcohol
Communication problems inside the group
Critical part of software is delayed and another has to wait until it is finished
The test plan is clear enough but the test persons don't know what to test

8.4 Risk scenarios

The Risk scenarios can be found in appendix A

Summary of the risk scenarios:
Scenario 1: If a key person becomes disabled, somebody else has to do his work.
Scenario 2: If members don't get along, communication is lousy and unnecessary mistakes are made then somebody has to fix the problems.
Scenario 3: If part of the system is not ready, then you can't go ahead with some tests and one solution is to make changes to the testing plan.
Scenario 4: If part of the system is not ready, then you can't go ahead with some tests and one solution is leave some tests out.
Scenario 5: If tests are performed slower than expected then one solution is to leave some tests out.
Scenario 6: If tests are performed slower than expected then one solution is to work harder.
Scenario 7: If it takes longer than expected to train the test personnel then more work has to be done.
Scenario 8: If it takes longer than expected to train the test personnel then the test personnel is not trained properly
Scenario 9: If equipment breaks down during tests the software is (if needed) restored from backups and new equipment is bought.
Scenario 10: If it takes longer than expected to make the GUI for the demo, we will have to leave some features out.

How likely and critical the scenario situations are:

8.5 The worst cases

We choose the worst cases from the table and analyze them. Then we'll make backup plans in case the risks happen.
Scenarios 2, 3, 4 and 10 seem to be the worst cases.

Scenario 2:
The members of the team cannot be changed anymore and therefore it is very important that everybody tries to make the communication between team members successful. Plans need to be very accurate.
Scenarios 3,4:
If time is available members should try to do things beforehand or try to help other members of the team to do their job.
Scenario 10:
Everybody should try to help code writers do their job. The GUI is the most important part of the demonstration and everybody should understand that if there is a problem someone cannot solve then we have to help him.

8.6 Summary

It seems to be likely that some part of the system will be delayed and others have to wait until it has been completed. There are some solutions shown above but the fact is that if we want to stay on schedule we have to work hard.

9. Schedule

9.1 Schedule for T2

Module testing: 11.11.2000 - 6.12.2000
Integration and System testing is not done

9.2 Schedule for T3

Module and integration testing: 16.12.2000 - 1.2.2001
System testing: 1.2.2001 - 10.2.2001

9.3 Schedule for T4

Module and integration testings: 17.2.2001 - 11.3.2001
System testing: 11.3.2001 - 18.3.2001

9.4 Schedule for T5

Module and integration testings: 24.3.2001 - 15.4.2001
System testing is done 15.4.2001 - 22.4.2001

10. Acceptors

10.1 Tests and test cases

The passing of the tests is approved as follows:

10.2 Final version

Project manager Juha Vainio approves the final version to be released.

Appendix A