Computer Games Design

Adventures in Procedural Content Generation - Adam Speers

Computer Games Design

Adventures in Procedural Content Generation - Adam Speers

Dungeon Generator: Part 8

Testing Methodology

Testing is a crucial step in the development process to ensure optimum quality and performance (testbytes, 2017). Using a defined testing methodology allows identification of bugs and ensures documentation of the product is maintained (testbytes, 2017). 

The following test methods will be employed (testbytes, 2017):

• Clean room testing - This helps to ensure the reliability of the gaming software. Working based on mathematical and statistical reasoning, the aim is to generate a product with minimal errors (testbytes, 2017).

• Functionality testing - This is done to confirm that the product works in accordance with the design specifications. Aimed at identifying errors that affect user experience (Torres, 2019). For the level generator, a good indicator of a level design bug may include stuck spots, sticky spots, map holes, invisible walls, and missing geometry (Levy & Novak, 2009).

Objectives of the testing (Toy testing, 2020):

• To find and fix any defects or bugs created during development.

• To ensure the product meets the design goals and specifications.

• To provide customers with a quality product and increase confidence in the provider.

Clean Room Testing

Tests have been performed to ensure the tile type matching algorithm performs as expected. A test harness has been created within Unity (TileSpawnerExtended.cs) which accepts a collection of direction inputs. For each set of inputs, the geometry match found using the scriptable objects (TileMatchExtended.cs and TileTypeMatchSimple.cs) is spawned into the scene. The geometry for each match is rendered, including the following debug text:

• Direction Labels

• Direction Inputs

• Matched Tile Type

• Matched Rotation

 Further details regarding the matching algorithm can be found in blog post (Dungeon generator: Part 6).

A detailed spreadsheet (TileTesting.xlsx tab TilePicker_Testing) showing all the tests conducted and associated results can be found in the project testing folder. All known permutations were tested (240 tests) representing 100% test coverage.

Please see below for some example results, tests for tile type "tile_corner_0".

Figure 8.1: Tile_corner_0 - 0 - 90

Figure 8.2: Tile_corner_0 - 180 - 270

Figure 8.3: Tile_corner_0 - Base rotation

Figure 8.4: Test results - 0 rotation

Figure 8.5: Test results - 90 rotation

Figure 8.6: Test results - 180 rotation

Figure 8.7: Test results - 270 rotation

Functional Testing

A set of 50 control tests have been run measuring performance of the dungeon generator. The tests checked to ensure a valid dungeon was generated, that the dungeon produced showed the desired characteristics and that the number of features requested matched with input variables. All tests performed were run from within Unity using the scene "Maze Testing" and used common variable settings in the controlling script "MazeGeneratorExtended" (Figure 8.8).

A detailed spreadsheet (TileTesting.xlsx tab MazeGenerator_Testing) showing all the tests conducted and associated results can be found in the project testing folder. For each test the following information was recorded, where numbers are indicated in brackets these are the expected or default values. If all criteria is met, then the test is "passed", deviances from requested values are "check", and errors in geometry, rendering or a failure to generate, "failed" :

• TestNo (1 - 50)

• Width (15)

• Height (15)

• No. Rooms Requested (5)

• Max Possible Rooms

• No. Rooms Made

• No. Loops Requested (5)

• Max Possible Loops

• No. Loops made

• Sections Requested (3)

• Sections Made

• No. Doors (2)

• No. Keys (2)

• Keys before doors? (Y / N)

• Exit at Max Dist? (Y / N)

• Max Dist

• Exit DeadEnd? (Y / N)

• All geometry Ok? (Y / N)

• Test Result (Passed / Failed / Check)

Figure 8.8: Maze Generator Script Settings

Test Results - Summary

Following the completion of 50 tests (Figure 8.9):

49 - Passed

1 - Check

0 - Failed

Check: Test No.9 - (5 loops requested, 4 candidates, 4 created)

• Loop Rules: A loop path candidate must have a walkable path tile to either both N & S or E & W. Loop tile candidates cannot have a connection to different section of the maze and cannot connect to a tile marked as containing a door.

• Analysis: The generator has created a maze with a strong 'river' characteristic, that follows a rough 'N' shape. Beginning in the South west and moving North, the path forks in the northern most row, a tributary moves west and then south. However crucially the path is now designated as section 2 preventing any looping back to the inital northern path. The critical path continues to move east and then south, again as it is now within section 2 loops back to the inital path are blocked. Now, near the southernmost row the section changes to 3 before moving north to the finish. The change in section number for the final northward run prevents any looping back to the central section 2.

• Result: Passed - No error found, the generated geometry (Figure 8.10) simply did not have 5 possible loop candidates available within the current rule constraints.


Figure 8.9: Test Results

Figure 8.10: MazeTest_09

Test Results - Sample output

Below is a sample of results for the first 10 test cases. A complete set of result screenshots are avaliable in the project folder.






















Levy, L. & Novak, J. (2009). Game development essentials : Game QA and Testing. New York: Delmar Cengage Learning.

testbytes (2017). A Guide on Game Testing Methodology. Available at: (Accessed: 25 March 2020).

Torres, A. (2019). 5 Different Types of Game Testing Techniques. Available at: (Accessed: 25 March 2020).

Toy testing (2020). Software Testing – Why is it Important and Why is it Done?. Available at: (Accessed: 25 March 2020).

Wilson, D. (2009). Quality Quality Assurance: A Methodology for Wide-Spectrum Game Testing. Available at: (Accessed: 25 March 2020).