Effectiveness of Different Test Case Prioritization Methods Based on Coverage Criteria

CS 206 Project, Fall 2012

Handed Out: October 30, 2012
Due Date: November 29, 2012
Grade Weight: 15% of total course grade
Note: This project can be completed in groups of two.

Overview

In this project, you will compare the effectiveness of different kinds of test case prioritization methods in exposing faults via multiple coverage criteria. You are provided with a set of benchmark programs, a set of tests cases for each benchmark, and a set of faults for each benchmark.


Download Required Files

First, download the required files for this project here: benchmarks.tar.gz (download size: approximately 1.8 MB)

Next, uncompress the downloaded file by executing the command: "tar -xvzf benchmarks.tar.gz".


Benchmark Programs

You will be provided with a set of 7 relatively small benchmark programs written in C. Each program is associated with a set of faults and a set of test cases:

Program Name # of Available Faults # of Available Test Cases
tcas 41 1590
totinfo 23 1026
schedule 9 2634
schedule2 9 2679
printtokens 7 4072
printtokens2 9 4057
replace 31 5542

General Directory Structure for Benchmarks

All benchmark programs are contained within a folder named "benchmarks". Within this folder, there is a subdirectory for each of the seven benchmark programs.

Within the subdirectory for a benchmark program, you will find the following folders and files:

Compiling and Running Each Benchmark Program

All benchmark programs take some input values (either command-line parameters, the name of an input file, or both), and produce output that is written to the standard output stream (by default, the screen). The following table shows how to compile each program, how to execute the compiled program using the first test case specified in the associated "universe.txt" test case file, and the names of any input file directories that are required by the associated test cases. Note that the test cases assume that any necessary input file directories are contained in the current working directory when running the benchmark program.

Program Name How to Compile Example Command to Execute Program Input File Directories
tcas gcc -g -o tcas tcas.c tcas 1258 1 0 897 174 7253 1 629 500 0 0 1 (none)
totinfo gcc -g -o totinfo totinfo.c -lm totinfo < universe/jkADl.mat universe
schedule gcc -g -o schedule schedule.c schedule 5 1 1 < input/dat027 input
schedule2 gcc -g -o schedule2 schedule2.c schedule2 5 1 1 < input/dat027 input
printtokens gcc -g -o printtokens printtokens.c printtokens < inputs/newtst122.tst inputs
printtokens2 gcc -g -o printtokens2 printtokens2.c printtokens2 < inputs/newtst122.tst inputs
replace gcc -g -o replace replace.c -lm replace '@|' 'E)m' < input/ruin.1373 input, moni, temp-test


Coverage Criteria

We will consider 2 different kinds of coverage criteria in this project.

Obtaining Coverage Information

For each test case associated with each benchmark program, you have to collect the coverage information for each of the two coverage criteria listed above. The coverage information can be collected by using a UNIX tool called gcov.

Here are some useful links for the usage of gcov:


Test Case Prioritization

Test case prioritization techniques schedule test cases in an execution order according to some criterion. Here is the prioritization methods you will use:

If you want to read more about the above methods you may refer to the following papers for more details.

References:


Detailed Requirements

The following tasks will need to be performed to complete this project.

  1. Create the following test suites
  2. Evaluate the fault-exposing potential of each test suite
  3. Report your experimental results and observations


Implementation and Submission

Programming Languages to Use

We would prefer that you use either C/C++ or Java to implement your solution to this project. However, if you choose to use another language (e.g., script languages), that is OK too. Whatever language you choose to use, please also submit along with your programs, a description file (README) for how to compile and run them.

Items to Submit

Please submit the following items for this project (Please put all your files into one directory and compress it into a zip/tgz/tar.gz file; please do not include executables in your submission) to ltan003@cs.ucr.edu: