Unit and Functional Tests in a Nutshell
Taking inspiration from the work that I made on Deformetrica, I'm going to show the difference between unit and functional tests.
Unit and functional tests can be done with/or/without the use of a framework, but in order to simplify and give a solid infrastructure to the project, the use of frameworks are high recommended if you do not want reinventing the wheel.
For the examples I'll use the C++11 Google Test Framework.
Unit Tests
Tests the results produced by a function or a class method.
You can think a unit test like a math function :
The f() function (or method) for a given INPUT X, always return the OUTPUT Y.
Y = f(X)
From the programming point of view, you can considering something like these examples :
-
ASSERT_EQ(5*2, 10)
assert equal : the two expressions, left and right must be equal -
ASSERT_EQ(multiply(5,2), 10)
-
ASSERT_TRUE(is_db_connected_to(host))
assert true : the condition inside the assert must be true -
ASSERT_GT(new_value, old_value)
assert great then : the left expression must be great then the right -
ASSERT_GE(random(0.0f, 1.0f), 0.0f)
assert great equal : the left expression must be great or equal the right
The goal of unit test is to assure that the functions (or methods) will return always the same values for the given input!
Do you know why functional programming is so trendy ?
Just because the concept itself give confidence to the users (programmers & companies) : the behavior of a function is strong defined.
If you're familiar with OO programming, you known that due to the complexity and the interaction between classes, the same method of an object,
can return different value based on the complex context, and it's hard to tests all the possibles contexts.
So, unit tests allow to validate the mathematical definition of functions (or methods).
Let's see two examples:
1. Check XML input from file
To check that your XML reader works fine, a simple code like that can make your code much secure.
TEST_F(TestReadParametersXML, SparseDiffeoParameters) {
auto paramDiffeos = readSparseDiffeoParametersXML("fake_file");
ASSERT_TRUE(paramDiffeos.IsNull());
paramDiffeos = readSparseDiffeoParametersXML(UNIT_TESTS_DIR"/io/data/paramDiffeos.xml");
ASSERT_FALSE(paramDiffeos.IsNull());
ASSERT_EQ(paramDiffeos->GetKernelWidth(), 1.5);
ASSERT_EQ(paramDiffeos->GetKernelType(), "Exact");
ASSERT_EQ(paramDiffeos->GetNumberOfTimePoints(), 20);
}
2. Check the differences of two matrix
Using the basic asserts provided by the google framework, we can create new asserts able to check complex data.
void AbstractTestKernelPrecision::ASSERT_MATRIX_EQ(const MatrixType &X,
const MatrixType &Y,
const std::string &msg,
const ScalarType error) {
ASSERT_EQ(X.rows(), Y.rows()) << msg;
ASSERT_EQ(X.cols(), Y.cols()) << msg;
for (int i = 0; i < X.rows(); ++i) {
for (int j = 0; j < X.cols(); ++j) {
ScalarType x = X(i, j);
ScalarType y = Y(i, j);
ASSERT_LE(std::min(std::abs(x - y) / std::abs(x), std::abs(x - y)), error) << msg << " on position (" << i << "," << j << ")";
}
}
}
Functional Tests
Tests the results produced by a subgroup of functionalities of a software.
Unlike unit tests, there is no a common way to code functional tests.
Here I show you the strategy I've implemented for Deformetrica
, the idea is :
1. to lean on the Google test framework in order to have an automatic way to check and run assertions
2. use the unix fork()
strategy to clone the running process and assign to the child the task to run a specific subgroup of functionalities
3. create a config.ini
file to define the subgroup of functionalities that must be check
4. collect the results of each child process and create a final report
1. lean on Google Test Framework
Like for the unit tests, the code can be build and run using the standard way that google test framework offer.
The LoadAndRun
test will be automatically managed by the google framework.
TEST_F(TestFunctional, LoadAndRun) {
/* CODE HERE */
}
2. fork()
strategy
Like many software, Deformetrica
read xml files and run different process depends the input.
These different process represent the subgroup of functionalities that must be checked.
The idea is to run deformetrica
in a child
subprocess as it is run from the command line.
auto deformetrica_run = [](const std::string& args, const std::string& path, const std::string& stdout_file, const std::string& stderr_file) {
pid_t pid;
auto child = [&]() {
std::vector<std::string> strs;
boost::split(strs, args, boost::is_any_of("\t "));
int argc = strs.size();
char *argv[argc + 1];
int i = 0;
for (auto &str: strs)
argv[i++] = (char *) str.c_str();
/* CHANGING THE CURRENT WORKING DIRECTORY */
bfs::current_path(path);
/* Redirect std::cout streaming to empty/fake streaming */
std::ofstream stdout(stdout_file);
std::ofstream stderr(stderr_file);
std::streambuf *coutbuf = std::cout.rdbuf();
std::streambuf *cerrbuf = std::cerr.rdbuf();
std::cout.rdbuf(stdout.rdbuf());
std::cerr.rdbuf(stderr.rdbuf());
/* Run deformetrica */
deformetrica(argc, argv);
/* Restore stdout/stderr to previous streaming */
std::cout.rdbuf(coutbuf);
std::cout.rdbuf(cerrbuf);
exit(0);
};
auto father = [&]() {
int returnStatus;
waitpid(pid, &returnStatus, 0);
};
pid = fork();
pid ? father() : child();
};
3. config.ini
file
Using the config.ini
file is possible to define the arguments argc,argv
passed to the program at run time from the command line.
[Test1]
use_cuda = YES
use_double_precision = YES
tolerance = 1e-5
path = atlas/image/2d/digits
exec = deformetrica atlas 2D model.xml data_set.xml optimization_parameters.xml
state-compare = deformetrica-state.bin
[Test2]
use_cuda = NO
use_double_precision = YES
tolerance = 1e-7
path = registration/image/2d/snowman
exec = deformetrica registration 2D model.xml data_set.xml optimization_parameters.xml
state-compare = deformetrica-state.bin
#..etc..
The code that read the ini file is so easy using boost::ptree
library.
...
bpt::ptree pt;
bpt::ini_parser::read_ini(FUNCTIONAL_TESTS_DIR"/configure.ini", pt);
std::size_t test_counter = 0;
while(++test_counter < 1000) {
/* EXTRACTION INI DATA */
std::stringstream ss;
ss << "Test" << test_counter;
if (pt.find(ss.str()) == pt.not_found()) continue;
...
}
4. Collect the results & final report
In the while
loop the lambda function deformetrica_run
is call to execute the test defined in the config.ini
.
All the data generated are checked and collected to test if the results are equivalent (using a delta precision) with data that are already pre-computed.
...
try {
try {
/* Running deformetrica in a fork */
deformetrica_run(cmdline.str(), working_directory, log_file, err_file);
} catch(...) {
TEST_COUT << "Current test has crashed" << std::endl;
good_run = false;
continue;
}
if (!file_exist(output_state_file)) {
TEST_COUT << "Current test has not produced the state-file [" << output_state_file + "]" << std::endl;
good_run = false;
continue;
}
def::utils::DeformationState df1, df2;
try {
df1.load(compare_binary);
} catch(std::exception& err) {
TEST_COUT << err.what() << " - Error while reading the [" << compare_binary << "] file" << std::endl;
throw err;
}
try {
df2.load(output_state_file);
} catch(std::exception& err) {
TEST_COUT << err.what() << " - Error while reading the [" << output_state_file << "] file" << std::endl;
throw err;
}
if (df1.compare(df2, ini_tolerance))
TEST_COUT << "Functional tests PASSED" << std::endl;
else
TEST_COUT << "Functional tests NOT PASSED" << std::endl;
} catch (...){
good_run = false;
}
...
A simple report, in the style of classic unit test is print.
Conclusion
Unit and Functional tests can save your time and your business!
Do that using strong framework and do not re-inventing the wheel.
Enjoy!
...ah, I was forgetting, here the whole code
/***************************************************************************************
* *
* Deformetrica *
* *
* Copyright Inria and the University of Utah. All rights reserved. This file is *
* distributed under the terms of the Inria Non-Commercial License Agreement. *
* *
* *
****************************************************************************************/
#include "TestFunctional.h"
#include <iostream>
#include <fstream>
#include <memory>
#include <cstdio>
#include <thread>
#define BOOST_NO_CXX11_SCOPED_ENUMS
#include <boost/filesystem.hpp>
#undef BOOST_NO_CXX11_SCOPED_ENUMS
#include <boost/property_tree/ptree.hpp>
#include <boost/property_tree/ini_parser.hpp>
#include <boost/algorithm/string.hpp>
#include <unistd.h>
#include <stdlib.h>
#include <src/support/utilities/SerializeDeformationState.h>
#include <src/launch/deformetrica.h>
namespace bpt = boost::property_tree;
namespace bfs = boost::filesystem;
namespace testing
{
// TEST_COUT is copyright of https://stackoverflow.com/questions/16491675/how-to-send-custom-message-in-google-c-testing-framework/29155677
namespace internal
{
enum GTestColor {
COLOR_DEFAULT,
COLOR_RED,
COLOR_GREEN,
COLOR_YELLOW
};
extern void ColoredPrintf(GTestColor color, const char* fmt, ...);
}
}
#define PRINTF(...) do { testing::internal::ColoredPrintf(testing::internal::COLOR_GREEN, "[ ] "); testing::internal::ColoredPrintf(testing::internal::COLOR_YELLOW, __VA_ARGS__); } while(0)
// C++ stream interface
class TestCout : public std::stringstream
{
public:
~TestCout()
{
PRINTF("%s",str().c_str());
}
};
#define TEST_COUT TestCout()
namespace def {
namespace test {
void TestFunctional::SetUp() {
Test::SetUp();
}
#ifdef USE_CUDA
bool use_cuda = true;
#else
bool use_cuda = false;
#endif
#ifdef USE_DOUBLE_PRECISION
bool use_double_precision = true;
#else
bool use_double_precision = false;
#endif
TEST_F(TestFunctional, LoadAndRun) {
auto file_exist = [](const std::string& f) {
std::ifstream infile(f);
return infile.good();
};
auto deformetrica_run = [](const std::string& args, const std::string& path, const std::string& stdout_file, const std::string& stderr_file) {
pid_t pid;
auto child = [&]() {
std::vector<std::string> strs;
boost::split(strs, args, boost::is_any_of("\t "));
int argc = strs.size();
char *argv[argc + 1];
int i = 0;
for (auto &str: strs)
argv[i++] = (char *) str.c_str();
/* CHANGING THE CURRENT WORKING DIRECTORY */
bfs::current_path(path);
/* Redirect std::cout streaming to empty/fake streaming */
std::ofstream stdout(stdout_file);
std::ofstream stderr(stderr_file);
std::streambuf *coutbuf = std::cout.rdbuf();
std::streambuf *cerrbuf = std::cerr.rdbuf();
std::cout.rdbuf(stdout.rdbuf());
std::cerr.rdbuf(stderr.rdbuf());
/* Run deformetrica */
deformetrica(argc, argv);
/* Restore stdout/stderr to previous streaming */
std::cout.rdbuf(coutbuf);
std::cout.rdbuf(cerrbuf);
exit(0);
};
auto father = [&]() {
int returnStatus;
waitpid(pid, &returnStatus, 0);
};
pid = fork();
pid ? father() : child();
};
//check if '--replace-state-file' is present
auto replace_state_file = std::find(args.begin(), args.end(), "--replace-state-file") != args.end();
if (replace_state_file)
TEST_COUT << "Replace state file ON" << std::endl;
else
TEST_COUT << "Replace state file OFF" << std::endl;
bpt::ptree pt;
bpt::ini_parser::read_ini(FUNCTIONAL_TESTS_DIR"/configure.ini", pt);
std::string tag;
std::size_t test_counter = 0;
bool good_run = true;
while(++test_counter < 1000) {
/* EXTRACTION INI DATA */
std::stringstream ss;
ss << "Test" << test_counter;
if (pt.find(ss.str()) == pt.not_found()) continue;
tag = ss.str() + ".use_cuda";
std::string ini_use_cuda = def::support::utilities::strtolower(pt.get<std::string>(tag));
ASSERT_TRUE(ini_use_cuda == "no" || ini_use_cuda == "yes") << "Wrong value for " << tag;
TEST_COUT << "_____________________________________________________________" << std::endl;
if (ini_use_cuda == "yes" && use_cuda == false) {
TEST_COUT << "Skip the [Test" << test_counter << "] : CUDA IS NOT AVAILABLE" << std::endl;
continue;
}
tag = ss.str() + ".use_double_precision";
std::string ini_use_double_precision = def::support::utilities::strtolower(pt.get<std::string>(tag));
ASSERT_TRUE(ini_use_double_precision == "no" || ini_use_double_precision == "yes") << "Wrong value for " << tag;
if (ini_use_double_precision == "yes" && use_double_precision == false) {
TEST_COUT << "Skip the [Test" << test_counter << "] : DOUBLE PRECISION IS NOT AVAILABLE" << std::endl;
continue;
}
tag = ss.str() + ".tolerance";
float ini_tolerance = pt.get<float>(tag);
ASSERT_GE(ini_tolerance, 0.0) << "Wrong value for " << tag;
EXPECT_LT(ini_tolerance, 10.0) << "Tolerance is too big " << tag;
tag = ss.str() + ".path";
std::string path = pt.get<std::string>(tag);
tag = ss.str() + ".exec";
std::string exec = pt.get<std::string>(tag);
tag = ss.str() + ".state-compare";
std::string state_compare = pt.get<std::string>(tag);
auto compare_binary = FUNCTIONAL_TESTS_DIR"/" + path + "/" + state_compare;
if (!file_exist(compare_binary)) {
TEST_COUT << "Skip the [Test" << test_counter << "] : Missing serialization binary file [" << compare_binary + "]" << std::endl;
continue;
}
/* CREATING TMP DIRECTORY */
boost::system::error_code ec;
bfs::path test_root(bfs::unique_path(bfs::temp_directory_path() / "%%%%-%%%%-%%%%"));
ASSERT_TRUE(bfs::create_directory(test_root, ec)) << "Failed creating " << test_root << ": " << ec.message() << std::endl;
std::string output_state_file = test_root.string() + "/output-state-file.bin";
auto working_directory = FUNCTIONAL_TESTS_DIR"/" + path;
auto log_file = test_root.string() + "/log.txt";
auto err_file = test_root.string() + "/error.txt";
std::stringstream cmdline;
cmdline << exec
<< " --output-state-file=" << output_state_file
<< " --output-dir=" << test_root.string();
TEST_COUT << "Running functional tests [" << ss.str() + "]" << std::endl;
TEST_COUT << "Tolerance [" << ini_tolerance << "]" << std::endl;
TEST_COUT << "Log file: [" << log_file << "]" << std::endl;
TEST_COUT << "Error file: [" << err_file << "]" << std::endl;
TEST_COUT << "Temporary Output Directory: [" << test_root.string() << "]" << std::endl;
TEST_COUT << "Exec: [" << cmdline.str() << "]" << std::endl;
try {
try {
/* Running deformetrica in a fork */
deformetrica_run(cmdline.str(), working_directory, log_file, err_file);
} catch(...) {
TEST_COUT << "Current test has crashed" << std::endl;
good_run = false;
continue;
}
if (!file_exist(output_state_file)) {
TEST_COUT << "Current test has not produced the state-file [" << output_state_file + "]" << std::endl;
good_run = false;
continue;
}
def::utils::DeformationState df1, df2;
try {
df1.load(compare_binary);
} catch(std::exception& err) {
TEST_COUT << err.what() << " - Error while reading the [" << compare_binary << "] file" << std::endl;
throw err;
}
try {
df2.load(output_state_file);
} catch(std::exception& err) {
TEST_COUT << err.what() << " - Error while reading the [" << output_state_file << "] file" << std::endl;
throw err;
}
if (df1.compare(df2, ini_tolerance))
TEST_COUT << "Functional tests PASSED" << std::endl;
else
TEST_COUT << "Functional tests NOT PASSED" << std::endl;
} catch (...){
good_run = false;
}
if (replace_state_file) {
try {
bfs::copy_file(output_state_file, compare_binary, bfs::copy_option::overwrite_if_exists);
TEST_COUT << "binary state file replaced" << std::endl;
} catch(...) {
TEST_COUT << "binary state file can't be replaced" << std::endl;
good_run = false;
}
}
}
TEST_COUT << "_____________________________________________________________" << std::endl;
ASSERT_TRUE(good_run);
}
}
}