Google C++ Testing Framework AdvancedGuide

My>Sign in
googletest
Google C++ Testing Framework
 
Project Home  Downloads  Wiki  Issues  Source
Search    Search within:      All wiki pages   Featured pages   Current pages       Deprecated pages     for   
 
AdvancedGuide  
Updated  Jul 23, 2013 by  [email protected]

r#summary Advanced>More Assertions

  • Explicit Success and Failure
  • Exception Assertions
  • Predicate Assertions for Better Error Messages
    • Using an Existing Boolean Function
    • Using a Function That Returns an AssertionResult
    • Using a Predicate-Formatter
  • Floating-Point Comparison
    • Floating-Point Macros
    • Floating-Point Predicate-Format Functions
  • Windows HRESULT assertions
  • Type Assertions
  • Assertion Placement
  • Teaching Google Test How to Print Your Values
  • Death Tests
    • How to Write a Death Test
    • Regular Expression Syntax
    • How It Works
    • Death Tests And Threads
    • Death Test Styles
    • Caveats
  • Using Assertions in Sub-routines
    • Adding Traces to Assertions
    • Propagating Fatal Failures
      • Asserting on Subroutines
      • Checking for Failures in the Current Test
  • Logging Additional Information
  • Sharing Resources Between Tests in the Same Test Case
  • Global Set-Up and Tear-Down
  • Value Parameterized Tests
    • How to Write Value-Parameterized Tests
    • Creating Value-Parameterized Abstract Tests
  • Typed Tests
  • Type-Parameterized Tests
  • Testing Private Code
    • Static Functions
    • Private Class Members
  • Catching Failures
  • Getting the Current Test's Name
  • Extending Google Test by Handling Test Events
    • Defining Event Listeners
    • Using Event Listeners
    • Generating Failures in Listeners
  • Running Test Programs: Advanced Options
    • Selecting Tests
      • Listing Test Names
      • Running a Subset of the Tests
      • Temporarily Disabling Tests
      • Temporarily Enabling Disabled Tests
    • Repeating the Tests
    • Shuffling the Tests
    • Controlling Test Output
      • Colored Terminal Output
      • Suppressing the Elapsed Time
      • Generating an XML Report
    • Controlling How Failures Are Reported
      • Turning Assertion Failures into Break-Points
      • Disabling Catching Test-Thrown Exceptions
      • Letting Another Testing Framework Drive
    • Distributing Test Functions to Multiple Machines
  • Fusing Google Test Source Files
  • Where to Go from Here
  • Now>Primer and>More Assertions

    This>Explicit Success and Failure

    These>

    SUCCEED();

    Generates> FAIL(); ADD_FAILURE(); ADD_FAILURE_AT("file_path", line_number);

    FAIL() generates>switch(expression) { case 1: ... some checks ... case 2: ... some other checks ... default: FAIL() << "We shouldn't get here."; }

    Availability: Linux, Windows, Mac.

    Exception Assertions

    These> Fatal assertion Nonfatal assertion Verifies ASSERT_THROW(statementexception_type); EXPECT_THROW(statementexception_type); statement throws an exception of the given type ASSERT_ANY_THROW(statement); EXPECT_ANY_THROW(statement); statement throws an exception of any type ASSERT_NO_THROW(statement); EXPECT_NO_THROW(statement); statement doesn't>ASSERT_THROW(Foo(5), bar_exception); EXPECT_NO_THROW({ int n = 5; Bar(&n); });

    Availability: Linux, Windows, Mac;>Predicate Assertions for Better Error Messages

    Even>Using an Existing Boolean Function

    If> Fatal assertion Nonfatal assertion Verifies ASSERT_PRED1(pred1, val1); EXPECT_PRED1(pred1, val1); pred1(val1) returns true ASSERT_PRED2(pred2, val1, val2); EXPECT_PRED2(pred2, val1, val2); pred2(val1, val2) returns true ... ... ...

    In>// Returns true iff m and n have no common divisors except 1. bool MutuallyPrime(int m, int n) { ... } const int a = 3; const int b = 4; const int c = 10;

    the>this for >

    Availability: Linux, Windows, Mac

    Using a Function That Returns an AssertionResult

    While EXPECT_PRED*() and>namespace testing { // Returns an AssertionResult object to indicate that an assertion has // succeeded. AssertionResult AssertionSuccess(); // Returns an AssertionResult object to indicate that an assertion has // failed. AssertionResult AssertionFailure(); }

    You>::testing::AssertionResult IsEven(int n) { if ((n % 2) == 0) return ::testing::AssertionSuccess(); else return ::testing::AssertionFailure() << n << " is odd"; }

    instead>bool IsEven(int n) { return (n % 2) == 0; }

    the>::testing::AssertionResult IsEven(int n) { if ((n % 2) == 0) return ::testing::AssertionSuccess() << n << " is even"; else return ::testing::AssertionFailure() << n << " is odd"; }

    Then>Using a Predicate-Formatter

    If> Fatal assertion Nonfatal assertion Verifies ASSERT_PRED_FORMAT1(pred_format1, val1); EXPECT_PRED_FORMAT1(pred_format1, val1`); pred_format1(val1) is successful ASSERT_PRED_FORMAT2(pred_format2, val1, val2); EXPECT_PRED_FORMAT2(pred_format2, val1, val2); pred_format2(val1, val2) is successful ... ... ...

    The>// Returns the smallest prime common divisor of m and n, // or 1 when m and n are mutually prime. int SmallestPrimeCommonDivisor(int m, int n) { ... } // A predicate-formatter for asserting that two integers are mutually prime. ::testing::AssertionResult AssertMutuallyPrime(const char* m_expr, const char* n_expr, int m, int n) { if (MutuallyPrime(m, n)) return ::testing::AssertionSuccess(); return ::testing::AssertionFailure() << m_expr << " and " << n_expr << " (" << m << " and " << n << ") are not mutually prime, " << "as they have a common divisor " << SmallestPrimeCommonDivisor(m, n); }

    With>EXPECT_PRED_FORMAT2(AssertMutuallyPrime, b, c);

    to>Floating-Point Comparison

    Comparing>this article on float comparison.

    Floating-Point Macros

    Fatal assertion Nonfatal assertion Verifies
    ASSERT_FLOAT_EQ(expected, actual); EXPECT_FLOAT_EQ(expected, actual); the two float values are almost equal
    ASSERT_DOUBLE_EQ(expected, actual); EXPECT_DOUBLE_EQ(expected, actual); the two double values are almost equal

    By "almost equal", we mean the two values are within 4 ULP's from each other.

    The following assertions allow you to choose the acceptable error bound:

    Fatal assertion Nonfatal assertion Verifies
    ASSERT_NEAR(val1, val2, abs_error); EXPECT_NEAR(val1, val2, abs_error); the difference between val1 and val2 doesn't>Floating-Point Predicate-Format Functions

    Some>EXPECT_PRED_FORMAT2(::testing::FloatLE, val1, val2); EXPECT_PRED_FORMAT2(::testing::DoubleLE, val1, val2);

    Verifies>Windows HRESULT assertions

    These assertions test for HRESULT success or failure.

    Fatal assertion Nonfatal assertion Verifies
    ASSERT_HRESULT_SUCCEEDED(expression); EXPECT_HRESULT_SUCCEEDED(expression); expression is a success HRESULT
    ASSERT_HRESULT_FAILED(expression); EXPECT_HRESULT_FAILED(expression); expression is>CComPtr shell; ASSERT_HRESULT_SUCCEEDED(shell.CoCreateInstance(L"Shell.Application")); CComVariant empty; ASSERT_HRESULT_SUCCEEDED(shell->ShellExecute(CComBSTR(url), empty, empty, empty, empty));

    Availability: Windows.

    Type Assertions

    You>::testing::StaticAssertTypeEq<T1, T2>();

    to>template <typename T> class Foo { public: void Bar() { ::testing::StaticAssertTypeEq<int, T>(); } };

    the>void Test1() { Foo<bool> foo; }

    will not generate>void Test2() { Foo<bool> foo; foo.Bar(); }

    to>Assertion Placement

    You>Teaching Google Test How to Print Your Values

    When>#include <iostream> namespace foo { class Bar { ... }; // We want Google Test to be able to print instances of this. // It's important that the << operator is defined in the SAME // namespace that defines Bar. C++'s look-up rules rely on that. ::std::ostream& operator<<(::std::ostream& os, const Bar& bar) { return os << bar.DebugString(); // whatever needed to print bar to os } } // namespace foo

    Sometimes,>#include <iostream> namespace foo { class Bar { ... }; // It's important that PrintTo() is defined in the SAME // namespace that defines Bar. C++'s look-up rules rely on that. void PrintTo(const Bar& bar, ::std::ostream* os) { *os << bar.DebugString(); // whatever needed to print bar to os } } // namespace foo

    If>vector<pair<Bar, int> > bar_ints = GetBarIntVector(); EXPECT_TRUE(IsCorrectBarIntVector(bar_ints)) << "bar_ints = " << ::testing::PrintToString(bar_ints);

    Death Tests

    In many applications, there are assertions that can cause application failure if a condition is not met. These sanity checks, which ensure that the program is in a known good state, are there to fail at the earliest possible time after some program state is corrupted. If the assertion checks the wrong condition, then the program may proceed in an erroneous state, which could lead to memory corruption, security holes, or worse. Hence it is vitally important to test that such assertion statements work as expected.

    Since these precondition checks cause the processes to die, we call such tests death tests. More generally, any test that checks that a program terminates (except by throwing an exception) in an expected fashion is also a death test.

    Note that if a piece of code throws an exception, we don't consider it "death" for the purpose of death tests, as the caller of the code could catch the exception and avoid the crash. If you want to verify exceptions thrown by your code, see Exception Assertions.

    If you want to test EXPECT_*()/ASSERT_*() failures in your test code, see Catching Failures.

    How to Write a Death Test

    Google Test has the following macros to support death tests:

    Fatal assertion Nonfatal assertion Verifies
    ASSERT_DEATH(statement, regex`); EXPECT_DEATH(statement, regex`); statement crashes with the given error
    ASSERT_DEATH_IF_SUPPORTED(statement, regex`); EXPECT_DEATH_IF_SUPPORTED(statement, regex`); if death tests are supported, verifies that statement crashes with the given error; otherwise verifies nothing
    ASSERT_EXIT(statement, predicate, regex`); EXPECT_EXIT(statement, predicate, regex`); statement exits>::testing::ExitedWithCode(exit_code)

    This>::testing::KilledBySignal(signal_number) // Not available on Windows.

    This>TEST(MyDeathTest, Foo) { // This death test uses a compound statement. ASSERT_DEATH({ int n = 5; Foo(&n); }, "Error on line .* of Foo()"); } TEST(MyDeathTest, NormalExit) { EXPECT_EXIT(NormalExit(), ::testing::ExitedWithCode(0), "Success"); } TEST(MyDeathTest, KillMyself) { EXPECT_EXIT(KillMyself(), ::testing::KilledBySignal(SIGKILL), "Sending myself unblockable signal"); }

    verifies that:

    • calling Foo(5) causes the process to die with the given error message,
    • calling NormalExit() causes the process to print "Success" to stderr and exit with exit code 0, and
    • calling KillMyself() kills the process with signal SIGKILL.

    The test function body may contain other assertions and statements as well, if necessary.

    Important: We strongly recommend you to follow the convention of naming your test case (not test) *DeathTest when it contains a death test, as demonstrated in the above example. The Death Tests And Threads section below explains why.

    If a test fixture class is shared by normal tests and death tests, you can use typedef to introduce an alias for the fixture class and avoid duplicating its>class FooTest : public ::testing::Test { ... }; typedef FooTest FooDeathTest; TEST_F(FooTest, DoesThis) { // normal test } TEST_F(FooDeathTest, DoesThat) { // death test }

    Availability: Linux, Windows (requires MSVC 8.0>Regular Expression Syntax

    On POSIX>POSIX extended regular expression syntax>Wikipedia entry.

    On Windows, Google Test uses its own simple regular expression implementation. It lacks many features you can find in POSIX extended regular expressions. For example, we don't support union ("x|y"), grouping ("(xy)"), brackets ("[xy]"), and repetition count ("x{5,7}"), among others. Below is what we do support (Letter Adenotes a literal character, period (.), or a single \\ escape sequence; x and y denote regular expressions.):

    c matches any literal character c
    \\d matches any decimal digit
    \\D matches any character that's not a decimal digit
    \\f matches \f
    \\n matches \n
    \\r matches \r
    \\s matches any ASCII whitespace, including \n
    \\S matches any character that's not a whitespace
    \\t matches \t
    \\v matches \v
    \\w matches any letter, _, or decimal digit
    \\W matches any character that \\w doesn't match
    \\c matches any literal character c, which must be a punctuation
    \\. matches the . character
    . matches any single character except \n
    A? matches 0 or 1 occurrences of A
    A* matches 0 or many occurrences of A
    A+ matches 1 or many occurrences of A
    ^ matches the beginning of a string (not that of each line)
    $ matches the end of a string (not that of each line)
    xy matches x followed> when it uses POSIX extended regular expressions, orGTEST_USES_SIMPLE_RE=1 when>How It Works

    Under>Death Tests And Threads

    The>Death Test Styles

    The "threadsafe">::testing::FLAGS_gtest_death_test_style = "threadsafe";

    You>TEST(MyDeathTest, TestOne) { ::testing::FLAGS_gtest_death_test_style = "threadsafe"; // This test is run in the "threadsafe" style: ASSERT_DEATH(ThisShouldDie(), ""); } TEST(MyDeathTest, TestTwo) { // This test is run in the "fast" style: ASSERT_DEATH(ThisShouldDie(), ""); } int main(int argc, char** argv) { ::testing::InitGoogleTest(&argc, argv); ::testing::FLAGS_gtest_death_test_style = "fast"; return RUN_ALL_TESTS(); }

    Caveats

    The statement argument>Using Assertions in Sub-routines

    Adding Traces to Assertions

    If a test sub-routine is called from several places, when an assertion inside it fails, it can be hard to tell which invocation of the sub-routine the failure is from. You can alleviate this problem using extra logging or custom failure messages, but that usually clutters up your tests. A better solution is to use theSCOPED_TRACE macro:

    SCOPED_TRACE(message);

    where message can>10: void Sub1(int n) { 11: EXPECT_EQ(1, Bar(n)); 12: EXPECT_EQ(2, Bar(n + 1)); 13: } 14: 15: TEST(FooTest, Bar) { 16: { 17: SCOPED_TRACE("A"); // This trace point will be included in 18: // every failure in this scope. 19: Sub1(1); 20: } 21: // Now it won't. 22: Sub1(9); 23: }

    could>path/to/foo_test.cc:11: Failure Value of: Bar(n) Expected: 1 Actual: 2 Trace: path/to/foo_test.cc:17: A path/to/foo_test.cc:12: Failure Value of: Bar(n + 1) Expected: 2 Actual: 3

    Without>Propagating Fatal Failures

    A>void Subroutine() { // Generates a fatal failure and aborts the current function. ASSERT_EQ(1, 2); // The following won't be executed. ... } TEST(FooTest, Bar) { Subroutine(); // The intended behavior is for the fatal failure // in Subroutine() to abort the entire test. // The actual behavior: the function goes on after Subroutine() returns. int* p = NULL; *p = 3; // Segfault! }

    Since>Asserting on Subroutines

    As shown above, if your test calls a subroutine that has an ASSERT_* failure in it, the test will continue after the subroutine returns. This may not be what you want.

    Often people want fatal failures to propagate like exceptions. For that Google Test offers the following macros:

    Fatal assertion Nonfatal assertion Verifies
    ASSERT_NO_FATAL_FAILURE(statement); EXPECT_NO_FATAL_FAILURE(statement); statement doesn't>ASSERT_NO_FATAL_FAILURE(Foo()); int i; EXPECT_NO_FATAL_FAILURE({ i = Bar(); });

    Availability: Linux, Windows, Mac. Assertions>Checking for Failures in the Current Test

    HasFatalFailure() in>class Test { public: ... static bool HasFatalFailure(); };

    The>TEST(FooTest, Bar) { Subroutine(); // Aborts if Subroutine() had a fatal failure. if (HasFatalFailure()) return; // The following won't be executed. ... }

    If HasFatalFailure() is>if (::testing::Test::HasFatalFailure()) return;

    Similarly, HasNonfatalFailure() returns true if>Logging Additional Information

    In>TEST_F(WidgetUsageTest, MinAndMaxWidgets) { RecordProperty("MaximumWidgets", ComputeMaxUsage()); RecordProperty("MinimumWidgets", ComputeMinUsage()); }

    will>... <testcase name="MinAndMaxWidgets" status="run" time="6" classname="WidgetUsageTest" MaximumWidgets="12" MinimumWidgets="9" /> ...

    Note:

    • RecordProperty() is>Sharing Resources Between Tests in the Same Test Case

      Google Test>class FooTest : public ::testing::Test { protected: // Per-test-case set-up. // Called before the first test in this test case. // Can be omitted if not needed. static void SetUpTestCase() { shared_resource_ = new ...; } // Per-test-case tear-down. // Called after the last test in this test case. // Can be omitted if not needed. static void TearDownTestCase() { delete shared_resource_; shared_resource_ = NULL; } // You can define per-test set-up and tear-down logic as usual. virtual void SetUp() { ... } virtual void TearDown() { ... } // Some expensive resource shared by all tests. static T* shared_resource_; }; T* FooTest::shared_resource_ = NULL; TEST_F(FooTest, Test1) { ... you can refer to shared_resource here ... } TEST_F(FooTest, Test2) { ... you can refer to shared_resource here ... }

      Availability: Linux, Windows, Mac.

      Global Set-Up and Tear-Down

      Just>class Environment { public: virtual ~Environment() {} // Override this to define how to set up the environment. virtual void SetUp() {} // Override this to define how to tear down the environment. virtual void TearDown() {} };

      Then,>Environment* AddGlobalTestEnvironment(Environment* env);

      Now,>::testing::Environment* const foo_env = ::testing::AddGlobalTestEnvironment(new FooEnvironment);

      However,>Value Parameterized Tests

      Value-parameterized>TEST(MyCodeTest, TestFoo) { // A code to test foo(). }

      Usually>void TestFooHelper(bool flag_value) { flag = flag_value; // A code to test foo(). } TEST(MyCodeTest, TestFoo) { TestFooHelper(false); TestFooHelper(true); }

      But>How to Write Value-Parameterized Tests

      To>class FooTest : public ::testing::TestWithParam<const char*> { // You can implement all the usual fixture class members here. // To access the test parameter, call GetParam() from class // TestWithParam<T>. }; // Or, when you want to add parameters to a pre-existing fixture class: class BaseTest : public ::testing::Test { ... }; class BarTest : public BaseTest, public ::testing::WithParamInterface<const char*> { ... };

      Then,>TEST_P(FooTest, DoesBlah) { // Inside a test, access the test parameter with the GetParam() method // of the TestWithParam<T> class: EXPECT_TRUE(foo.Blah(GetParam())); ... } TEST_P(FooTest, HasBlahBlah) { ... }

      Finally, you can use INSTANTIATE_TEST_CASE_P to instantiate the test case with any set of parameters you want. Google Test defines a number of functions for generating test parameters. They return what we call (surprise!) parameter generators. Here is a summary of them, which are all in the testing namespace:

      Range(begin, end[, step]) Yields values {begin, begin+step, begin+step+step, ...}. The values do not include end. step defaults to 1.
      Values(v1, v2, ..., vN) Yields values {v1, v2, ..., vN}.
      ValuesIn(container)andValuesIn(begin, end) Yields values from a C-style array, an STL-style container, or an iterator range [begin, end). container, begin, and end can be expressions whose values are determined at run time.
      Bool() Yields sequence {false, true}.
      Combine(g1, g2, ..., gN) Yields all combinations (the Cartesian product for the math savvy) of the values generated by the N generators. This>. See comments in include/gtest/internal/gtest-port.h for >source code.

      The>INSTANTIATE_TEST_CASE_P(InstantiationName, FooTest, ::testing::Values("meeny", "miny", "moe"));

      To distinguish different instances of the pattern (yes, you can instantiate it more than once), the first argument to INSTANTIATE_TEST_CASE_P is a prefix that will be added to the actual test case name. Remember to pick unique prefixes for different instantiations. The tests from the instantiation above will have these names:

      • InstantiationName/FooTest.DoesBlah/0 for "meeny"
      • InstantiationName/FooTest.DoesBlah/1 for "miny"
      • InstantiationName/FooTest.DoesBlah/2 for "moe"
      • InstantiationName/FooTest.HasBlahBlah/0 for "meeny"
      • InstantiationName/FooTest.HasBlahBlah/1 for "miny"
      • InstantiationName/FooTest.HasBlahBlah/2 for "moe"

      You can use these names in --gtest_filter.

      This>const char* pets[] = {"cat", "dog"}; INSTANTIATE_TEST_CASE_P(AnotherInstantiationName, FooTest, ::testing::ValuesIn(pets));

      The>these files for >Creating Value-Parameterized Abstract Tests

      In>Typed Tests

      Suppose>template <typename T> class FooTest : public ::testing::Test { public: ... typedef std::list<T> List; static T shared_; T value_; };

      Next,>typedef ::testing::Types<char, int, unsigned int> MyTypes; TYPED_TEST_CASE(FooTest, MyTypes);

      The typedef is>TYPED_TEST(FooTest, DoesBlah) { // Inside a test, refer to the special name TypeParam to get the type // parameter. Since we are inside a derived class template, C++ requires // us to visit the members of FooTest via 'this'. TypeParam n = this->value_; // To visit static members of the fixture, add the 'TestFixture::' // prefix. n += TestFixture::shared_; // To refer to typedefs in the fixture, add the 'typename TestFixture::' // prefix. The 'typename' is required to satisfy the compiler. typename TestFixture::List values; values.push_back(n); ... } TYPED_TEST(FooTest, HasPropertyA) { ... }

      You>Type-Parameterized Tests

      Type-parameterized>template <typename T> class FooTest : public ::testing::Test { ... };

      Next,>TYPED_TEST_CASE_P(FooTest);

      The _P suffix>TYPED_TEST_P(FooTest, DoesBlah) { // Inside a test, refer to TypeParam to get the type parameter. TypeParam n = 0; ... } TYPED_TEST_P(FooTest, HasPropertyA) { ... }

      Now>REGISTER_TYPED_TEST_CASE_P(FooTest, DoesBlah, HasPropertyA);

      Finally,>typedef ::testing::Types<char, int, unsigned int> MyTypes; INSTANTIATE_TYPED_TEST_CASE_P(My, FooTest, MyTypes);

      To>INSTANTIATE_TYPED_TEST_CASE_P(My, FooTest, int);

      You>Testing Private Code

      If>Static Functions

      Both>Private Class Members

      Private>FRIEND_TEST(TestCaseName, TestName);

      For>// foo.h #include "gtest/gtest_prod.h" // Defines FRIEND_TEST. class Foo { ... private: FRIEND_TEST(FooTest, BarReturnsZeroOnNull); int Bar(void* x); }; // foo_test.cc ... TEST(FooTest, BarReturnsZeroOnNull) { Foo foo; EXPECT_EQ(0, foo.Bar(NULL)); // Uses Foo's private member Bar(). }

      Pay>namespace my_namespace { class Foo { friend class FooTest; FRIEND_TEST(FooTest, Bar); FRIEND_TEST(FooTest, Baz); ... definition of the class Foo ... }; } // namespace my_namespace

      Your>namespace my_namespace { class FooTest : public ::testing::Test { protected: ... }; TEST_F(FooTest, Bar) { ... } TEST_F(FooTest, Baz) { ... } } // namespace my_namespace

      Catching Failures

      If you are building a testing utility on top of Google Test, you'll want to test your utility. What framework would you use to test it? Google Test, of course.

      The challenge is to verify that your testing utility reports failures correctly. In frameworks that report a failure by throwing an exception, you could catch the exception and assert on it. But Google Test doesn't use exceptions, so how do we test that a piece of code generates an expected failure?

      "gtest/gtest-spi.h" contains some constructs to do this. After #including this header, you can use

      EXPECT_FATAL_FAILURE(statement, substring);

      to assert that statement generates a fatal (e.g. ASSERT_*) failure whose message contains the given substring, or use

      EXPECT_NONFATAL_FAILURE(statement, substring);

      if you are expecting a non-fatal (e.g. EXPECT_*) failure.

      For technical reasons, there are some caveats:

      1. You cannot stream a failure message to either macro.
      2. statement in EXPECT_FATAL_FAILURE() cannot reference local non-static variables or non-static members of this object.
      3. statement in EXPECT_FATAL_FAILURE() cannot return a value.

      Note: Google Test is designed with threads in mind. Once the synchronization primitives in "gtest/internal/gtest-port.h" have been implemented, Google Test will become thread-safe, meaning that you can then use assertions in multiple threads concurrently. Before

      that, however, Google Test only supports single-threaded usage. Once thread-safe, EXPECT_FATAL_FAILURE() and EXPECT_NONFATAL_FAILURE() will capture failures in the current thread only. If statement creates new threads, failures in these threads will be ignored. If you want to capture failures from all threads instead, you should use the following macros:

      EXPECT_FATAL_FAILURE_ON_ALL_THREADS(statement, substring);
      EXPECT_NONFATAL_FAILURE_ON_ALL_THREADS(statement,>Getting the Current Test's Name

      Sometimes>namespace testing { class TestInfo { public: // Returns the test case name and the test name, respectively. // // Do NOT delete or free the return value - it's managed by the // TestInfo class. const char* test_case_name() const; const char* name() const; }; } // namespace testing

      To>// Gets information about the currently running test. // Do NOT delete the returned object - it's managed by the UnitTest class. const ::testing::TestInfo* const test_info = ::testing::UnitTest::GetInstance()->current_test_info(); printf("We are in test %s of test case %s.\n", test_info->name(), test_info->test_case_name());

      current_test_info() returns>Extending Google Test by Handling Test Events

      Google Test>Defining Event Listeners

      To>testing::TestEventListener or testing::EmptyTestEventListener. The>UnitTest reflects>TestCase has>TestInfo contains>TestPartResult represents> class MinimalistPrinter : public ::testing::EmptyTestEventListener { // Called before a test starts. virtual void OnTestStart(const ::testing::TestInfo& test_info) { printf("*** Test %s.%s starting.\n", test_info.test_case_name(), test_info.name()); } // Called after a failed assertion or a SUCCEED() invocation. virtual void OnTestPartResult( const ::testing::TestPartResult& test_part_result) { printf("%s in %s:%d\n%s\n", test_part_result.failed() ? "*** Failure" : "Success", test_part_result.file_name(), test_part_result.line_number(), test_part_result.summary()); } // Called after a test ends. virtual void OnTestEnd(const ::testing::TestInfo& test_info) { printf("*** Test %s.%s ending.\n", test_info.test_case_name(), test_info.name()); } };

      Using Event Listeners

      To>TestEventListeners - >int main(int argc, char** argv) { ::testing::InitGoogleTest(&argc, argv); // Gets hold of the event listener list. ::testing::TestEventListeners& listeners = ::testing::UnitTest::GetInstance()->listeners(); // Adds a listener to the end. Google Test takes the ownership. listeners.Append(new MinimalistPrinter); return RUN_ALL_TESTS(); }

      There's> ... delete listeners.Release(listeners.default_result_printer()); listeners.Append(new MinimalistPrinter); return RUN_ALL_TESTS();

      Now,>sample.

      You>Generating Failures in Listeners

      You>here.

      Running Test Programs: Advanced Options

      Google Test>int main(int argc, char** argv) { // Disables elapsed time by default. ::testing::GTEST_FLAG(print_time) = false; // This allows the user to override the flag on the command line. ::testing::InitGoogleTest(&argc, argv); return RUN_ALL_TESTS(); }

      Selecting Tests

      This>Listing Test Names

      Sometimes>TestCase1. TestName1 TestName2 TestCase2. TestName

      None>Running a Subset of the Tests

      By> Also runs everything, due to the single match-everything * value.

    • ./foo_test --gtest_filter=FooTest.* Runs> Runs any test whose full name contains either "Null" or "Constructor".
    • ./foo_test --gtest_filter=-*DeathTest.* Runs> Runs everything in test case FooTest except FooTest.Bar.
    • Availability: Linux, Windows, Mac.

      Temporarily Disabling Tests

      If>// Tests that Foo does Abc. TEST(FooTest, DISABLED_DoesAbc) { ... } class DISABLED_BarTest : public ::testing::Test { ... }; // Tests that Bar does Xyz. TEST_F(DISABLED_BarTest, DoesXyz) { ... }

      Note: This>Temporarily Enabling Disabled Tests

      To include disabled tests in test execution, just invoke the test program with the --gtest_also_run_disabled_tests flag or set the GTEST_ALSO_RUN_DISABLED_TESTS environment variable to a value other than 0. You can combine this with the --gtest_filter flag>Repeating the Tests

      Once in a while you'll run into a test whose result is hit-or-miss. Perhaps it will fail only 1% of the time, making it rather hard to reproduce the bug under a debugger. This can be a major source of frustration.

      The --gtest_repeat flag allows you to repeat all (or selected) test methods in a program many times. Hopefully, a flaky test will eventually fail and give you a chance to debug. Here's how to use it:

      $> Repeat foo_test 1000 times and don't stop at failures.
      $> A negative count means repeating forever.
      $> Repeat foo_test 1000 times, stopping at the first failure. This is especially useful when running under a debugger: when the testfails, it will drop into the debugger and you can then inspect variables and stacks.
      $> Repeat the tests whose name matches the filter 1000 times.

      If>Shuffling the Tests

      You> flag (or set the GTEST_RANDOM_SEED environment>, Google Test will pick a different random seed and re-shuffle the tests in each iteration.

      Availability: Linux, Windows, Mac;>Controlling Test Output

      This>Colored Terminal Output

      Google Test>Suppressing the Elapsed Time

      By> command line flag. Setting theGTEST_PRINT_TIME environment>Generating an XML Report

      Google Test>Hudson. Since><testsuites name="AllTests" ...> <testsuite name="test_case_name" ...> <testcase name="test_name" ...> <failure message="..."/> <failure message="..."/> <failure message="..."/> </testcase> </testsuite> </testsuites>

      • The>TEST(MathTest, Addition) { ... } TEST(MathTest, Subtraction) { ... } TEST(LogicTest, NonContradiction) { ... }

        could><?xml version="1.0" encoding="UTF-8"?> <testsuites tests="3" failures="1" errors="0" time="35" name="AllTests"> <testsuite name="MathTest" tests="2" failures="1" errors="0" time="15"> <testcase name="Addition" status="run" time="7" classname=""> <failure message="Value of: add(1, 1)&#x0A; Actual: 3&#x0A;Expected: 2" type=""/> <failure message="Value of: add(1, -1)&#x0A; Actual: 1&#x0A;Expected: 0" type=""/> </testcase> <testcase name="Subtraction" status="run" time="5" classname=""> </testcase> </testsuite> <testsuite name="LogicTest" tests="1" failures="0" errors="0" time="5"> <testcase name="NonContradiction" status="run" time="5" classname=""> </testcase> </testsuite> </testsuites>

        Things>Controlling How Failures Are Reported

        Turning Assertion Failures into Break-Points

        When>Disabling Catching Test-Thrown Exceptions

        Google Test> flag when running the tests.

        Availability: Linux, Windows, Mac.

        Letting Another Testing Framework Drive

        If>#include "gtest/gtest.h" int main(int argc, char** argv) { ::testing::GTEST_FLAG(throw_on_failure) = true; // Important: Google Test must be initialized. ::testing::InitGoogleTest(&argc, argv); ... whatever your existing testing framework requires ... }

        With>void TestFooDoesBar() { Foo foo; EXPECT_LE(foo.Bar(1), 100); // A Google Test assertion. CPPUNIT_ASSERT(foo.IsEmpty()); // A native assertion. }

        If> in your main(),>Distributing Test Functions to Multiple Machines

        If>TEST(A, V) TEST(A, W) TEST(B, X) TEST(B, Y) TEST(B, Z)

        and>Fusing Google Test Source Files

        Google Test's>python fuse_gtest_files.py OUTPUT_DIR

        and>scripts/test/Makefile file>Where to Go from Here

        Congratulations! You've>Frequently-Asked Questions.

      0
      0
       
       

      参考知识库

      猜你在找
      查看评论
      * 以上用户言论只代表其个人观点,不代表CSDN网站的观点或立场
      快速回复 TOP
        个人资料
        Google C++ Testing Framework AdvancedGuide_第1张图片
        jfkidear
        • 访问:945506次
        • 积分:11052
        • 等级:
        • 排名:第815名
        • 原创:18篇
        • 转载:1008篇
        • 译文:0篇
        • 评论:44条
        文章分类
      • Linux(206)
      • 函数(64)
      • C++(302)
      • C(115)
      • 数据库(3)
      • office(3)
      • android(2)
      • 移动(1)
      • 算法(104)
      • 搜索(3)
      • 操作系统(12)
      • 设计模式(16)
      • 终端(1)
      • 架构(6)
      • 脚本语言(35)
      • html解析(11)
      • C#(18)
      • 人生(3)
      • 网络(69)
      • 多线程(63)
      • 面向对象(16)
      • 面试(16)
      • VS(13)
      • windows(19)
      • 多进程(24)
      • 开源(8)
      • 编程思想(7)
      • 软件测试(29)
      • 软件工程(4)
      • 代码管理(37)
      • 垃圾回收(16)
      • 编译器(3)
      • Bug定位分析(4)
      • Java(2)
      • android(0)
      • web(45)
      • 媒体(1)
        文章存档
      • 2016年04月(1)
      • 2016年03月(24)
      • 2016年02月(2)
      • 2016年01月(9)
      • 2015年12月(1)
      • 2015年11月(2)
      • 2015年10月(4)
      • 2015年09月(16)
      • 2015年08月(14)
      • 2015年07月(1)
      • 2015年06月(1)
      • 2015年05月(3)
      • 2015年03月(2)
      • 2015年02月(2)
      • 2015年01月(10)
      • 2014年12月(14)
      • 2014年11月(4)
      • 2014年09月(6)
      • 2014年08月(15)
      • 2014年07月(6)
      • 2014年06月(14)
      • 2014年05月(24)
      • 2014年04月(21)
      • 2014年03月(9)
      • 2014年02月(8)
      • 2014年01月(47)
      • 2013年12月(24)
      • 2013年11月(11)
      • 2013年10月(17)
      • 2013年09月(9)
      • 2013年08月(13)
      • 2013年07月(8)
      • 2013年06月(23)
      • 2013年05月(58)
      • 2013年04月(63)
      • 2013年03月(57)
      • 2013年02月(36)
      • 2013年01月(15)
      • 2012年12月(25)
      • 2012年11月(3)
      • 2012年10月(9)
      • 2012年09月(35)
      • 2012年08月(128)
      • 2012年07月(34)
      • 2012年06月(9)
      • 2012年05月(36)
      • 2012年04月(7)
      • 2012年03月(27)
      • 2012年02月(68)
      • 2012年01月(1)
      • 2011年12月(12)
      • 2011年11月(9)
      • 2011年10月(11)
      • 2011年09月(40)
      • 2011年08月(12)
        阅读排行
      • c++模板类(26155)
      • XML文件结构和基本语法(25537)
      • python按行读取文件,如何去掉换行符"\n"(24229)
      • Python ConfigParser的使用(21017)
      • std::nothrow(13721)
      • 如何使用gdb调试多进程 (attach方法)(12727)
      • makefile的选项CFLAGS、CPPFLAGS、LDFLAGS和LIBS的区别(10150)
      • linux下解压命令大全和 rpm命令使用简介(9599)
      • read和write函数(9441)
      • socket缓冲区大小(7791)
        评论排行
      • c++模板类(8)
      • source insight打开工程死掉问题解决(3)
      • epoll去实现一个服务器(2)
      • 使用boost::regex_search进行字符串提取(2)
      • google mock分享(2)
      • python按行读取文件,如何去掉换行符"\n"(2)
      • 有关gdb调试watch(1)
      • 迭代器失效(1)
      • 字符指针的初始化(1)
      • KNN算法的實現(1)
        推荐文章
        • *EventBus的使用与深入学习
        • *Android 拍照、选择图片并裁剪
        • *spark性能调优:开发调优
        • *Ceph架构
        • *neutron-server的启动流程(一)
        • * iOS 网络资源汇总之动画
        最新评论
      • jQuery: Get filename from input [type='file']

        u013047005: 感谢分享,很不错的内容恩

      • 字符指针的初始化

        rl529014: 看懂了,谢谢楼主分享

      • git cherry-pick

        yuerliang: cherry-pick一个区间的commit,冲突的情况下还是一个个的来

      • google mock分享

        u012453032: 求大神指导怎样mock重载函数

      • python按行读取文件,如何去掉换行符"\n"

        u013413189: @zkzhou_10:列表解析linelist=

      • K均值聚类

        a1b2c3d4123456: 感谢博主分析

      • Git修改前一次提交的方法(特别注意保持Change-Id不变)

        lindexi_gd: 请问我提交了一个错误的用户名,我要如何去修改?

      • python按行读取文件,如何去掉换行符"\n"

        zkzhou_10: 有没有更好的方法呢?好啰嗦啊、

      • 进程间同步的方法

        liyimin09: 文不对题,乱转

      • 迭代器失效

        CorCplusplusorjava: 好

    你可能感兴趣的:(Google C++ Testing Framework AdvancedGuide)