This issue tracker has been migrated to GitHub, and is currently read-only.
For more information, see the GitHub FAQs in the Python's Developer Guide.

classification
Title: unittest TestResult wasSuccessful returns True when there are unexpected successes
Type: behavior Stage: resolved
Components: Versions: Python 3.4
process
Status: closed Resolution: fixed
Dependencies: Superseder:
Assigned To: gregory.p.smith Nosy List: gregory.p.smith, michael.foord, python-dev, serhiy.storchaka, slattarini
Priority: normal Keywords: patch

Created on 2014-01-07 19:25 by gregory.p.smith, last changed 2022-04-11 14:57 by admin. This issue is now closed.

Files
File name Uploaded Description Edit
issue20165-gps01.diff gregory.p.smith, 2014-01-08 09:29 review
Messages (8)
msg207582 - (view) Author: Gregory P. Smith (gregory.p.smith) * (Python committer) Date: 2014-01-07 19:25
Python 3.3.3+ (3.3:28337a8fb502+, Jan  7 2014, 01:32:44)
[GCC 4.6.3] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import unittest
>>> r = unittest.result.TestResult()
>>> r.wasSuccessful()
True
>>> r.addUnexpectedSuccess("weird!")
>>> r.wasSuccessful()
True

An unexpected success is not a good thing and indicates a problem.

the wasSuccessful() method should include a check for len(self.unexpectedSuccesses) == 0 as part of its condition.
msg207583 - (view) Author: Gregory P. Smith (gregory.p.smith) * (Python committer) Date: 2014-01-07 19:26
somewhat related - https://code.google.com/p/unittest-ext/issues/detail?id=22
msg207614 - (view) Author: Michael Foord (michael.foord) * (Python committer) Date: 2014-01-07 23:41
I'm pretty sure this has been debated before (and the status quo is the result). Trying to find the issue.
msg207629 - (view) Author: Michael Foord (michael.foord) * (Python committer) Date: 2014-01-07 23:58
Hmmm... TestTools has wasSuccessful return False on an unexpected success [1] and I can't find an issue for any previous discussion.

I don't use unexpected success, so I have no particular horse in this race but it seems more logical that wasSuccessful should return False when there are unexpected successes.


[1] https://code.launchpad.net/~jml/testtools/unexpected-success-2/+merge/42050
msg207677 - (view) Author: Gregory P. Smith (gregory.p.smith) * (Python committer) Date: 2014-01-08 09:29
I'm not comfortable changing this for 2.7 or 3.3 in case some code is unfortunately is depending on this behavior.  But as it is it does seem like the kind of thing that can hide problems (tests that are passing that are not expected to).

Here's a patch for 3.4 (minus the documentation update that'll need to explicitly mention this with a versionchanged::).
msg207734 - (view) Author: Stefano Lattarini (slattarini) Date: 2014-01-09 12:52
Since I too was bitten by this issue, I'd like to support Gregory's
request, and report my rationale for changing the current behaviour.

With the current behaviour, we could see (and I *have* seen) scenarios
like this:
	
  1. A test exposing a known bug is written, and the test is marked
     as "expected failure".
	
  2. Some refactoring and code improvements follow.
	
  3. They improve the overall behaviour/correctness of the program
     or library under test, to the point of fixing the bug "behind
     the scenes".
	
  4. The developer doesn't notice the fix though, since the testing
     framework doesn't warn him "strongly enough" of the fact that the
     test marked as expected failure has begun to pass. (To reiterate,
     the current behaviour of just printing a warning saying "some test
     passed unexpectedly" on the standard output is not good enough of
     a warning; it's easy to miss, and worse, it's *certain* that it
     will be missed if the tests are run by some CI systems or a similar
     wrapper system -- those would only report failures due to non-zero
     exit statuses.)
	  
  5. Without noticing that the issue has been fixed, the developer does
     some further changes, which again re-introduce the bug, or a
     similar one that the test still marked as "expected failure" could
     easily catch.
	  
  6. That test starts to fail again; but since it has remained marked
     as "expected failure" all along, the fact is not reported to the
     developer in any way.  So the bug has managed to sneak back in,
     unnoticed.

In addition to this rationale, another (weaker) reason to change the
existing behaviour would be the "principle of least surprise".  Among
other widely used testing framework (for python, or language-agnostic)
many of those which support the concept of "expected failure" will
throw hard errors by default when a test marked as expected failure
starts to pass; among these are:

  * Python libs:
    - py.test (http://pytest.org)
    - Nose (https://pypi.python.org/pypi/nose/1.3.0)

  * Language-agnostic protocols/frameworks:
    - the TAP protocol (the standard in the Perl world)
      (http://en.wikipedia.org/wiki/Test_Anything_Protocol)
    - the Automake "Simple Tests" framework
      (http://www.gnu.org/software/automake/manual/html_node/Tests.html)
msg208530 - (view) Author: Roundup Robot (python-dev) (Python triager) Date: 2014-01-20 09:11
New changeset 1e75ab9fd760 by Gregory P. Smith in branch 'default':
Fixes Issue #20165: The unittest module no longer considers tests marked with
http://hg.python.org/cpython/rev/1e75ab9fd760
msg408695 - (view) Author: Serhiy Storchaka (serhiy.storchaka) * (Python committer) Date: 2021-12-16 11:56
See also issue22815.
History
Date User Action Args
2022-04-11 14:57:56adminsetgithub: 64364
2021-12-16 11:56:09serhiy.storchakasetnosy: + serhiy.storchaka
messages: + msg408695
2014-01-20 09:13:00gregory.p.smithsetstatus: open -> closed
resolution: fixed
stage: patch review -> resolved
2014-01-20 09:11:28python-devsetnosy: + python-dev
messages: + msg208530
2014-01-09 12:52:32slattarinisetnosy: + slattarini
messages: + msg207734
2014-01-08 09:29:41gregory.p.smithsetfiles: + issue20165-gps01.diff
versions: - Python 2.7, Python 3.3
messages: + msg207677

keywords: + patch
stage: needs patch -> patch review
2014-01-07 23:58:29michael.foordsetmessages: + msg207629
2014-01-07 23:41:03michael.foordsetmessages: + msg207614
2014-01-07 21:04:49gregory.p.smithsetnosy: + michael.foord
2014-01-07 19:26:35gregory.p.smithsetmessages: + msg207583
2014-01-07 19:25:21gregory.p.smithcreate