Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

doctest should support fixtures #49149

Closed
dalloliogm mannequin opened this issue Jan 9, 2009 · 11 comments
Closed

doctest should support fixtures #49149

dalloliogm mannequin opened this issue Jan 9, 2009 · 11 comments
Assignees
Labels
tests Tests in the Lib/test dir type-feature A feature request or enhancement

Comments

@dalloliogm
Copy link
Mannequin

dalloliogm mannequin commented Jan 9, 2009

BPO 4899
Nosy @tim-one, @rhettinger, @terryjreedy, @abalkin

Note: these values reflect the state of the issue at the time it was migrated and might not reflect the current state.

Show more details

GitHub fields:

assignee = 'https://github.com/tim-one'
closed_at = <Date 2014-06-27.20:11:16.828>
created_at = <Date 2009-01-09.16:18:59.943>
labels = ['type-feature', 'tests']
title = 'doctest should support fixtures'
updated_at = <Date 2014-06-27.20:11:16.827>
user = 'https://bugs.python.org/dalloliogm'

bugs.python.org fields:

activity = <Date 2014-06-27.20:11:16.827>
actor = 'terry.reedy'
assignee = 'tim.peters'
closed = True
closed_date = <Date 2014-06-27.20:11:16.828>
closer = 'terry.reedy'
components = ['Tests']
creation = <Date 2009-01-09.16:18:59.943>
creator = 'dalloliogm'
dependencies = []
files = []
hgrepos = []
issue_num = 4899
keywords = []
message_count = 11.0
messages = ['79477', '79495', '79513', '79743', '79754', '79757', '79788', '113362', '113381', '221693', '221716']
nosy_count = 7.0
nosy_names = ['tim.peters', 'rhettinger', 'terry.reedy', 'belopolsky', 'LambertDW', 'dalloliogm', 'BreamoreBoy']
pr_nums = []
priority = 'normal'
resolution = 'rejected'
stage = 'resolved'
status = 'closed'
superseder = None
type = 'enhancement'
url = 'https://bugs.python.org/issue4899'
versions = ['Python 3.2']

@dalloliogm
Copy link
Mannequin Author

dalloliogm mannequin commented Jan 9, 2009

The doctest module should have support for fixtures (e.g. setUp and
tearDown methods).

It is very silly to be forced to always re-import all the modules needed
in every function tested with doctest.
For example, when you have to document functions that open files, you
have to import StringIO or TempFile, and then create a file, while this
could be done easily with a fixture.

@dalloliogm dalloliogm mannequin added tests Tests in the Lib/test dir type-feature A feature request or enhancement labels Jan 9, 2009
@lambertdw
Copy link
Mannequin

lambertdw mannequin commented Jan 9, 2009

I disagree. Purpose of __doc__ is to explain functionality all at once.
This command idiom is useful:

$ python -c 'from a_module import thing; help(thing)'

The doctest module is a lightweight nicety that helps verify that which
is suitable. The sufficiently simple algorithms of my code have doc
strings that are the complete test and explanation. For others I
provide both docstring and unit tests. But with many I explain the
arguments and output, possibly the algorithm in a doc string. Tests and
use case examples reside in the module's unit test.

I'm among the "Choose correct tool for the job. python comes with full
tool bag." group.

@rhettinger
Copy link
Contributor

I concur with David. This is not in the spirit of the doc test module.
We already have the heavy-weight unittest module as an alternative when
more firepower is needed. Also, adding more infra-structure to the this
already lengthy module will make it harder to learn, use, and remember.

Tim, I recommend rejecting this request.

@dalloliogm
Copy link
Mannequin Author

dalloliogm mannequin commented Jan 13, 2009

I was proposing to adopt doctest in the biopython project (modules for
bioinformatics in python, http://biopython.org/).

Doctest is very useful to document modules that will be used by many
other people: for example, there are many different file formats in
bioinformatics, and it is very useful to add an example of the file to
be parsed in the documentation of a file parser.
Look at my code here:

http://github.com/dalloliogm/biopython---popgen/blob/980419dbc0666e2578c2486dab1fef23ccfbb72c/src/PopGen/Gio/TpedIO.py

However, it is very uncomfortable to have to create a file-like object
in every docstring, especially when you want to document the methods of
a class.
It would be useful if at least the doctests of the methods of a class
share the objects created in the main doctest of the class.

Let's say I have a class called FastaIO (fasta is a file format for
sequence).
This module would have many methods: format, to_dict (returns a
dictionary of the sequences included in the file), and many others.
The main docstring of the class will have an example of a fasta file,
and shows how to create an instance of FastaIO.
It is silly to have to repeat this example (creating an instance of
FastaIO) in every submethod. Moreover, it is more difficult to maintain
and more error prone (imagine you have a class with one hundred methods).

@lambertdw
Copy link
Mannequin

lambertdw mannequin commented Jan 13, 2009

My goodness, that's the starting base sequence to gene 38c, chromosome 4
of the Columbian caldera cricket! But seriously...

  1. The relevant part of the doc string is this, and this is how it
    should read (your argument being "if doctests provided setUp framework
    my doc string would look like this!"):
def TpedIterator(handle):
'''
    Iterates on an TPed file handler.
    Returns Marker objects.
            Tped_stream = open('cricket.sequence','r')
            ti = TpedIterator(Tped_stream)
            for marker in ti:
                use(marker)
    '''
  1. (With the caveat that I am unfamilar with your project.) You should
    choose terminology appropriate for your project. A computer scientist
    would expect "file handle" to be an integer. What you call "handle" is
    clearly a "stream" object and therefore of uncommon name. Since the
    file objects are more likely to be from the computer sciences rather
    than the biological realm you should stick with "stream".

  2. We agree, "Don't Repeat Yourself". The last two chunks of your file
    enable doctest. I'll guess that similar lines may be found repeated
    throughout your sources. Instead of internal support, write a single
    test script that provides external support. It would process named
    files with unittest, doctest.[, and customtests.]
    $ python -c 'import test' glob
    There may be a python library for this. I can't guide you easily
    because I built and use my own framework. Nor have I bothered to figure
    out how python runs its own installation tests.

  3. Yes, unittest are quite appropriate for your project. When you move
    your docstring tests to unittests try to isolate the tests. For
    instance, the test you've shown couples the TpedIterator with the string
    representation of a Marker object.

  4. Is your system really so simple that it's useful to run
    interactively? I may be out of touch but I script most of my codes and
    tests because I make so many errors and module changes. In other words,
    is your interactive docstring example a reasonable use case?

@lambertdw
Copy link
Mannequin

lambertdw mannequin commented Jan 13, 2009

For unittests I recommend two things instead of need for doctest change.
A decoupled strict test to prove that the iterator works, and this class
to publish,

class Tped_use_cases(...):

    def test_Marker_iteration(self):
    '''
        Illustrative code adapted from what is now your doctest
    '''

...

@abalkin
Copy link
Member

abalkin commented Jan 13, 2009

The OP's problem, i.e. the need to reimport modules in every docstring
can easily be addressed by injecting the necessary names using
extraglobs argument to doctest.testmod().

I like the following trick:

def getextraglobs():
   import StringIO, tempfile, ...
   return locals()
doctest.testmod(extraglobs=getextraglobs())

This however does not solve a problem of cleanup after examples and some
way to specify a tearDown fixture would be helpful, particularly because
cleanup code in examples will rarely improve documentation.

@terryjreedy
Copy link
Member

Tim, any opinion on whether this should be kept open or not?

@tim-one
Copy link
Member

tim-one commented Aug 9, 2010

I stopped understanding doctest the last time it was rewritten - it got far more generalized than I ever intended already. It's up to the younger generation to decide how much more inscrutable to make it now ;-)

@BreamoreBoy
Copy link
Mannequin

BreamoreBoy mannequin commented Jun 27, 2014

I suggest this is closed as "won't fix" as David and Raymond are against it and Tim says he no longer understands doctest.

@terryjreedy
Copy link
Member

I am closing this for both the general reasons already given and the lack of a proposed api that could programmed, Hence there is no example code that would run and hence no specifics to approve. If someone wanted to pursue this, python-ideas would be a better place.

One of the general reasons given is that the example does not demonstrate a real need for the fixtures proposed. My take on the example code is this.

The input to TpedIterator is an iterable of lines. A list of lines will work as well an an open file. I would create a file of example lists and import the one needed for each docstring. For the example given.

def TpedIterator(lines):
    '''Yield Marker for each line of lines.
    >>> from biopython.sample_data import Tped
    >>> for marker in TpedIterator(Tped):
    ...    print(marker)
    Marker rs10000543, 2 individuals
    Marker rs10000929, 2 individuals
    Marker rs10002472, 2 individuals
    '''

@ezio-melotti ezio-melotti transferred this issue from another repository Apr 10, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
tests Tests in the Lib/test dir type-feature A feature request or enhancement
Projects
None yet
Development

No branches or pull requests

4 participants