This issue tracker has been migrated to GitHub, and is currently read-only.
For more information, see the GitHub FAQs in the Python's Developer Guide.

Title: Default value incorrectly read with unittests on Windows & macOS but not Linux
Type: Stage:
Components: macOS, Windows Versions: Python 3.10, Python 3.9, Python 3.8, Python 3.7
Status: open Resolution:
Dependencies: Superseder:
Assigned To: Nosy List: ned.deily, paul.moore, ronaldoussoren, samuelmarks, steve.dower, tim.golden, zach.ware
Priority: normal Keywords:

Created on 2021-02-12 00:42 by samuelmarks, last changed 2022-04-11 14:59 by admin.

Messages (2)
msg386845 - (view) Author: Samuel Marks (samuelmarks) * Date: 2021-02-12 00:42
Had a couple of commits to try and fix it on GitHub Actions (as I was unable to replicate locally), ended up with this very easy fix for Windows:

To completely remove the default value. The only thing I can assume happened is that a different test in the same class—which set the default value to something else—changed the default value of the function.

This was all very confusing, and I can only think it to be a bug on python, or in GitHub Actions deployment thereof.

PS: The macOS builds are still failing with the previous issue :\
msg386869 - (view) Author: Steve Dower (steve.dower) * (Python committer) Date: 2021-02-12 18:15
I'm afraid there's nowhere near enough context in your post for us to look into this.

Can you provide some code that you think *should* work but does not? Ideally in a post (as in, it should be that short) or as an attachment?
Date User Action Args
2022-04-11 14:59:41adminsetgithub: 87369
2021-02-12 18:15:12steve.dowersetmessages: + msg386869
2021-02-12 00:42:30samuelmarkscreate