Author jaraco
Recipients Anthony Sottile, Zac Hatfield-Dodds, domdfcoding, jaraco, miss-islington
Date 2021-05-30.16:11:08
SpamBayes Score -1.0
Marked as misclassified Yes
Message-id <>
Thanks Zac for your input.

> Just chiming in with a plea to slow down the rate of changes to importlib.metadata - I understand that you want to tidy up the API, but even deprecations cause substantial work downstream.

It would be difficult to go much slower. Are you suggesting delaying the deprecation warning? My rationale for not delaying the deprecation warning is because it's possible using the backport to support the newer APIs all the way back to some end-of-life Pythons. If the deprecation warning is delayed, that seems to only delay the inevitable - that most projects will ultimately have to endure the toil of transitioning the code and relying on backports to support older Pythons.

Still, projects have the option to use the older APIs indefinitely by pinning the backport, or delay their adoption of the newer APIs by suppressing the warning. There's a lot of flexibility to limit the toil.

What approach would you recommend?

> Would it really be so bad to support the older APIs until they go end-of-life in CPython?

At this point, I believe the code is compatible with all known use cases and it's conceivable the compatibility layer could be supported in Python releases until Python 3.9 is EOL. Is that what you're proposing? Would that help the Hypothesis case (or others)? My instinct is the value proposition there is small.

> For example, in Hypothesis we care a great deal about compatibility with all supported Python versions (including treating warnings as errors) and try to minimize the hard dependencies.  As a result, our entry-points handling looks like this...

Project maintainers are allowed of course to treat warnings like errors, but since deprecation warnings are the primary mechanism for an upstream API to signal to the downstream that something is changing, projects should expect an increased amount of toil by transforming the default behavior of warnings.

I suspect that the hypothesis project could achieve forward compatibility within its constraints by vendoring `backports.entry_points_selectable`, and thus re-using a shared implementation of the conditional import dance. I've added a note to the project's README indicating that option. The implementation you have seems suitable, though.

> Change can be worth the cost, but there *is* a cost and the packaging ecosystem is already painfully complex and fragmented.  Compatibility should be a foundational principle, not an optional extra _if someone presents a compelling use case!_

Agreed: compatibility is a foundational principle. Compatibility was built into the first release of this new behavior (importlib_metadata 3.6). Thanks to Anthony's feedback in the PR and extensive exploration and rewrites, the solution presented there has a fairly straightforward transition and clean separation of concerns. The case reported above, where compatibility was not achieved is an acknowledged missed concern, and I'm happy to invest the time to restore that compatibility if it's worth the trouble. The reason I'd thought it's not worth the trouble is because contrary to Anthony's claim, no user has reported an issue with index-based access on Distribution.entry_points results for the three months that this functionality has been in place, which is why a note about the incompatibility seemed sufficient (after the fact).

I'll proceed with adding compatibility for this reported case, although I worry that won't satisfy the concerns. Is there a satisfactory solution that doesn't involve reverting the changes? Is there an approach that meets the goals of the change with less disruption?

> I'm also seriously concerned that you take GitHub links as an indication of who is affected.  Python is very very widely used, including in environments that don't feed much back into the GitHub-open-source space, and I think it's important to avoid breaking things for low-visibility users too.

I surely recognize that Github links and reports are only one indicator of one segment of the user base, but it's the sole indicator these projects have to solicit user concerns. That's why I pinned the issue reported about the Deprecation warning and used that forum to express concern and support for the users' issues and to present a variety of approaches for any number of users to avail themselves. I wanted to increase the visibility of the issue and limit the difficulty of addressing the intentional deprecation.

I mainly rely on Github reports and +1s on those reports as an indication of the impact of an issue. I use Github links as a means of networking. It was Anthony who suggested the links were an indication of a widespread issue. I only meant to contrast that concern to other breakages (in my experience) that showed dozens of links to affected issues. Linked issues are a secondary indicator at best.

I do expect that if users have an issue that they would report it through python/importlib_metadata or bpo, but I also expect that absence of a report demonstrates stability. At least, that's been my experience in the hundreds of projects I've developed on Sourceforge, Bitbucket, GitLab, and Github.

After employing defensive, empathetic programming, developing docs, and writing comprehensive tests, what  approaches should I be taking to solicit user concerns other than to have an open forum for soliciting issues and responding to those issues promptly and with proactive solutions?
Date User Action Args
2021-05-30 16:11:09jaracosetrecipients: + jaraco, Anthony Sottile, Zac Hatfield-Dodds, miss-islington, domdfcoding
2021-05-30 16:11:09jaracosetmessageid: <>
2021-05-30 16:11:09jaracolinkissue44246 messages
2021-05-30 16:11:08jaracocreate