This issue tracker has been migrated to GitHub, and is currently read-only.
For more information, see the GitHub FAQs in the Python's Developer Guide.

Author lemburg
Recipients cgohlke, larry, lemburg, paul.moore, steve.dower, tim.golden, zach.ware
Date 2015-08-17.18:42:29
SpamBayes Score -1.0
Marked as misclassified Yes
Message-id <55D22AFA.2050801@egenix.com>
In-reply-to <1439828678.57.0.835733327529.issue24872@psf.upfronthosting.co.za>
Content
On 17.08.2015 18:24, Steve Dower wrote:
> 
> We can't have Python run VCredist because it requires administrative privileges and installing Python does not require those. This is actually one of the biggest complaints about the current installer, but because it looks like the design and not a bug it only comes up when you ask people directly what they'd like to change. (The most common bug is pip failing to install/uninstall, which is what you see on the issue tracker.)

Why not ? People certainly trust the Python installer more than
some script which might want to do this afterwards in order to
get a package working. After all, it's signed by the PSF and
comes with certificate backed verification of authenticity.

> The other benefits are that PYDs compiled with VS 2016+ (or whatever it's called) with these options will work with stock Python 3.5 on a clean machine - so if your extension links to python3.dll you can build once for all Python versions 3.5 and later and know that it will just copy and run. If you're building from source and only have a newer compiler than VC14, you will be able to build successfully. Also, we could decide to build some future Python 3.5.x with a newer compiler as it should not break any existing extensions (after much discussion and testing, obviously, but it is only feasible if we properly hide the VC version now).

This would only work for extensions using the stable Python ABI.
The standard approach is to rebuild extension for every minor
release, since the standard Python ABI is not guaranteed to stay
backwards compatible.

> To achieve these (apart from the last point) with VCredist, we need to be installing the current and all future versions of the redist with Python. Hence a time machine is required, and I haven't come up with any good way to simulate a time machine here.

See above. This is really not a very common use case. If you search
for Py_LIMITED_API (which has to be defined to enable the stable
ABI), you'll hardly find any reference to code using it.

>> not when even MS itself recommends against doing this.
> 
> I have separate advice (also from MS, from the same people who have been quoted previously, but in private conversations so I can't link to it) that if we want any chance of our plugin architecture being VC version independent, this is what we have to do. I'm emailing again to get more specific advice, but the official guidance has always been biased by wanting people to get the latest tools (which is why VC9 disappeared until my team made the case that sometimes people can't upgrade). We're still pushing hard to make this an acknowledged use case, and don't be surprised if at some point in the future official advice says "if you allow plugins, do what CPython did to help your users and developers".

Regardless of marketing strategies, the fact that you have to reinstall
Python and all extensions in case there's some bug in the CRT is really
the main argument against doing static linking.

Static linking of the CRT is normally only used in situations where
you don't want to have single file executables without external
dependencies, e.g. for working on arbitrary Windows systems without
having to install anything. It's a valid use case, but not a general
purpose one.

> The "/MT" == "statically linked" equivalence is an oversimplification in the presence of link-time code generation ("/GL" and "/LTCG" options), as we can take .obj or .lib files compiled with /MT and still use dynamic linking. The difference is we have to do it explicitly, which is what the "/nodefaultlib:libucrt.lib ucrt.lib" options do. If we add concrt140, msvcp140 and vccorlib140 to that as well (and probably the rest of the versions redistributables) then all of them will continue to be dynamically linked.
>
> So basically, all the existing static libs could be built with /MT and still have their dependencies dynamically linked if that's what the final linker decides to do. In any case, they probably need to be rebuilt with VC14 unless the authors have very carefully kept a clean API, in which case it may as well be a DLL.

You lost me there. What's the advantage of using /MT when you then add
all of the CRT libs to the set of libs which are dynamically linked ?

Just to clarify:

If I want to ship a C extension compiled for Python 3.5 which links to an
external DLL on the system, I will have to tell the users of the extension
to first run VCredist in order for them to be able to use extension
on their system, since Python 3.5 will not ship with the necessary
CRT DLLs, correct ?

> Because we're talking about a plugin architecture here, I think it's actually advantageous to use static linking of the C++ runtime. The main difference from the C runtime is that the C++ runtime does not have to be shared with the host - Python - and the advantage is that state will not be shared with other plugins that also happen to use the same version of C++ (or worse, a different version with the same name, and now we have a conflict).

Python C extensions are much more than a plugin architecture. They
allow connecting Python to the world of available C libraries out
there, not just ones which you can statically compile.

> I appreciate the problems this causes when trying to link in 3rd-party dependencies, but given a choice between making life easier for developers or users I have to side with the users (while doing as much as I possibly can to make developers lives easier). People installing wheels from Christoph's page or PyPI should be able to expect it to work. When pip grows extensions I'll certainly look into writing an extension for specifying, detecting and getting the required VC redistributable, but I don't want core Python to be burdened with shipping the full set of distributables.

Same here, and even more so: I would like the users to get a Python
installation they can use out of the box, without having to worry
about DLL hell and this includes making it possible to install
Python C extensions with external DLL dependencies without the
need to run VCredist before being able to do so and it includes
getting CRT bug fixes by means of OS updates rather than complete
reinstalls of Python and all your extensions.

Developers can work around these things, but if we end up requiring them
to redistribute VCredist with every single package that has external
dependencies, just to be able to install a binary package,
because Python itself doesn't ship the necessary DLLs, then something
is clearly broken for Python on Windows.

Users will also likely not grant pip admin privileges to run
VCredist for them (or ask their sys admins to allow pip to do this),
so that idea is not going to work out in practice.

Alternatively, we tell users to install VCredist by hand in case
they plan to use Python with C extensions. Possible, but not
very user friendly either.

With Python 2.7 all this is not an issue, creating yet another road
block to prevent upgrades :-(

I understand that you would like us to get rid off the compiler
version dependency and appreciate your work in that direction,
but chances are high that this will not actually work out in
practice. Backwards compatibility is already hard, forward
compatibility is really really difficult to achieve without
a time machine.

BTW: Do you know whether Python 3.5 will still work as Windows service
without the MSVCR140 DLL available in the system folder ?
History
Date User Action Args
2015-08-17 18:42:30lemburgsetrecipients: + lemburg, paul.moore, larry, tim.golden, cgohlke, zach.ware, steve.dower
2015-08-17 18:42:30lemburglinkissue24872 messages
2015-08-17 18:42:29lemburgcreate