classification
Title: Bezout and Chinese Remainder Theorem in the Standard Library?
Type: enhancement Stage: resolved
Components: Library (Lib) Versions: Python 3.9
process
Status: closed Resolution: rejected
Dependencies: Superseder:
Assigned To: mark.dickinson Nosy List: Dennis Sweeney, mark.dickinson, rhettinger, tim.peters
Priority: normal Keywords:

Created on 2020-02-16 21:40 by Dennis Sweeney, last changed 2020-05-17 08:53 by mark.dickinson. This issue is now closed.

Messages (5)
msg362106 - (view) Author: Dennis Sweeney (Dennis Sweeney) * Date: 2020-02-16 21:40
Should something like the following go in the standard library, most likely in the math module? I know I had to use such a thing before pow(a, -1, b) worked, but Bezout is more general. And many of the easy stackoverflow implementations of CRT congruence-combining neglect the case where the divisors are not coprime, so that's an easy thing to miss. 

def bezout(a, b):
    """
    Given integers a and b, return a tuple (x, y, g),
    where x*a + y*b == g == gcd(a, b).
    """
    # Apply the Extended Euclidean Algorithm:
    # use the normal Euclidean Algorithm on the RHS
    # of the equations
    #     u1*a + v1*b == r1
    #     u2*a + v2*b == r2
    # But carry the LHS along for the ride.
    u1, v1, r1 = 1, 0, a
    u2, v2, r2 = 0, 1, b

    while r2:
        q = r1 // r2
        u1, u2 = u2, u1-q*u2
        v1, v2 = v2, v1-q*v2
        r1, r2 = r2, r1-q*r2
        assert u1*a + v1*b == r1
        assert u2*a + v2*b == r2

    if r1 < 0:
        u1, v1, r1 = -u1, -v1, -r1

    # a_coefficient, b_coefficient, gcd
    return (u1, v1, r1)

def crt(cong1, cong2):
    """
    Apply the Chinese Remainder Theorem:
        If there are any integers x such that 
        x == a1 (mod n1) and x == a2 (mod n2),
        then there are integers a and n such that the
        above congruences both hold iff x == a (mod n)
    Given two compatible congruences (a1, n1), (a2, n2),
    return a single congruence (a, n) that is equivalent
    to both of the given congruences at the same time.
    
    Not all congruences are compatible. For example, there
    are no solutions to x == 1 (mod 2) and x == 2 (mod 4).
    For congruences (a1, n1), (a2, n2) to be compatible, it
    is sufficient, but not necessary, that gcd(n1, n2) == 1.
    """
    a1, n1 = cong1
    a2, n2 = cong2
    c1, c2, g = bezout(n1, n2)
    assert n1*c1 + n2*c2 == g
    
    if (a1 - a2) % g != 0:
        raise ValueError(f"Incompatible congruences {cong1} and {cong2}.")
    
    lcm = n1 // g * n2
    rem = (a1*c2*n2 + a2*c1*n1)//g
    return rem % lcm, lcm

assert crt((1,4),(2,3)) == (5, 12)
assert crt((1,6),(7,4)) == (7, 12)
msg362135 - (view) Author: Mark Dickinson (mark.dickinson) * (Python committer) Date: 2020-02-17 10:47
> Should something like the following go in the standard library, most likely in the math module?

I'm not keen. Granted that the math module has exceeded its original remit of "wrappers for libm", but even so, I'd prefer to try to limit it to a basic set of building blocks. For me, things like CRT and xgcd go beyond that.

I'd suggest that for now, the right place for this sort of thing would be a PyPI library for elementary number theory. That library could include probably primality testing, basic factoring, continued fractions, primitive root finding, and other elementary number theory topics.
msg362506 - (view) Author: Mark Dickinson (mark.dickinson) * (Python committer) Date: 2020-02-23 12:25
I'm inclined to close this. Raymond, Tim: thoughts?
msg362534 - (view) Author: Tim Peters (tim.peters) * (Python committer) Date: 2020-02-23 19:44
Ya, I'll close this, channeling that Raymond would also reject it for now (although he's certainly free to reopen it).

There's nothing wrong with adding such functions, but the `math` module is already straying too far from its original intent as a home for floating-point functions, nearly all of which had counterparts in `cmath`.  _If_ we had a, say, `imath` module, it would be a much easier sell.

These are just "too specialized".  I'd put them in the same class as, say, a new function to compute the Jacobi symbol.  Very handy but only for a vanishingly small percentage of Python users.

Probably _the_ most common use for xgcd (or egcd, either of which I suggest are better names than `bezout` - "extended gcd" is descriptive and commonly used) is for finding modular inverses, but `pow()` does that now directly.

For Chinese remainder, I'd suggest two changes:

1. Generalize to any number of bases.  The 2-base case is common in toy RSA implementations, but in that context a more efficient way is generally used, based on precomputing "helper constants" specific to the modulus's 2 factors.

2. Since the same bases are generally used over & over while only the remainders change, the function signature would be more usable if it accepted a sequence of remainders, and a sequence of corresponding bases, as two distinct arguments.
msg369105 - (view) Author: Mark Dickinson (mark.dickinson) * (Python committer) Date: 2020-05-17 08:53
Related (imath module discussions): #37132, #40028.
History
Date User Action Args
2020-05-17 08:53:20mark.dickinsonsetmessages: + msg369105
2020-02-23 19:44:10tim.peterssetstatus: open -> closed
resolution: rejected
messages: + msg362534

stage: resolved
2020-02-23 12:25:22mark.dickinsonsetassignee: mark.dickinson

messages: + msg362506
nosy: + tim.peters, rhettinger
2020-02-17 10:47:40mark.dickinsonsetnosy: + mark.dickinson
messages: + msg362135
2020-02-16 21:40:39Dennis Sweeneycreate