This issue tracker has been migrated to GitHub, and is currently read-only.
For more information, see the GitHub FAQs in the Python's Developer Guide.

Author pino
Recipients Arfrever, benjamin.peterson, pino, pitrou, skrah, zbysz
Date 2012-11-06.19:55:21
SpamBayes Score -1.0
Marked as misclassified Yes
Message-id <1352231722.75.0.26398583104.issue13669@psf.upfronthosting.co.za>
In-reply-to
Content
*_MAX constants are usually defined when the system declares that the maximum limit for that option exist and is already known at compile time.
Python should simply not rely on XATTR_LIST_MAX and XATTR_SIZE_MAX being defined, but just grow the buffers as needed until *getxattr / *listxattr succeed or fail with errno != ERANGE; it seems the case already, although the two *_MAX are used as kind of "upper bound" limit.

Instead of using short lists with sizes to try, what about starting from a size (e.g. 128 for getxattr and 256 for listxattr), and grow the buffer by adding the same value each iteration (or doubling the size)?
History
Date User Action Args
2012-11-06 19:55:22pinosetrecipients: + pino, pitrou, benjamin.peterson, Arfrever, zbysz, skrah
2012-11-06 19:55:22pinosetmessageid: <1352231722.75.0.26398583104.issue13669@psf.upfronthosting.co.za>
2012-11-06 19:55:22pinolinkissue13669 messages
2012-11-06 19:55:21pinocreate