This issue tracker has been migrated to GitHub, and is currently read-only.
For more information, see the GitHub FAQs in the Python's Developer Guide.

classification
Title: When concating strings, I think it is better to use += than join the list
Type: enhancement Stage: resolved
Components: asyncio Versions:
process
Status: closed Resolution: wont fix
Dependencies: Superseder:
Assigned To: Nosy List: asvetlov, remi.lapeyre, serhiy.storchaka, ys19991, yselivanov
Priority: normal Keywords: patch

Created on 2020-07-08 14:49 by ys19991, last changed 2022-04-11 14:59 by admin. This issue is now closed.

Pull Requests
URL Status Linked Edit
PR 21397 closed ys19991, 2020-07-08 14:54
Messages (5)
msg373310 - (view) Author: Wansoo Kim (ys19991) * Date: 2020-07-08 14:49
Hello

I think it's better to use += than list.join() when concating strings.

This is more intuitive than other methods.

Also, I personally think it is not good for one variable to change to another type during runtime.

https://github.com/python/cpython/blob/b26a0db8ea2de3a8a8e4b40e69fc8642c7d7cb68/Lib/asyncio/base_events.py#L826

If you look at the link above, `msg` was a list type at first, in the end 
 become a str type.
msg373312 - (view) Author: Rémi Lapeyre (remi.lapeyre) * Date: 2020-07-08 15:04
Hi Wansoo, using += instead of str.join() is less performant. Concatenating n strings with + will create and allocate n new strings will str.join() will carefully look ahead and allocate the correct amount of memory and do all concatenation at one:


➜  ~ python3 -m timeit -s 's = ""' 'for i in range(1_000_000): s += "foo\n"'
5 loops, best of 5: 107 msec per loop
➜  ~ python3 -m timeit -s 'l = ["foo"]*1_000_000' '"\n".join(l)'
20 loops, best of 5: 9.96 msec per loop


It's a common idiom that you will meet a lot in Python.
msg373316 - (view) Author: Andrew Svetlov (asvetlov) * (Python committer) Date: 2020-07-08 16:29
Remi is correct.
Closing the issue.
msg373320 - (view) Author: Serhiy Storchaka (serhiy.storchaka) * (Python committer) Date: 2020-07-08 17:05
In this particular case the number of concatenations is limited, the resulting string is usually short, and the code is not performance critical (it is the __repr__ implementation). So there is no significant advantage of one way over other, and no way is obviously wrong. In such cases the status quo wins.
msg373322 - (view) Author: Wansoo Kim (ys19991) * Date: 2020-07-08 17:19
Well... to be honest, I'm a little confused. bpo-41244 and this issue are completely opposite. I'm not used to Python community yet because it hasn't been long since I joined it.

You're saying that if a particular method is not dramatically good, we prefer to keep the existing one as it is, right?

Your comment was very helpful to me. Maybe I can learn one by one like this.

Thank you very much.
History
Date User Action Args
2022-04-11 14:59:33adminsetgithub: 85414
2020-07-08 17:19:11ys19991setmessages: + msg373322
2020-07-08 17:05:41serhiy.storchakasetnosy: + serhiy.storchaka
messages: + msg373320
2020-07-08 16:29:48asvetlovsetstatus: open -> closed
resolution: wont fix
stage: patch review -> resolved
2020-07-08 16:29:31asvetlovsetmessages: + msg373316
2020-07-08 15:04:38remi.lapeyresetnosy: + remi.lapeyre
messages: + msg373312
2020-07-08 14:54:44ys19991setkeywords: + patch
stage: patch review
pull_requests: + pull_request20545
2020-07-08 14:49:19ys19991create