Because `.format()` is a method on an instantiated `str` object in e and so
must return the same type so additional str methods could be stacked on
after it, like `.format(u'hi').decode()`. Whereas the % string
interpolation is a binary operation, so, like addition, where the more
general type can
> On May 17, 2017, at 2:41 PM, Craig Rodrigues wrote:
>
> Hi,
>
> While cleaning up some code during Python 2 -> Python 3 porting,
> I switched some code to use str.format(), I found this behavor:
>
> Python 2.7
> =
> a = "%s" % "hi"
> b = "%s" % u"hi"
> c = u"%s" % "hi"
> d = "{}".for
On Wed, May 17, 2017 at 02:41:29PM -0700, Craig Rodrigues wrote:
> e = "{}".format(u"hi")
[...]
> type(e) == str
> The confusion for me is why is type(e) of type str, and not unicode?
I think that's one of the reasons why the Python 2.7 string model is (1)
convenient to those using purely ASCII
Hi,
While cleaning up some code during Python 2 -> Python 3 porting,
I switched some code to use str.format(), I found this behavor:
Python 2.7
=
a = "%s" % "hi"
b = "%s" % u"hi"
c = u"%s" % "hi"
d = "{}".format("hi")
e = "{}".format(u"hi")
f = u"{}".format("hi")
type(a) == str
type(b) =