[Python-ideas] Re: Python should take a lesson from APL: Walrus operator not needed

2019-11-22 Thread Rob Cliffe via Python-ideas



On 11/11/2019 17:10:40, C. Titus Brown wrote:

Hi folks,

moderator here. I’d (strongly) suggest no further replies, unless there’s 
something Python specific to discuss. I’ll put the list into emergency 
moderation for a bit.

thanks,
—titus

Agreed.
The OP used APL for a number of years, and fell in love with it.
He has translated this love into a mystical esteem for "Notation" (it's 
not clear what that means)
and a belief that "Python ought to use left arrow instead of the walrus 
operator, and ultimately instead of the "=" (assignment) symbol.".
There have been many knowledgeable and insightful posts to this list 
(much more
sophisticated than I could ever have written!) _seriously_ attempting to 
address his concerns.
But: Bottom line: Modern programmers find that the extra symbols used by 
APL are obscure,

and by the OP's own admission take "a year or two" to become familiar.
In a nutshell: APL is a dinosaur; the world has moved on.
Python is always alive to adopting ideas from other programming languages.
But in this case "We've heard what you say.  No thank you".
Rob Cliffe
___
Python-ideas mailing list -- python-ideas@python.org
To unsubscribe send an email to python-ideas-le...@python.org
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived at 
https://mail.python.org/archives/list/python-ideas@python.org/message/GBDDLZH5GLUOA4KUD7476OWVABZBI33P/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-ideas] Re: Python should take a lesson from APL: Walrus operator not needed

2019-11-22 Thread Rob Cliffe via Python-ideas



On 06/11/2019 17:05:21, Martin Euredjian via Python-ideas wrote:
  One has to use APL for real work and for at least a year or two in 
order for your brain to make the mental switch necessary to understand 
it.  Just messing with it casually isn't good enough.  Lots of 
inquisitive people have messed with it, but they don't really 
understand it.




No offence, but my honest off-the-cuff reaction:
The above could be interpreted as
"APL is a difficult language to learn, it takes at least a year or two 
of real work with it in order for your brain to make the mental switch 
necessary to understand it.
As opposed to more intuitive languages, such as ... I don't know ... I'm 
sure there's one beginning with P."
In today's fast-moving world we can't afford that "year or two" before 
we are really productive.
Nor to write code that other people need "a year or two" before they can 
read it fluently.


Disclosure: I have no experience with APL.

I did click on your "Notation as a Tool of Thought" link, but I didn't 
get very far with it before my eyes glazed over looking at all the 
unfamiliar symbols.
(This although I call myself a mathematician of sorts - not someone who 
throws a fit at the sight of equations or Greek letters.)


Yes, it would have been nice if <- had been included in the ASCII 
character set when it was developed in the 1960s;
then all programming languages could use <- for assignment and = for 
equality.  (Where is the time machine when you need it?)

But regrettably, we are where we are.
And having to type Alt-[ for every assignment - virtually necessitating 
the use of both hands -

would IMO be a significant inconvenience.

Best wishes
Rob Cliffe
___
Python-ideas mailing list -- python-ideas@python.org
To unsubscribe send an email to python-ideas-le...@python.org
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived at 
https://mail.python.org/archives/list/python-ideas@python.org/message/MXAUIFBVVYLZEGEFAHHGRUJOY5WGQBUB/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-ideas] Re: Python should take a lesson from APL: Walrus operator not needed

2019-11-12 Thread David Mertz
Yeah. Maybe I should replace regex ' *:=' rather than just ':='. That's
easy enough with the plugin

On Tue, Nov 12, 2019, 12:12 PM Mike Miller 
wrote:

>
> On 2019-11-11 16:13, David Mertz wrote:
> > I implemented this discussed arrow operator in vim with conceal plugin.
> This is
> > an example given in PEP 572.  It looks perfectly fine.  It also does not
> require
> > ANY change to Python-the-language.  It just means that I can type ':'
> followed
> > by '=' to get that, rather than type 'Alt+Shift', '2', '1', '9', '0'.
> So fewer
> > keystrokes. No chording.  Easier to type.  And what gets saved to disk
> is good
> > old plain ASCII.
>
> I like your solution and think it looks great, though perhaps you forgot
> the
> space behind it?  I'm not a huge fan of how modern Python is putting
> colons
> everywhere so this helps a tiny bit.
>
> > I don't hate how it looks, but I really, really don't get how it's
> supposed to
> > "transform my thinking about coding" to have a slightly different glyph
> on
> > screen.
>
> Probably would need several, as CB mentioned below.  Still, debatable.
>
> > I mean, as shown in this example and a previous one I posted a
> > screenshot of, I think it's cute and geeky to use a few math symbols in
> the same
> > way in my editor.  I've been doing that for a few years, and it never
> got beyond
> > "slightly cute."
>
> Guessing there were a few rare curmudgeons who didn't think we needed
> lowercase
> letters before ascii and still a few who don't want syntax highlighting
> either.
> I realize we're hitting the land of diminishing returns on text, but once
> features are gained I know I don't want to go back.
>
> For example, I use many useful Unicode symbols in my text strings and
> console
> output.  Billions of folks are using non-latin alphabets right now because
> Python3 makes it easy.  All modern systems can handle them, why not?  And
> input
> is not an significant issue, though it depends on the block.
> ___
> Python-ideas mailing list -- python-ideas@python.org
> To unsubscribe send an email to python-ideas-le...@python.org
> https://mail.python.org/mailman3/lists/python-ideas.python.org/
> Message archived at
> https://mail.python.org/archives/list/python-ideas@python.org/message/4WNTP45AD43HJTZUTBBZF5KFOKVPPGLW/
> Code of Conduct: http://python.org/psf/codeofconduct/
>
___
Python-ideas mailing list -- python-ideas@python.org
To unsubscribe send an email to python-ideas-le...@python.org
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived at 
https://mail.python.org/archives/list/python-ideas@python.org/message/7PPNKOZW6MQNBZGFJFV7EDDJP3WSM3WO/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-ideas] Re: Python should take a lesson from APL: Walrus operator not needed

2019-11-12 Thread Mike Miller


On 2019-11-11 16:13, David Mertz wrote:
I implemented this discussed arrow operator in vim with conceal plugin.  This is 
an example given in PEP 572.  It looks perfectly fine.  It also does not require 
ANY change to Python-the-language.  It just means that I can type ':' followed 
by '=' to get that, rather than type 'Alt+Shift', '2', '1', '9', '0'.  So fewer 
keystrokes. No chording.  Easier to type.  And what gets saved to disk is good 
old plain ASCII.


I like your solution and think it looks great, though perhaps you forgot the 
space behind it?  I'm not a huge fan of how modern Python is putting colons 
everywhere so this helps a tiny bit.


I don't hate how it looks, but I really, really don't get how it's supposed to 
"transform my thinking about coding" to have a slightly different glyph on 
screen.  


Probably would need several, as CB mentioned below.  Still, debatable.

I mean, as shown in this example and a previous one I posted a 
screenshot of, I think it's cute and geeky to use a few math symbols in the same 
way in my editor.  I've been doing that for a few years, and it never got beyond 
"slightly cute."


Guessing there were a few rare curmudgeons who didn't think we needed lowercase 
letters before ascii and still a few who don't want syntax highlighting either. 
I realize we're hitting the land of diminishing returns on text, but once 
features are gained I know I don't want to go back.


For example, I use many useful Unicode symbols in my text strings and console 
output.  Billions of folks are using non-latin alphabets right now because 
Python3 makes it easy.  All modern systems can handle them, why not?  And input 
is not an significant issue, though it depends on the block.

___
Python-ideas mailing list -- python-ideas@python.org
To unsubscribe send an email to python-ideas-le...@python.org
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived at 
https://mail.python.org/archives/list/python-ideas@python.org/message/4WNTP45AD43HJTZUTBBZF5KFOKVPPGLW/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-ideas] Re: Python should take a lesson from APL: Walrus operator not needed

2019-11-12 Thread Marko Ristin-Kaufmann
Hi,

>  I mean, as shown in this example and a previous one I posted a screenshot
> of, I think it's cute and geeky to use a few math symbols in the same way
> in my editor.  I've been doing that for a few years, and it never got
> beyond "slightly cute."
>

I  would second this. I find it actually less readable if the font does not
provide nice arrows. It reminds me of ScaLa and the "=>" symbol. The right
implication arrow was barely readable in most common Ubuntu fonts.

Cheers,
Marko

>
___
Python-ideas mailing list -- python-ideas@python.org
To unsubscribe send an email to python-ideas-le...@python.org
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived at 
https://mail.python.org/archives/list/python-ideas@python.org/message/RUC6NH26QBEVGAK575A4SABU7A7KNIRF/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-ideas] Re: Python should take a lesson from APL: Walrus operator not needed

2019-11-11 Thread Greg Ewing

On 12/11/19 4:10 am, Random832 wrote:

well *of course* the goal was not to slow down
actual production of text, but this does not imply the method by which
"speeding up by preventing jams" was to be achieved was not by slowing down
the physical process of pressing keys.


That wasn't the method, though -- the method was to ensure that
frequent letter pairs were separated in the type basket, so that
they were less likely to collide with each other when used in
quick succession. At least that seems the most plausible
explanation to me -- nobody really knows for sure.

--
Greg
___
Python-ideas mailing list -- python-ideas@python.org
To unsubscribe send an email to python-ideas-le...@python.org
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived at 
https://mail.python.org/archives/list/python-ideas@python.org/message/TC3RCNF5E2HQABQ5X7ZRHNI3ZVV2ZZ7J/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-ideas] Re: Python should take a lesson from APL: Walrus operator not needed

2019-11-11 Thread Christopher Barker
On Mon, Nov 11, 2019 at 4:16 PM David Mertz  wrote:

>  I really, really don't get how it's supposed to "transform my thinking
> about coding" to have a slightly different glyph on screen.
>

I agree here. This thread got really caught up in issues like "how do I
type that?", but I don't think that was the OP's point. He was arguing for
"Notations" -- but I, at least have no idea what that means.

He made two specific proposals in the OP:

1) Use a single "left arrow" symbol in place of the two ascii-char := for
the assignment expression operator.

2) phase out the regular old assignment expression altogether eventually.

These are quite independent really.

But that was a lot of work, and a big old thread if the point was simply to
use one non-ascii symbol for one operator -- that clearly won't be
"transformative".

So I *think* the real point to "notations" is really to have  a lot more
operators - each with its own meaning -- *maybe* in place of some
operator overloading. See the other recent thread -- if you want a operator
that means "merge these two dicts", use a new one, rather than trying to
reuse +  -- which has some merit. After all, a number of the objections to
the dict addition proposal was that dict merging is pretty different than
numerical addition, and thus people might get confused.

If you were to introduce a lot more operators, you would get more compact
code, and you probably would want to use a wider range of symbols than
ascii provides, so as to avoid the "ascii soup" to.
referred to.

And I think there is some merit to the "more operators" -- that's exactly
why @ was added -- folks doing matrix calculations really wanted to be able
to write:

A @ B (or A* B) rather than:

np.dot(A, B)

Particularly why we have all the other operators, rather than everything
being a function call. Would we really want:

index(a_list, i)

or

slice(a_sequence, i, j, step)

Making the code look like the math on a blackboard has its advantages.

However, as pointed out in this thread, even in math notation, the same (or
similar) notation has different meaning in different contexts, making it
very hard to take that approach is a general purpose programing language.

So there is a limit -- making everything an operator would be worse than
ascii soup, - it would be hieroglyphics to most of us. like complex math is
to people outside the domain it's used in.

I think we need a mixture of operators and named functions, and that Python
as the balance about right as it is.

The other issue at hand is overloading vs new operators -- and particularly
in a dynamic language, there's something to be said for more operators
rather than overloading -- but i'm really not sure about that -- more than
a handful more, and I think I'd get very confused --even if I could figure
out how to type them.

-CHB
-- 
Christopher Barker, PhD

Python Language Consulting
  - Teaching
  - Scientific Software Development
  - Desktop GUI and Web Development
  - wxPython, numpy, scipy, Cython
___
Python-ideas mailing list -- python-ideas@python.org
To unsubscribe send an email to python-ideas-le...@python.org
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived at 
https://mail.python.org/archives/list/python-ideas@python.org/message/PLX37LWNPKC3IC2KUNCV7DSV2OSIM4HA/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-ideas] Re: Python should take a lesson from APL: Walrus operator not needed

2019-11-11 Thread David Mertz
I implemented this discussed arrow operator in vim with conceal plugin.
This is an example given in PEP 572.  It looks perfectly fine.  It also
does not require ANY change to Python-the-language.  It just means that I
can type ':' followed by '=' to get that, rather than type 'Alt+Shift',
'2', '1', '9', '0'.  So fewer keystrokes. No chording.  Easier to type.
And what gets saved to disk is good old plain ASCII.

I don't hate how it looks, but I really, really don't get how it's supposed
to "transform my thinking about coding" to have a slightly different glyph
on screen.  I mean, as shown in this example and a previous one I posted a
screenshot of, I think it's cute and geeky to use a few math symbols in the
same way in my editor.  I've been doing that for a few years, and it never
got beyond "slightly cute."

https://www.dropbox.com/s/vtwd1grlnml8sz2/Screenshot%202019-11-11%2019.02.58.png?dl=0

On Mon, Nov 11, 2019 at 5:23 PM Mike Miller 
wrote:

>
> On 2019-11-10 12:50, Martin Euredjian via Python-ideas wrote:
> > I have found that trying to explain the value of true notation to people
> who
> > lack the experience and training is always a losing proposition.  I'm
> already
> > regretting having started this thread, simply because I know how this
> works.
>
>
> Reminds me of the "you can't tell people anything," post:
>
>  http://habitatchronicles.com/2004/04/you-cant-tell-people-anything/
>
>  "What’s going on is that without some kind of direct experience to
> use as a
>  touchstone, people don’t have the context that gives them a place in
> their
>  minds to put the things you are telling them."
>
>
> I found the thread interesting despite the many "how to type it?" replies.
> Don't be too discouraged.
>
> -Mike
> ___
> Python-ideas mailing list -- python-ideas@python.org
> To unsubscribe send an email to python-ideas-le...@python.org
> https://mail.python.org/mailman3/lists/python-ideas.python.org/
> Message archived at
> https://mail.python.org/archives/list/python-ideas@python.org/message/46DS43JHWXNJBSQSSSWVXBBRLSNIDTJS/
> Code of Conduct: http://python.org/psf/codeofconduct/
>


-- 
Keeping medicines from the bloodstreams of the sick; food
from the bellies of the hungry; books from the hands of the
uneducated; technology from the underdeveloped; and putting
advocates of freedom in prisons.  Intellectual property is
to the 21st century what the slave trade was to the 16th.
___
Python-ideas mailing list -- python-ideas@python.org
To unsubscribe send an email to python-ideas-le...@python.org
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived at 
https://mail.python.org/archives/list/python-ideas@python.org/message/ROWF7HPL7UCZBXHYNCZIP7SDWLDO3HA5/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-ideas] Re: Python should take a lesson from APL: Walrus operator not needed

2019-11-11 Thread Mike Miller


On 2019-11-10 12:50, Martin Euredjian via Python-ideas wrote:
I have found that trying to explain the value of true notation to people who 
lack the experience and training is always a losing proposition.  I'm already 
regretting having started this thread, simply because I know how this works.  



Reminds me of the "you can't tell people anything," post:

http://habitatchronicles.com/2004/04/you-cant-tell-people-anything/

"What’s going on is that without some kind of direct experience to use as a
touchstone, people don’t have the context that gives them a place in their
minds to put the things you are telling them."


I found the thread interesting despite the many "how to type it?" replies. 
Don't be too discouraged.


-Mike
___
Python-ideas mailing list -- python-ideas@python.org
To unsubscribe send an email to python-ideas-le...@python.org
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived at 
https://mail.python.org/archives/list/python-ideas@python.org/message/46DS43JHWXNJBSQSSSWVXBBRLSNIDTJS/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-ideas] Re: Python should take a lesson from APL: Walrus operator not needed

2019-11-11 Thread C. Titus Brown
Hi folks,

moderator here. I’d (strongly) suggest no further replies, unless there’s 
something Python specific to discuss. I’ll put the list into emergency 
moderation for a bit.

thanks,
—titus

> On Nov 11, 2019, at 9:05 AM, Ricky Teachey  wrote:
> 
>  
> I have found that trying to explain the value of true notation to people who 
> lack the experience and training is always a losing proposition.  I'm already 
> regretting having started this thread, simply because I know how this works.  
> Frankly, it's almost like trying to engage with a religious person while 
> trying to discuss the lack of evidence for the existence of supernatural 
> beings.  They "know" what they "know" and it is a very rare case that someone 
> is actually going to get out of that box and comprehend what you are saying.
> 
> 
> 
> I am done with this thread.  It has received nothing but close-minded 
> hostility.  Which is fine.  I understand.  That's the way the world works.  
> I've seen this kind of thing happen in many domains, not just programming.
> 
>  
> 
> I intend this response in the most friendly and kind way possible:
> 
> Approaching discussion in such a manner-- i.e., with the assumption that 
> other people, who see things differently than your (in your view) grounded, 
> logical, thought through, and deeply held standpoint, must "lack" things like 
> experience, training, or comprehension (in the example of your interlocution 
> with religious people) if they continue to differ with you-- could be one 
> reason people have reacted with what you are interpreting as hostility.
> 
> People often naturally react hostile way when they detect the person they are 
> talking with believes they are lacking in some way.
> 
> Furthermore, speaking as a religious person who is coming from a tradition 
> that is deeply introspective, and has grappled for centuries with other 
> points of view, I suggest that disparaging the ability of religious people to 
> "comprehend" your views doesn't make the point you think it does. Might want 
> to find a new example. My two cents.
> 
> ---
> Ricky.
> 
> "I've never met a Kentucky man who wasn't either thinking about going home or 
> actually going home." - Happy Chandler
> 
> ___
> Python-ideas mailing list -- python-ideas@python.org
> To unsubscribe send an email to python-ideas-le...@python.org
> https://mail.python.org/mailman3/lists/python-ideas.python.org/
> Message archived at 
> https://mail.python.org/archives/list/python-ideas@python.org/message/VLSRRZXV4IVT4K77WVNZMGVEEDQMNFCZ/
> Code of Conduct: http://python.org/psf/codeofconduct/

___
Python-ideas mailing list -- python-ideas@python.org
To unsubscribe send an email to python-ideas-le...@python.org
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived at 
https://mail.python.org/archives/list/python-ideas@python.org/message/JUJ4PBZEJUWZE4EDJCJ3AUEEGM3XKNGX/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-ideas] Re: Python should take a lesson from APL: Walrus operator not needed

2019-11-11 Thread Ricky Teachey
> I have found that trying to explain the value of true notation to people
> who lack the experience and training is always a losing proposition.  I'm
> already regretting having started this thread, simply because I know how
> this works.  Frankly, it's almost like trying to engage with a religious
> person while trying to discuss the lack of evidence for the existence of
> supernatural beings.  They "know" what they "know" and it is a very rare
> case that someone is actually going to get out of that box and comprehend
> what you are saying.
>
>

I am done with this thread.  It has received nothing but close-minded
> hostility.  Which is fine.  I understand.  That's the way the world works.
> I've seen this kind of thing happen in many domains, not just programming.
>
>

I intend this response in the most friendly and kind way possible:

Approaching discussion in such a manner-- i.e., with the assumption that
other people, who see things differently than your (in your view) grounded,
logical, thought through, and deeply held standpoint, must "lack" things
like experience, training, or comprehension (in the example of your
interlocution with religious people) if they continue to differ with you--
could be one reason people have reacted with what you are interpreting as
hostility.

People often naturally react hostile way when they detect the person they
are talking with believes they are lacking in some way.

Furthermore, speaking as a religious person who is coming from a tradition
that is deeply introspective, and has grappled for centuries with other
points of view, I suggest that disparaging the ability of religious people
to "comprehend" your views doesn't make the point you think it does. Might
want to find a new example. My two cents.

---
Ricky.

"I've never met a Kentucky man who wasn't either thinking about going home
or actually going home." - Happy Chandler
___
Python-ideas mailing list -- python-ideas@python.org
To unsubscribe send an email to python-ideas-le...@python.org
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived at 
https://mail.python.org/archives/list/python-ideas@python.org/message/VLSRRZXV4IVT4K77WVNZMGVEEDQMNFCZ/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-ideas] Re: Python should take a lesson from APL: Walrus operator not needed

2019-11-11 Thread Martin Euredjian via Python-ideas
 > These thousands of words of repeating claims with weird non sequitur 
 > digressions seem to amount to 

I am done with this thread.  It has received nothing but close-minded 
hostility.  Which is fine.  I understand.  That's the way the world works.  
I've seen this kind of thing happen in many domains, not just programming.
If I had the power to delete this entire thread, I would.  I actually regret 
daring to suggest there might be a different way to do things, not to change 
the entire language, but rather to solve the problem cleanly and elegantly with 
the introduction of a single symbol rather than piling on stuff.  I love Python 
and will continue to use it, including the walrus operator.  Life goes on.

Admin:  If you have a way to just delete this entire thread, please do so.  It 
was a waste of time for all involved.

Thank you,
-Martin
___
Python-ideas mailing list -- python-ideas@python.org
To unsubscribe send an email to python-ideas-le...@python.org
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived at 
https://mail.python.org/archives/list/python-ideas@python.org/message/GDEWE3W4OMFDOVVXR6OYFRWT4QKUDMMH/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-ideas] Re: Python should take a lesson from APL: Walrus operator not needed

2019-11-11 Thread Anders Hovmöller


> On 11 Nov 2019, at 17:05, Richard Damon  wrote:
> 
> On 11/11/19 10:10 AM, Random832 wrote:
>>> On Mon, Nov 11, 2019, at 03:22, Greg Ewing wrote:
>>> On 11/11/19, 12:41 PM, Richard Damon wrote:
 it was DESIGNED to be inefficient (that was one of its design goals, to
 slow typesetters down to be slower than the machine they were working
 on).
>>> This is most likely a myth, see https://en.wikipedia.org/wiki/QWERTY
>> This is a nice rhetorical trick: "Contrary to popular belief, the QWERTY 
>> layout was not designed to slow the typist down,[5] but rather to speed up 
>> typing by preventing jams." - well *of course* the goal was not to slow down 
>> actual production of text, but this does not imply the method by which 
>> "speeding up by preventing jams" was to be achieved was not by slowing down 
>> the physical process of pressing keys. (And the argument that having keys on 
>> alternating hands speeds things up is related to modern touch-typing 
>> techniques, and has little to do with the environment in which QWERTY was 
>> originally designed).
> 
> Yes, Someone on the Internet is wrong! https://xkcd.com/386/
> 
> My memory of the full story is that, YES, putting come combinations
> verse putting them together let the machines go faster and perhaps let
> touch typist go faster, but then they often went a bit too fast even
> then so many of the common letters were moved from the home row or to
> weak fingers to slow the typist down a bit to match the machine. This
> was the impetus for the development of alternate keyboards, like the
> Dvorak, which were designed to be faster for a trained typist to use.
> 
> This gets to the key point of my comment, even though the Dvorak
> keyboard has been show to be superior to the standard QWERTY keyboard in
> a number of studies (like your claims that a symbolic notation is
> superior to 'ASCII Soup'), a major hindrance to it being adopted is the
> existing infrastructure.

And some studies have shown no or insignificant advantage and the original 
study was a fraud. I think we can safely let it go. 

It's way more important that there is a standard.
___
Python-ideas mailing list -- python-ideas@python.org
To unsubscribe send an email to python-ideas-le...@python.org
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived at 
https://mail.python.org/archives/list/python-ideas@python.org/message/EJP6CZKYTB2TMPDEZWU5KB6WCYJHJEJH/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-ideas] Re: Python should take a lesson from APL: Walrus operator not needed

2019-11-11 Thread Richard Damon
On 11/11/19 10:10 AM, Random832 wrote:
> On Mon, Nov 11, 2019, at 03:22, Greg Ewing wrote:
>> On 11/11/19, 12:41 PM, Richard Damon wrote:
>>> it was DESIGNED to be inefficient (that was one of its design goals, to
>>> slow typesetters down to be slower than the machine they were working
>>> on).
>> This is most likely a myth, see https://en.wikipedia.org/wiki/QWERTY
> This is a nice rhetorical trick: "Contrary to popular belief, the QWERTY 
> layout was not designed to slow the typist down,[5] but rather to speed up 
> typing by preventing jams." - well *of course* the goal was not to slow down 
> actual production of text, but this does not imply the method by which 
> "speeding up by preventing jams" was to be achieved was not by slowing down 
> the physical process of pressing keys. (And the argument that having keys on 
> alternating hands speeds things up is related to modern touch-typing 
> techniques, and has little to do with the environment in which QWERTY was 
> originally designed).

Yes, Someone on the Internet is wrong! https://xkcd.com/386/

My memory of the full story is that, YES, putting come combinations
verse putting them together let the machines go faster and perhaps let
touch typist go faster, but then they often went a bit too fast even
then so many of the common letters were moved from the home row or to
weak fingers to slow the typist down a bit to match the machine. This
was the impetus for the development of alternate keyboards, like the
Dvorak, which were designed to be faster for a trained typist to use.

This gets to the key point of my comment, even though the Dvorak
keyboard has been show to be superior to the standard QWERTY keyboard in
a number of studies (like your claims that a symbolic notation is
superior to 'ASCII Soup'), a major hindrance to it being adopted is the
existing infrastructure.

If for some reason, every keyboard in the world was destroyed, every
driver disappeared, and everyone forgot all their training on keyboard
use, the replacement keyboard might well be something like the Dvorak
keyboard, but that isn't going to happen.

In the same way, perhaps a graphical based language might be the choice
if all programming languages and tools disappeared and had to be built
up fresh (but on a system reboot like that, simplicity would be important).

Due to the infrastructure situation, I don't see an existing language
making the jump from being 'ASCII' based to graphics based suddenly. One
option is to develop and environment for programming that is 'graphical'
in its entry and display, with the actual program file still the
classical ASCII language, so it still interfaces with the existing
tools. As that tool demonstrates improvements in programmer
productivity, it would gather more users, and perhaps create the demand
for such an environment be considered 'normal' and thus the language
able to make moves based on that.

The other option is to create a new language, perhaps based on an
existing language, based on the new graphical idea. Being a fresh start,
it won't be held back by existing infrastructure in its design, just in
its availability. Such a language would need to build it following based
on its merits, including that it use makes much of the existing tools
hard to use with it.

-- 
Richard Damon
___
Python-ideas mailing list -- python-ideas@python.org
To unsubscribe send an email to python-ideas-le...@python.org
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived at 
https://mail.python.org/archives/list/python-ideas@python.org/message/OPROGFDV4EK77RATNYAZBDD7MXQBG3F7/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-ideas] Re: Python should take a lesson from APL: Walrus operator not needed

2019-11-11 Thread Anders Hovmöller


> On 11 Nov 2019, at 15:26, Rhodri James  wrote:
> 
> On 10/11/2019 20:50, Martin Euredjian via Python-ideas wrote:
>> It does, it's an extension of the reality that, after so many
>> decades, we are still typing words on a text editor.  In other words,
>> my comment isn't so much about the mechanics and editors that are
>> available as much as the fact that the way we communicate and define
>> the computational solution of problems (be it to other humans or the
>> machine that will execute the instructions) is through typing text
>> into some kind of a text editor.
> 
> You seem to be stuck on the idea that symbols (non-ASCII characters) are 
> inherently more expressive than text, specifically that a single symbol is 
> easier to comprehend and use than a composition of several symbols. This is a 
> lovely theory.  Unfortunately it's wrong.

I'm gonna bet it's correct in some limited cases. Like APLs sort functions. Way 
easier to understand directly than "ascending" and "descending". Maybe it's 
just that I'm a non-native English speaker. But I feel the same way towards my 
native "stigande" and "fallande" so I don't think so. 
___
Python-ideas mailing list -- python-ideas@python.org
To unsubscribe send an email to python-ideas-le...@python.org
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived at 
https://mail.python.org/archives/list/python-ideas@python.org/message/T64HGZQH2CZTJ7LFUXXMDGEMBOC5KGTV/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-ideas] Re: Python should take a lesson from APL: Walrus operator not needed

2019-11-11 Thread Random832
On Mon, Nov 11, 2019, at 03:22, Greg Ewing wrote:
> On 11/11/19, 12:41 PM, Richard Damon wrote:
> > it was DESIGNED to be inefficient (that was one of its design goals, to
> > slow typesetters down to be slower than the machine they were working
> > on).
> 
> This is most likely a myth, see https://en.wikipedia.org/wiki/QWERTY

This is a nice rhetorical trick: "Contrary to popular belief, the QWERTY layout 
was not designed to slow the typist down,[5] but rather to speed up typing by 
preventing jams." - well *of course* the goal was not to slow down actual 
production of text, but this does not imply the method by which "speeding up by 
preventing jams" was to be achieved was not by slowing down the physical 
process of pressing keys. (And the argument that having keys on alternating 
hands speeds things up is related to modern touch-typing techniques, and has 
little to do with the environment in which QWERTY was originally designed).
___
Python-ideas mailing list -- python-ideas@python.org
To unsubscribe send an email to python-ideas-le...@python.org
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived at 
https://mail.python.org/archives/list/python-ideas@python.org/message/LAGB732MOP67SZK2ZXOZ7FAILFRUK2LQ/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-ideas] Re: Python should take a lesson from APL: Walrus operator not needed

2019-11-11 Thread Rhodri James

On 10/11/2019 20:50, Martin Euredjian via Python-ideas wrote:

It does, it's an extension of the reality that, after so many
decades, we are still typing words on a text editor.  In other words,
my comment isn't so much about the mechanics and editors that are
available as much as the fact that the way we communicate and define
the computational solution of problems (be it to other humans or the
machine that will execute the instructions) is through typing text
into some kind of a text editor.


You seem to be stuck on the idea that symbols (non-ASCII characters) are 
inherently more expressive than text, specifically that a single symbol 
is easier to comprehend and use than a composition of several symbols. 
This is a lovely theory.  Unfortunately it's wrong.


We don't read character by character, it turns out.  We read whole 
lexical units in one go.  So '→', ':=' and 'assign' all take the same 
amount of effort to recognise.  What we learn to recognise them as is 
another matter, and familiarity counts there.


I'm not a cognitive psychologist so I can't point you at any of the 
relevant papers for this, but I can assure you it's true.  I've been 
through the experiments where words were flashed up on a screen for a 
fiftieth of a second (eye persistence time, basically), and we could all 
recognise them perfectly well no matter how long they were.  (There 
probably are limits but we didn't hit them.)  A quirk of my brain is 
that unlike my classmates I couldn't do the same with numbers -- with a 
very few exceptions like powers of two, numbers are just collections of 
digits to me in a way that words *aren't* collections of letters.


Yes, I was a mathematician.  Why do you ask? :-)

--
Rhodri James *-* Kynesim Ltd
___
Python-ideas mailing list -- python-ideas@python.org
To unsubscribe send an email to python-ideas-le...@python.org
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived at 
https://mail.python.org/archives/list/python-ideas@python.org/message/ERO75XVSZTU3LFFO7AUOJ44GQUTS5ZPS/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-ideas] Re: Python should take a lesson from APL: Walrus operator not needed

2019-11-11 Thread Greg Ewing

On 11/11/19, 12:41 PM, Richard Damon wrote:

it was DESIGNED to be inefficient (that was one of its design goals, to
slow typesetters down to be slower than the machine they were working
on).


This is most likely a myth, see https://en.wikipedia.org/wiki/QWERTY

--
Greg
___
Python-ideas mailing list -- python-ideas@python.org
To unsubscribe send an email to python-ideas-le...@python.org
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived at 
https://mail.python.org/archives/list/python-ideas@python.org/message/YNLMNHRMSB2UWZJ4JA2UEC6OLNSIZK4M/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-ideas] Re: Python should take a lesson from APL: Walrus operator not needed

2019-11-10 Thread David Mertz
These thousands of words of repeating claims with weird non sequitur
digressions seem to amount to "I wish Python used hard-to-enter unicode
characters instead of words on normal keyboards" as far as I can tell
because human brains, apparently, cannot make sense of the two character
symbol `:=` but could somehow easily comprehend the exact same operation
using the back arrow that I have no idea how to enter here.

And yes, I know that Martin said there was some complex chord of keys in
one particular IDE that would enter that arrow character.  I don't remember
what the combo is, but I'm sure it's possible to learn it.

FWIW, as I often point out is similar threads, I actually use vim's conceal
plugin so that my Python code looks like it has funny characters inside
it.  But it doesn't really, the screen just shows those for certain
character patterns, but the beautiful ASCII gets saved to disk as regular
Python.  Martin could easily use that plugin (or something similar for his
own editor) to visually transform ':=' into .

Just install something like that, and know that the key strokes to enter
funny-back-arrow are ':' followed by '='.  Simpler than that chording in
whatever APL IDE even.

On Sun, Nov 10, 2019 at 3:53 PM Martin Euredjian via Python-ideas <
python-ideas@python.org> wrote:

> > This has nothing to do with representation or input via text
>
> It does, it's an extension of the reality that, after so many decades, we
> are still typing words on a text editor.  In other words, my comment isn't
> so much about the mechanics and editors that are available as much as the
> fact that the way we communicate and define the computational solution of
> problems (be it to other humans or the machine that will execute the
> instructions) is through typing text into some kind of a text editor.
>
> When I say "text" I mean "a through z, numbers and a few symbols that were
> found on mechanical typewriters in the 1960's".  My shorthand for that is
> ASCII, which isn't quite accurate in that the set symbols contained in the
> sets where the most significant bits are "000" and "001" (7 bit ASCII) are
> not used other than CR, LF and HT.
>
> So, for the most part, programming, for the last 60 years or so --over
> half a century-- has been limited to the characters and symbols found on a
> 60 year old typewriter.
>
> For some reason this evokes the lyrics from a Pink Floyd song, "Got
> thirteen channels of sh** on the T.V. to choose from".  The advances in
> computation since the 1960's have been immense, and yet we pretend that it
> is OK to limit ourselves to a 60 year old keyboard in describing and
> programming the next generation of AI systems that will reach unimaginable
> levels of complexity, power and capabilities.
>
> As I have mentioned in another comment, having had this experience, I
> fully understand how people who do not have the benefit of having
> communicated with computers, not just symbolically, but through a very
> different paradigm as well, simply cannot see what I am describing.  It's
> hard to find an analogy that can easily represent this without some common
> shared perspective.  I found that music can be that tool.  Of course, that
> requires classical training at a level sufficient enough to, for example,
> read and "see" the music when presented with a score.
>
> Now, it's easy to say "I can do that" when presented with something like
> this and maybe have a rudimentary understanding of it:
>
> https://www.youtube.com/watch?v=MeaQ595tzxQ
>
>
> It is something quite different when presented with something like this,
> without a "play" button, even if annotated:
>
>
> http://buxtonschool.org.uk/wp-content/uploads/2017/04/Annotated-Bach-Brandenburg-score.pdf?LMCL=ruk_U7
>
>
> I have found that trying to explain the value of true notation to people
> who lack the experience and training is always a losing proposition.  I'm
> already regretting having started this thread, simply because I know how
> this works.  Frankly, it's almost like trying to engage with a religious
> person while trying to discuss the lack of evidence for the existence of
> supernatural beings.  They "know" what they "know" and it is a very rare
> case that someone is actually going to get out of that box and comprehend
> what you are saying.
>
> BTW, there are some interesting efforts out there, like this:
>
> https://www.youtube.com/watch?v=1iTPLgfmFdI
>
> Once you dig into these truly interesting examples you end-up discovering
> that notation still has a significant edge.
>
>
> -Martin
>
>
> ___
> Python-ideas mailing list -- python-ideas@python.org
> To unsubscribe send an email to python-ideas-le...@python.org
> https://mail.python.org/mailman3/lists/python-ideas.python.org/
> Message archived at
> https://mail.python.org/archives/list/python-ideas@python.org/message/T55CUZ6AVDK5LFZFHWJRG2RTNLEYSSTF/
> Code of Conduct: http://python.org/psf/codeofconduct/
>


-

[Python-ideas] Re: Python should take a lesson from APL: Walrus operator not needed

2019-11-10 Thread Andrew Barnert via Python-ideas
On Nov 10, 2019, at 20:50, Martin Euredjian via Python-ideas 
 wrote:
> 
> > This has nothing to do with representation or input via text
> 
> It does, it's an extension of the reality that, after so many decades, we are 
> still typing words on a text editor. 

And how else would you want to enter code?

APL is words and symbols in a text editor.

Python, C++, JS, Haskell, Mathematica, Julia, etc. are also words and symbols 
in a text editor. The only difference is a couple dozen fewer symbols, and I’m 
not sure why you think that makes a transformative difference.

Meanwhile, unlike APL, some of these languages have options like Jupyter 
notebooks and similar tools that allow you to organize that text into cells and 
paste images between the cells or generate them from your code or even include 
inline live displays, but apparently that doesn’t impress you at all.

You’ve agreed graphical programming languages where you connect up components 
by drawing lines between them are useless for general purpose.

So what exactly are you suggesting we should have instead of text?

And in what way is experience with APL relevant to it?

> In other words, my comment isn't so much about the mechanics and editors that 
> are available as much as the fact that the way we communicate and define the 
> computational solution of problems (be it to other humans or the machine that 
> will execute the instructions) is through typing text into some kind of a 
> text editor.  
> 
> When I say "text" I mean "a through z, numbers and a few symbols that were 
> found on mechanical typewriters in the 1960's".  My shorthand for that is 
> ASCII, which isn't quite accurate in that the set symbols contained in the 
> sets where the most significant bits are "000" and "001" (7 bit ASCII) are 
> not used other than CR, LF and HT.  

Right, most programming languages make do with 80-odd characters, while APL 
uses about 100. Most of the extras being variations on letters.

Although actually most languages—including Python, but not including APL—let 
you use a few thousand other characters for your function names and other 
identifiers. But apparently that isn’t interesting to you, it’s only those few 
dozen extra characters being used as builtins that matters.

So, why?

> So, for the most part, programming, for the last 60 years or so --over half a 
> century-- has been limited to the characters and symbols found on a 60 year 
> old typewriter.

And adding another shift key to add one more bank of a couple dozen makes a 
difference how?

And if you want something that can input thousands of characters… well, what 
would that look like? Have you used CJK keyboards? They don’t have thousands of 
keys, because nobody could use that with human fingers. Instead, either you 
have a bunch of extra shifts, or you enter effectively two letters and a number 
for each character. That’s not any more expressive, it’s slower and clumsier.

> As I have mentioned in another comment, having had this experience, I fully 
> understand how people who do not have the benefit of having communicated with 
> computers, not just symbolically, but through a very different paradigm as 
> well, simply cannot see what I am describing.  It's hard to find an analogy 
> that can easily represent this without some common shared perspective.  I 
> found that music can be that tool.  Of course, that requires classical 
> training at a level sufficient enough to, for example, read and "see" the 
> music when presented with a score.

You keep bringing up music as a comparison, but music notation has far fewer 
characters, and they’ve been unchanged for even longer than text punctuation. 
The advantage of music is a 2D notation, not more characters.

And the disadvantage of music notation is the same disadvantage of everything 
besides text: nobody’s come up with a nice way of entering it that’s even 
remotely smooth enough that it doesn’t force you to think about the editor 
instead of the music. When I want to generate notation, I don’t use a notation 
editor, I play something into a sequencer, edit it in the piano roll interface, 
and then convert to notation and tweak a few last things, because nothing else 
is usable. And if I want to do it on my phone, I just can’t do it at all.

Just like math, where it’s easier to read notation because it’s 2D, but the 
best way to create that notation is to type in code or AsciiMath or TeX and 
render that to an equation. 

> I have found that trying to explain the value of true notation to people who 
> lack the experience and training is always a losing proposition.  I'm already 
> regretting having started this thread, simply because I know how this works.  
> Frankly, it's almost like trying to engage with a religious person while 
> trying to discuss the lack of evidence for the existence of supernatural 
> beings.

No, what you’re trying to do is engage with a Christian by telling him that his 
silly 2000-year-old trinity 

[Python-ideas] Re: Python should take a lesson from APL: Walrus operator not needed

2019-11-10 Thread Richard Damon
Martin,

I think one thing you need to realize that just being a better idea
doesn't make it easy to get implemented. There is a LOT of inertia in
infrastructure, and overcoming it can be nearly impossible.

A great example of that is look at your typical keyboard, if you were to
create a new keyboard now, it is absolutely not how you would want to
organize it because it really seems illogical and inefficient. In fact,
it was DESIGNED to be inefficient (that was one of its design goals, to
slow typesetters down to be slower than the machine they were working
on). Why do we still use it? There have been numerous attempts to
replace it, and the answer is the inertia of infrastructure. It would
cost WAY too much to just scrap all the existing keyboards and replace
them with new (both materially and training), and the gains aren't good
enough, and the costs still too high to try to phase in a transition.
(Not all systems will support the new input method, as there are cost to
support it, so demand needs to be proved, but if you still need to keep
the skill of using a classical keyboard, a better keyboard isn't going
to be that much better, so isn't worth the effort).

One big issue in programming is that programmers have gotten use to
being able to use a number of different tools for programming (a given
programmer doesn't use many, but many are used by different
programmers). This is part of the existing infrastructure. This
infrastructure is largely based on simple 'ASCII' input. To make a
language be based on lots of different characters not on the keyboard,
it needs to either restrict the user to a small set of tools that
supports the language, or there needs to be a common input method that
is available on all the common tools.

APL got away with this, because it started out that way, and started
before people got used to having a variety of tools. It is also one
reason it is relegated to being a niche language.

The infrastructure argument basically makes it very hard for an existing
language to move from being 'ASCII' based to being symbolic based.
___
Python-ideas mailing list -- python-ideas@python.org
To unsubscribe send an email to python-ideas-le...@python.org
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived at 
https://mail.python.org/archives/list/python-ideas@python.org/message/WGQ4I776VNWRDCC3IMC4B2356G4YNAM5/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-ideas] Re: Python should take a lesson from APL: Walrus operator not needed

2019-11-10 Thread Martin Euredjian via Python-ideas
 > This has nothing to do with representation or input via text

It does, it's an extension of the reality that, after so many decades, we are 
still typing words on a text editor.  In other words, my comment isn't so much 
about the mechanics and editors that are available as much as the fact that the 
way we communicate and define the computational solution of problems (be it to 
other humans or the machine that will execute the instructions) is through 
typing text into some kind of a text editor.  

When I say "text" I mean "a through z, numbers and a few symbols that were 
found on mechanical typewriters in the 1960's".  My shorthand for that is 
ASCII, which isn't quite accurate in that the set symbols contained in the sets 
where the most significant bits are "000" and "001" (7 bit ASCII) are not used 
other than CR, LF and HT.  

So, for the most part, programming, for the last 60 years or so --over half a 
century-- has been limited to the characters and symbols found on a 60 year old 
typewriter.

For some reason this evokes the lyrics from a Pink Floyd song, "Got thirteen 
channels of sh** on the T.V. to choose from".  The advances in computation 
since the 1960's have been immense, and yet we pretend that it is OK to limit 
ourselves to a 60 year old keyboard in describing and programming the next 
generation of AI systems that will reach unimaginable levels of complexity, 
power and capabilities.  

As I have mentioned in another comment, having had this experience, I fully 
understand how people who do not have the benefit of having communicated with 
computers, not just symbolically, but through a very different paradigm as 
well, simply cannot see what I am describing.  It's hard to find an analogy 
that can easily represent this without some common shared perspective.  I found 
that music can be that tool.  Of course, that requires classical training at a 
level sufficient enough to, for example, read and "see" the music when 
presented with a score.

Now, it's easy to say "I can do that" when presented with something like this 
and maybe have a rudimentary understanding of it:

https://www.youtube.com/watch?v=MeaQ595tzxQ 


It is something quite different when presented with something like this, 
without a "play" button, even if annotated:

http://buxtonschool.org.uk/wp-content/uploads/2017/04/Annotated-Bach-Brandenburg-score.pdf?LMCL=ruk_U7
 

I have found that trying to explain the value of true notation to people who 
lack the experience and training is always a losing proposition.  I'm already 
regretting having started this thread, simply because I know how this works.  
Frankly, it's almost like trying to engage with a religious person while trying 
to discuss the lack of evidence for the existence of supernatural beings.  They 
"know" what they "know" and it is a very rare case that someone is actually 
going to get out of that box and comprehend what you are saying.

BTW, there are some interesting efforts out there, like this:

https://www.youtube.com/watch?v=1iTPLgfmFdI

Once you dig into these truly interesting examples you end-up discovering that 
notation still has a significant edge.


-Martin

___
Python-ideas mailing list -- python-ideas@python.org
To unsubscribe send an email to python-ideas-le...@python.org
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived at 
https://mail.python.org/archives/list/python-ideas@python.org/message/T55CUZ6AVDK5LFZFHWJRG2RTNLEYSSTF/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-ideas] Re: Python should take a lesson from APL: Walrus operator not needed

2019-11-10 Thread Andrew Barnert via Python-ideas
On Nov 10, 2019, at 08:23, Stephen J. Turnbull 
 wrote:
> 
> Martin Euredjian via Python-ideas writes:
> 
>> Another interesting example is had in some of my work with real
>> time embedded systems.  There are plenty of cases where you are
>> doing things that are very tightly related to, for example, signals
>> coming into the processor though a pin; by this I mean real-time
>> operations where every clock cycle counts.  One of the most
>> critical aspects of this for me is the documentation of what one si
>> doing and thinking.  And yet, because we insist on programming in
>> ASCII we are limited to text-based comments that don't always work
>> well at all.  In my work I would insert a comment directing the
>> reader to access a PDF file I placed in the same directory often
>> containing an annotated image of a waveform with notes.
> 
> This has nothing to do with representation or input via text, though.
> Emacs has displayed formatted output along with or instead of code
> since the mid-90s or so, invariably based on a plain-text protocol and
> file format.  (The point is not that Emacs rulez, although it
> does. ;-)  It's that a program available to anyone on commodity
> hardware could do it.  I imagine the capability goes back to the Xerox
> Alto in the mid-70s.)
>  > It would be amazing if we could move away from text-only
>  > programming and integrate a rich environment where such
>  > documentation could exist and move with the code.
> 
> We've been *able* to do so for decades (when was WYSIWYG introduced?),
> and even if you hate Emacs, there are enough people and enough
> variation among them that if this were really an important thing, it
> would be important in Emacs.  It's not.

Well, we know that some of this really is important because thousands of people 
are already doing it with Jupyter/iPython notebooks, and with 
Mathematica/Wolfram, and so on. Not only can I paste images in as comments 
between the cells, I can have a cell that displays a graph or image inline in 
the notebook, or (with SymPy) an equation in nice math notation. And all of 
that gets saved together with the code in the cells.

I wouldn’t use it for all my code (it’s not the best way to write a web service 
or an AI script embedded in a game), but I believe many people who mostly do 
scientific computing do.

But that’s how we know that we don’t need to change our languages to enable 
such tools. Anyone who would find it amazing is already so used to doing it 
every day that they’ve ceased to be amazed.

> I would offer the Lisp family as a higher-level counterexample.  In
> Lisp programming, I at least tend to avoid creating idioms in favor of
> named functions and macros, or flet and labels (their local versions).

Haskell might be an even more relevant example.

In Haskell, you can define custom operators as arbitrary strings of the symbol 
characters. You can also use any function as an infix operator by putting it in 
backticks, and pass any operator around as a function by putting it in parens, 
so deciding what to call it is purely a question of whether you want your 
readers to see map `insert` value or map |< value. (Well, almost; you can’t 
control the precedence of backticked functions.)

And that often involves a conscious tradeoff between how many times you need to 
read the code in the near future vs. how likely you are to come back to it 
briefly after 6 months away from it, or between how many people are going to 
work on the code daily vs. how many will have to read it occasionally. The |< 
is more readable once you internalize it, at least if you’re doing a whole lot 
of inserting and chaining it up with other operations. But once it’s gone out 
of your head, it’s one more thing to have to relearn before you can understand 
the code.

Python effectively makes that choice for me, erring on the side of future 
readability. I think that’s very close to the right choice 80% of the time—and 
not having to make that choice makes coding easier and more fluid, even if I 
sacrifice a bit the other 20% of the time. This is similar to other tradeoffs 
that Haskell lets me make but Python makes for me (e.g., point-free or 
lambda—Python only has the lambda style plus explicit partial).

Of course Haskell uses strings of ASCII symbols for operators. If you used 
strings of Unicode symbols, that would probably change the balance. I could 
imagine that there would be cases where no string of ASCII symbols will make 
sense to me in 6 months but a Unicode symbol would. (But only if we could 
ignore the input problem—which we can’t.) I don’t think the set of APL symbols 
is particularly intuitive in that sense. (If I were using it every day, a 
special rho variation character meaning reshape would make sense, but if I 
hadn’t seen it in 6 months it would be no more meaningful than any Haskell 
operator line noise.) But maybe there are symbols that would be more 
intuitively meaningful, or at least that would be closer,

[Python-ideas] Re: Python should take a lesson from APL: Walrus operator not needed

2019-11-10 Thread Andrew Barnert via Python-ideas
On Nov 10, 2019, at 08:00, Stephen J. Turnbull 
 wrote:
> 
> Andrew Barnert via Python-ideas writes:
>>> On Nov 7, 2019, at 19:59, Chris Angelico  wrote:
>>> 
>>> And I do the same with the operators that you disparagingly call
>>> "ASCII soup". I touch type them. What's the difference, other than
>>> that I can transfer my knowledge of typing English?
>> 
>> Well, there’s also the fact that you can touch type them into
>> Mail.app and pine and a StackOverflow answer box and a blog comment
>> and a general purpose text editor and get the exact same result you
>> get in PyCharm or PyDev. That’s a pretty huge difference, which the
>> OP is ignoring.
> 
> I don't think there would be a difference nowadays.  I type Japanese
> encoded in Unicode into Mail.app and Emacs and Nano and web forms all
> the time; they're perfectly happy with Unicode.  

Sure, they’re all happy with Unicode; the question is how you type it.

On a Mac, I can use a system input method to type Japanese characters, and it 
works the same way in every app (including the console), but iOS and Windows 
and X11 have different input methods, and if I sit down at someone else’s 
laptop it’s not likely to have the IM configured. (Or I can use emacs native 
IMs, which work across platforms, but only work within emacs.)

But it’s worse than that. MacOS, iOS, Android, Windows, and most Linux distros 
come with a Japanese IM that you just have to know how to enable and configure, 
but I don’t think any of them come with an APL symbol IM. So if I want to type 
APL source code, I have to find a third-party IM, or maybe even write one. (Or 
I have to buy an APL IDE or configure emacs, and not be able to type code 
anywhere else, including on my phone.)

And that wouldn’t change if Python used the APL arrow for assignment. I’m sure 
PyCharm would have a shortcut for it, and anyone who uses emacs could figure 
out how to set that up, and Pythonista would have an arrow key on its bar above 
the virtual keyboard, but none of that would enable me to type Python code into 
an email message or a StackOverflow answer or even a terminal REPL session.

> The question is what
> will the *audience* do with those unfamiliar symbols?

If we’re talking about APL-like symbol density, that’s an issue. But just 
adding the arrow for assignment wouldn’t commit us to that. It’s possible that 
the happy medium of readability lies somewhere between Python and APL, and 
that’s what Python would approach over the next decade. (Python already 
approximates a hybrid between English pseudocode and math notation; it would 
just be moving along that spectrum.) And the audience would have no problem 
with that—novices would learn the arrow for assignment in the very first 
lesson, and they’d learn how to use iota instead of range (and the best 
practices for when to do so) a few lessons in, and so on. If it really is more 
readable, people would learn to read it.

Of course it’s arguable that Python is already so close to the sweet spot that 
there’s no benefit to be gained there. But I don’t think that’s something 
that’s immediately self-evident. If it weren’t for the input problem, arrow 
might well be better than equals, a few extra operators might be helpful, etc. 
It’s the input that dooms that possibility, not the readability.


___
Python-ideas mailing list -- python-ideas@python.org
To unsubscribe send an email to python-ideas-le...@python.org
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived at 
https://mail.python.org/archives/list/python-ideas@python.org/message/7IK5SGRUEUJPLJE2I4GBOC46UWTUHHUL/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-ideas] Re: Python should take a lesson from APL: Walrus operator not needed

2019-11-09 Thread Stephen J. Turnbull
Martin Euredjian via Python-ideas writes:

 > Another interesting example is had in some of my work with real
 > time embedded systems.  There are plenty of cases where you are
 > doing things that are very tightly related to, for example, signals
 > coming into the processor though a pin; by this I mean real-time
 > operations where every clock cycle counts.  One of the most
 > critical aspects of this for me is the documentation of what one si
 > doing and thinking.  And yet, because we insist on programming in
 > ASCII we are limited to text-based comments that don't always work
 > well at all.  In my work I would insert a comment directing the
 > reader to access a PDF file I placed in the same directory often
 > containing an annotated image of a waveform with notes.

This has nothing to do with representation or input via text, though.
Emacs has displayed formatted output along with or instead of code
since the mid-90s or so, invariably based on a plain-text protocol and
file format.  (The point is not that Emacs rulez, although it
does. ;-)  It's that a program available to anyone on commodity
hardware could do it.  I imagine the capability goes back to the Xerox
Alto in the mid-70s.)

 > It would be amazing if we could move away from text-only
 > programming and integrate a rich environment where such
 > documentation could exist and move with the code.

We've been *able* to do so for decades (when was WYSIWYG introduced?),
and even if you hate Emacs, there are enough people and enough
variation among them that if this were really an important thing, it
would be important in Emacs.  It's not.

Note that I do not deny that it is a *thing*.  I just can't agree that
it would be amazing -- I would be neither surprised nor particularly
overwhelmingly enabled by it, and I don't know anybody else who would
find it especially enabling.  (Again, I don't deny that *you* would.)

 > The famous one-liner solutions are not neat because they are
 > on-liners, they are interesting because they become idioms, words,
 > with a meaning.  Your brain sees that and knows what that line is
 > doing.

This is absolutely true.  As you point out earlier, it's also true of
*fingers*, and it's really no harder to type the ligature "<-" than it
is to type the chord "Alt-Meta-DoubleBucky-<" (unless you're a
professional pianist).  I agree with those who say that "←" is hard
to read, but I think that's a font issue: we should fix the fonts.

Keyboards are harder.  Labels aren't enough: it really annoys me when
I have to look at the keyboard for some rarely typed Japanese
characters whose Roman input format I don't know.

 > Again, this isn't what happens to a newbie, of course.

Of course it does.  That's exactly why I have always consistently
flubbed certain phrases in old hymns translated to Japanese: the use
of particles (suffixes equivalent to English prepositions) has changed
over time.  I don't know how natives handle it, but to me it's really
difficult precisely because I have a limited repertoire of idioms, I
can't even read the older versions easily, even though they use the
modern glyphs.  (Bach German lyrics in Fraktur are perversely easier
*because* I don't have any German idioms!)  Japanese members of my
choirs are sympathetic, but they also don't understand why I don't
"get better".  *They* catch on fast enough, and my Japanese is
good enough to occasionally fool people on the phone. :-)

 > The closest Python example of this I can provide would be list
 > comprehensions you reach for all the time.

I think even closer is the walrus operator itself.

But there are Python counterexamples, as well.  Some of them are
arguably compromises that satisfy nobody (lambdas that can wrap
expressions but not suites).  Others are debatable (I don't know how
he feels now, but there was a time that Guido wrote he wished he could
deprecate augmented assignments in favor of a compiler optimization),
and Guido has never been able to accept increment operators, though
for at least the first decade I participated here they were a frequent
request.

I would offer the Lisp family as a higher-level counterexample.  In
Lisp programming, I at least tend to avoid creating idioms in favor of
named functions and macros, or flet and labels (their local versions).
(Curiously, I guess, I do make liberal use of lambdas as arguments to
functionals rather than use a label.)

I'm beginning to regret using the words "idiom" and "counterexample"
(even though they were introduced by others), but I don't have better
ones.  By "counterexample", I think that what I'm trying to argue here
is that yes, human beings do have the ability, and a very useful
ability, to create and use idioms.  But these idioms can also be
usefully compressed into names in some contexts.  The choice of which
to use is a matter of style, of course, but also a matter of audience.
Use of names is more compact, and among "experts" (ie, those who
already know and use those names) the

[Python-ideas] Re: Python should take a lesson from APL: Walrus operator not needed

2019-11-09 Thread Stephen J. Turnbull
Andrew Barnert via Python-ideas writes:
 > On Nov 7, 2019, at 19:59, Chris Angelico  wrote:
 > > 
 > > And I do the same with the operators that you disparagingly call
 > > "ASCII soup". I touch type them. What's the difference, other than
 > > that I can transfer my knowledge of typing English?
 > 
 > Well, there’s also the fact that you can touch type them into
 > Mail.app and pine and a StackOverflow answer box and a blog comment
 > and a general purpose text editor and get the exact same result you
 > get in PyCharm or PyDev. That’s a pretty huge difference, which the
 > OP is ignoring.

I don't think there would be a difference nowadays.  I type Japanese
encoded in Unicode into Mail.app and Emacs and Nano and web forms all
the time; they're perfectly happy with Unicode.  The question is what
will the *audience* do with those unfamiliar symbols?

Steve
___
Python-ideas mailing list -- python-ideas@python.org
To unsubscribe send an email to python-ideas-le...@python.org
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived at 
https://mail.python.org/archives/list/python-ideas@python.org/message/XBTETZLJTSRDROWRA4RBWS3U6QIPLBV2/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-ideas] Re: Python should take a lesson from APL: Walrus operator not needed

2019-11-07 Thread Chris Angelico
On Fri, Nov 8, 2019 at 11:16 AM Andrew Barnert via Python-ideas
 wrote:
>
> On Nov 7, 2019, at 19:59, Chris Angelico  wrote:
> >
> > And I do the same with the operators that you disparagingly call
> > "ASCII soup". I touch type them. What's the difference, other than
> > that I can transfer my knowledge of typing English?
>
> Well, there’s also the fact that you can touch type them into Mail.app and 
> pine and a StackOverflow answer box and a blog comment and a general purpose 
> text editor and get the exact same result you get in PyCharm or PyDev. That’s 
> a pretty huge difference, which the OP is ignoring.
>

Well - yes. However, if I were to have need of regularly typing
certain tokens, I would set them up with my Compose key, which works
with (nearly) every X11 app. But not everyone has a convenient way to
type arbitrary characters across all apps.

ChrisA
___
Python-ideas mailing list -- python-ideas@python.org
To unsubscribe send an email to python-ideas-le...@python.org
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived at 
https://mail.python.org/archives/list/python-ideas@python.org/message/XE6U3RTSYC4GY7LN3A2PKP5JWL2JZEP5/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-ideas] Re: Python should take a lesson from APL: Walrus operator not needed

2019-11-07 Thread Andrew Barnert via Python-ideas
On Nov 7, 2019, at 19:59, Chris Angelico  wrote:
> 
> And I do the same with the operators that you disparagingly call
> "ASCII soup". I touch type them. What's the difference, other than
> that I can transfer my knowledge of typing English?

Well, there’s also the fact that you can touch type them into Mail.app and pine 
and a StackOverflow answer box and a blog comment and a general purpose text 
editor and get the exact same result you get in PyCharm or PyDev. That’s a 
pretty huge difference, which the OP is ignoring.

___
Python-ideas mailing list -- python-ideas@python.org
To unsubscribe send an email to python-ideas-le...@python.org
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived at 
https://mail.python.org/archives/list/python-ideas@python.org/message/3AKBVF3OQ2MXHHCRUJ3TQYKTPYL4HZNN/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-ideas] Re: Python should take a lesson from APL: Walrus operator not needed

2019-11-07 Thread Greg Ewing

Abe Dillon wrote:
I don't disagree that infix notation is more readable because humans 
have trouble balancing brackets visually.


I don't think it's just about brackets, it's more about
keeping related things together. An expression such as

   b**2 - 4*a*c

can be written unambiguously without brackets in a variety
of less-readable ways, e.g.

   - ** b 2 * 4 * a c

That uses the same number of tokens, but I think most people
would agree that it's a lot harder to read.

The 
main concern of math notation seems to be limiting ink or chalk use at 
the expense of nearly all else (especially readability).


Citation needed, that's not obvious to me at all.

Used judiciously, compactness aids readability because it
allows you to see more at once. I say "judiciously" because
it's possible to take it too far -- regexes are a glaring
example of that. Somewhere there's a sweet spot where you
present enough information in a well-organised way in a
small enough space to be able to see it all at once, but
not so much that it becomes overwhelming.

Why is 
exponentiation or log not infixed?  Why so many different ways to

represent division or differentiation?


Probably the answer is mostly "historical reasons", but I
think there are good reasons for some of those things persisting.
It's useful to have addition, multiplication and exponentiation
all look different from each other. During my Lisp phase, what
bothered me more than the parentheses (I didn't really mind those)
was that everything looked so bland and uniform -- there were no
visual landmarks for the eye to latch onto.

Log not infix -- it's pretty rare to use more than one base for
logs in a given expression, so it makes sense for the base to
be implied or relegated to a less intrusive position.

I can only think of a couple of ways mathematicians represent
division (÷ is only used in primary school in my experience)
and one of them (negative powers) is kind of unavoidable once
you generalise exponentiation beyond positive integers.

I'll grant that there are probably more ways to represent
differentiation than we really need. But I think we would lose
something if we were restricted to just one. Newton's notation
is great for when you're thinking of functions as first-class
objects. But Leibnitz makes the chain rule blindingly obvious,
and when you're solving differential equations by separating
variables it's really useful to be able to move differentials
around independently. Other notations have their own advantages
in their own niches.

Something persisting because it works does not imply any sort of 
optimality.


True, but equally, something having been around for a long time
doesn't automatically mean it's out of date and needs to be
replaced. Things need to be judged on their merits.

A good way to test this is to find a paper with heavy use of 
esoteric math notation and translate that notation to code. I think 
you'll find the code more accessible. I think you'll find that even 
though it takes up significantly more characters, it reads much quicker 
than a dense array of symbols.


In my experience, mathematics texts are easiest to read when the
equations are interspersed with a good amount of explanatory prose,
using words rather than abbreviations.

But I wouldn't want the equations themselves to be written more
verbosely. Nor would I want the prose to be written in a programming
language. Perhaps ironically, the informality of the prose makes
it easier to take in.

I spent a good few weeks trying to make sense of the rather short book 
"Universal Artificial Intelligence" by Marcus Hutter because he relies 
so heavily on symbolic notation. Now that I grasp it, I could explain it 
much more clearly in much less time to someone with much less background 
than I had going in to the book.


But your explanation would be in English, not a formal language,
right?

--
Greg
___
Python-ideas mailing list -- python-ideas@python.org
To unsubscribe send an email to python-ideas-le...@python.org
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived at 
https://mail.python.org/archives/list/python-ideas@python.org/message/TQM3HKSL7DFIA5OXRO4CNSVAGRHBYL5J/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-ideas] Re: Python should take a lesson from APL: Walrus operator not needed

2019-11-07 Thread Andrew Barnert via Python-ideas
On Nov 7, 2019, at 22:35, MRAB  wrote:
>> 
> There was a version of APL on the Sinclair QL, which, IIRC, replaced the 
> symbols with keywords. I don't know how well it did.

The OP started the thread complaining about J, which is a much more systematic 
ASCIIfication of APL carefully designed by half the core APL team. If he hates 
that, I’m pretty sure he wouldn’t be happy with Sinclair APL.

___
Python-ideas mailing list -- python-ideas@python.org
To unsubscribe send an email to python-ideas-le...@python.org
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived at 
https://mail.python.org/archives/list/python-ideas@python.org/message/OS4A46MDN2NBXJZUJTAICBZWEYOR5SFC/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-ideas] Re: Python should take a lesson from APL: Walrus operator not needed

2019-11-07 Thread MRAB

On 2019-11-07 20:30, Paul Moore wrote:

On Thu, 7 Nov 2019 at 18:59, Chris Angelico  wrote:


On Fri, Nov 8, 2019 at 5:40 AM Martin Euredjian via Python-ideas
 wrote:
>
> > Was your use of APL on a machine with a dedicated APL keyboard?
>
> I've done both.  In the early '80's it was not uncommon to find terminals 
with APL keyboards.  IBM, DEC, Tektronix and other made them.  Once the IBM PC era 
took hold most of APL was done with either a card you'd place in front of your 
keyboard or stickers you'd add to the front of the then thick keycaps.
>
> Here's reality:  It isn't that difficult at all to mentally map a bunch of 
symbols to a standard keyboard.  It's a bit clunky at first but you learn very, 
very quickly, I would venture to guess that one could reach for the most common 
APL symbols with ease within a day.
>

Here's another very VERY important reality: once you've done something
for multiple years, you are usually *terrible* at estimating how
difficult it truly is. Your brain understands what you're doing and
has no difficulty with it, so you think that it's an easy thing to do.
Is it? Maybe; maybe not. But unless you have watched a brand new APL
programmer, you can't see how hard it actually is. Or in this case,
perhaps not a brand-new APL programmer, but someone who has (say) six
months of experience.


And another very important reality here is that Python is used by a
lot of people who would not class themselves as professional
programmers. It's used by schoolchildren to learn about computers.
It's used by graphic designers as an embedded language. It's used by
gamers writing mods for games. The list goes on. Many of those people
have NO INTEREST in learning to program Python efficiently. An awful
lot won't learn any Python, they'll just copy some code off the web
and fiddle with it to get the results they want. They just want to get
a job done, and for them, even a single non-standard character is
probably a major barrier. They certainly aren't going to put stickers
on their keys, or use a reference card to know how to type operators.

To be blunt, there's good reasons APL never took off with the general
programming community. If we are to learn any lessons about the good
features in APL, we need to understand those reasons and accept their
validity first. And I'm pretty certain that "weird character set"
would turn out to be one of them...

There was a version of APL on the Sinclair QL, which, IIRC, replaced the 
symbols with keywords. I don't know how well it did.

___
Python-ideas mailing list -- python-ideas@python.org
To unsubscribe send an email to python-ideas-le...@python.org
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived at 
https://mail.python.org/archives/list/python-ideas@python.org/message/5T2WHOF6VMGH4KNSADAC6ZX2XV5QG4GU/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-ideas] Re: Python should take a lesson from APL: Walrus operator not needed

2019-11-07 Thread Paul Moore
On Thu, 7 Nov 2019 at 18:59, Chris Angelico  wrote:
>
> On Fri, Nov 8, 2019 at 5:40 AM Martin Euredjian via Python-ideas
>  wrote:
> >
> > > Was your use of APL on a machine with a dedicated APL keyboard?
> >
> > I've done both.  In the early '80's it was not uncommon to find terminals 
> > with APL keyboards.  IBM, DEC, Tektronix and other made them.  Once the IBM 
> > PC era took hold most of APL was done with either a card you'd place in 
> > front of your keyboard or stickers you'd add to the front of the then thick 
> > keycaps.
> >
> > Here's reality:  It isn't that difficult at all to mentally map a bunch of 
> > symbols to a standard keyboard.  It's a bit clunky at first but you learn 
> > very, very quickly, I would venture to guess that one could reach for the 
> > most common APL symbols with ease within a day.
> >
>
> Here's another very VERY important reality: once you've done something
> for multiple years, you are usually *terrible* at estimating how
> difficult it truly is. Your brain understands what you're doing and
> has no difficulty with it, so you think that it's an easy thing to do.
> Is it? Maybe; maybe not. But unless you have watched a brand new APL
> programmer, you can't see how hard it actually is. Or in this case,
> perhaps not a brand-new APL programmer, but someone who has (say) six
> months of experience.

And another very important reality here is that Python is used by a
lot of people who would not class themselves as professional
programmers. It's used by schoolchildren to learn about computers.
It's used by graphic designers as an embedded language. It's used by
gamers writing mods for games. The list goes on. Many of those people
have NO INTEREST in learning to program Python efficiently. An awful
lot won't learn any Python, they'll just copy some code off the web
and fiddle with it to get the results they want. They just want to get
a job done, and for them, even a single non-standard character is
probably a major barrier. They certainly aren't going to put stickers
on their keys, or use a reference card to know how to type operators.

To be blunt, there's good reasons APL never took off with the general
programming community. If we are to learn any lessons about the good
features in APL, we need to understand those reasons and accept their
validity first. And I'm pretty certain that "weird character set"
would turn out to be one of them...

Paul
___
Python-ideas mailing list -- python-ideas@python.org
To unsubscribe send an email to python-ideas-le...@python.org
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived at 
https://mail.python.org/archives/list/python-ideas@python.org/message/IDGZTZNK4CFJZCW2HWIHQFQ4OH2SKDZC/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-ideas] Re: Python should take a lesson from APL: Walrus operator not needed

2019-11-07 Thread Abe Dillon
>
> > Why not use a more consistent notation like add(x, y) instead of x +
> > y when we know addition is a function and all other functions (usually)
> > follow the f(x, y) notation?
> > Because math is old.
> No, it's because infix notation is *more readable* than function
> notation when formulas become complex.


I think addition was a bad example of inconsistent notation, but I did
hedge that statement with:

That's not the only reason, of course, but it is a pretty big reason.


I don't disagree that infix notation is more readable because humans have
trouble balancing brackets visually. However, I maintain that readability
doesn't seem to be the main concern of math notation. The main concern of
math notation seems to be limiting ink or chalk use at the expense of
nearly all else (especially readability). Why is exponentiation or log not
infixed? Why so many different ways to represent division or
differentiation?

It has persisted because it works, not because mathematicians are stuck in
> their ways.


Something persisting because it works does not imply any sort of
optimality. A good way to test this is to find a paper with heavy use of
esoteric math notation and translate that notation to code. I think you'll
find the code more accessible. I think you'll find that even though it
takes up significantly more characters, it reads much quicker than a dense
array of symbols.

I spent a good few weeks trying to make sense of the rather short book
"Universal Artificial Intelligence" by Marcus Hutter because he relies so
heavily on symbolic notation. Now that I grasp it, I could explain it much
more clearly in much less time to someone with much less background than I
had going in to the book.

On Thu, Nov 7, 2019 at 1:44 AM Greg Ewing 
wrote:

> Abe Dillon wrote:
> > Why not use a more consistent notation like add(x, y) instead of x +
> > y when we know addition is a function and all other functions (usually)
> > follow the f(x, y) notation?
> > Because math is old.
>
> No, it's because infix notation is *more readable* than function
> notation when formulas become complex. It has persisted because
> it works, not because mathematicians are stuck in their ways.
>
> Having said that, it's relatively rare that mathematicians make
> up entirely new symbols -- they're more likely to repurpose
> existing ones. E.g. "+" is used for addition-like operations on
> a very wide variety of things -- numbers, vectors, matrices,
> tensors, quantum states, etc. etc. Mathematics is quite
> Python-like in that way.
>
> --
> Greg
> ___
> Python-ideas mailing list -- python-ideas@python.org
> To unsubscribe send an email to python-ideas-le...@python.org
> https://mail.python.org/mailman3/lists/python-ideas.python.org/
> Message archived at
> https://mail.python.org/archives/list/python-ideas@python.org/message/FUHAOCNKANGDQEZBPRRKC7NTHV4JIVNU/
> Code of Conduct: http://python.org/psf/codeofconduct/
>
___
Python-ideas mailing list -- python-ideas@python.org
To unsubscribe send an email to python-ideas-le...@python.org
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived at 
https://mail.python.org/archives/list/python-ideas@python.org/message/LFFEJPZK5ER35LG5D5LVS6ZBJDWEYL65/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-ideas] Re: Python should take a lesson from APL: Walrus operator not needed

2019-11-07 Thread Chris Angelico
On Fri, Nov 8, 2019 at 5:40 AM Martin Euredjian via Python-ideas
 wrote:
>
> > Was your use of APL on a machine with a dedicated APL keyboard?
>
> I've done both.  In the early '80's it was not uncommon to find terminals 
> with APL keyboards.  IBM, DEC, Tektronix and other made them.  Once the IBM 
> PC era took hold most of APL was done with either a card you'd place in front 
> of your keyboard or stickers you'd add to the front of the then thick keycaps.
>
> Here's reality:  It isn't that difficult at all to mentally map a bunch of 
> symbols to a standard keyboard.  It's a bit clunky at first but you learn 
> very, very quickly, I would venture to guess that one could reach for the 
> most common APL symbols with ease within a day.
>

Here's another very VERY important reality: once you've done something
for multiple years, you are usually *terrible* at estimating how
difficult it truly is. Your brain understands what you're doing and
has no difficulty with it, so you think that it's an easy thing to do.
Is it? Maybe; maybe not. But unless you have watched a brand new APL
programmer, you can't see how hard it actually is. Or in this case,
perhaps not a brand-new APL programmer, but someone who has (say) six
months of experience.

> How do we learn to touch-type?  By typing.  At first you look at the keyboard 
> all the time.  I never do any more.  I am typing this by looking at the 
> screen, I haven't looked at the keyboard even once this entire time.  You can 
> do that with APL just fine, it's easy.
>
> When I was actively using the language every day I touch typed APL, didn't 
> even think about it.  Which is also another powerful thing.  Once you get to 
> that point expressing ideas computationally is not unlike playing music on a 
> piano, it just flows.
>

And I do the same with the operators that you disparagingly call
"ASCII soup". I touch type them. What's the difference, other than
that I can transfer my knowledge of typing English?

> It is my opinion that this is so because we are still typing words into text 
> editors.  I do not, by any means, imply that programming graphically is the 
> solution.  I do a lot of FPGA work, mostly designing complex high speed real 
> time image processing hardware.  I have tried graphical tools for FPGA work 
> and they have never really worked well at all.  In this case my go-to tool 
> ends-up being Verilog or even lower register-level hardware description.  I 
> can't tell you what form this "next generation" approach to programming 
> should take other than having the believe, due to my experience with APL, 
> that the introduction of symbols would be of potentially great value.
>

Oh you're absolutely right that graphical programming is not the
solution.  We're still typing words *because typing words is still the
best way to do things*. There have been many MANY point-and-click
programming tools developed (the first one I ever met was back in the
90s, a codegen tool in VX-REXX), and while they are spectacular tools
for bringing a smidgen of programming to a non-programmer (say, giving
an artist the ability to drag and drop instruction blocks around to
create a complex animation sequence), they are *not* a replacement for
text-based coding.

> A simple example of this might be a state machine driving the menu system of 
> an embedded system with a simple multi-line LCD display and a few buttons and 
> knobs for a control panel.  I've done control panels with two dozen displays, 
> a couple hundred buttons and to dozen encoder/knobs.  Once you start looking 
> at what it takes to design something like that, code it and then support it 
> through iterations, feature changes and general code maintenance it becomes 
> VERY obvious that typing words on a text editor is absolutely the worst way 
> to do it.  And yet we insist on being stuck inside an ASCII text editor for 
> our work.  From my perspective, in 2019, it's just crazy.
>

No actually, it's not so obvious to me. Convince me. Show me that
typing words is "absolutely the worst".

> Another interesting example is had in some of my work with real time embedded 
> systems.  There are plenty of cases where you are doing things that are very 
> tightly related to, for example, signals coming into the processor though a 
> pin; by this I mean real-time operations where every clock cycle counts.  One 
> of the most critical aspects of this for me is the documentation of what one 
> si doing and thinking.  And yet, because we insist on programming in ASCII we 
> are limited to text-based comments that don't always work well at all.  In my 
> work I would insert a comment directing the reader to access a PDF file I 
> placed in the same directory often containing an annotated image of a 
> waveform with notes.  It would be amazing if we could move away from 
> text-only programming and integrate a rich environment where such 
> documentation could exist and move with the code.
>

Ehh, if you want to use

[Python-ideas] Re: Python should take a lesson from APL: Walrus operator not needed

2019-11-07 Thread Martin Euredjian via Python-ideas
 > Was your use of APL on a machine with a dedicated APL keyboard?
I've done both.  In the early '80's it was not uncommon to find terminals with 
APL keyboards.  IBM, DEC, Tektronix and other made them.  Once the IBM PC era 
took hold most of APL was done with either a card you'd place in front of your 
keyboard or stickers you'd add to the front of the then thick keycaps.
Here's reality:  It isn't that difficult at all to mentally map a bunch of 
symbols to a standard keyboard.  It's a bit clunky at first but you learn very, 
very quickly, I would venture to guess that one could reach for the most common 
APL symbols with ease within a day.
How do we learn to touch-type?  By typing.  At first you look at the keyboard 
all the time.  I never do any more.  I am typing this by looking at the screen, 
I haven't looked at the keyboard even once this entire time.  You can do that 
with APL just fine, it's easy.

When I was actively using the language every day I touch typed APL, didn't even 
think about it.  Which is also another powerful thing.  Once you get to that 
point expressing ideas computationally is not unlike playing music on a piano, 
it just flows.
I still use APL today, but mostly as a powerful calculator than anything else.  
Among other things, I work in robotics, where doing quick linear algebra 
calculations comes in handy.  Other than that, APL --for good reasons-- is 
pretty much a dead language.  That's not to say there are concepts in there 
that warrant consideration.  The power of notation is not appreciated by most 
programmers because there really isn't anything like APL out there.  I know 
people won't accept this because it is human nature to resist change or new 
ideas, but the truth is the way we express our ideas in computational terms is 
rather primitive.  
It is my opinion that this is so because we are still typing words into text 
editors.  I do not, by any means, imply that programming graphically is the 
solution.  I do a lot of FPGA work, mostly designing complex high speed real 
time image processing hardware.  I have tried graphical tools for FPGA work and 
they have never really worked well at all.  In this case my go-to tool ends-up 
being Verilog or even lower register-level hardware description.  I can't tell 
you what form this "next generation" approach to programming should take other 
than having the believe, due to my experience with APL, that the introduction 
of symbols would be of potentially great value.
I look at ideas such as designing and defining state machines.  I've done a ton 
of that work in both hardware (FPGA's) and software (ranging from embedded 
systems in Forth, C and C++ to desktop and web applications in various 
languages).  I've had to develop custom tools to make the task of designing, 
coding and maintaining such state machines easier than manually typing walls of 
text consisting of nested switch() statements or whatever the language allows.  
A simple example of this might be a state machine driving the menu system of an 
embedded system with a simple multi-line LCD display and a few buttons and 
knobs for a control panel.  I've done control panels with two dozen displays, a 
couple hundred buttons and to dozen encoder/knobs.  Once you start looking at 
what it takes to design something like that, code it and then support it 
through iterations, feature changes and general code maintenance it becomes 
VERY obvious that typing words on a text editor is absolutely the worst way to 
do it.  And yet we insist on being stuck inside an ASCII text editor for our 
work.  From my perspective, in 2019, it's just crazy.
Another interesting example is had in some of my work with real time embedded 
systems.  There are plenty of cases where you are doing things that are very 
tightly related to, for example, signals coming into the processor though a 
pin; by this I mean real-time operations where every clock cycle counts.  One 
of the most critical aspects of this for me is the documentation of what one si 
doing and thinking.  And yet, because we insist on programming in ASCII we are 
limited to text-based comments that don't always work well at all.  In my work 
I would insert a comment directing the reader to access a PDF file I placed in 
the same directory often containing an annotated image of a waveform with 
notes.  It would be amazing if we could move away from text-only programming 
and integrate a rich environment where such documentation could exist and move 
with the code.
Anyhow, not suggesting, by any stretch of the imagination, that these things 
are a necessity for Python.  You asked an important and interesting question 
and I wanted to give you an answer that also exposed some of my perspective 
beyond this insignificant question of an assignment operator.
> I'd like to get some information on how much of that productivity was 
> demonstrated on a system with a conventionalkeyboard.
To address this directly, the case for somethi

[Python-ideas] Re: Python should take a lesson from APL: Walrus operator not needed

2019-11-07 Thread Paul Moore
On Thu, 7 Nov 2019 at 00:16, Martin Euredjian via Python-ideas
 wrote:
> Sorry, notation is far more powerful.  As I said in one of my other notes, 
> people who have not had the unique experience of using something like APL for 
> non-trivial development work simply don't get it.

Was your use of APL on a machine with a dedicated APL keyboard? I know
the old IBM terminals had dedicated APL symbol keys. The point has
been made here a few times that typing extended characters reliably is
hard, but you haven't as yet responded to that (as far as I can see).
While I haven't used APL much, I do a reasonable amount of
mathematical writing, and I find that it's frustratingly difficult to
express mathematical ideas on a computer, because the need to find
ways of writing the notation (whether it's by looking up Unicode
symbols, or remembering notation like LaTeX) breaks the flow of ideas.

So while I won't dispute that writing APL may have been highly
productive for you, I'd like to get some information on how much of
that productivity was demonstrated on a system with a conventional
keyboard. Without good evidence that the productivity gains you're
suggesting can be achieved on the input devices that real-world Python
users have available, one of your main arguments in favour of this
change is significantly weakened.

Paul
___
Python-ideas mailing list -- python-ideas@python.org
To unsubscribe send an email to python-ideas-le...@python.org
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived at 
https://mail.python.org/archives/list/python-ideas@python.org/message/47LKB24BM7VDNX5E2PIHR5NN72T3MD2Y/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-ideas] Re: Python should take a lesson from APL: Walrus operator not needed

2019-11-06 Thread Greg Ewing

Abe Dillon wrote:
Why not use a more consistent notation like add(x, y) instead of x + 
y when we know addition is a function and all other functions (usually) 
follow the f(x, y) notation?

Because math is old.


No, it's because infix notation is *more readable* than function
notation when formulas become complex. It has persisted because
it works, not because mathematicians are stuck in their ways.

Having said that, it's relatively rare that mathematicians make
up entirely new symbols -- they're more likely to repurpose
existing ones. E.g. "+" is used for addition-like operations on
a very wide variety of things -- numbers, vectors, matrices,
tensors, quantum states, etc. etc. Mathematics is quite
Python-like in that way.

--
Greg
___
Python-ideas mailing list -- python-ideas@python.org
To unsubscribe send an email to python-ideas-le...@python.org
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived at 
https://mail.python.org/archives/list/python-ideas@python.org/message/FUHAOCNKANGDQEZBPRRKC7NTHV4JIVNU/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-ideas] Re: Python should take a lesson from APL: Walrus operator not needed

2019-11-06 Thread Greg Ewing

Mike Miller wrote:


There is:

  U+2B32 ⬲ LEFT ARROW WITH CIRCLED PLUS

But there would need to be more.


At this point you're creating a mini-language by combining symbols,
which the OP seems to be against, since he describes things like
":=" and "<-" condescendingly as "ascii soup".

--
Greg
___
Python-ideas mailing list -- python-ideas@python.org
To unsubscribe send an email to python-ideas-le...@python.org
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived at 
https://mail.python.org/archives/list/python-ideas@python.org/message/DXOWWKOGVEPUGQ3N3LQ7MMV2ICJVDVBD/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-ideas] Re: Python should take a lesson from APL: Walrus operator not needed

2019-11-06 Thread Abe Dillon
>
> In that context something like the Conway Game of Life in APL demo should
> inspire an interested party in exploring further.  None of the tools he
> uses in the demo are difficult to comprehend, particularly if you have a
> background in basic Linear Algebra (another foundational element of APL).


My reaction was "burn it with fire!". It looks like a programming language
designed around coding golf . I've
seen enough of code in that vein to hate it with a passion. I could tell
that you thought the game of life video would win some favor, I can only
speak for my self; but I suspect it's had the opposite effect intended.

The difference is that regex looks like )(*&)(*&)(*^)(*&^ which means
> nothing.  Your brain has a mapping for what these symbols mean already.
> Ascribing new meaning to a new mish-mash of them breaks that mental mapping
> and model


I don't think my brain has much of a model for what * and ^ mean. I don't
think it's an over-loading problem. Again verbal expressions make much more
sense to me even though they use letters that are used everywhere. Even
though they require ~10x the number of characters, I can read verbal
expressions 50x faster.

> What is "real notation"
> Maybe the right term is "domain specific notation".  I suspect you know
> very well what I mean and are simply enjoying giving me a hard time.


No. It was a 100% legit question. I think our perspectives are far out of
sync if you think that was a facetious question.

> APL is such a powerful language. APL is also a powerfully write-only
> language.
> Easy answer:  That Reddit commenter is simply ignorant.  This is silly.


It's the same impression I got from watching the videos you linked and
trying to learn a bit of APL. From looking around it doesn't seem like an
unpopular view. You can call everyone ignorant all you want. Python
programmers could just as easily ignore all the people that talk about how
slow Python is. It would be a detriment to the community to stick their
head in the sand and ignore detractors. If it makes sense to you and a tiny
fraction of the world and looks like complete madness to everyone else,
then maybe your appraisal of it being super easy to understand and pick up
and read and write isn't universal.

I think you mentioned having a physics background? I could see how a
language that adopts a bunch of super-essoteric mathematical notation would
appeal to you. I'm sure the Navier-Stokes equations map quite nicely to
APL. When I see that upside-down capital "L" symbol, my brain NOPEs out.

Look, APL is, for all intents an purposes, a dead for general usage today.
> Yet both IBM and Dyalog sell high end interpreters, with Dyalog getting
> $2,500 PER YEAR (I believe IBM is similarly priced).
> https://www.dyalog.com/prices-and-licences.htm#devlicprice
> https://www.ibm.com/us-en/marketplace/apl2
> So, clearly this would not exist if the language was useless or if it was
> "write-only" as that genius on Reddit opined.


It absolutely would if some of your code base relies on some indecipherable
APL that someone wrote back in the 70s and nobody wants to touch out of
fear that a nuclear sub will sink somewhere. Otherwise, someone would have
translated that bit of code decades ago and been done with it.


On Wed, Nov 6, 2019 at 7:33 PM Martin Euredjian via Python-ideas <
python-ideas@python.org> wrote:

> > This distinction between notation and soup seems pretty subjective.
>
> Yes and no.  Math looks like hieroglyphics to most people.  We are talking
> about professional programmers here.  In that context something like the
> Conway Game of Life in APL demo should inspire an interested party in
> exploring further.  None of the tools he uses in the demo are difficult to
> comprehend, particularly if you have a background in basic Linear Algebra
> (another foundational element of APL).
>
> It's like learning a language that does not use latin script, say, Hebrew
> or Greek.  At first nothing makes sense, yet it doesn't take very long for
> someone to recognize the characters, attach them to sounds and then make
> words, badly at first and better with time.
>
> Note that I am not proposing a complete APL-ization of Python.  My only
> observation was that the judicious introduction of a single new symbol for
> assignment would solve the problem that now requires "=" and ":=".  This is
> far more elegant.  You don't break old code --ever-- even when you phase
> out the use of "=" in future versions because replacing "=" with the new
> symbol is an elementary process.
>
>
> >> Another example of ASCII soup is regex.
> > That's interesting, I feel the same way. I can read most code pretty
> quickly, but as soon as I hit a regex it takes me 50x as long to read
>
> That's it!  You got it!  The difference is that regex looks like
> )(*&)(*&)(*^)(*&^ which means nothing.  Your brain has a mapping for what
> these symbols mean already.  Ascribing new meaning to a 

[Python-ideas] Re: Python should take a lesson from APL: Walrus operator not needed

2019-11-06 Thread Chris Angelico
On Thu, Nov 7, 2019 at 12:35 PM Martin Euredjian via Python-ideas
 wrote:
> >> Another example of ASCII soup is regex.
> > That's interesting, I feel the same way. I can read most code pretty 
> > quickly, but as soon as I hit a regex it takes me 50x as long to read
>
> That's it!  You got it!  The difference is that regex looks like 
> )(*&)(*&)(*^)(*&^ which means nothing.  Your brain has a mapping for what 
> these symbols mean already.  Ascribing new meaning to a new mish-mash of them 
> breaks that mental mapping and model, which means that it requires 50 or 100 
> times the cognitive load to process and comprehend.  You are forced to keep a 
> truly unnatural mental stack in your head as you parse these infernal 
> combinations of seemingly random ASCII to figure out their meaning.
>
> Notation changes that, if anything for one simple reason:  It establishes new 
> patterns, with punctuation and rhythm and your brain can grok that.  Don't 
> forget that our brains have evolved amazing pattern matching capabilities, 
> symbols, notation, take advantage of that, hence the deep and wide history of 
> humanity using symbols to communicate.  Symbols are everywhere, from the 
> icons on your computer and phone to the dashboard of your car, signs on the 
> road, math, music, etc.
>

The asterisk is commonly interpreted to mean multiplication:

>>> 3 * 5
15
>>> "abc" * 4
'abcabcabcabc'

In a regex, it has broadly the same meaning. It allows any number of
what came before it. That's broadly similar to multiplication.

How is that somehow "not notation", yet you can define arbitrary
symbols to have arbitrary meanings and it is "notation"? What's the
distinction?

ChrisA
___
Python-ideas mailing list -- python-ideas@python.org
To unsubscribe send an email to python-ideas-le...@python.org
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived at 
https://mail.python.org/archives/list/python-ideas@python.org/message/Z4VVNT55TLLJYZ5V2ZG5WT3XHNHIVECN/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-ideas] Re: Python should take a lesson from APL: Walrus operator not needed

2019-11-06 Thread Martin Euredjian via Python-ideas
 > This distinction between notation and soup seems pretty subjective.
Yes and no.  Math looks like hieroglyphics to most people.  We are talking 
about professional programmers here.  In that context something like the Conway 
Game of Life in APL demo should inspire an interested party in exploring 
further.  None of the tools he uses in the demo are difficult to comprehend, 
particularly if you have a background in basic Linear Algebra (another 
foundational element of APL).  

It's like learning a language that does not use latin script, say, Hebrew or 
Greek.  At first nothing makes sense, yet it doesn't take very long for someone 
to recognize the characters, attach them to sounds and then make words, badly 
at first and better with time.

Note that I am not proposing a complete APL-ization of Python.  My only 
observation was that the judicious introduction of a single new symbol for 
assignment would solve the problem that now requires "=" and ":=".  This is far 
more elegant.  You don't break old code --ever-- even when you phase out the 
use of "=" in future versions because replacing "=" with the new symbol is an 
elementary process.

>> Another example of ASCII soup is regex.> That's interesting, I feel the same 
>> way. I can read most code pretty quickly, but as soon as I hit a regex it 
>> takes me 50x as long to read
That's it!  You got it!  The difference is that regex looks like 
)(*&)(*&)(*^)(*&^ which means nothing.  Your brain has a mapping for what these 
symbols mean already.  Ascribing new meaning to a new mish-mash of them breaks 
that mental mapping and model, which means that it requires 50 or 100 times the 
cognitive load to process and comprehend.  You are forced to keep a truly 
unnatural mental stack in your head as you parse these infernal combinations of 
seemingly random ASCII to figure out their meaning.
Notation changes that, if anything for one simple reason:  It establishes new 
patterns, with punctuation and rhythm and your brain can grok that.  Don't 
forget that our brains have evolved amazing pattern matching capabilities, 
symbols, notation, take advantage of that, hence the deep and wide history of 
humanity using symbols to communicate.  Symbols are everywhere, from the icons 
on your computer and phone to the dashboard of your car, signs on the road, 
math, music, etc.

> What is "real notation" 
Maybe the right term is "domain specific notation".  I suspect you know very 
well what I mean and are simply enjoying giving me a hard time.  No problem.  
Thick skin on this side.
> APL is such a powerful language. APL is also a powerfully write-only language.
Easy answer:  That Reddit commenter is simply ignorant.  This is silly.

> APL doesn't strike me as pragmatic in any sense.

Look, APL is, for all intents an purposes, a dead for general usage today.  Yet 
both IBM and Dyalog sell high end interpreters, with Dyalog getting $2,500 PER 
YEAR (I believe IBM is similarly priced).

https://www.dyalog.com/prices-and-licences.htm#devlicprice

https://www.ibm.com/us-en/marketplace/apl2

So, clearly this would not exist if the language was useless or if it was 
"write-only" as that genius on Reddit opined.
That said, outside of certain application domains I would not recommend anyone 
consider using APL.  The language, as I said before, was ahead of it's time and 
the people behind it truly sucked at marketing and expanding popularity for 
more reasons than I care to recount here.  I was very involved in this 
community in the 80's.  I knew Ken Iverson and the other top fliers in the 
domain.  I even published a paper back in '85 along with a presentation at an 
ACM/APL conference.  And still, I would say, no, not something anyone should 
use today.  Learn?  Yes, absolutely, definitely.  It's an eye opener but not 
much more than that.
Real APL development stopped a long time ago.  Maybe one day someone with the 
right ideas will invent NextGenerationAPL or something like that and give it a 
place in computing.

That is not to say that some of the concepts in APL have no place in other 
languages or computing.  For example, list comprehensions in Python have a very 
close link to the way things are done in APL.  They almost feel APL-like 
constructs to someone with experience in the language.

Here's another interesting APL resource that serious practitioners have used 
for decades (with some memorizing useful idioms):

https://aplwiki.com/FinnAplIdiomLibrary

Yes, if you program APL professionally you can read this and it does not look 
like ASCII soup.

For example this is a downward (largest to smallest) sort of a:
b←⍒a

      a ← ⍳20      a1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20       
b←⍒a      b20 19 18 17 16 15 14 13 12 11 10 9 8 7 6 5 4 3 2 1 
I just showed you another symbol, "⍳", the index generator, which is loosely 
equivalent to range() in Python.
So "⍳20" and range(1,21) generate similar results.  In APL parlance, it's a 
vector.

The difference 

[Python-ideas] Re: Python should take a lesson from APL: Walrus operator not needed

2019-11-06 Thread Andrew Barnert via Python-ideas
On Nov 7, 2019, at 01:04, Martin Euredjian via Python-ideas 
 wrote:
> 
> >> No professional thinks that "a = some_object" results in a bucket being 
> >> filled with whatever the object might contain. 
> > That's exactly how variables work in many common languages like C
> 
> Nope, not true.  Bytes, words, integers and that's about it.  Anything else 
> is a pointer to a relevant data structure somewhere in memory.  I think I can 
> say this it true of the vast majority of languages.  Exceptions are cases 
> like Python and, say, Objective-C, or, in general, where the philosophy is 
> that everything is an object.  Nobody is filling buckets with anything every 
> time there's an assignment, at least no language I have ever used. 

Almost every part of this is seriously wrong.

In C, filling a struct-shaped bucket with a struct-shaped value is exactly what 
happens when you initialize, assign to, or pass an argument to a struct-typed 
variable. Early C severely restricted when you could do such things, but that’s 
always how it worked when it was allowed, and in modern C it’s allowed almost 
everywhere you’d want it to be. If you want to pass around a pointer in C, you 
have to be explicit on both ends—use a pointer-to-struct-typed variable, and 
use the & operator on the value (and then you have to use the * operator to do 
anything with the pointer variable). And the right way to think about it is not 
reference semantics, but filling a pointer-shaped bucket with a pointer-shaped 
value. (Or, actually, not just “pointer” but specifically “pointer to some 
specific type”; you can’t portably stick a pointer-to-nullary-int-function 
value in a pointer-to-double value.)

And this is key to the whole notion of lvalue semantics, where variables (among 
other things) are typed buckets with identity for storing values (that are just 
bit patterns), as opposed to namespace semantics, where variables (among other 
things) are just names for typed values with identity that live wherever they 
want to live. After `a=b` in a namespace language, `a is b` is true, because a 
and b are two names for the one value in one location; in an lvalue language, 
there usually is no such operator, but if there were. `a is b` would still be 
false, because a and b are two distinct locations that contain distinct copies 
of the same value, because what `a=b` means is filling the a bucket with the 
value in the b bucket. (In fact, in C., even `a==b` might not be true if, say, 
`a` is an uint16 variable and b is a uint32 variable holding a value over 
65535.)

And it’s not like lvalue semantics was a dead-end bad idea from C that other 
languages abandoned. C++ took lvalue semantics much further than C, and built 
on struct assignment with notions like assignment operators, that let your type 
T customize how to fill a T-shaped bucket with whatever kind of value you want. 
C# and Swift built even further on that by syntactically distinguishing value 
types (that act like C structs) and reference types (that act sort of like 
Python objects).

Meanwhile, while most “everything is an object” languages like Python, 
Smalltalk, and Ruby (but not Objective C, which is about as far from 
everything-is-an-object as possible) use namespace semantics, they’re hardly 
the only languages that do; so do, for example, plenty of impure functional 
languages that aren’t at all OO.

Meanwhile, “bytes, words, integers, and… anything else is a [implicit] pointer” 
is a kind of clunky manual optimization used in various 80s languages, and 
borrowed from there into Java, but much less common in newer languages. It’s 
hardly a universal across languages, much less some kind of a priori necessity.___
Python-ideas mailing list -- python-ideas@python.org
To unsubscribe send an email to python-ideas-le...@python.org
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived at 
https://mail.python.org/archives/list/python-ideas@python.org/message/MDGUMGS23ZB4Y4YHFE6RUYUX3FAIENW2/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-ideas] Re: Python should take a lesson from APL: Walrus operator not needed

2019-11-06 Thread Chris Angelico
On Thu, Nov 7, 2019 at 11:15 AM Martin Euredjian via Python-ideas
 wrote:
> Anyhow, I'll repeat, I am nobody and I am certainly not going to change the 
> Python world.  Sleep well knowing this is just another random moron saying 
> something he probably should not have said.
>

You keep saying that, yet you posted to Python-Ideas about a "missed
opportunity". Clearly you have an opinion on what the language
*should* have done. If you truly believed your opinion to be
meaningless, you would have posted this rant onto Facebook or
something instead :)

Respect yourself as much as you respect everyone else. We can discuss
and debate even though this *exact* ship has sailed; there will be
future questions of language design. :)

ChrisA
___
Python-ideas mailing list -- python-ideas@python.org
To unsubscribe send an email to python-ideas-le...@python.org
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived at 
https://mail.python.org/archives/list/python-ideas@python.org/message/TYKDUTTAFJGZFXKT6AEFLAWACQ2E6WZB/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-ideas] Re: Python should take a lesson from APL: Walrus operator not needed

2019-11-06 Thread Abe Dillon
>
> Nope, not true.


Yes indeed. It is true.

Bytes, words, integers and that's about it.


All the primary data types and structs and enums and typedefs. That leaves
pointers, arrays, and... function pointers if you don't count those as
pointers. All of which misses the point completely.

In C, you have to declare variables with a type (even if it's a pointer)
and the program allocates memory for that variable (even if it's a pointer)
to be stored (like a bucket). If it's a basic type or struct, the memory
allocated for it depends on the type. The program has to allocate memory
*for the variable* based on what that variable is supposed to hold (even if
it's a pointer).

I really don't want to explain the mechanics of C here. I know how pointers
work. You seem to be avoiding the point on purpose.

On Wed, Nov 6, 2019 at 6:05 PM Martin Euredjian via Python-ideas <
python-ideas@python.org> wrote:

> >> No professional thinks that "a = some_object" results in a bucket
> being filled with whatever the object might contain.
> > That's exactly how variables work in many common languages like C
>
> Nope, not true.  Bytes, words, integers and that's about it.  Anything
> else is a pointer to a relevant data structure somewhere in memory.  I
> think I can say this it true of the vast majority of languages.  Exceptions
> are cases like Python and, say, Objective-C, or, in general, where the
> philosophy is that everything is an object.  Nobody is filling buckets with
> anything every time there's an assignment, at least no language I have ever
> used.  At the end of the day, it's a pointer to a chunk-o-memory with a
> header describing what's in there, how many, etc.  In more complex cases
> it's a pointer to a linked list or a pointer to a chunk of memory filled
> with pointers to other chunks of memory.  This has been the case from
> almost the beginning of time.  Let's put it this way, I was doing this kind
> of thin when I was programming IMSAI's with toggle switches and keeping
> track of variable names, memory locations and contents on a notebook by
> hand, paper and pencil.
>
>
> -Martin
>
>
> On Wednesday, November 6, 2019, 02:01:27 PM PST, Abe Dillon <
> abedil...@gmail.com> wrote:
>
>
> Question:  Where did APL's  "←" operator come from?
>
> Doesn't matter. If your notation can't stand on its own without a history
> lesson, then it's not great.
>
>
> A number of APL's elements came from a notation developed to describe the
> operation of IBM processors back in the 1960's.  In many ways it meant
> "this name is assigned to this object", to paraphrase your statement.
>
> In "many ways"? Not exactly? How does this make it better? It still sounds
> counterintuitive. If it really means "this name references this object",
> why not a left arrow?
>
> I mean, how does "a = 23" which is read "a is equal to 23" or "a =
> some_object" which is literally read "a is equal to some_object" say "a is
> a label that is attached to 23" or "a is a label that is attached to
> some_object"?
>
> I would say that it's not perfect notation. I read "a = 23" as "a is 23"
> in the same way that I would say "Abe is my name". It doesn't describe the
> relationship well, but it's an acceptable, pragmatic use of familiar
> notation and it doesn't run *counter to* the actual mechanics of the
> language. As far as symbols go, an arrow to the left would be closest to
> representing the mechanics of the language, "=" is a compromise, and "←" is
> backwards. The object doesn't refer to the name, the name refers to the
> object.
>
> No professional thinks that "a = some_object" results in a bucket being
> filled with whatever the object might contain.
>
> That's exactly how variables work in many common languages like C and is
> actually a common misconception even among people with years of experience
> with Python.
>
> Check out this Stack Overflow question
> 
> that asks what the difference is and the confused comments that follow
> saying there is no difference or even the fact that the question was closed
> because people didn't understand what it was asking. It's pretty bonkers to
> me that professional programers don't know the difference or understand a
> very basic question, but it's the world we live in.
>
> In fact, one could very well make the argument that using the "="
> character for this operation is misleading because the left side is not set
> to be equal to the right side.  Even worse, these pointers in Python are
> inmutable.
>
> The "=" doesn't imply immutability. That's actually what the walrus
> operator ":=" implies. "pi := circumference/diameter" means "pi is defined
> as the ratio of the circumference to the diameter".
>
> For the most part, implying that the name equals the object it references
> is fine. Implying that you're somehow putting the object into the variable
> or attachi

[Python-ideas] Re: Python should take a lesson from APL: Walrus operator not needed

2019-11-06 Thread Abe Dillon
>
> > The fact that I can enter Python code as plain text is even more useful
> than music and math.
> Sure, because you speak English.  Go talk to a kid who has to learn this
> in, I don't know, Egypt or China.


The same process that made mathematical notation standard across the world
is happening for English right now. I'm sure there are better languages
that could have become the global standard just as there may have been
better notation systems that might have been better than the mathematical
notation we ended up with (the use of "i" and name "imaginary numbers" is
quite unfortunate in my opinion, but at least we're not using Roman
numerals...). Why would you praise the ubiquity of mathematical notation
and scorn English developing the same property?

On Wed, Nov 6, 2019 at 6:16 PM Martin Euredjian via Python-ideas <
python-ideas@python.org> wrote:

> > Really? Because I’ve been using ABC notation for music for decades
>
> Try writing an orchestral suite in ASCII and see how well it goes.
> C'mon.  I know people use tablature and similar ideas.  Sure.  So what?
>
> > The fact that I can enter Python code as plain text is even more useful
> than music and math.
>
> Sure, because you speak English.  Go talk to a kid who has to learn this
> in, I don't know, Egypt or China.
>
> Sorry, notation is far more powerful.  As I said in one of my other notes,
> people who have not had the unique experience of using something like APL
> for non-trivial development work simply don't get it.  This is very much a
> conservative reaction (I do not mean this in the political sense at all) in
> that people are always more comfortable with things they know staying the
> same.
>
> I suspect that, had I not been pulled from a FORTRAN class by my
> APL-evangelist Physics professor in college I would likely react similarly
> if someone told me it might be better to consider notation as a way to
> improve computational expression of ideas.  I get it.  I am not oblivious
> to the absolute fact that context is crucially important here and very few
> people without the appropriate context are open minded enough to consider
> new ideas without becoming passionately involved in a negative way.
>
> Anyhow, I'll repeat, I am nobody and I am certainly not going to change
> the Python world.  Sleep well knowing this is just another random moron
> saying something he probably should not have said.
>
> I now see this is an intellectually welcoming community, which completely
> explains the walrus operator and other issues.
>
>
> Thanks anyway,
>
> -Martin
>
>
>
>
> On Wednesday, November 6, 2019, 02:22:00 PM PST, Andrew Barnert <
> abarn...@yahoo.com> wrote:
>
>
> On Nov 6, 2019, at 18:05, Martin Euredjian via Python-ideas <
> python-ideas@python.org> wrote:
>
> >
> > No, using "<--" is going in the wrong direction.  We want notation, not
> ASCII soup.  One could argue even walrus is ASCII soup.  Another example of
> ASCII soup is regex.  Without real notation one introduces a huge cognitive
> load.  Notation makes a massive difference.  Any classically trained
> musician sees this instantly.  If we replaced musical notation with
> sequences of two or three ASCII characters it would become an
> incomprehensible mess.
>
>
> Really? Because I’ve been using ABC notation for music for decades, and
> it’s never felt like an incomprehensible mess to me. While it doesn’t look
> as nice as normal musical notation, it’s even easier to learn, and it
> doesn’t take that long to be able to read it. And the fact that I can enter
> it a lot faster—and, more importantly, that I can enter it in exactly the
> same way in any email client, text editor, etc.—makes it hugely useful.
> When I’m out somewhere and think of a riff, I can just pull out my phone,
> fire up the Notes app, and type in the ABC. That’s visual enough for me to
> get it out of my consciousness and then read it off and hear it in my head
> so I can come up with a bassline to go with it. Or to read back a few days
> later and play it (after a couple seconds of processing, which admittedly
> isn’t as good as musical notation—but a lot better than nothing, which is
> what I’d have if I insisted on musical notation or nothing).
>
> For some reason, ABC is something that only musicologists learn, not
> musicians, unless you happen to get lucky and find it by accident. But many
> other musicians come up with their own idiosyncratic systems for the same
> purpose.
>
> You also mentioned math. The last visual editor for math I didn’t
> absolutely hate was my HP42. And I’d still rather use ASCIIMathML over
> that, much less something like Microsoft Word’s equation editor. It’s great
> to have something like MathJax on-the-fly rendering, but when I don’t have
> that, I can read x_0, or (-b +- sqrt(b^2 – 4ac))/(2a). Some things are too
> complicated for ASCIIMathML, but for them, I’ll use TeX; there’s no way I’d
> figure it out in a symbolic editor without a whole lot of painful trial and
> e

[Python-ideas] Re: Python should take a lesson from APL: Walrus operator not needed

2019-11-06 Thread Alexandre Brault

On 2019-11-06 12:05 p.m., Martin Euredjian via Python-ideas wrote:


Typing these symbols isn't a problem at all.  For example, in 
NARS2000, a free APL interpreter I use, the assignment operator "←" is 
entered simply with "Alt + [".  It takes seconds to internalize this 
and never think about it again.  If you download NARS2000 right now 
you will know how to enter "←" immediately because I just told you how 
to do it.  You will also know exactly what it does. It's that simple.


I'm not nearly qualified enough to speak to the rest of your message, 
but I should point out that I still have no idea how to type that arrow. 
The keystroke for [ already includes the Alt key and does nothing in NARS


Alex

___
Python-ideas mailing list -- python-ideas@python.org
To unsubscribe send an email to python-ideas-le...@python.org
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived at 
https://mail.python.org/archives/list/python-ideas@python.org/message/72HBYSXYIEXJKTJLPP7EVJ2C5MDCVSSR/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-ideas] Re: Python should take a lesson from APL: Walrus operator not needed

2019-11-06 Thread Martin Euredjian via Python-ideas
 > Really? Because I’ve been using ABC notation for music for decades
Try writing an orchestral suite in ASCII and see how well it goes.  C'mon.  I 
know people use tablature and similar ideas.  Sure.  So what?
> The fact that I can enter Python code as plain text is even more useful than 
> music and math.

Sure, because you speak English.  Go talk to a kid who has to learn this in, I 
don't know, Egypt or China.  

Sorry, notation is far more powerful.  As I said in one of my other notes, 
people who have not had the unique experience of using something like APL for 
non-trivial development work simply don't get it.  This is very much a 
conservative reaction (I do not mean this in the political sense at all) in 
that people are always more comfortable with things they know staying the same.
I suspect that, had I not been pulled from a FORTRAN class by my APL-evangelist 
Physics professor in college I would likely react similarly if someone told me 
it might be better to consider notation as a way to improve computational 
expression of ideas.  I get it.  I am not oblivious to the absolute fact that 
context is crucially important here and very few people without the appropriate 
context are open minded enough to consider new ideas without becoming 
passionately involved in a negative way.

Anyhow, I'll repeat, I am nobody and I am certainly not going to change the 
Python world.  Sleep well knowing this is just another random moron saying 
something he probably should not have said.
I now see this is an intellectually welcoming community, which completely 
explains the walrus operator and other issues.

Thanks anyway,
-Martin



On Wednesday, November 6, 2019, 02:22:00 PM PST, Andrew Barnert 
 wrote:  
 
 On Nov 6, 2019, at 18:05, Martin Euredjian via Python-ideas 
 wrote:
> 
> No, using "<--" is going in the wrong direction.  We want notation, not ASCII 
> soup.  One could argue even walrus is ASCII soup.  Another example of ASCII 
> soup is regex.  Without real notation one introduces a huge cognitive load.  
> Notation makes a massive difference.  Any classically trained musician sees 
> this instantly.  If we replaced musical notation with sequences of two or 
> three ASCII characters it would become an incomprehensible mess.

Really? Because I’ve been using ABC notation for music for decades, and it’s 
never felt like an incomprehensible mess to me. While it doesn’t look as nice 
as normal musical notation, it’s even easier to learn, and it doesn’t take that 
long to be able to read it. And the fact that I can enter it a lot faster—and, 
more importantly, that I can enter it in exactly the same way in any email 
client, text editor, etc.—makes it hugely useful. When I’m out somewhere and 
think of a riff, I can just pull out my phone, fire up the Notes app, and type 
in the ABC. That’s visual enough for me to get it out of my consciousness and 
then read it off and hear it in my head so I can come up with a bassline to go 
with it. Or to read back a few days later and play it (after a couple seconds 
of processing, which admittedly isn’t as good as musical notation—but a lot 
better than nothing, which is what I’d have if I insisted on musical notation 
or nothing).

For some reason, ABC is something that only musicologists learn, not musicians, 
unless you happen to get lucky and find it by accident. But many other 
musicians come up with their own idiosyncratic systems for the same purpose.

You also mentioned math. The last visual editor for math I didn’t absolutely 
hate was my HP42. And I’d still rather use ASCIIMathML over that, much less 
something like Microsoft Word’s equation editor. It’s great to have something 
like MathJax on-the-fly rendering, but when I don’t have that, I can read x_0, 
or (-b +- sqrt(b^2 – 4ac))/(2a). Some things are too complicated for 
ASCIIMathML, but for them, I’ll use TeX; there’s no way I’d figure it out in a 
symbolic editor without a whole lot of painful trial and error.

The fact that I can enter Python code as plain text is even more useful than 
music and math.

Sure, in your IDE, alt-[ types an arrow. But what happens when you type alt-[ 
in your iPhone mail app, in emacs or vi over an ssh session to a deployed 
server, in web text boxes like the the answer box on StackOveflow, in your 
favorite editor’s markdown mode, in a WYSIWYG word processor, or even just in 
someone else’s IDE? And I do all those things a lot more often with source code 
than I do with music or equations.

How are you entering the arrows in these emails? Do you fire up your IDE, type 
an arrow, and then copy and paste it into your webmail or something like that? 
If you were on your phone instead of your laptop, would you go search the web 
for the Unicode arrow and copy and paste from there? 

Does any of that really seem as easy to you as typing := the exact same way in 
every editor in the world?

I wouldn’t mind an editor that had the same kind of “presentation mode” as 
ma

[Python-ideas] Re: Python should take a lesson from APL: Walrus operator not needed

2019-11-06 Thread Abe Dillon
>
> No, using "<--" is going in the wrong direction.  We want notation, not
> ASCII soup.

This distinction between notation and soup seems pretty subjective. What is
the difference between soup and notation? In my mind it has a lot to do
with familiarity. I watched that video about programming Conway's Game of
Life in APL and it looks like an incomprehensible soup of symbols to me.

Another example of ASCII soup is regex.

That's interesting, I feel the same way. I can read most code pretty
quickly, but as soon as I hit a regex it takes me 50x as long to read and I
have to crack open a reference because I can never remember the notation.
Luckily someone came up with a solution called verbal expressions
 which trade
hard-to-remember symbols with easy to understand words! (though I think the
Python implementation smacks of Java idioms)

I'm sure there are people who work with regular expressions on such a
regular basis that they've become fluent, but when you require such deep
emersion in the language before the symbols make sense to you, it's a huge
barrier to entry. You can't then act all confused about why your favorite
language never caught on.

Without real notation one introduces a huge cognitive load.

What is "real notation". This sounds like a no-true-Scotsman fallacy
. Has everyone on this
message board been communicating with fake ASCII notation this entire time?
Cognative load can come from many different places like:

   1. Having to remember complex key combinations just to get your thoughts
   into code
   2. Having to memorize what each of thousands of symbols do because
   there's no way to look them up in a search engine
   3. Knowing no other notation system that even slightly resembles APL.
   I mean, I know some esoteric mathematics, but I've never seen anything
   that looks even remotely like:
   life←{↑1 ⍵∨.∧3 4=+/,¯1 0 1∘.⊖¯1 0 1∘.⌽⊂⍵}

A big part of Python's philosophy is that you read code way more often than
you write code so we should optimize readability. As one Reddit commentor
put it

:

APL is such a powerful language. APL is also a powerfully write-only
> language.


And I don't even fully agree there because it somehow manages to be almost
as difficult to write.

Typing these symbols isn't a problem at all.  For example, in NARS2000, a
> free APL interpreter I use, the assignment operator "←" is entered simply
> with "Alt + [".  It takes seconds to internalize this and never think about
> it again.


For some people. I, myself; have a learning disability and often need to
look at my keyboard. The relationship between "←" and "[" doesn't seem
obvious at all.

If you download NARS2000 right now you will know how to enter "←"
> immediately because I just told you how to do it.  You will also know
> exactly what it does.  It's that simple.


You know what's even simpler and requires even less cognitive load?  Typing
ASCII characters...

The other interesting thing about notation is that it transcends language.


The word "notation" refers to symbols, abbreviations, and short-hand that
make up domain-specific languages. Nothing about notation "transcends"
language, notation is a component of language. Math is the study of
patterns. Mathematical notation is what we use to write the language of
patterns, to describe different patterns and communicate ideas about
patterns. There used to be different mathematical languages based on
culture, just like spoken languages. There's no magical property that made
Roman numerals or Arabic numerals just make sense to people from other
cultures, they had to learn each others notation just like any other
language and eventually settled on Arabic numerals. Maybe things would have
gone differently if the Mayans had a say.

It has been my experience that people who have not had the experience
> rarely get it


A pattern I've seen in my experience is that some person or group will put
forth a pretty good idea, and others become dogmatic about that idea, loose
sight of pragmatism, and try to push the idea beyond its practical
applicability. I'm not saying this is you. I haven't yet read the Ken
Iverson paper (I will). My suspicion at this point and after seeing the APL
code demos is that there's probably plenty of good ideas in there, but APL
doesn't strike me as pragmatic in any sense.

On Wed, Nov 6, 2019 at 11:06 AM Martin Euredjian via Python-ideas <
python-ideas@python.org> wrote:

> Thanks for your feedback.  A few comments:
>
> > I do not consider these two things conceptually equivalent. In Python
> the identifier ('a' in this case) is just label to the value
>
> I used APL professionally for about ten years.  None of your objections
> ring true.  A simple example is had from mathematics.  The integral symbol
> conveys and represe

[Python-ideas] Re: Python should take a lesson from APL: Walrus operator not needed

2019-11-06 Thread Andrew Barnert via Python-ideas
On Nov 6, 2019, at 21:53, Martin Euredjian via Python-ideas 
 wrote:
> 
> I've had this kind of a conversation with many people in the 30+ years since 
> I learned APL and 20+ years since I stopped using it professionally.  It has 
> been my experience that people who have not had the experience rarely get it, 
> and, sadly, more often than not, they become hostile to the implication that 
> there might actually be a better way to translate ideas into computer 
> executable code.

I haven’t used APL professionally, but I’ve heard Iverson talk about it, and 
read articles that are supposed to sell me on it, and I think I get the point 
of it. And the point is that it gets you about halfway to where Haskell does 
and then leaves you there.

In traditional languages you have to write loops and then turn your functions 
inside out. In Python/R/Mathematica/C++/etc. you can do things array-wise, but 
only the things the language/library designer thought of. (Well, you can use 
vectorize/frompyfunc/etc., which does have the right semantics, but it doesn’t 
look as friendly as the stuff NumPy comes with, and it’s slow.) GLSL has 
different limitations, and you often do have to think about the parallel 
“virtual loops” to do anything nontrivial. APL doesn’t have either of those 
problems; you can—and are actively encouraged to—think about how to combine 
operations independently from how to apply the combined operation.

But Haskell encourages that too—and further abstraction beyond it. Of course 
like J, its operators are strings of ASCII symbols (or identifiers in 
backticks), but that doesn’t change what the abstractions are, just how they’re 
spelled. For example, it’s cool that in APL I can lift + to sum just by writing 
+/, so I can sum an array of ints with +/x. But what if x is an array of 
bigints or fractions or decimal64s? As far as I know, there’s no way to 
implement such things in a way that + works on them. I know there’s a box 
operator that acts like an array by reading from stdin, but can I write 
something similar that reads from an open text file, parses it as CSV, parses 
the third column of each row as an int, and pass those ints to +/ too? What if 
I want to sum an array of maybe ints, or a maybe array of ints, and get back a 
maybe int? What if I want to pass around + or +/ or / as first-class objects? 
Can I even apply an operator to an operator, or do I only get second-order 
rather than arbitrarily-higher-order functionality? Can I write a new operator 
that unlifts and relifts any function in a way that turns this one from sum to 
running-sums?

Is there something about all of those limitations that crystallizes your 
thinking differently than having a more abstract and less restricted version of 
the same abstractions? That’s not inconceivable, but it doesn’t seem likely a 
priori, and I’ve never heard an argument for it, and none of the examples APL 
fans have shown me have convinced me.

And here’s the thing: sometimes Haskell is the right tool for the job, and even 
when it isn’t, learning it made me a better programmer even when I’m using 
other languages—but that doesn’t mean I want Python to be more like Haskell. 
(Well, maybe occasionally in a few minor ways, but I don’t want Python to be 
static-type-driven, or to be pure immutable, or to curry all functions and 
encourage point-free style, or to be lazy and make me trust an optimizer to 
figure out when to reify values, or to let me define hundreds of arbitrary 
operators, etc.) So, what’s different about APL that you want Python to be more 
like APL?
___
Python-ideas mailing list -- python-ideas@python.org
To unsubscribe send an email to python-ideas-le...@python.org
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived at 
https://mail.python.org/archives/list/python-ideas@python.org/message/3ZF7D5R3ET5EERTM3HHCLWLSAEE7QE3T/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-ideas] Re: Python should take a lesson from APL: Walrus operator not needed

2019-11-06 Thread Martin Euredjian via Python-ideas
 >> No professional thinks that "a = some_object" results in a bucket being 
 >>filled with whatever the object might contain. > That's exactly how 
 >>variables work in many common languages like C
Nope, not true.  Bytes, words, integers and that's about it.  Anything else is 
a pointer to a relevant data structure somewhere in memory.  I think I can say 
this it true of the vast majority of languages.  Exceptions are cases like 
Python and, say, Objective-C, or, in general, where the philosophy is that 
everything is an object.  Nobody is filling buckets with anything every time 
there's an assignment, at least no language I have ever used.  At the end of 
the day, it's a pointer to a chunk-o-memory with a header describing what's in 
there, how many, etc.  In more complex cases it's a pointer to a linked list or 
a pointer to a chunk of memory filled with pointers to other chunks of memory.  
This has been the case from almost the beginning of time.  Let's put it this 
way, I was doing this kind of thin when I was programming IMSAI's with toggle 
switches and keeping track of variable names, memory locations and contents on 
a notebook by hand, paper and pencil.

-Martin

On Wednesday, November 6, 2019, 02:01:27 PM PST, Abe Dillon 
 wrote:  
 
 
Question:  Where did APL's  "←" operator come from?

Doesn't matter. If your notation can't stand on its own without a history 
lesson, then it's not great.
 
A number of APL's elements came from a notation developed to describe the 
operation of IBM processors back in the 1960's.  In many ways it meant "this 
name is assigned to this object", to paraphrase your statement.
In "many ways"? Not exactly? How does this make it better? It still sounds 
counterintuitive. If it really means "this name references this object", why 
not a left arrow?


I mean, how does "a = 23" which is read "a is equal to 23" or "a = some_object" 
which is literally read "a is equal to some_object" say "a is a label that is 
attached to 23" or "a is a label that is attached to some_object"?
I would say that it's not perfect notation. I read "a = 23" as "a is 23" in the 
same way that I would say "Abe is my name". It doesn't describe the 
relationship well, but it's an acceptable, pragmatic use of familiar notation 
and it doesn't run *counter to* the actual mechanics of the language. As far as 
symbols go, an arrow to the left would be closest to representing the mechanics 
of the language, "=" is a compromise, and "←" is backwards. The object doesn't 
refer to the name, the name refers to the object.


No professional thinks that "a = some_object" results in a bucket being filled 
with whatever the object might contain. 
That's exactly how variables work in many common languages like C and is 
actually a common misconception even among people with years of experience with 
Python.

Check out this Stack Overflow question that asks what the difference is and the 
confused comments that follow saying there is no difference or even the fact 
that the question was closed because people didn't understand what it was 
asking. It's pretty bonkers to me that professional programers don't know the 
difference or understand a very basic question, but it's the world we live in.


In fact, one could very well make the argument that using the "=" character for 
this operation is misleading because the left side is not set to be equal to 
the right side.  Even worse, these pointers in Python are inmutable. 
The "=" doesn't imply immutability. That's actually what the walrus operator 
":=" implies. "pi := circumference/diameter" means "pi is defined as the ratio 
of the circumference to the diameter".

For the most part, implying that the name equals the object it references is 
fine. Implying that you're somehow putting the object into the variable or 
attaching the object to the variable (instead of the other way around) is 
backwards.


Someone coming from a whole range of languages sees the "=" sign to mean 
something very different.  For example, there are a bunch of languages where 
incrementing or performing math on the pointer's address is normal and 
fundamental to the language.  So, "=" in Python is not equal to "=" in many 
languages.  Why are we using the same symbol and creating this confusion? 
For many many reasons.


 If your response is something like "people learn the difference", well, you 
just made my point.  People learn.
That's not sufficient justification. If it were, you could use that same logic 
to justify adding any symbol to any language. I find musical notation woefully 
lacking. There's no way to denote clapping or snapping fingers or the glottal 
stops that Regina Spector is so fond of. Maybe I should add a fish symbol, a 
duck symbol, and the guy walking sideways from Egyptian Hieroglyphics to the 
standard musical notation to represent those sounds. People will learn the 
difference, right?


I've had this kind of a conversation with many people in the 30+ years s

[Python-ideas] Re: Python should take a lesson from APL: Walrus operator not needed

2019-11-06 Thread Andrew Barnert via Python-ideas
On Nov 6, 2019, at 19:52, Mike Miller  wrote:
> 
>> On 2019-11-06 05:40, Andrew Barnert via Python-ideas wrote:
>> While we’re at it, when you replace both = and := with an arrow, what do you 
>> do with += and the other augmented assignments? I can’t think of a 
>> single-character symbol that visually represents that meaning. If you leave 
>> it as + followed by an arrow, or try to come up with some new digraph, now 
>> we have the worst of both worlds, Unicode soup: operators that are digraphs 
>> and not visually meaningful while also not being typeable.
> 
> There is:
> 
>  U+2B32⬲LEFT ARROW WITH CIRCLED PLUS

Once we go to Unicode and lots of operators, I doubt it’ll be long before we 
use circled plus for something. At which point this is pretty misleading as a 
meaning for “plus than assign back” rather than “circle-plus then assign back”.

> But there would need to be more.  I didn't find any obvious for: -=

So going to Unicode and lots of operators doesn’t actually save us from symbol 
soup at all (unless we want to give up functionality the language already has), 
it just adds new problems in top of the soup problem.

___
Python-ideas mailing list -- python-ideas@python.org
To unsubscribe send an email to python-ideas-le...@python.org
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived at 
https://mail.python.org/archives/list/python-ideas@python.org/message/FPKMDIZOPS5K5LEJ3VFEZP7AGK4JAR5P/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-ideas] Re: Python should take a lesson from APL: Walrus operator not needed

2019-11-06 Thread Chris Angelico
On Thu, Nov 7, 2019 at 8:47 AM Greg Ewing  wrote:
>
> Andrew Barnert via Python-ideas wrote:
> > On Nov 6, 2019, at 08:59, Chris Angelico  wrote:
> >
> >> No, because "x <-- y" is already legal syntax
> >
> > You could handle that by making the grammar more complicated.
>
> Or just have the tokeniser treat "<--" as a single token, the
> same way that it treats "<=" as a single token rather than
> "<" followed by "=". It would be a backwards-incompatible
> change (if you really wanted "less than minus minus something"
> you'd have to put a space in somewhere) but replacing the
> assignment operator is already a much bigger one.
>

To clarify: I wasn't saying that it's fundamentally impossible to have
these kinds of parsing rules, but that it's backward incompatible.
Notably, even though this syntax is fairly unlikely to come up, it
means that anyone using "<--" as an assignment operator will have to
worry about older Python versions misinterpreting it. If you create a
brand new operator out of something that's currently invalid syntax,
then it's easy - you get an instant compilation error on an older
interpreter. With this, it might sometimes result in a runtime
NameError or TypeError, and even worse, might just silently do the
wrong thing. That's why Python 3.9 still won't let you write "except
ValueError, IndexError:" - you *have* to parenthesize the tuple,
because the comma syntax had a different meaning in Python 2 (the
"except Exception as name:" syntax was backported to 2.6/2.7 but the
older syntax is of course still valid). There is no way that you can
accidentally run your code on the wrong Python and have it silently
assign to IndexError instead of catching two types.

ChrisA
___
Python-ideas mailing list -- python-ideas@python.org
To unsubscribe send an email to python-ideas-le...@python.org
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived at 
https://mail.python.org/archives/list/python-ideas@python.org/message/EN3YZ7MUSOFBEX3QTT5OYCPDZCQVWN3I/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-ideas] Re: Python should take a lesson from APL: Walrus operator not needed

2019-11-06 Thread Abe Dillon
>
> Question:  Where did APL's  "←" operator come from?
>
Doesn't matter. If your notation can't stand on its own without a history
lesson, then it's not great.


> A number of APL's elements came from a notation developed to describe the
> operation of IBM processors back in the 1960's.  In many ways it meant
> "this name is assigned to this object", to paraphrase your statement.

In "many ways"? Not exactly? How does this make it better? It still sounds
counterintuitive. If it really means "this name references this object",
why not a left arrow?

I mean, how does "a = 23" which is read "a is equal to 23" or "a =
> some_object" which is literally read "a is equal to some_object" say "a is
> a label that is attached to 23" or "a is a label that is attached to
> some_object"?

I would say that it's not perfect notation. I read "a = 23" as "a is 23" in
the same way that I would say "Abe is my name". It doesn't describe the
relationship well, but it's an acceptable, pragmatic use of familiar
notation and it doesn't run *counter to* the actual mechanics of the
language. As far as symbols go, an arrow to the left would be closest to
representing the mechanics of the language, "=" is a compromise, and "←" is
backwards. The object doesn't refer to the name, the name refers to the
object.

No professional thinks that "a = some_object" results in a bucket being
> filled with whatever the object might contain.

That's exactly how variables work in many common languages like C and is
actually a common misconception even among people with years of experience
with Python.

Check out this Stack Overflow question

that asks what the difference is and the confused comments that follow
saying there is no difference or even the fact that the question was closed
because people didn't understand what it was asking. It's pretty bonkers to
me that professional programers don't know the difference or understand a
very basic question, but it's the world we live in.

In fact, one could very well make the argument that using the "=" character
> for this operation is misleading because the left side is not set to be
> equal to the right side.  Even worse, these pointers in Python are
> inmutable.

The "=" doesn't imply immutability. That's actually what the walrus
operator ":=" implies. "pi := circumference/diameter" means "pi is defined
as the ratio of the circumference to the diameter".

For the most part, implying that the name equals the object it references
is fine. Implying that you're somehow putting the object into the variable
or attaching the object to the variable (instead of the other way around)
is backwards.

Someone coming from a whole range of languages sees the "=" sign to mean
> something very different.  For example, there are a bunch of languages
> where incrementing or performing math on the pointer's address is normal
> and fundamental to the language.  So, "=" in Python is not equal to "=" in
> many languages.  Why are we using the same symbol and creating this
> confusion?

For many many reasons.

 If your response is something like "people learn the difference", well,
> you just made my point.  People learn.

That's not sufficient justification. If it were, you could use that same
logic to justify adding any symbol to any language. I find musical notation
woefully lacking. There's no way to denote clapping or snapping fingers or
the glottal stops that Regina Spector is so fond of. Maybe I should add a
fish symbol, a duck symbol, and the guy walking sideways from Egyptian
Hieroglyphics to the standard musical notation to represent those sounds.
People will learn the difference, right?

I've had this kind of a conversation with many people in the 30+ years
> since I learned APL and 20+ years since I stopped using it professionally.

 Oh, really? You programmed APL for 10 years?! Did you go to Yale Mr.
Kavanaugh? You can cut the arguments from Authority. They're worth nothing.

Look, we don't have to agree, and, frankly, you seem to be getting rattled.

I'm genuinely curious what makes you think I'm "rattled"? I'm not.


On Wed, Nov 6, 2019 at 2:54 PM Martin Euredjian via Python-ideas <
python-ideas@python.org> wrote:

> > I don't think you understood the point about APL's arrow assignment
> operator being counterintuitive in Python.
>
> I understood this just fine.  I happen to think your argument in this
> regard is neither sound nor valid.
>
> Question:  Where did APL's  "←" operator come from?
>
> A number of APL's elements came from a notation developed to describe the
> operation of IBM processors back in the 1960's.  In many ways it meant
> "this name is assigned to this object", to paraphrase your statement.
>
> I mean, how does "a = 23" which is read "a is equal to 23" or "a =
> some_object" which is literally read "a is equal to some_object" say "a is
> a label that i

[Python-ideas] Re: Python should take a lesson from APL: Walrus operator not needed

2019-11-06 Thread Greg Ewing

Andrew Barnert via Python-ideas wrote:

On Nov 6, 2019, at 08:59, Chris Angelico  wrote:


No, because "x <-- y" is already legal syntax


You could handle that by making the grammar more complicated.


Or just have the tokeniser treat "<--" as a single token, the
same way that it treats "<=" as a single token rather than
"<" followed by "=". It would be a backwards-incompatible
change (if you really wanted "less than minus minus something"
you'd have to put a space in somewhere) but replacing the
assignment operator is already a much bigger one.

--
Greg
___
Python-ideas mailing list -- python-ideas@python.org
To unsubscribe send an email to python-ideas-le...@python.org
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived at 
https://mail.python.org/archives/list/python-ideas@python.org/message/SK2UXJ4VLHWCVJN54Q4ZN37BO3ER33B6/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-ideas] Re: Python should take a lesson from APL: Walrus operator not needed

2019-11-06 Thread Chris Angelico
On Thu, Nov 7, 2019 at 7:21 AM Abe Dillon  wrote:
>> None of your objections ring true.  A simple example is had from 
>> mathematics.  The integral symbol conveys and represents a concept.  Once 
>> the practitioner is introduced to the definition of that symbol, what it 
>> means, he or she uses it.  It really is a simple as that, this is how our 
>> brains work.  That's how you recognize the letter "A" as to correspond to a 
>> sound and as part of words.  This is how, in languages such as Chinese, 
>> symbols, notation, are connected to meaning.  It is powerful and extremely 
>> effective.
>
> I don't think you understood the point about APL's arrow assignment operator 
> being counterintuitive in Python. In Python: variables are names assigned to 
> objects *not* buckets that objects are stored in. Using a notation that 
> implies that objects are assigned to variables encourages a broken 
> understanding of Python's mechanics.
>

For the record, I actually don't think that this is much of a problem.
We have notions of "assignment" and "equality" that don't always have
the exact same meanings in all contexts, and our brains cope. The
sense in which 2/3 is equal to 4/6 is not the same as the one in which
x is equal to 7, and whether or not you accept that the (infinite) sum
of all powers of two is "equal to" -1 depends on your interpretation
of equality. You might think that after an assignment, the variable is
equal to that value - but you can say "x = x + 1" and then x will not
be equal to x + 1, because assignment is temporal in nature. People
will figure this out regardless of the exact spelling of either
equality or assignment.

> As to the role of ML and AI in all of this: These are tools that will allow 
> greater abstraction. Assuming more symbols will greatly enhance programing in 
> the future is like assuming that more opcodes will greatly enhance programing 
> in the future. AI and ML, if anything, will allow us to define the problems 
> we want to solve in something much closer to natural language and let the 
> computers figure out how that translates to code. What kind of code? Python? 
> C++? APL? x86? RISC-V? Who cares?!
>

Agreed. I would suggest, though, that this isn't going to be anything
new. It's just a progression that we're already seeing - that
programming languages are becoming more abstract, more distant from
the bare metal of execution. Imagine a future in which we dictate to a
computer what we want it to do, and then it figures out (via AI) how
to do it... and now imagine what a present day C compiler does to
figure out what sequence of machine language instructions will achieve
the programmer's stated intention. Here's a great talk discussing the
nature of JavaScript in this way:

https://www.destroyallsoftware.com/talks/the-birth-and-death-of-javascript

AI/ML might well be employed *already* to implement certain
optimizations. I wouldn't even know; all I know is that, when I ask
the computer to do something, it does it. That's really all that
matters!

ChrisA
___
Python-ideas mailing list -- python-ideas@python.org
To unsubscribe send an email to python-ideas-le...@python.org
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived at 
https://mail.python.org/archives/list/python-ideas@python.org/message/PFYPVNG4NJDJITUNXOZBMCS5OK6Q2ZSP/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-ideas] Re: Python should take a lesson from APL: Walrus operator not needed

2019-11-06 Thread Martin Euredjian via Python-ideas
 > I don't think you understood the point about APL's arrow assignment operator 
 >being counterintuitive in Python.
I understood this just fine.  I happen to think your argument in this regard is 
neither sound nor valid.

Question:  Where did APL's  "←" operator come from?

A number of APL's elements came from a notation developed to describe the 
operation of IBM processors back in the 1960's.  In many ways it meant "this 
name is assigned to this object", to paraphrase your statement.

I mean, how does "a = 23" which is read "a is equal to 23" or "a = some_object" 
which is literally read "a is equal to some_object" say "a is a label that is 
attached to 23" or "a is a label that is attached to some_object"?

This is no different from the concept of pointers.  A pointer stores an address 
to some data structure somewhere.  No professional thinks that "a = 
some_object" results in a bucket being filled with whatever the object might 
contain.  It's a pointer.  We are assigning a pointer.  We are storing an 
address that points to where the data lives.

In fact, one could very well make the argument that using the "=" character for 
this operation is misleading because the left side is not set to be equal to 
the right side.  Even worse, these pointers in Python are inmutable.  Someone 
coming from a whole range of languages sees the "=" sign to mean something very 
different.  For example, there are a bunch of languages where incrementing or 
performing math on the pointer's address is normal and fundamental to the 
language.  So, "=" in Python is not equal to "=" in many languages.  Why are we 
using the same symbol and creating this confusion?  

If your response is something like "people learn the difference", well, you 
just made my point.  People learn.

I've had this kind of a conversation with many people in the 30+ years since I 
learned APL and 20+ years since I stopped using it professionally.  It has been 
my experience that people who have not had the experience rarely get it, and, 
sadly, more often than not, they become hostile to the implication that there 
might actually be a better way to translate ideas into computer executable 
code.  That's just reality and I am not going to change it.

Look, we don't have to agree, and, frankly, you seem to be getting rattled.  I 
want no part of that.  I didn't come here to change the Python universe.  Like 
I said, I am nobody, so, yeah, forget it.  Don't waste your time on me or my 
ridiculous ideas.  I just wanted to share an opinion, worthless as it might be.

Thanks,
-Martin


On Wednesday, November 6, 2019, 12:18:21 PM PST, Abe Dillon 
 wrote:  
 
 
I used APL professionally for about ten years.
Yes, you've stated that already.


None of your objections ring true.  A simple example is had from mathematics.  
The integral symbol conveys and represents a concept.  Once the practitioner is 
introduced to the definition of that symbol, what it means, he or she uses it.  
It really is a simple as that, this is how our brains work.  That's how you 
recognize the letter "A" as to correspond to a sound and as part of words.  
This is how, in languages such as Chinese, symbols, notation, are connected to 
meaning.  It is powerful and extremely effective.
I don't think you understood the point about APL's arrow assignment operator 
being counterintuitive in Python. In Python: variables are names assigned to 
objects *not* buckets that objects are stored in. Using a notation that implies 
that objects are assigned to variables encourages a broken understanding of 
Python's mechanics.

A simple example is had from mathematics.  The integral symbol conveys and 
represents a concept.  Once the practitioner is introduced to the definition of 
that symbol, what it means, he or she uses it.  It really is a simple as that, 
this is how our brains work.  That's how you recognize the letter "A" as to 
correspond to a sound and as part of words.  This is how, in languages such as 
Chinese, symbols, notation, are connected to meaning.  It is powerful and 
extremely effective.
The fact that people learn and then become comfortable with symbols doesn't 
imply that choosing which symbols to adopt into a language is trivial. You can 
follow the evolution of languages over time and find that they often eject 
characters that serve little use or cause confusion like the english character 
"thorn". 


The use of notation as a tool for thought is a powerful concept that transcends 
programming.  Mathematics is a simple example. So is music.  Musical notation 
allows the expression of ideas and massively complex works as well as their 
creation.  In electronics we have circuit diagrams, which are not literal 
depictions of circuits but rather a notation to represent them, to think about 
them, to invent them.
You don't need to convince people of the power of abstraction or the utility of 
domain-specific languages. Such a general statement doesn't support the 
adoption

[Python-ideas] Re: Python should take a lesson from APL: Walrus operator not needed

2019-11-06 Thread Abe Dillon
>
> I used APL professionally for about ten years.

Yes, you've stated that already.

None of your objections ring true.  A simple example is had from
> mathematics.  The integral symbol conveys and represents a concept.  Once
> the practitioner is introduced to the definition of that symbol, what it
> means, he or she uses it.  It really is a simple as that, this is how our
> brains work.  That's how you recognize the letter "A" as to correspond to a
> sound and as part of words.  This is how, in languages such as Chinese,
> symbols, notation, are connected to meaning.  It is powerful and extremely
> effective.

I don't think you understood the point about APL's arrow assignment
operator being counterintuitive in Python. In Python: variables are names
assigned to objects *not* buckets that objects are stored in. Using a
notation that implies that objects are assigned to variables encourages a
broken understanding of Python's mechanics.

A simple example is had from mathematics.  The integral symbol conveys and
> represents a concept.  Once the practitioner is introduced to the
> definition of that symbol, what it means, he or she uses it.  It really is
> a simple as that, this is how our brains work.  That's how you recognize
> the letter "A" as to correspond to a sound and as part of words.  This is
> how, in languages such as Chinese, symbols, notation, are connected to
> meaning.  It is powerful and extremely effective.

The fact that people learn and then become comfortable with symbols doesn't
imply that choosing which symbols to adopt into a language is trivial. You
can follow the evolution of languages over time and find that they often
eject characters that serve little use or cause confusion like the english
character "thorn" .

The use of notation as a tool for thought is a powerful concept that
> transcends programming.  Mathematics is a simple example. So is music.
> Musical notation allows the expression of ideas and massively complex works
> as well as their creation.  In electronics we have circuit diagrams, which
> are not literal depictions of circuits but rather a notation to represent
> them, to think about them, to invent them.

You don't need to convince people of the power of abstraction or the
utility of domain-specific languages. Such a general statement doesn't
support the adoption of any specific change. You might as well be
advocating for adding Egyptian hieroglyphics to musical notation. We don't
need a lecture on the importance of abstract notation each time a new
syntax is proposed.

The future of computing, in my opinion, must move away --perhaps not
> entirely-- from ASCII-based typing of words.  If we want to be able to
> express and think about programming at a higher level we need to develop a
> notation.  As AI and ML evolve this might become more and more critical.

I strongly disagree with this.
First of all, mathematical notation which programming borrows heavily from,
highly favors compaction over clarity. It uses greek and latin symbols that
mean different things depending on the field. It uses both left and right
super and sub-scripts sometimes for naming conventions, sometimes to denote
exponentiation. It uses dots and hats and expressions that sit below and/or
above symbols (like in "limit" notation or summations) and all sorts of
other orientations and symbol modifications that are almost impossible to
look up, infix and prefix and postfix notation. It makes picking up any
given mathematical paper a chore to comprehend because so much context is
assumed and not readily accessible.

Why not use a more consistent notation like add(x, y) instead of x + y when
we know addition is a function and all other functions (usually) follow the
f(x, y) notation?
Because math is old. It predates the printing press and other tools that
make more explicit and readable notation possible. It was much more
important hundreds of years ago, that your ideas be expressible in a
super-concise form to the detriment of readability. That's not the only
reason, of course, but it is a pretty big reason. I submit that most
mathematical papers would benefit from having their formulas re-written in
something like a programming language with more explicit variable names and
consistent notation.

As to the role of ML and AI in all of this: These are tools that will allow
greater abstraction. Assuming more symbols will greatly enhance programing
in the future is like assuming that more opcodes will greatly enhance
programing in the future. AI and ML, if anything, will allow us to define
the problems we want to solve in something much closer to natural language
and let the computers figure out how that translates to code. What kind of
code? Python? C++? APL? x86? RISC-V? Who cares?!

That's all I have time for, for now, I may pick this up later.

On Wed, Nov 6, 2019 at 11:06 AM Martin Euredjian via Python-ideas <
python-ideas@python.org> wrote:

> Than

[Python-ideas] Re: Python should take a lesson from APL: Walrus operator not needed

2019-11-06 Thread Chris Angelico
On Thu, Nov 7, 2019 at 4:05 AM Martin Euredjian via Python-ideas
 wrote:
> I used APL professionally for about ten years.  None of your objections ring 
> true.  A simple example is had from mathematics.  The integral symbol conveys 
> and represents a concept.  Once the practitioner is introduced to the 
> definition of that symbol, what it means, he or she uses it.  It really is a 
> simple as that, this is how our brains work.  That's how you recognize the 
> letter "A" as to correspond to a sound and as part of words.  This is how, in 
> languages such as Chinese, symbols, notation, are connected to meaning.  It 
> is powerful and extremely effective.
>
> The use of notation as a tool for thought is a powerful concept that 
> transcends programming.  Mathematics is a simple example. So is music.  
> Musical notation allows the expression of ideas and massively complex works 
> as well as their creation.  In electronics we have circuit diagrams, which 
> are not literal depictions of circuits but rather a notation to represent 
> them, to think about them, to invent them.
>

At this point, you've solidly established the need for notation. Yes,
I think we all agree; in fact, programming *in general* is a matter of
finding a notation to represent various concepts, and then using that
notation to express more complex concepts.

> The future of computing, in my opinion, must move away --perhaps not 
> entirely-- from ASCII-based typing of words.  If we want to be able to 
> express and think about programming at a higher level we need to develop a 
> notation.  As AI and ML evolve this might become more and more critical.
>

But this does not follow. English, as a language, is almost entirely
representable within ASCII, and we don't hear people saying that they
can't express their thoughts adequately without introducing "ő" and
"火"; people just use more letters. There's no fundamental reason that
Python is unable to express the concept of "assignment" without
reaching for additional characters.

> APL, sadly, was too early.  Machines of the day were literally inadequate in 
> almost every respect.  It is amazing that the language went as far as it did. 
>  Over 30+ years I have worked with over a dozen languages, ranging from low 
> level machine code through Forth, APL, Lisp, C, C++, Objective-C, and all the 
> "modern" languages such as Python, JS, PHP, etc.  Programming with APL is a 
> very different experience.  Your mind works differently.  I can only equate 
> it to writing orchestral scores in the sense that the symbols represent very 
> complex textures and structures that your mind learns to imagine and 
> manipulate in real time.  You think about spinning, crunching, slicing and 
> manipulating data structures in ways you never rally think about when using 
> any other language.  Watch the videos I link to below for a taste of these 
> ideas.
>

Please, explain to me how much better Python would be if we used "≤"
instead of "<=". If I'm reading something like "if x <= y: ...", I
read the two-character symbol "<=" as a single symbol.

> Anyhow, obviously the walrus operator is here to stay.  I am not going to 
> change anything.  I personally think this is sad and a wasted opportunity to 
> open a potentially interesting chapter in the Python story; the mild 
> introduction of notation and a path towards evolving a richer notation over 
> time.
>
> > Second point, I can write := in two keystrokes, but I do not have a 
> > dedicated key for the arrow on my keyboard. Should '<--' also be an 
> > acceptable syntax?
>
> No, using "<--" is going in the wrong direction.  We want notation, not ASCII 
> soup.  One could argue even walrus is ASCII soup.  Another example of ASCII 
> soup is regex.  Without real notation one introduces a huge cognitive load.  
> Notation makes a massive difference.  Any classically trained musician sees 
> this instantly.  If we replaced musical notation with sequences of two or 
> three ASCII characters it would become an incomprehensible mess.
>

The trouble with an analogy to music is that it would take a LOT more
than 2-3 ASCII characters to represent a short section of musical
score. A closer analogy would be mathematics, where the typical
blackboard-friendly notation contrasts with the way that a programming
language would represent it. The problem in mathematical notation is
that there simply aren't enough small symbols available, so they have
to keep getting reused (Greek letters in particular end up getting a
lot of different meanings).

When your notation is built on an expectation of a two-dimensional
sketching style, it makes a lot of sense to write a continued fraction
with lots of long bars and then a diagonal "..." at the end, or to
write an infinite sum with a big sigma at the beginning and some small
numbers around it to show what you're summing from and to, etc, etc.
When your notation is built on the expectation of a keyboard and lines
of text, it makes ju

[Python-ideas] Re: Python should take a lesson from APL: Walrus operator not needed

2019-11-06 Thread David Mertz
Unfortunately, my device dors not display LEFT ARROW WITH CIRCLED PLUS.
Nor, obviously, write I have any way to enter it easily.

On Wed, Nov 6, 2019, 2:05 PM Mike Miller  wrote:

>
> On 2019-11-06 05:40, Andrew Barnert via Python-ideas wrote:
> > While we’re at it, when you replace both = and := with an arrow, what do
> you do with += and the other augmented assignments? I can’t think of a
> single-character symbol that visually represents that meaning. If you leave
> it as + followed by an arrow, or try to come up with some new digraph, now
> we have the worst of both worlds, Unicode soup: operators that are digraphs
> and not visually meaningful while also not being typeable.
>
> There is:
>
>U+2B32   ⬲   LEFT ARROW WITH CIRCLED PLUS
>
> But there would need to be more.  I didn't find any obvious for: -=
>
> -Mike
>
>
> ___
> Python-ideas mailing list -- python-ideas@python.org
> To unsubscribe send an email to python-ideas-le...@python.org
> https://mail.python.org/mailman3/lists/python-ideas.python.org/
> Message archived at
> https://mail.python.org/archives/list/python-ideas@python.org/message/RBEZYYBQYQX6ZYF32I4OKJUZ7SW5Y6K5/
> Code of Conduct: http://python.org/psf/codeofconduct/
>
___
Python-ideas mailing list -- python-ideas@python.org
To unsubscribe send an email to python-ideas-le...@python.org
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived at 
https://mail.python.org/archives/list/python-ideas@python.org/message/IZVX762E353YYQGFIHI4HKVQJ3IXZEHG/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-ideas] Re: Python should take a lesson from APL: Walrus operator not needed

2019-11-06 Thread Mike Miller


On 2019-11-06 05:40, Andrew Barnert via Python-ideas wrote:

While we’re at it, when you replace both = and := with an arrow, what do you do 
with += and the other augmented assignments? I can’t think of a 
single-character symbol that visually represents that meaning. If you leave it 
as + followed by an arrow, or try to come up with some new digraph, now we have 
the worst of both worlds, Unicode soup: operators that are digraphs and not 
visually meaningful while also not being typeable.


There is:

  U+2B32⬲   LEFT ARROW WITH CIRCLED PLUS

But there would need to be more.  I didn't find any obvious for: -=

-Mike


___
Python-ideas mailing list -- python-ideas@python.org
To unsubscribe send an email to python-ideas-le...@python.org
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived at 
https://mail.python.org/archives/list/python-ideas@python.org/message/RBEZYYBQYQX6ZYF32I4OKJUZ7SW5Y6K5/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-ideas] Re: Python should take a lesson from APL: Walrus operator not needed

2019-11-06 Thread Mike Miller
It's too late for this one, but I'd be open to allowing Unicode operators.  It 
is always poo-poo'd here but there are numerous input solutions:



- I enabled AltGr and a Compose key on my US keyboard, I can type symbols like
  ©, —, …, ø, ə, with two or three keystrokes, diacritics too: café.
  I do this often.

- Use an ASCII notation, such as >= to ≥, \eAExpression or \eSetUnion, \u,
  etc, that are rendered with a tool like "go fmt" or black.

- Word processors typically have a Symbols dialog for such occasions.

- There are simple websites such as http://unicode-search.net/ for finding
  obscure symbols. Python has unicodedata, would be simple to wire it up.


-Mike

P.S. Ligatures are another solution from a different angle, perhaps your 
favorite editor could show ← for :=, might need a custom font though.




On 2019-11-06 09:05, Martin Euredjian via Python-ideas wrote:

Thanks for your feedback.  A few comments:


___
Python-ideas mailing list -- python-ideas@python.org
To unsubscribe send an email to python-ideas-le...@python.org
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived at 
https://mail.python.org/archives/list/python-ideas@python.org/message/LG7HPLIYCWR7D3PC7ADZZG4JT3SXIMU3/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-ideas] Re: Python should take a lesson from APL: Walrus operator not needed

2019-11-06 Thread Martin Euredjian via Python-ideas
 Thanks for your feedback.  A few comments:
> I do not consider these two things conceptually equivalent. In Python the 
>identifier ('a' in this case) is just label to the value
I used APL professionally for about ten years.  None of your objections ring 
true.  A simple example is had from mathematics.  The integral symbol conveys 
and represents a concept.  Once the practitioner is introduced to the 
definition of that symbol, what it means, he or she uses it.  It really is a 
simple as that, this is how our brains work.  That's how you recognize the 
letter "A" as to correspond to a sound and as part of words.  This is how, in 
languages such as Chinese, symbols, notation, are connected to meaning.  It is 
powerful and extremely effective.

The use of notation as a tool for thought is a powerful concept that transcends 
programming.  Mathematics is a simple example. So is music.  Musical notation 
allows the expression of ideas and massively complex works as well as their 
creation.  In electronics we have circuit diagrams, which are not literal 
depictions of circuits but rather a notation to represent them, to think about 
them, to invent them.
The future of computing, in my opinion, must move away --perhaps not entirely-- 
from ASCII-based typing of words.  If we want to be able to express and think 
about programming at a higher level we need to develop a notation.  As AI and 
ML evolve this might become more and more critical.  
APL, sadly, was too early.  Machines of the day were literally inadequate in 
almost every respect.  It is amazing that the language went as far as it did.  
Over 30+ years I have worked with over a dozen languages, ranging from low 
level machine code through Forth, APL, Lisp, C, C++, Objective-C, and all the 
"modern" languages such as Python, JS, PHP, etc.  Programming with APL is a 
very different experience.  Your mind works differently.  I can only equate it 
to writing orchestral scores in the sense that the symbols represent very 
complex textures and structures that your mind learns to imagine and manipulate 
in real time.  You think about spinning, crunching, slicing and manipulating 
data structures in ways you never rally think about when using any other 
language.  Watch the videos I link to below for a taste of these ideas.
Anyhow, obviously the walrus operator is here to stay.  I am not going to 
change anything.  I personally think this is sad and a wasted opportunity to 
open a potentially interesting chapter in the Python story; the mild 
introduction of notation and a path towards evolving a richer notation over 
time.
> Second point, I can write := in two keystrokes, but I do not have a dedicated 
> key for the arrow on my keyboard. Should '<--' also be an acceptable syntax?
No, using "<--" is going in the wrong direction.  We want notation, not ASCII 
soup.  One could argue even walrus is ASCII soup.  Another example of ASCII 
soup is regex.  Without real notation one introduces a huge cognitive load.  
Notation makes a massive difference.  Any classically trained musician sees 
this instantly.  If we replaced musical notation with sequences of two or three 
ASCII characters it would become an incomprehensible mess.

Typing these symbols isn't a problem at all.  For example, in NARS2000, a free 
APL interpreter I use, the assignment operator "←" is entered simply with "Alt 
+ [".  It takes seconds to internalize this and never think about it again.  If 
you download NARS2000 right now you will know how to enter "←" immediately 
because I just told you how to do it.  You will also know exactly what it does. 
 It's that simple.
The other interesting thing about notation is that it transcends language.  So 
far all conventional programming languages have been rooted in English.  I 
would argue there is no need for this when a programming notation, just like 
mathematical and musical notations have demonstrated that they transcend spoken 
languages.  Notation isn't just a tool for thought, it adds a universal element 
that is impossible to achieve in any other way.

Anyhow, again, I am not going to change a thing.  I am nobody in the Python 
world.  Just thought it would be interesting to share this perspective because 
I truly think this was a missed opportunity.  If elegance is of any importance, 
having two assignment operators when one can do the job, as well as evolve the 
language in the direction of an exciting and interesting new path is, at the 
very least, inelegant.  I can only ascribe this to very few people involved in 
this process, if any, any real experience with APL.  One has to use APL for 
real work and for at least a year or two in order for your brain to make the 
mental switch necessary to understand it.  Just messing with it casually isn't 
good enough.  Lots of inquisitive people have messed with it, but they don't 
really understand it.

I encourage everyone to read this Turing Award presentation:

"Notation as a Tool of Thought" b

[Python-ideas] Re: Python should take a lesson from APL: Walrus operator not needed

2019-11-06 Thread Andrew Barnert via Python-ideas
On Nov 6, 2019, at 08:59, Chris Angelico  wrote:
> 
> No, because "x <-- y" is already legal syntax (it applies unary minus
> to y twice, then checks whether x is less than the result).

You could handle that by making the grammar more complicated. I don’t think 
it’s a good idea at all, but I think it could be done. Without working through 
the details, I think all you’d need is a rule that the assignment statement 
production is tried before the expression production. This would mean that any 
statement made up of a result-ignoring less-than comparison between an 
expression simple enough to be a target and a double-negated expression can 
only be written by putting a space after the < symbol, but that probably 
affects not a single line of code ever written.

But (even assuming I’m right about that), it would mean a new and unique rule 
that you have to internalize to parse Python in your head, or you’ll stumble 
every time you see <-. And I don’t think people would internalize it.

A big part of the reason Python is readable is that its grammar is simple 
(compared to anything but Lisp or Smalltalk), and usually obviously 
unambiguous, to humans, not just to parser programs. Not that Python doesn’t 
already have a few rules that people don’t completely internalize. (I don’t 
know exactly the rules for when you can leave parens off a genexpr, a yield, 
maybe even a tuple; I only know them well enough to write code without pausing, 
and to read 99.99% of the code anyone writes without pausing, and there 
probably are constructions that would be legal if anyone ever wrote them that 
would momentarily throw me for a loop.) But in each case, the advantage is so 
huge (imagine having to write `x, y = (y, x)` everywhere…) that it’s clearly 
worth it. In this case, the advantage would be tiny (instead of having to learn 
that assignment is spelled :=, as in many other languages and non-code 
contexts, I get to learn that assignment is spelled <—, as in many other 
languages as non-code contexts?). So it’s definitely not worth it.

___
Python-ideas mailing list -- python-ideas@python.org
To unsubscribe send an email to python-ideas-le...@python.org
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived at 
https://mail.python.org/archives/list/python-ideas@python.org/message/6JLVFMZO5KMBLNK72XK7DNKU6JRNY37J/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-ideas] Re: Python should take a lesson from APL: Walrus operator not needed

2019-11-06 Thread Andrew Barnert via Python-ideas
On Nov 6, 2019, at 01:46, martin_05--- via Python-ideas 
 wrote:
> 
> 
> Still, the idea of two assignment operators just didn't sit well with me. 
> That's when I realized I had seen this kind of a problem nearly thirty years 
> ago, with the introduction of "J". I won't get into the details unless 
> someone is interested, I'll just say that J turned APL into ASCII soup. It 
> was and is ugly and it completely misses the point of the very reason APL has 
> specialized notation; the very thing Iverson highlighted in his paper [0].

J didn’t invent having multiple related operators, J was trying to fix the 
problems that were created by APL having multiple related operators. You may 
not like its solution, but then you have to come up with a different solution 
that at least tries to solve them.

Chris already raised the typeability problem.
And let’s pretend for the sake of argument that the display problem has been 
solved.

The remaining problem is that APL had way too many different operators. Too 
many operators to fit into ASCII almost inherently means too many to fit into a 
programmer’s head (and especially a programmer who also works in other 
languages and comes back to APL every so often).

J attempted to solve this by making much heavier and more systematic use of 
operator modifiers. I don’t think it was all that successful in making the 
language easy to keep in your head, but it was enough to inspire other 
languages. We have the elementwise prefix in math languages, Haskell’s banks of 
operators organized as if they had modifiers even though they don’t—and, best 
of all, the discovery that thanks to types, in many cases you don’t actually 
need more operators. In Python, and in C++, I can just add two arrays with 
plain old +, and this is almost never confusing in practice. As it turns out, 
you never miss having three or four complete sets of operators; at most you 
miss matrix multiplication (and maybe exponentiation) and a distinction between 
cross and dot for vectors, so we only needed to add 0 to 2 operators rather 
than tripling the number of operators or adding a way to modify or combine 
operators.

And I don’t think it’s a coincidence that most array-heavy programming these 
days is not done in either APL or J, or even modern languages like Julia that 
try to incorporate their best features, but in Python and C++ (and shader 
languages based on C++) that completely avoided the problem rather than 
creating it and then trying to solve it.

And that’s why := is not a crisis for Python. Python doesn’t have way too many 
operators, and isn’t in danger of getting anywhere near there. Python adds one 
or two new operators per decade, and that only if you cheat by including things 
like if-else and := as operators when they actually aren’t while ignoring the 
removal of `. And most of the operators are words rather than symbols (if-else 
rather than ?:, yield and await which I’m sure APL would have found symbolic 
ways to spell, etc.). If we keep going at the current pace, by the end of the 
century, we’ll have used up either $ or ` and added one more digraph and three 
more keywords… which is fine.

While we’re at it, when you replace both = and := with an arrow, what do you do 
with += and the other augmented assignments? I can’t think of a 
single-character symbol that visually represents that meaning. If you leave it 
as + followed by an arrow, or try to come up with some new digraph, now we have 
the worst of both worlds, Unicode soup: operators that are digraphs and not 
visually meaningful while also not being typeable.

___
Python-ideas mailing list -- python-ideas@python.org
To unsubscribe send an email to python-ideas-le...@python.org
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived at 
https://mail.python.org/archives/list/python-ideas@python.org/message/HX7ORVM4XC74ASOZ7NP3BAP6OVD43HIA/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-ideas] Re: Python should take a lesson from APL: Walrus operator not needed

2019-11-06 Thread Chris Angelico
On Wed, Nov 6, 2019 at 6:57 PM Richard Musil  wrote:
> Second point, I can write := in two keystrokes, but I do not have a dedicated 
> key for the arrow on my keyboard. Should '<--' also be an acceptable syntax?

No, because "x <-- y" is already legal syntax (it applies unary minus
to y twice, then checks whether x is less than the result).

ChrisA
___
Python-ideas mailing list -- python-ideas@python.org
To unsubscribe send an email to python-ideas-le...@python.org
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived at 
https://mail.python.org/archives/list/python-ideas@python.org/message/CWX5LMPRZQH4T55JX6TNMTTL2ZAJWJHA/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-ideas] Re: Python should take a lesson from APL: Walrus operator not needed

2019-11-05 Thread Richard Musil
On Wed, Nov 6, 2019 at 5:32 AM martin_05--- via Python-ideas <
python-ideas@python.org> wrote:

> In other words, these two things would have been equivalent in Python:
>
> a ← 23
>
> a = 23
>

I do not consider these two things conceptually equivalent. In Python the
identifier ('a' in this case) is just label to the value, I can imagine
"let 'a' point to the value of 23 now" and write it this way: "a --> 23",
but "a <-- 23" does give an impression that 23 points to, or is somehow fed
into, 'a'. This may give false expectations to those who are coming to
Python from another language and might expect the "l-value" behavior in
Python.

Second point, I can write := in two keystrokes, but I do not have a
dedicated key for the arrow on my keyboard. Should '<--' also be an
acceptable syntax?

Richard
___
Python-ideas mailing list -- python-ideas@python.org
To unsubscribe send an email to python-ideas-le...@python.org
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived at 
https://mail.python.org/archives/list/python-ideas@python.org/message/273JBQ6DHIR2QWW5YT6BFWTAGZN7OFUT/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-ideas] Re: Python should take a lesson from APL: Walrus operator not needed

2019-11-05 Thread Alex Walters
The arrow ...which I will not copy and paste to really hammer home the point 
that its not on my fairly standard US keyboard... doesn't look like assignment, 
it looks like a comparison operator.

> -Original Message-
> From: martin_05--- via Python-ideas 
> Sent: Tuesday, November 5, 2019 7:47 PM
> To: python-ideas@python.org
> Subject: [Python-ideas] Python should take a lesson from APL: Walrus
> operator not needed
> 
> During a recent HN discussion about the walrus operator I came to realize yet
> another advantage of notation. I used APL professionally for about ten years,
> which made it an obvious source of inspiration for an example that, in my
> opinion, demonstrates why the Python team missed a very valuable
> opportunity to take this wonderful language and start exploring the judicious
> introduction of notation as a valuable tool for thought (borrowing from Ken
> Iverson's APL paper with that title [0]).
> 
> To simplify, I'll define the desire for this walrus operator ":=" as "wanting 
> to
> be able to make assignments within syntax where it was previously
> impossible":
> 
> if x = 5# was impossible
> 
> # and now
> 
> if x := 5  # makes is possible
> 
> A more elaborate example given in the PEP goes like this:
> Current:
> 
> reductor = dispatch_table.get(cls)
> if reductor:
> rv = reductor(x)
> else:
> reductor = getattr(x, "__reduce_ex__", None)
> if reductor:
> rv = reductor(4)
> else:
> reductor = getattr(x, "__reduce__", None)
> if reductor:
> rv = reductor()
> else:
> raise Error(
> "un(deep)copyable object of type %s" % cls)
> 
> Improved:
> 
> if reductor := dispatch_table.get(cls):
> rv = reductor(x)
> elif reductor := getattr(x, "__reduce_ex__", None):
> rv = reductor(4)
> elif reductor := getattr(x, "__reduce__", None):
> rv = reductor()
> else:
> raise Error("un(deep)copyable object of type %s" % cls)
> 
> At first I thought, well, just extend "=" and be done with it. The HN thread
> resulted in many comments against this idea. The one that made me think
> was this one [1]:
> 
> "These two are syntactically equal and in Python there's no
> way a linter can distinguish between these two:
> 
> if reductor = dispatch_table.get(cls):
> if reductor == dispatch_table.get(cls):
> 
> A human being can only distinguish them through careful inspection.
> The walrus operator not only prevents that problem, but makes
> the intent unambiguous."
> 
> Which is a perfectly valid point. I get it.
> 
> Still, the idea of two assignment operators just didn't sit well with me. 
> That's
> when I realized I had seen this kind of a problem nearly thirty years ago, 
> with
> the introduction of "J". I won't get into the details unless someone is
> interested, I'll just say that J turned APL into ASCII soup. It was and is 
> ugly and
> it completely misses the point of the very reason APL has specialized
> notation; the very thing Iverson highlighted in his paper [0].
> 
> Back to Python.
> 
> This entire mess could have been avoided by making one simple change that
> would have possibly nudged the language towards a very interesting era,
> one where a specialized programming notation could be evolved over time
> for the benefit of all. That simple change would have been the introduction
> and adoption of APL's own assignment operator: "←"
> 
> In other words, these two things would have been equivalent in Python:
> 
> a ← 23
> 
> a = 23
> 
> What's neat about this is that both human and automated tools (linters, etc.)
> would have no problem understanding the difference between these:
> 
> if reductor ← dispatch_table.get(cls):
> if reductor == dispatch_table.get(cls):
> 
> And the larger example would become this:
> 
> if reductor ← dispatch_table.get(cls):
> rv ← reductor(x)
> elif reductor ← getattr(x, "__reduce_ex__", None):
> rv ← reductor(4)
> elif reductor ← getattr(x, "__reduce__", None):
> rv ← reductor()
> else:
> raise Error("un(deep)copyable object of type %s" % cls)
> 
> This assignment operator would work everywhere and, for a period of time,
> the "=" operator would be retained. The good news is that old code could be
> updated with a simple search-and-replace. In fact, code editors could even
> display "=" as "←" as an option. The transition to only allowing "←" (and
> perhaps other symbols) could be planned for Python 4.
> 
> Clean, simple and forward-looking. That, to me, is a good solution. Today we
> have "=" and ":=" which, from my opinionated perspective, does not
> represent progress at all.
> 
> [0] http://www.eecg.toronto.edu/~jzhu/csc326/readings/iverson.pdf
> 
> [1] https://news.ycombinator.com/item?id=21426338
> 

[Python-ideas] Re: Python should take a lesson from APL: Walrus operator not needed

2019-11-05 Thread Greg Ewing

martin_05--- via Python-ideas wrote:

The transition to only allowing "←" (and
perhaps other symbols) could be planned for Python 4.


Requiring non-ascii characters in the core language would be a
very big change, especially for something as ubiquitous as
assignment. Much more justification than just "it looks nicer
than the walrus operator" would be required.

--
Greg
___
Python-ideas mailing list -- python-ideas@python.org
To unsubscribe send an email to python-ideas-le...@python.org
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived at 
https://mail.python.org/archives/list/python-ideas@python.org/message/B24MQYTBW2T3ONAY4NAO6KRE6RHEFETZ/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-ideas] Re: Python should take a lesson from APL: Walrus operator not needed

2019-11-05 Thread Chris Angelico
On Wed, Nov 6, 2019 at 3:28 PM martin_05--- via Python-ideas
 wrote:
> Back to Python.
>
> This entire mess could have been avoided by making one simple change that 
> would have possibly nudged the language towards a very interesting era, one 
> where a specialized programming notation could be evolved over time for the 
> benefit of all. That simple change would have been the introduction and 
> adoption of APL's own assignment operator: "←"
>

How do people type that operator?

ChrisA
___
Python-ideas mailing list -- python-ideas@python.org
To unsubscribe send an email to python-ideas-le...@python.org
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived at 
https://mail.python.org/archives/list/python-ideas@python.org/message/RG2B57U6IPZ6NGMOZFFH6GHZBYJMTGHO/
Code of Conduct: http://python.org/psf/codeofconduct/