Re: [Tutor] origins bootstrapped.
On 22/11/2018 06:05, Steven D'Aprano wrote: > I don't know of any non-free (free as in beer, or free as in speech) > implementations of Python. Can you elaborate? There are several commercial distributions (as opposed to implementations) of Python, that may be what Avi has in mind. Some of these are commercial IDEs that include python as part of an integrated bundle - I think Blackadder is one such - and others are just uber distros like Enthought Entropy(?) which is a "supported" distro for scientific work - rather like Anaconda. Others are in the Movie industry where it is either tied to a particular tool or again to a support arrangement. The implementations are the standard open source code but the distribution is paid for, with the value add either in the toolset, the packaging or the support. But maybe Avi means something different... -- Alan G Author of the Learn to Program web site http://www.alan-g.me.uk/ http://www.amazon.com/author/alan_gauld Follow my photo-blog on Flickr at: http://www.flickr.com/photos/alangauldphotos ___ Tutor maillist - Tutor@python.org To unsubscribe or change subscription options: https://mail.python.org/mailman/listinfo/tutor
Re: [Tutor] origins bootstrapped.
On Wed, Nov 21, 2018 at 11:31:59AM -0500, Avi Gross wrote: > Alan has been involved with Python for a long time so he has more to offer > historically. I've been involved with Python for a long time too. What exactly are you trying to say? > Can I ask a question that I really want an opinion on? As a preface, I see > some think python as a formal language is being pushed by industry in > directions that may not meld as well for its use in other contexts like for > teaching students. I think there is always going to be tension between the needs of different users. Beginners need simplicity; expert, experienced programmers need power; both have very different ideas of what "readable code" means. I don't think Python is being pushed in any direction by "industry". It is evolving according to the needs of the programmers who use it, some of whom may work for some industry or another. > How much of that is due to it being a relative open and > free product? There are plenty of other applications that you pay for and > thus have to be responsive to the buyers to remain in business. Python has > many implementations including some freer than others. I don't know of any non-free (free as in beer, or free as in speech) implementations of Python. Can you elaborate? > Yet is has gone > through a bit of a bifurcation and many would like to see 2.X retained and > others wish everyone should migrate. Is there room for a smaller core > language that remains good for teaching purposes and that is small enough to > fit in a Rasberry pi, while other versions are of industrial strength? Do we > already sort of have some of that? Standard CPython is light enough to run on fairly low-powered devices, including Raspberry Pi. For an even smaller footprint, you can use Micropython, which will run on embedded devices, although μPy does make some comprompises that means that it's not a fully compliant Python implementation. There are, or were, other small implementations: - Pippy, Python for Palm (probably unmaintained by now...) - Python for S60, for the Nokia S60 platform (likewise...) - Pythonce, for Windows CE (who still uses WinCE?) - PyMite for embedded devices - Python-iPod - Py4A and QPython (Android) - TinyPy - PyPad for the iPad - Pycorn, Python running on bare hardware with no OS > I was thinking of how many languages and environments have been looking at > working using parallelism. [...] > It definitely is worth doing but does everyone need it especially for > teaching an intro class? Who teaches threading and parallelization in introductory classes? -- Steve ___ Tutor maillist - Tutor@python.org To unsubscribe or change subscription options: https://mail.python.org/mailman/listinfo/tutor
Re: [Tutor] origins bootstrapped.
> On Nov 21, 2018, at 10:31, Avi Gross wrote: > > Is there room for a smaller core > language that remains good for teaching purposes and that is small enough to > fit in a Rasberry pi, while other versions are of industrial strength? Do we > already sort of have some of that? What comes stock on a Pi is more than sufficient (there’s plenty of room for “standard” python 2 and python 3). Micropython (https://micropython.org/) fits that category nicely for micro controllers and Adafruit’s version of it, CircuitPython has a strong following https://www.adafruit.com/circuitpython These have been great to allow people learn not only python, but how to physically interact with the world outside the computer. — David Rock da...@graniteweb.com ___ Tutor maillist - Tutor@python.org To unsubscribe or change subscription options: https://mail.python.org/mailman/listinfo/tutor
Re: [Tutor] origins bootstrapped.
On 11/21/18 5:54 PM, Alan Gauld via Tutor wrote: > On 21/11/2018 16:31, Avi Gross wrote: >> An obvious speedup might be had by starting up N threads with each opening >> one file and doing what I said above into one shared process with N >> variables now available. But will it be faster? > > Trying to calculate (or guess) this kind of thing in > advance is near impossible. The best solution is to > prototype and measure, making sure to do so on typical > data volumes. > > That having been said if you know (or discover) that > you definitely need parallelism then its definitely worth > revisiting the design to ensure the data structures > and overall workflow are optimised for a parallel approach. > >> ...I will pause and simply say that I opted not to bother >> as the darn program finished in 5 or 10 seconds. > > Exactly so. > AS the famous quote says "Premature optimisation is..." > >> For heavy industrial uses, like some of the applications in the cloud >> dealing with huge problems, it may well be worth it. > > In many cases it's the only practical solution. > Almost all of my industrial programming has involved multi > processing and threading. Almost none (I think one )of my > personal programming projects has needed it. > People play all kinds of parallelism tricks with Python because Python has a certain Impedimet Which Shall Remain Nameless (except I'm certain someone will mention it). Anyway, it's one thing to try to decompose a massive problem, that's interesting on a certain level (see some of the talks companies like Google have done on scaling their services) but is really hard to replicate at home. But another use for non-linear programming, if you want to call it that, is task that just needs a different programming model. That's where a lot of the async stuff with coroutines and event loops that has been beefed up recently is quite interesting. Even very simple programs can run into cases where it may make sense, usually if there are things you have to wait for and want to be able to do other work while doing so. I actually got around to watching David Beazley's talk from a couple years ago, and it was pretty impressive - something I'd flagged as "watch later" I don't know how long ago, more than a year at least. Wish I'd watched it earlier now! I hate posting those obscure YouTube links where people don't know what they are clicking on, so search for this title if interested: David Beazley - Python Concurrency From the Ground Up (it's from the 2015 PyCon) ___ Tutor maillist - Tutor@python.org To unsubscribe or change subscription options: https://mail.python.org/mailman/listinfo/tutor
Re: [Tutor] origins bootstrapped.
On 21/11/2018 16:31, Avi Gross wrote: > Alan has been involved with Python for a long time so he has more to offer > historically. I'm not so sure about that, several folks on this list have been around longer than me. And I don't follow the main comp.lang.python list that closely. I'm simply giving my perspective for whatever that may be worth. > OK, horrible analogy but interesting naming. So some say Guido started with > learning his ABC and then became educated enough to understand Monty Python > and reach for the holy grail. Made me laugh out loud! > Back when my group was programming in C, I was sent to Denver for a class in > Lex/Yacc to learn how to use C libraries that now look primitive. One was a > lexical analyzer and the other sort of a parser somewhat rudely named as Yet > Another Compiler-Compiler. Still powerful tools and in active use in several projects. They were great for quickly bootstrapping a small bespoke language. > some think python as a formal language is being pushed by industry in > directions that may not meld as well for its use in other contexts like for > teaching students. How much of that is due to it being a relative open and > free product? I think that's true but not necessarily bad. It just takes the language in as different direction. And as you said, that happens in many projects. They start as one ting and end up someplace entirely different. I remember one project that started out as a network management system for a fairly obscure protocol and wound up as both a customer service system for our Global Corporate clients and as part of the monitoring system for the English Channel Tunnel!. Very different applications of the same root code base. > ...Is there room for a smaller core > language that remains good for teaching purposes and that is small enough to > fit in a Rasberry pi, while other versions are of industrial strength? Do we > already sort of have some of that? We sort of have that. Python v3 certainly works well on the pi. We could certainly have a smaller language for teaching but then we had that in ABC and nobody used it. Students don't like learning stuff that they can't use in the real world. And if you want purity for beginners we already have Logo, Scheme, Squeak/Scratch and a few others. But none of those really work well in the wider world. Which is why I still recommend python, warts and all. > I was thinking of how many languages and environments have been looking at > working using parallelism. Most people simply have no need Absolutely and for beginners a single thread is more than enough to cope with. > I was thinking about the little project I mentioned the other day. Should > some of it be done in parallel using methods available? It sounded a lot like a job for the map-reduce paradigm. Which is parallel where it can be and sequential where it should be... > An obvious speedup might be had by starting up N threads with each opening > one file and doing what I said above into one shared process with N > variables now available. But will it be faster? Trying to calculate (or guess) this kind of thing in advance is near impossible. The best solution is to prototype and measure, making sure to do so on typical data volumes. That having been said if you know (or discover) that you definitely need parallelism then its definitely worth revisiting the design to ensure the data structures and overall workflow are optimised for a parallel approach. > ...I will pause and simply say that I opted not to bother > as the darn program finished in 5 or 10 seconds. Exactly so. AS the famous quote says "Premature optimisation is..." > For heavy industrial uses, like some of the applications in the cloud > dealing with huge problems, it may well be worth it. In many cases it's the only practical solution. Almost all of my industrial programming has involved multi processing and threading. Almost none (I think one )of my personal programming projects has needed it. -- Alan G Author of the Learn to Program web site http://www.alan-g.me.uk/ http://www.amazon.com/author/alan_gauld Follow my photo-blog on Flickr at: http://www.flickr.com/photos/alangauldphotos ___ Tutor maillist - Tutor@python.org To unsubscribe or change subscription options: https://mail.python.org/mailman/listinfo/tutor
[Tutor] origins bootstrapped.
Alan has been involved with Python for a long time so he has more to offer historically. I don't see some things as either/or. You can start with one major motivation and it morphs from a one-celled creature like an Amoeba to a complex vertebrate like a Python which needs modules added so it can walk around better. OK, horrible analogy but interesting naming. So some say Guido started with learning his ABC and then became educated enough to understand Monty Python and reach for the holy grail. OK, even worse. Time to get serious. I have seen this on many projects, not just programming languages and environments. Something fairly simple is imagined then perhaps prototyped. Someone may notice that what was created may be used in another way if perhaps expanded a bit. Someone then realizes they now have functionality that can be used to create something else, in a form of bootstrapping. After a while they have a collection of tools that can be combined to make something more complex. The biological analogy above can be an example. No, I am not saying that a distant ancestor of a snake like a python was an amoeba. But they do share common ancestors they have both diverged from with the amoeba remaining a single celled organism and the python descending from something that became multi-cellular then differentiated into having different kinds of cells in tissues and organs and became a somewhat integrated whole that is possibly more than the sum of its parts. The ABC analogy is also obvious. Once an alphabet is chosen and provisional meanings given to each letter, it can grow and even adjust to making words and sentences and even seemingly endless streams of consciousness like some of my messages. Python was built on top of other achievements that some people were learning from. There were many steps along the way from building machines programmed one byte at a time in binary (I hated a class that made me do that as one error means start over) to various levels where a compiler and then an interpreter would parse things. We have been discussing using regular expressions. Much of a language like python is having bits and pieces of code written in ASCII or Unicode be parsed using hopefully unambiguous rules into tokens that can be made into decision trees or whatever data structure. That deepens on being able to look for and find some sort of pattern in strings. I am not sure what python and others use, but it may be tools similar to string search or regular expressions that allows them to bootstrap. Back when my group was programming in C, I was sent to Denver for a class in Lex/Yacc to learn how to use C libraries that now look primitive. One was a lexical analyzer and the other sort of a parser somewhat rudely named as Yet Another Compiler-Compiler. But today, what do most people use? Our tools improve, often by being a wrapper to older tools and so on for multiple levels. New functionality is added too. Can I ask a question that I really want an opinion on? As a preface, I see some think python as a formal language is being pushed by industry in directions that may not meld as well for its use in other contexts like for teaching students. How much of that is due to it being a relative open and free product? There are plenty of other applications that you pay for and thus have to be responsive to the buyers to remain in business. Python has many implementations including some freer than others. Yet is has gone through a bit of a bifurcation and many would like to see 2.X retained and others wish everyone should migrate. Is there room for a smaller core language that remains good for teaching purposes and that is small enough to fit in a Rasberry pi, while other versions are of industrial strength? Do we already sort of have some of that? I was thinking of how many languages and environments have been looking at working using parallelism. Most people simply have no need for the complication. When you add the ability to do multiprocessing within an application using something like threads, you spend lots of time making sure you literally lock down shared resources so they are used serially. You need to make sure race conditions do not lock up all your threads at once. Lots of added overhead is only worth it if you gain in the process. Add multiple cores in your CPU, and you may need to handle more complications as they are actually running in parallel, perhaps still sharing a common memory. Allow it to use multiple processors around the world, and you need even more elaborate control structures to synchronize all that. It definitely is worth doing but does everyone need it especially for teaching an intro class? I was thinking about the little project I mentioned the other day. Should some of it be done in parallel using methods available? One part of the problem was to read in N files into N pandas DataFrame objects. I knew that I/O tends to be fairly slow and most programs take a nap while waiting. I