On 30/08/2021 00.47, Peter Otten wrote: > On 29/08/2021 12:13, dn via Python-list wrote: >> On 29/08/2021 20.06, Peter Otten wrote: >> ... >>> OK, maybe a bit complicated... but does it pay off if you want to >>> generalize? >>> >>>>>> def roll_die(faces): >>> while True: yield random.randrange(1, 1 + faces) >>> >>>>>> def hmt(faces, dies): >>> for c, d in enumerate(zip(*[roll_die(faces)]*dies), 1): >>> if len(set(d)) == 1: return c, d >> >> >> Curiosity: >> why not add dies as a parameter of roll_die()? > > Dunno. Maybe because I've "always" [1] wanted a version of > random.randrange() that generates values indefinitely. It would need to > check its arguments only once, thus leading to some extra
Each code-unit should do one job, and do it well! SRP... >> Efficiency: >> - wonder how max( d ) == min( d ) compares for speed with the set() type >> constructor? > > I did the simplest thing, speed was not a consideration. If it is, and > dies (sorry for that) is large I'd try > > first = d[0] > all(x == first for x in d) # don't mind one duplicate test > > For smaller numbers of dice I'd unpack (first, *rest) inside the for > loop. But it's a trade-off, you' have to measure if/when it's better to > go through the whole tuple in C. For larger numbers of dice, and presuming a preference for an inner-function which rolls only a single die per call; would it be more efficient to test each individual die's value against the first, immediately after its (individual) roll/evaluation (rinse-and-repeat for each die thereafter)? This, on the grounds that the first mis-match obviates the need to examine (or even roll), any other die/dice in the collection. OTOH the simulation of rolling n-number of dice, as would happen in the real-world, would be broken by making the computer's algorithm more efficient (rolling until the first non-equal value is 'found'). Does that mean the realism of the model dies? (sorry - no, I'm not sorry - you deserved that!) Does one "roll" a die, or "shake the dice"??? We don't know the OP's requirements wrt to execution-efficiency. However, as you (also) probably suffered, such exercises regularly feature in stats and probability courses. Sometimes 'the numbers' are quite large in order to better-illustrate ("smooth") distribution characteristics, etc. I have a love?hate relationship with questions of Python and 'efficiency'. Today, discovered that using a cut-down Linux version (on AWS) was actually slower than using a full-fat distribution - upon analysis, my friendly 'expert' was able to point the finger at the way the two distros compiled/prepared/distribute the(ir) Python Interpreter. (I'm glad he thought such investigation 'fun'!) All of which further complicates the business of design, given we already know of situations where approach-a will run faster than approach-b, on your machine; yet the comparison may be reversed on mine. This discussion forms a sub-set of that: when to use the built-in functions (implemented in C) because they are (claimed to be) more efficient than another approach - and, when one approach using a built-in function might be faster than another 'built-in'/C-coded approach. ("small things amuse small minds" - mind how you describe my mind!) "Bottom line": I prefer to think of Python's "efficiency" as reflected in the amount of my time that is needed, in order to complete a project! >> - alternately len( d ) < 2? >> - or len( d ) - 1 coerced to a boolean by the if? >> - how much more efficient is any of this (clever thinking!) than the >> OP's basic, simpler, and thus more readable, form? > > It really isn't efficiency, it's a (misled?) sense of aesthetics where > I've come to prefer > > - for-loops over while, even when I end up with both to get the desired for > > - enumerate() over an explicit counter even though there is the extra > unpack, and you still need to initialize the counter in the general case: > > for i, item in enumerate([]): pass > print(f"There are {i+1} items in the list.") # Oops Next thing you'll be using for-else... [insane giggling] It's interesting how we arrive at these views (as a trainer I spend a lot of time trying to detect how learners build their mental maps, or "models", of each topic). I've always had a clear 'formula'/rule/hobgoblin: if the number of loops can be predicted, use 'for', otherwise use 'while'. Of course, Python alters that view because it offers a for-each, which means that I don't need to know the number of loops, only that the loop will cycle through each item in the iterable. It used to be a far simpler world! That said, I really miss the option of while controlling the loop with a pre-condition AND having a repeat...until controlling the loop with a post-condition - the former enabling >=0 loops; the latter, requiring at least one! Using enumerate() may be a matter of aesthetics (English-English spelling!). However, it is a basic Python idiom. It is as much a tool in our coding as a fork/spoon/chop-sticks/fingers are to someone eating. (yes, I'm feeling hungry). At times I feel a 'siren call' to use some of the powerful, let's call them "one-liner" tools, because it is 'fun' to bend my mind around the challenge. However, most such code tends to lose readability in some sort of inverse proportion to its power - it leaves the clarity and simplicity of Python-space and heads towards APL, Lisp, et al. The 'problem' is that as a coder increases his/her knowledge (learns to harness 'the force'), the code-structures which count as one "chunk" of thought, expand. For example, it can be difficult to see (remember) that although comprehensions may have become a single 'unit' of thought to a skilled practitioner; for others they are (still) opaque, and 'mountains' to climb. Thus, the 'balance' between 'power' and readability; and between using techniques which demand Python-expertise and thus carry an expectation that less-capable/-experienced programmers will 'improve their game' (to be able to contribute to 'this' project, or future re-use/maintenance). (see also "Zen of Python") The 'level' then becomes a convention - just as much as 'do we adhere to PEP-008 naming' (etc). The important point is not what the convention is, but that the team has arrived at a pragmatic agreement, eg will we use comprehensions or explicit loops. Your team might say "of course", whereas mine says "can't cope with that"... BTW you already know this, but I'm not writing only to you! (don't fret, you're still 'special'...) >> English language 'treachery': >> - one die >> - multiple dice > > You might have inferred that I knew (or had looked up) the singular of > dice, so this is but a momentary lapse of reason. It hurts me more than > you, trust me. Not as much, as going on record with confusing they're > and their, but still ;) Reason? Logic? Consistency? The English language. Surely, you jest... Having learned/lived many languages, struggling-mightily with some, feeling comfortable with others; I am enormously grateful (greatly grateful?) that English was one of my home-languages, and that I didn't have to "learn" it! I do enjoy 'playing' with it though, and as pointed-out annually (?semesterly), making students groan in response to my weak/pathetic/Dad-joke/seven-year-old's humor (in live-lectures) at least gives feedback that they are 'there' and paying some attention. (alternately, it's Pavlovian-conditioning - BTW: yes, I'm still hungry...) Please consider the die/dice as 'educational', and a reflection of my confusion of comprehension when trying to infer meaning whilst reading the code-example - rather than arrogance or placing undue demand on you. >> (probably not followed in US-English (can't recall), particularly on >> computers running the Hollywood Operating System). > > I've come to the conclusion that International English is hopelessly and > inevitably broken. That's the price native speakers have to pay for > having they're (oops, I did it again!) language used as lingua franca. If English really set-out to be the world's lingua-franca, why didn't it come up with its own, a particularly (and peculiarly) English word, for the concept??? I've spent quite some time looking into the idea of 'International English' - to attempt to find 'an English' to employ in our course materials used world-wide. Even 'native speakers' can't agree on what is 'English'! When the question is widened to include the broader, majority of the world; the concept of a 'core' of English becomes impossible to define or encapsulate... Even being a 'native speaker' doesn't necessarily help: George Bernard Shaw described "Britain and America are two nations divided by [the use of] a common language"! He was Irish - and I recall another quotation along the lines of: the Irish learn English in order to hurl insults. Sigh! Let's not be racist though - or would that be lingual-ist(?) Advice when learning French was "always expect exceptions" (apologies, I went looking for a source, in English or French - but Python-references crowded the first couple of web-search 'hits'). I enjoy the word-picture in Spanish: "Hijo de tigre, no siempre sale pintado. Siempre ahi esa excepción." (The tiger-cub isn't always born (ready-)painted (colored?striped). There is always an exception!) So much more apropos (to use another (?)English expression) than, that "a leopard NEVER changes its spots"! >> Continuous Education: >> Thanks for the reminder that enumerate() can be seeded with a "start" >> value! > > [1] I think I've suggested reimplementing the whole module in terms of > generators -- can't find the post though. Can't we assume lazy-evaluation - if only by virtue of the example equivalent-code illustration in the docs? (it features a "yield") (https://docs.python.org/3/library/functions.html?highlight=enumerate#enumerate) Must admit, once the use of generators started to multiply release-after-release (both in Python code and 'under the hood'), and tools like range()/xrange() were made more memory-efficient, etc, that pretty much all such constructs had become "lazy". (Careless assumption. Don't know!) -- Regards, =dn -- https://mail.python.org/mailman/listinfo/python-list