On 7/22/2011 6:41 PM, Casey Ransberger wrote:
I did this dance too... Hmm... Seems the Mac installer comes with some kind of translation tool that's advertised to be able to output MPEG, maybe we can use that to save others the trouble of installing the Real client.


yeah...
even on Windows, RealPlayer was giving me trouble with working (initially crashing, ...).
got it playing, but having audio problems...

If I figure out that I can handle the conversion without spending any money, would folks have interest in the artifact produced?


the big downside of MPEG-1 though is that it has a poor size/quality tradeoff.

FWIW, XViD AVI is probably better, or maybe OGM/Theora?...


This is a fun talk, I'm only about halfway through it, but I must admit having cracked up when he said, (and I have to paraphrase, because I don't have it in front of me,) "...if we were physicists, and we didn't understand what Newton did [slight dramatic pause] we should be shot."


yeah...

in general it was a good video, and seemed to mostly make sense.


(sorry if I wander a bit with all this).

also the comment of people confusing their specific views for reality as a whole. I generally agree with this, as I run into this problem a lot in dealing with people.

(maybe I can be accused of that sometimes?... I try to keep an open mind though WRT possibilities, although admittedly I am a little less optimistic of ideas which seem unlikely to really be workable or useful in the near term, or whose adoption would be very costly or necessitate fundamental re-engineering which seems unlikely to be able to happen within a reasonable time-frame, or which would not likely offer much advantage or could more likely be detrimental, ...).

but, I have gotten in lots of arguments, with people, where at the same time I don't believe in a singular necessarily correct worldview, but still choose to uphold a certain set of views which seem most-likely correct (after all, at the end of the day it does little good to doubt and twiddle away all ones time in speculation, most often one will just pick something good enough and run with it).

although, in an "ontological" sense, I have no particular need for any particular ontology, as "my world" just doesn't work that way.


hmm... people don't really as often call me "smart", but people are apparently sometimes impressed by my ability to recall piles of technical and other trivia. sometimes a big part of thinking is being able to pull up a lot of related information from memory and then walk the links between it all, to find problems, derive solutions, ...

but, it is nothing magic. most of this is trivia I have ran across one time or another on sites like Wikipedia.


I suspect I generally mostly see the present and the past, and the future is generally a bit anomalous except in cases where it can be predicted via graphs or similar. granted, it is possible an xSTP-style psychology may have something to do with this.

I distrust people thinking too much about the "future" as much of the time it drifts off into fantasies and lala-land, and too far and one may as well just be watching Star Trek or Gundam or similar... or if these people are something like my dad, big elaborate ideas ("plans") which rarely go anywhere as he doesn't bother to actually do anything, beyond maybe trying to boss people around and get them to do busywork.

but, sometimes (seemingly more often in my case though) it is amusing that someone can get something or something will happen, and previously implausible-seeming ideas can suddenly become something that can actually work. it is the magic of things just falling into place on their own and working out well.


making things small: well, yes, I guess one can do this.
but, if one can pull off the same thing with less thinking by just writing a large glob of code, well this seems to work fairly well as well.

it just isn't really clear what is the practical gain of striving for this level of minimalism?
unless the goal is minimalism for sake of minimalism?
(this may require further evaluation).


maybe information can be distilled down somewhat, but what of the cost that this will make the information beyond the grasp of mere humans? say, math-like forms which people can look at and be like "now what is this exactly"? then large piles of many words may be needed to give context and application details (what it is and what it can be used for, ...).

it is like, me trying to understand SSA-form and SSA-based transforms. I can see how it works, but can't fully ever get my head around how to make it work. so, meanwhile, my code-generators generally tend to work via a mix of stack-machines and good old procedural logic, and this seems to work (apart from my main codegen being, buggy...).

this is why programming can be relatively easy, but things like math are very difficult, and beyond the grasp of many people (but, some people claim math is easier than programming, but this is itself a mystery...).


I don't personally use lex or yacc or similar, as personally I didn't really see the gain (replacing an otherwise reasonably straightforward task with opaque "voodoo magic" tools).

if one can do a C-like parser in C using recursive descent in maybe a few kloc, what is the big deal? just write the parser in C using recursive descent.

actually, most of my parsers (BGBScript / C/Java/C#/BS2, as well as others such as for XML, ASM, ...), are mostly built on copy-pasted versions of a lot of the same basic code.

actually, often, one can implement something by copy-pasting around a lot of the code and specializing it for each task. apart from sometimes having to copy/paste the code for patches, it can work fairly well.

usually larger things end up being consolidated though, as well as code where core aspects have mostly separated out from its use-case-specific details. other times, they diverge notably (becoming independent and unrelated entities).

fairly common code and algorithms may also be stored in memory, and typed-out as needed, ...


I have sometimes wondered if math people end up essentially memorizing large numbers of common patterns which they recall and invoke as needed, combined with any logic needed for which operations to perform and when (as well as, however, these effectively people keep track of variable scope, ...).

I have observed some that the output of math people doing their thing (partly notes from actually trying in vain to pass such a class) happens to sort of resemble the results of a compiler optimizer (potentially, similar algorithms are at play?...).

nevermind that maybe I place into classes higher than I can do, maybe because for the placement tests I used a more general "make math work" strategy: endlessly fiddling with the expressions in ones' head and re-evaluating them until they work (guess and check). sadly, this strategy is often unusably slow in general (may take many minutes of head-grinding per problem).

well, there is also imagining the graph or probing for the answer, ...

in high-school math classes, "solving" most things was simply a matter of recognizing the expression pattern, performing a little arithmetic on the constants, and having the answer (like, quadratic formula and so on). like basically A->B pattern matching and replacement, typically all done in a single step (almost sort of like a regex or sed or similar, but with arithmetic...).

scary though is when one finds that none of their usual strategies work, like they have encountered something far more sinister.


or such...


_______________________________________________
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc

Reply via email to