Re: VHS and Betamax

2006-04-13 Thread Denny Valliant
> > Socially? As in people don't get out enought and meet in
> > meat-space? Or our interactions, even on-line?
>
> Well I was thinking more about social infrastructure. Things like bank
> personell being able to help you in any way if their computers go out.
> I understand why it's become that way, but it'd be nice to think that
> there would be some kind of backup system, like they could make a
> phone call to another central authority for the bank who could verify
> your available funds, etc. Or that the personell at your own bank


This is exactly what that article about loose abstractions was
getting at.
See, the problem never goes away, no matter the level. Say there was
a backup system. Nothing says that that backup system will accomplish
it's goal.  Even if we go back to the days of paper, what happens when
they are all out of "official" forms?

To be clear, the human element is the only way to get around the
disconnect.  People are just lazy, and it's easier to say "the
computer is down" than call up Joe, have him dig up the paperwork,
etc., etc.,. But I feel in my heart of hearts that we don't NEED
computers to survive. Or paperwork, for that matter.

At least until we go borg. :-) It bugs the hell out of me, but "who you
know" still matters . If you were chums with the main bank dude,
you'll see a different aspect of the whole process than when you're
just joe schmoe.  I don't see that changing for a bit, if ever.
Guess it keeps people cordial. Eh. My jury is still out on that.

And all that aside, there are times when "the system is down",
plain and simple. If you realize that, you can contain the leaks,
so to speak. Damn the circular nature of this idea. ;-)

I'll have to check out the Asylum book. Heard enough people
who's thoughts I appreciate, mention it.  Someday I'll take
advantage of Amazon or something similar for a book list.
The little "note" in my PDA just isn't cutting it no more. :-)

our code generators are for more specific purposes, that's true, but I
> still shy away from it on the thinking that it's liable to lead to
> handcuffing me in some way, re: what you just described about the old
> mac OS or the couple of examples above. Although there have been a
> couple of occasions on which I felt there was too much abstraction in
> something, more frequently I find myself being limited and/or required
> to do more work as a result of a lack of abstraction.


I think the real pain comes when you have no way to translate the
abstraction.  I don't see much difference, so long as you have a "getter"
and a "setter" of generated content, between a function that gets and
sets things in memory, vs. on-disk.

Like my attempt at using the single table per data type idea-  I had an
inkling that I might have to use DB tables at some point, whether
to extract/import data (why I first wrote in/out) or as a storage medium.
I was just sure to have relatively easy ways of transforming to and from
my "preferred format" and other more standard formats.

So long as you can do that, you can use pipes, and transform data
into whatever format you like. Within reason, I guess. ;-)

Abstractions are ideas. You can make a wrapper for something and
'vwalla', you abstracted it - If you wrote the wrapper "right", right being
in a form that give you the most use. Even if you just did the same
stuff the thing being wrapped did, it's still an abstraction of sorts.
Just by the fact that it's now "in your language", so to speak.

I guess it's more how good of an abstraction, than abstraction itself.
"How good" is a many factored equation,  involving opinions and
other riff-raff.  A "good" abstraction maybe spot on for one person,
and not even close to being "done to a turn" for another. Or I could
have the whole idea of abstraction wrong. Or a weird definition.
I'm thinking along the lines of a representation of something that is
easier to "do stuff" with than the "actual" thing.

But I really have to give them credit for
> making my life easier. :)


Same here. Of course I'm one to talk, I used to look down on
people who wrote code in Basic. It's hard not to stick to that
mentality.  "But you could write yer put pixel 300x  faster in
ASM!" :-)
Don't get me wrong, if you have a big enough "snippets" lib,
I think you could be pretty fast, even with low level stuff.

But I like how you can replace the built-in putpixel with
your "optimized" ASM routine. Seemed like it was the
best of both worlds. When I realized that, it all sorta...
Seems sorta like 6 of one... maybe not so much the
"framework", as it is the mind behind the code... Eh.
What this has to do with anything... =-p

Oh there's certainly value in understanding some of the lower theory,
> but an individual programmer can only learn so much. Not that an
> individual programmer can't learn whatever they want to learn, but you
> know, there are only so many hours in a day. :) In the long-run, you
> have to weigh the value of having that low level insight agai

Re: VHS and Betamax

2006-04-13 Thread Denny Valliant
> I'll have to pick up a copy of it... Seems somewhat similar to a novel
> I wrote which can be found here: http://www.turnkey.to/ike/613.htm


Heh. Judging by your introduction... the age of spiritual machines had
a good bit along the lines of "this stuff is far out man, right now!" In X
years, it's gonna be outta sight!

If you think that's a hokey idea, you'd really think that book is hokey.

It is kind of impressive when you think about it... all that LISP stuff and
whatnot. Wierd programming languages, all funky like. :-)

The site's kinda crude, I know. :)
> I haven't done anything with it in several years.


I dig writing man, it's cool that you put it up.  'tis pretty funny so far.
:Denny


~|
Message: http://www.houseoffusion.com/lists.cfm/link=i:4:237665
Archives: http://www.houseoffusion.com/cf_lists/threads.cfm/4
Subscription: http://www.houseoffusion.com/lists.cfm/link=s:4
Unsubscribe: 
http://www.houseoffusion.com/cf_lists/unsubscribe.cfm?user=11502.10531.4
Donations & Support: http://www.houseoffusion.com/tiny.cfm/54


Re: VHS and Betamax

2006-04-13 Thread Denny Valliant
> > I guess my point is that I can see the point of not
> > worrying too much, but I don't buy into believing
> > things will change in the future. Arg! Of course things
> > will change, but, um. Well... I guess don't go around
> > smoking, thinking that, "what the hell, by the time I
> > get cancer, we'll have the technology to..."  Or you
> > can, hell, I do sometimes, and it could be true, but
> > the safest bet is to avoid smoking. :-/
>
> I don't think that's actually analogous to my development process.


Oh, I wasn't implying that it was. Just talk'n man, I've seen your stuff,
it's good.  Hell, you'd shudder if you saw my stuff. *shudder*

So I'm not exactly running around saying "oh just throw whatever
> abstraction you want in there, 'cause the hardware will support it in
> 20 years" (and even if I were, the Cancer analogy is bad because we
> have no real proof of progress toward a cure for cancer, whereas we
> have definitive and continual proof of the progress of our hardware).


(-: It was a bad analogy. I'd take the progress of hardware, and throw
in the progress of software. Can you BELIEVE the rate of development?
The warez are evolving at some kind of proportional rate to the hardware.

What I'm saying is that there is a need to consider the progress of
> hardware when evaluating the long-term viability of abstractions in
> our software. Something that is not viable in today's market because
> its poor performance makes it unable to support a profitable
> application may very well be a major bread-winner in twenty years for
> its ability to help programmers produce more "agile" code. Thus it's
> also bad form to discount an idea that failed once before for
> performance reasons without first testing it in combination with new
> hardware and new complementary technologies/techniques.


I don't know about the 20 year span... but I totally agree. And I guess,
if you think about it, there might be tons of good ideas out there that
are 10-20 years old, and were just ahead of their time. Hmmm. For
sure like 5 probably.

> Take one look at the gamer market and tell me that
> > people aren't still concerned with shaving .2
> > off of some random shade routine.
> > :-) Thank god. I kind of dig that.
>
> Some of us. :) But I don't work in the gaming market... and I'm glad I
> don't. :)


Heh. Depends on what area. I could settle for one of those heads
who just goes to E3 or whatever and tells everyone how cool everything
is, and, wow, look at this swag! =]

That's a market that programs for stuff that's not even available at
the time, or so I hear. Maybe not programs, but dependant on the fact
that technology will increase.


~|
Message: http://www.houseoffusion.com/lists.cfm/link=i:4:237664
Archives: http://www.houseoffusion.com/cf_lists/threads.cfm/4
Subscription: http://www.houseoffusion.com/lists.cfm/link=s:4
Unsubscribe: 
http://www.houseoffusion.com/cf_lists/unsubscribe.cfm?user=11502.10531.4
Donations & Support: http://www.houseoffusion.com/tiny.cfm/54


Re: VHS and Betamax

2006-04-12 Thread S . Isaac Dealey
> Hey Isaac, ever read "the age of spiritual machines"?
> :D

Just looked at some reviews on Amazon.com

I'll have to pick up a copy of it... Seems somewhat similar to a novel
I wrote which can be found here: http://www.turnkey.to/ike/613.htm

The site's kinda crude, I know. :)
I haven't done anything with it in several years.

I must have been in a bad mood when I wrote that introduction to the
book too, 'cause the last few paragraphs are pretty caustic. :P


s. isaac dealey 434.293.6201
new epoch : isn't it time for a change?

add features without fixtures with
the onTap open source framework

http://www.fusiontap.com
http://coldfusion.sys-con.com/author/4806Dealey.htm


~|
Message: http://www.houseoffusion.com/lists.cfm/link=i:4:237655
Archives: http://www.houseoffusion.com/cf_lists/threads.cfm/4
Subscription: http://www.houseoffusion.com/lists.cfm/link=s:4
Unsubscribe: http://www.houseoffusion.com/cf_lists/unsubscribe.cfm?user=89.70.4
Donations & Support: http://www.houseoffusion.com/tiny.cfm/54


Re: VHS and Betamax

2006-04-12 Thread S . Isaac Dealey
Let's try this again. :)

> I don't know. Doesn't CF take care of query caching? ;-)

It can. It certainly does have a built-in mechanism for caching them.
I'm not personally very fond of the built-in mechanism, so I've
created my own mechanism for caching queries (which isn't tremendously
different, but does provide a little more control).

> I guess my point is that I can see the point of not
> worrying too much, but I don't buy into believing
> things will change in the future. Arg! Of course things
> will change, but, um. Well... I guess don't go around
> smoking, thinking that, "what the hell, by the time I
> get cancer, we'll have the technology to..."  Or you
> can, hell, I do sometimes, and it could be true, but
> the safest bet is to avoid smoking. :-/

I don't think that's actually analogous to my development process.

I do actually spend quite a bit of time working on the optimization of
my applications, but I stay away from much examination of the more
extreme "low level" ideas. For example, unless I have some very
compelling reason (it could happen, though I don't think it's likely)
I won't avoid using a loop to concatenate some strings if it makes
coding an application easier. Examining my queries and caching
routines (as Dave Watts suggested) for ways to optimize my
applications is however an assumed part of my development process.

So I'm not exactly running around saying "oh just throw whatever
abstraction you want in there, 'cause the hardware will support it in
20 years" (and even if I were, the Cancer analogy is bad because we
have no real proof of progress toward a cure for cancer, whereas we
have definitive and continual proof of the progress of our hardware).

What I'm saying is that there is a need to consider the progress of
hardware when evaluating the long-term viability of abstractions in
our software. Something that is not viable in today's market because
its poor performance makes it unable to support a profitable
application may very well be a major bread-winner in twenty years for
its ability to help programmers produce more "agile" code. Thus it's
also bad form to discount an idea that failed once before for
performance reasons without first testing it in combination with new
hardware and new complementary technologies/techniques.

> Take one look at the gamer market and tell me that
> people aren't still concerned with shaving .2
> off of some random shade routine.
> :-) Thank god. I kind of dig that.

Some of us. :) But I don't work in the gaming market... and I'm glad I
don't. :)

> Hey Isaac, ever read "the age of spiritual machines"?
> :D

Nope, hadn't heard of it until just now. :)

s. isaac dealey 434.293.6201
new epoch : isn't it time for a change?

add features without fixtures with
the onTap open source framework

http://www.fusiontap.com
http://coldfusion.sys-con.com/author/4806Dealey.htm


~|
Message: http://www.houseoffusion.com/lists.cfm/link=i:4:237654
Archives: http://www.houseoffusion.com/cf_lists/threads.cfm/4
Subscription: http://www.houseoffusion.com/lists.cfm/link=s:4
Unsubscribe: http://www.houseoffusion.com/cf_lists/unsubscribe.cfm?user=89.70.4
Donations & Support: http://www.houseoffusion.com/tiny.cfm/54


Re: VHS and Betamax

2006-04-12 Thread S . Isaac Dealey
> I don't know. Doesn't CF take care of query caching? ;-)

It can. It certainly does have a built-in mechanism for caching them.
I'm not personally very fond of the built-in mechanism, so I've
created my own mechanism

> I guess my point is that I can see the point of not
> worrying too much,
> but I don't buy into believing things will change in the
> future. Arg! Of
> course things will change, but, um. Well... I guess don't
> go around
> smoking, thinking that, "what the hell, by the time I get
> cancer, we'll
> have the technology to..."  Or you can, hell, I do
> sometimes, and it
> could be true, but the safest bet is to avoid smoking. :-/

> Not really a point as such, I guess, but take it from my
> dad and
> muscle cars... there will always be optimization to be
> done.
> Whether it really makes things faster is besides the
> point. =P

> The link to the thing about strings was more conceptual
> than not.

> Take one look at the gamer market and tell me that people
> aren't
> still concerned with shaving .2 off of some random
> shade routine.
> :-) Thank god. I kind of dig that.

> Hey Isaac, ever read "the age of spiritual machines"?
> :D



> ~~
> 

~|
Message: http://www.houseoffusion.com/lists.cfm/link=i:4:237650
Archives: http://www.houseoffusion.com/cf_lists/threads.cfm/4
Subscription: http://www.houseoffusion.com/lists.cfm/link=s:4
Unsubscribe: http://www.houseoffusion.com/cf_lists/unsubscribe.cfm?user=89.70.4
Donations & Support: http://www.houseoffusion.com/tiny.cfm/54


Re: VHS and Betamax

2006-04-12 Thread S . Isaac Dealey
>> Socially I think we allow computers to hand-cuff us.

> Socially? As in people don't get out enought and meet in
> meat-space? Or our interactions, even on-line?

Well I was thinking more about social infrastructure. Things like bank
personell being able to help you in any way if their computers go out.
I understand why it's become that way, but it'd be nice to think that
there would be some kind of backup system, like they could make a
phone call to another central authority for the bank who could verify
your available funds, etc. Or that the personell at your own bank
might be able to manually give you a withdrawl -- right now all they
can do on paper is take a deposit. I think that generally speaking
once the computers go down they don't even have access to the cash
drawer at most banks. I could be wrong about that, but that's my
feeling.

This is just one example -- although the majority of examples stem
from poor user interfaces rather than from over-dependance on machines
in general. There's one example of a poor user interface in the
Inmates Are Running the Asylum for a system for displaying movies to
different passengers on an intercontinental flight. The system
required (hand-cuffed) the stewardesses to collect the money and enter
it into the system before the passenger could be given their movie,
which, the average programmer thinks is perfectly logical (and I don't
blame us). The problem is that assumption is made with a lack of
understanding of the environment the stewardesses work in - an
environment in which it's much easier just to give the passenger the
movie and then collect the cash later. The interface was so
problematic for them that the stewardesses would regularly sabotage
the system so that nobody could get any of the movies (and hence, the
airline lots a lot of money) because being handcuffed by the interface
in that way was such a hassle for them.

>> Well, maybe if the optimization is as swell as it's said
>> > to be.  I can't help but feel that even as smart as
>> > computers are, there are areas that a human could see a
>> > "pattern" before the computer could. Or whatever.
>>
>> Is this a response to my comment about why I'm not
>> bothered by the fact that the ColdFusion server generates
>> java code? (another good example of which is that the
>> server used to generate C++ code (at least I thought I
>> remembered somebody saying such), and my knowledge
>> of C++ wasn't helpful when I worked with ColdFusion
>> then either)

> More along the lines of "over helpful" generation.  The
> old mac OS always kind of bugged me.  It was SO arcane
> to do underlying stuff, ya know? I guess that was cool
> too.  But you ever feel so abstracted that you are no
> longer in control? I guess that's bad design or
> interface or something more than abstraction...

Ahh... Rarely. :) But that's another limitation of generated code.
It's one thing to have an entire language that is an abstraction such
as CFML, but then, the amount of time that goes into that code
generator is astronomical in comparison to the amount of time that
goes into developing the code generators that we would use. Of course,
our code generators are for more specific purposes, that's true, but I
still shy away from it on the thinking that it's liable to lead to
handcuffing me in some way, re: what you just described about the old
mac OS or the couple of examples above. Although there have been a
couple of occasions on which I felt there was too much abstraction in
something, more frequently I find myself being limited and/or required
to do more work as a result of a lack of abstraction.

> The computer still doesn't know the goal (yet), so it
> has to consider all options, picking what it thinks you
> want. Your comment about hoping the macromedia engeneers
> thought about this stuff... some dude some where put some
> logic in there, there is no law of nature stating that it
> doesn't matter once your at a higher level. Man, that made
> sense.

Yeah, to date the people responsible for ColdFusion seem to have been
pretty effective at identifying good abstractions (for the majority of
their market) and making them work well. Not that I haven't had the
occasional complaint. :) But I really have to give them credit for
making my life easier. :)

> Maybe you're right, and it's a moot point, but I think
> understanding something to it's core is worthy. Actually
> considering the difference of running "the same code" on
> a 64 bit or a 32 bit.  It's nice to know it should "just
> work", but I really like that intuitive guess type stuff
> that happens when you start understanding the nature of
> something.

> "How did you know to look there to fix that?"
> "I dunno, it just made sense."

Oh there's certainly value in understanding some of the lower theory,
but an individual programmer can only learn so much. Not that an
individual programmer can't learn whatever they want to learn, but you
know, there are only so many hours in a 

Re: VHS and Betamax

2006-04-12 Thread Denny Valliant
On 4/11/06, Dave Francis <[EMAIL PROTECTED]> wrote:

> Isaac, I wish it were otherwise, but with multi-core, multi-threaded
> processors/processes, I'm not sure that people CAN optimize software any
> longer. At least, at the code level. It just isn't cost-effective to spend
> days trying to tweak a block of code down to 200 cyles from 220 cycles.
> (unless you're John Carmack!)
>
> I can remember the days when it did matter (I wrote Assembler on a 360/40,
> circa 1967), but I just don't believe that that's the case any longer.


You get 30k people all using that app at the same time and what? :-)

I guess the whole low level code thing is too focused.  Like, I guess what I
was talking about was more like the single query with multiple joins.

(That link to Shpeckle the painter* (I know, sp) was so funny. Maybe it was
just my state of mind, but when I got to the part about the 300 meter
brush, I just lost it (*the link to the "actual numbers" or whatnot))

I don't know. Doesn't CF take care of query caching? ;-)

I guess my point is that I can see the point of not worrying too much,
but I don't buy into believing things will change in the future. Arg! Of
course things will change, but, um. Well... I guess don't go around
smoking, thinking that, "what the hell, by the time I get cancer, we'll
have the technology to..."  Or you can, hell, I do sometimes, and it
could be true, but the safest bet is to avoid smoking. :-/

Not really a point as such, I guess, but take it from my dad and
muscle cars... there will always be optimization to be done.
Whether it really makes things faster is besides the point. =P

The link to the thing about strings was more conceptual than not.

Take one look at the gamer market and tell me that people aren't
still concerned with shaving .2 off of some random shade routine.
:-) Thank god. I kind of dig that.

Hey Isaac, ever read "the age of spiritual machines"?
:D



~|
Message: http://www.houseoffusion.com/lists.cfm/link=i:4:237498
Archives: http://www.houseoffusion.com/cf_lists/threads.cfm/4
Subscription: http://www.houseoffusion.com/lists.cfm/link=s:4
Unsubscribe: http://www.houseoffusion.com/cf_lists/unsubscribe.cfm?user=89.70.4
Donations & Support: http://www.houseoffusion.com/tiny.cfm/54


Re: VHS and Betamax

2006-04-11 Thread Denny Valliant
> Socially I think we allow computers to hand-cuff us.


Socially? As in people don't get out enought and meet in
meat-space? Or our interactions, even on-line?

> Well, maybe if the optimization is as swell as it's said
> > to be.  I can't help but feel that even as smart as
> > computers are, there are areas that a human could see a
> > "pattern" before the computer could. Or whatever.
>
> Is this a response to my comment about why I'm not bothered by the
> fact that the ColdFusion server generates java code? (another good
> example of which is that the server used to generate C++ code (at
> least I thought I remembered somebody saying such), and my knowledge
> of C++ wasn't helpful when I worked with ColdFusion then either)


More along the lines of "over helpful" generation.  The old mac OS always
kind of bugged me.  It was SO arcane to do underlying stuff, ya know? I
guess that was cool too.  But you ever feel so abstracted that you are no
longer in control? I guess that's bad design or interface or something more
than abstraction...

The computer still doesn't know the goal (yet), so it has to consider all
options, picking what it thinks you want. Your comment about hoping
the macromedia engeneers thought about this stuff... some dude some
where put some logic in there, there is no law of nature stating that it
doesn't matter once your at a higher level. Man, that made sense.

Maybe you're right, and it's a moot point, but I think understanding
something to it's core is worthy. Actually considering the difference of
running "the same code" on a 64 bit or a 32 bit.  It's nice to know it
should "just work", but I really like that intuitive guess type stuff that
happens when you start understanding the nature of something.

"How did you know to look there to fix that?"
"I dunno, it just made sense."

Sometimes stuff doesn't work, even tho we're told at the high level
it should.  Then what. :-P  Ya gotta dig in. If you dig that kind of
stuff. I guess you could also just say, "hey person who's thing my
thing doesn't work with, why aren't you "standard"?". Or wait for
the person who's job it is to do that part figures it out.


> > Guess the argument about optimization has some validity,
> > yet I can't help see history repeat itself. Every few
> > years there's this idea that it doesn't matter, we're
> > getting bigger, faster processors, more RAM, etc.. Yet
> > the real idea is to conserve energy. Sorta. I guess make
> > less go further.  That's never going to change, no
> > matter how much power there is. It's the nature of
> > power - corruption and responsibility aside.
>
> No not entirely. The issue is that we're still in transition. The
> hardware progress is not as fast as many of us would like and
> sometimes we jump the gun with regard to wanting to be able to have
> the Star Trek computer that we just tell what to do and it does it. So
> if I build an application today and I fail to optimize it , then my
> application is going to be slow in comparison to another application
> which accomplishes the same task. (Incidentally I spend quite a bit of
> my programming time thinking about the optimization of my software --
> I may not always get it right, but I do have a reasonable handle on
> the concepts.)
>
> Skip forward 20 years.


I guess our ideas of optimization are different.

When I think of optimization, it's not necessarily "speed".  There are
so many areas to optimize, many of which have nothing to do with
processors or memory.  And much optimization is usefull later on.

I would think.  At least it seems kind of evolving, or whatever.


All that being said of course, anyone can screw up a good thing and
> it's not very difficult to accomplish. There are lots of times that


Ha!! That kills me. Listen to this:
I had the bright idea to instead of having tables with different data-
types, I'd have tables all of one data-type, and use a key and another
table to keep track of what was where or whatever.

Long story short, it's death by a thousand queries.  I had to make
some cache tables in the end, just to keep it all together. Bleh.
I'd had some SQL generating stuff already tho so the cache wasn't
too hard to wrangle. And now everything is a lot faster, so long as
I can get my cache-keeper-up-to-dater working optimally.

> Sorta saying it's all data, but some data is much
> > easier to parse than other data is. By "much" I
> > mean astronomically.
>
> Oh. Okay... Yes, admittedly. :)
> Hence much of the reason behind XML.


Indeed. I thought that was all the reason. ;-)

An object on disk, an object in mem... um. let me try to think
of how to express what I'm thinking.


~|
Message: http://www.houseoffusion.com/lists.cfm/link=i:4:237494
Archives: http://www.houseoffusion.com/cf_lists/threads.cfm/4
Subscription: http://www.houseoffusion.com/lists.cfm/link=s:4
Unsubscribe: 
http://www.houseoffusion.com/cf_lists/unsubscribe.cfm?

RE: VHS and Betamax

2006-04-11 Thread S . Isaac Dealey
> -Original Message-
> From: S. Isaac Dealey [mailto:[EMAIL PROTECTED]
> Sent: Tuesday, April 11, 2006 1:11 PM
> To: CF-Talk
> Subject: Re: VHS and Betamax

>>No not entirely. The issue is that we're still in
>>transition. The hardware progress is not as fast
>>as many of us would like and sometimes we jump the
>>gun with regard to wanting to be able to have the
>>Star Trek computer that we just tell what to do and
>>it does it. So if I build an application today and
>>I fail to optimize it, then my application is going
>>to be slow in comparison to another application
>>which accomplishes the same task. (Incidentally I
>>spend quite a bit of my programming time thinking
>>about the optimization of my software -- I may not
>>always get it right, but I do have a reasonable
>>handle on the concepts.)

> With genuine respect:

Thanks, although I always assume respect is given. :)

Having said that I realize that's an opportunity for someone to poke
at me for being disrespectful. :)

> I'm not sure that being in transition is relevant
> - we're always going to be. More horsepower
> generally seems to engender more complex
> applications rather than faster ones.

While that's true, feature complexity doesn't inherently solve human
need. Particularly with software development (as compared to physical
product production) there's a sort of illusion of software being
cheaper than it is to make because the business people involved get
sort of swept away by the lack of material costs. They don't see the
development lifecycle and the planning costs the way we do down here
in the trenches. That same sort of lack of understanding at the top
frequently engenders a lack of research into the usefulness of the end
product. I.e. if it costs "nothing" to make, then why not just make
it? :) The end result is often a lot of guess-work rather than
spending the time and money up-front to research what people need.
Thus we end up with all those thousands of nifty little features in MS
Office products that less than 1% of the product's target market
actually use. :P

Now couple that with the evolution of hardware. As the hardware
becomes more efficient in todays market, software manufacturers
continue to add features because the efficiency of the hardware gives
them more room to add them. This is true, and I don't debate that it's
still important to add features, since I have plenty of examples of
applications currently that even at their most efficient are still
performing more slowly than we'd like, or that still lack certain
features because those features are simply too inefficient to be
practical for daily use with today's hardware.

Set up ColdFusion developer edition at home with a copy of any
anti-virus application with the auto-file scan enabled. Watch how long
it takes the CF Server service to start up. :) Webservices as another
example are only viable now because the hardware will support them --
we'd have never even tried webservices if we still had to serve
everything on the 14.4k modems we had in the early 90's.

So what I'm getting at here is that although we can't see it
currently, I believe (and I certainly could be wrong) that the demand
for features will eventually taper off (although I'm certain it will
never be completely abolished) as the hardware becomes more efficient
in the same way that the demand for efficiency in our existing feature
sets is tapering off.

There is only a certain amount of complexity that will be useful to
the average person (avoiding the word "user" here), beyond which any
given application begins to delve into a niche market with a much
smaller user base, but who as a result of being in that niche market
will then also want a certain specific sub-set of features which will
vary from the sub-set of features desired by another niche using a
similar application. Sure we could just cram both niches into a single
application, but it's better for the person using it if they can get
something that's not so cavalier about the needs of their niche.
There's a good example of this in The Inmates Are Running the Asylum
by Alan Cooper where he describes the car that's designed for everyone
-- a convertible mini-van with a big bed for hauling lumber. :)

I submit to you as a real-world example of that sort of tapering of
feature complexity, the common pocket-calculator. :) I know of three
essential versions of this thing currently, the basic calculator
(performs arithmetic) for people doing their grocery shopping, the
scientific calculator (includes a few extra features like the log
button) for people in mathematically complex professions such as
architects, and the programmable compu-calculator (for mathematicians,
trig and calculus students). Note that ea

RE: VHS and Betamax

2006-04-11 Thread Dave Francis
-Original Message-
From: S. Isaac Dealey [mailto:[EMAIL PROTECTED]
Sent: Tuesday, April 11, 2006 1:11 PM
To: CF-Talk
Subject: Re: VHS and Betamax


>No not entirely. The issue is that we're still in transition. The
>hardware progress is not as fast as many of us would like and
>sometimes we jump the gun with regard to wanting to be able to have
>the Star Trek computer that we just tell what to do and it does it. So
>if I build an application today and I fail to optimize it , then my
>application is going to be slow in comparison to another application
>which accomplishes the same task. (Incidentally I spend quite a bit of
>my programming time thinking about the optimization of my software --
>I may not always get it right, but I do have a reasonable handle on
>the concepts.)

With genuine respect:

I'm not sure that being in transition is relevant - we're always going to
be. More horsepower generally seems to engender more complex applications
rather than faster ones.

Isaac, I wish it were otherwise, but with multi-core, multi-threaded
processors/processes, I'm not sure that people CAN optimize software any
longer. At least, at the code level. It just isn't cost-effective to spend
days trying to tweak a block of code down to 200 cyles from 220 cycles.
(unless you're John Carmack!)

I can remember the days when it did matter (I wrote Assembler on a 360/40,
circa 1967), but I just don't believe that that's the case any longer.






~|
Message: http://www.houseoffusion.com/lists.cfm/link=i:4:237441
Archives: http://www.houseoffusion.com/cf_lists/threads.cfm/4
Subscription: http://www.houseoffusion.com/lists.cfm/link=s:4
Unsubscribe: http://www.houseoffusion.com/cf_lists/unsubscribe.cfm?user=89.70.4
Donations & Support: http://www.houseoffusion.com/tiny.cfm/54


Re: VHS and Betamax

2006-04-11 Thread S . Isaac Dealey
>> The comment I forgot to make in that post was that really
>> I think the trick is to find handfull of technologies you
>> like that are on the up-swing (since they all rise and
>> fall) and stick with them as long as you can. This should
>> make it easier to add other complementary skills as the
>> demand for them increases and mitigate the risks involved
>> in devaluation of any individual skill as a result of
>> increased supply or waning demand. As an individual I'm
>> personally probably more invested in ColdFusion than
>> anything else, potentially over-invested actually.

> Ok. But your heavy involvement in CF perpetuates it as
> well.

Yes. :)

>> I guess we shouldn't be to reliant on computers,
>> > neh? (-:
>>
>> My opinions on that subject tend to be pretty unpopular.
>> :)

> Swinging  to what side? More so or less?

Socially I think we allow computers to hand-cuff us.

>> I still put forth that generated code is generated code,
>> > why shy away from generated code? So long as it's well
>> > formatted (don't look at me, you saw my regex ;) you
>> > should be ok, I recon.
>>
>> It's a question of who's generating it and why. :)
>>
>> To me the fact that my coldfusion templates or CFC's are
>> generated Java is transparent. I know it's there, but I
>> don't have to care too much about what's being generated,
>> beyond knowing that it is generated and having some
>> understanding of the problems that can be caused by that.

> Well, maybe if the optimization is as swell as it's said
> to be.  I can't help but feel that even as smart as
> computers are, there are areas that a human could see a
> "pattern" before the computer could. Or whatever.

Is this a response to my comment about why I'm not bothered by the
fact that the ColdFusion server generates java code? (another good
example of which is that the server used to generate C++ code (at
least I thought I remembered somebody saying such), and my knowledge
of C++ wasn't helpful when I worked with ColdFusion then either)

> Guess the argument about optimization has some validity,
> yet I can't help see history repeat itself. Every few
> years there's this idea that it doesn't matter, we're
> getting bigger, faster processors, more RAM, etc.. Yet
> the real idea is to conserve energy. Sorta. I guess make
> less go further.  That's never going to change, no
> matter how much power there is. It's the nature of
> power - corruption and responsibility aside.

No not entirely. The issue is that we're still in transition. The
hardware progress is not as fast as many of us would like and
sometimes we jump the gun with regard to wanting to be able to have
the Star Trek computer that we just tell what to do and it does it. So
if I build an application today and I fail to optimize it , then my
application is going to be slow in comparison to another application
which accomplishes the same task. (Incidentally I spend quite a bit of
my programming time thinking about the optimization of my software --
I may not always get it right, but I do have a reasonable handle on
the concepts.)

Skip forward 20 years.

Twenty years from now if you load up the same two applications, you
won't be able to tell the difference between them. Yes, one of them is
still inefficient / slow, but to the human person using them, there is
no tangible difference, because advances in hardware cause the slow
application to perform as quickly as the efficient one. So both
applications have the same value (including monetary value) in the
market.

When we optimize software, we're not doing that for the future, we're
doing that to compete in today's market, and because "today's market"
is always becoming "tomorrow's market" that means we're always
shooting at a moving target, so there becomes this balancing act
between how much time we spend optimizing an application and making it
blazingly fast today, and how much time we carve away from the
optimization game in favor of tasks that will be more important in the
market in years to come, such as usability, extensibility and new
features.

Extensibility in particular is one of those "future value" prospects.
Pretty much without fail, something which makes your application
extensible will cost you some efficiency. In some cases it may be
thoughtfull application of XML, which as Joel points out in the
article you posted is always going to be slow compared to a database
(in today's market, and for a good while yet - although if you store
the xml in a file and/or in memory you can get some of that back by
not needing a network trip through the database port), other times it
may be the division of logical functionality into separate objects
which then have to be instantiated. Don't get me wrong, I love
objects, but object instantiation is always slower than using
something that exists already in memory. Thus each time you find an
object doing too much and you separate it into two or more objects to
handle different tasks, you're increasing the l

Re: VHS and Betamax

2006-04-11 Thread Denny Valliant
> The comment I forgot to make in that post was that really I think the
> trick is to find handfull of technologies you like that are on the
> up-swing (since they all rise and fall) and stick with them as long as
> you can. This should make it easier to add other complementary skills
> as the demand for them increases and mitigate the risks involved in
> devaluation of any individual skill as a result of increased supply or
> waning demand. As an individual I'm personally probably more invested
> in ColdFusion than anything else, potentially over-invested actually.


Ok. But your heavy involvement in CF perpetuates it as well.

> I guess we shouldn't be to reliant on computers,
> > neh? (-:
>
> My opinions on that subject tend to be pretty unpopular. :)


Swinging  to what side? More so or less?

> I still put forth that generated code is generated code,
> > why shy away from generated code? So long as it's well
> > formatted (don't look at me, you saw my regex ;) you
> > should be ok, I recon.
>
> It's a question of who's generating it and why. :)
>
> To me the fact that my coldfusion templates or CFC's are generated
> Java is transparent. I know it's there, but I don't have to care too
> much about what's being generated, beyond knowing that it is generated
> and having some understanding of the problems that can be caused by
> that.


Well, maybe if the optimization is as swell as it's said to be.  I can't
help but feel that even as smart as computers are, there are areas
that a human could see a "pattern" before the computer could. Or
whatever. Guess the argument about optimization has some validity,
yet I can't help see history repeat itself. Every few years there's this
idea that it doesn't matter, we're getting bigger, faster processors,
more RAM, etc.. Yet the real idea is to conserve energy. Sorta.
I guess make less go further.  That's never going to change, no matter
how much power there is. It's the nature of power - corruption and
responsibility aside.

You may have seen this, but I liked it alot:
http://www.joelonsoftware.com/articles/fog000319.html

well be. Thus a sweeping change would need to occur in each bean or
> require a change to the generator and a re-build of its generated
> code. I personally find it easier to simply create objects which are
> flexible enough to not require generation and use composition or
> inheritance to allow me to make sweeping changes. The sweeping change
> then is a line or two of code, instead of a larger modification to the
> generator and a rebuild.


So what, you use the root java "object"? ;-) Seriously, you have to store
the information somewhere -  I don't know of any ESP generated code.

I get what you're saying tho (as much as I'm able. Concepts!=use).
I really dig the ideas encapsulated in "The Pragmatic Programmer".

Good stuff, doesn't matter what language. All old-hat for most I'm sure,
but I'd sorta been through the "bad" that knowledge helps avoid, and
I liked the analogies/stories.  Nice having it all in that format.

> But that's coming from someone who, using line breaks and
> > MS Word, smashed several hundred pages of mish-mash into
> > a C^HSV. They weren't commas, so I deleted the C, see?
> > :P  It would be a dream to get a bunch of pragmatically
> > generated documents compared to that.  Programs you can
> > reverse engineer, whatnot. People are so random, sorta.
> > [...]
>
> Ya lost me. :)


Sorta saying it's all data, but some data is much easier to parse
than other data is. By "much" I mean astronomically.

> Or it is, but that whole butterfly in Tibet or
> > whatever, ya know? The right thing at the right
> > time, and bang, you're father of some type of
> > legacy. And conversely, some other legacy never
> > occurs.
>
> I'm familar with the concept of the "butterfly effect", although I'm
> not certain what connection you were trying to make.


I guess...  that there is an element of randomness in evolution.
Yet at the same time I'm saying we have a stake in our destiny,
so... bleh. doesn't make sense 'cept generally. if then. :)

In practice the cf-community list mostly talks about politics.


This more fun. If it bugs anyone, I'll try to keep it more CF
centric.
:D


~|
Message: http://www.houseoffusion.com/lists.cfm/link=i:4:237410
Archives: http://www.houseoffusion.com/cf_lists/threads.cfm/4
Subscription: http://www.houseoffusion.com/lists.cfm/link=s:4
Unsubscribe: http://www.houseoffusion.com/cf_lists/unsubscribe.cfm?user=89.70.4
Donations & Support: http://www.houseoffusion.com/tiny.cfm/54


Re: VHS and Betamax

2006-04-10 Thread S . Isaac Dealey
>> Although I certainly don't base my tech decisions
>> solely on market influences I do think it's
>> important to consider them. A person who knows
>> some XML is certainly more valuable in today's
>> programming market than they would have been 10
>> years ago (how old is XML anyway?). In another
>> 10 years that skill may have continued to become
>> more valuable or it may be less valuable due to
>> increased supply of programmers who are
>> proficient with XML.

> Hmmm. I see your point.  Take it a bit further and
> add in the whole "global economy" type deal- and
> what's easier to transport than the Internet? once
> it's connected, so to speak... computers are pretty
> close to the internet. We're pretty close to
> computers... hmmm...

The comment I forgot to make in that post was that really I think the
trick is to find handfull of technologies you like that are on the
up-swing (since they all rise and fall) and stick with them as long as
you can. This should make it easier to add other complementary skills
as the demand for them increases and mitigate the risks involved in
devaluation of any individual skill as a result of increased supply or
waning demand. As an individual I'm personally probably more invested
in ColdFusion than anything else, potentially over-invested actually.

> I guess we shouldn't be to reliant on computers,
> neh? (-:

My opinions on that subject tend to be pretty unpopular. :)

>> There will always be work to be done, just like stuff
>> > will always be built on other stuff. The real meat and
>> > potatoes are in the "sum is more than the parts" type
>> > deals.  I don't think language or popularity have much
>> > to do with it, sorta.
>>
>> I tend to agree. I think ColdFusion as a technology does
>> a good job of encouraging synergies.

> No doubt. It's like java, but even more fun. :-)

> I still put forth that generated code is generated code,
> why shy away from generated code? So long as it's well
> formatted (don't look at me, you saw my regex ;) you
> should be ok, I recon.

It's a question of who's generating it and why. :)

To me the fact that my coldfusion templates or CFC's are generated
Java is transparent. I know it's there, but I don't have to care too
much about what's being generated, beyond knowing that it is generated
and having some understanding of the problems that can be caused by
that. The ColdFusion server actually doesn't allow me to look at or
modify the generated code myself -- I can't reverse engineer those
generated files or make changes to them except by proxy in my CFML
code. So the end result is that although I'm using generated code, I'm
not working with generated code in any way.

This is significantly different from the form of bean generators that
have become popular in today's "ColdFusion is java-lite" community (or
as I understand it in the Java community as well). In this community,
you definitely do care how the code is generated, and you're likely to
make modifications to the generator if not to the generated code. The
generator then produces a slew of beans (I don't like these in general
anyway, preferring value objects), which are all the same or may as
well be. Thus a sweeping change would need to occur in each bean or
require a change to the generator and a re-build of its generated
code. I personally find it easier to simply create objects which are
flexible enough to not require generation and use composition or
inheritance to allow me to make sweeping changes. The sweeping change
then is a line or two of code, instead of a larger modification to the
generator and a rebuild.

There are cases in which I've built code "generators" although not of
the sort that are popular of late. I generally view it as a last
resort. One example is the skinning aspects of the Blogs onTap
application. A user provides a layout for their blog in the form of an
XHTML template (a default is provided) which is then converted into a
collection of about 4 CFML templates using some CSS which strips
undesirable content (primarily cfml and javascript) using XSL and
prepends a cfimport tag to the beginning of the document, which allows
the templates to use several pre-defined custom tags for displaying
different elements of the blog's layout such as the calendar, author
image and various links. This allows the blog owner to have some
limited control over dynamic elements without giving them free reign
to use cfml code. This code generation is more specific/controlled and
imo less potentially frustrating than bean generation.

> But that's coming from someone who, using line breaks and
> MS Word, smashed several hundred pages of mish-mash into
> a C^HSV. They weren't commas, so I deleted the C, see?
> :P  It would be a dream to get a bunch of pragmatically
> generated documents compared to that.  Programs you can
> reverse engineer, whatnot. People are so random, sorta.
> [...]

Ya lost me. :)

>> > Note my use of "security" as "work" - as I believe
>> > "