I'm trying to wrap my brain around this idea.  In theory, it should be
faster, but it would be such a huge array for a lookup table and it would
add another conditional statement, that when the mechanicals are built, it
may be slower.  But my idea is this:

If i compare record 20 to record 1030 and find them to be comparable and
decide to calculate the distance between them, then why should i recompare
record 1030 to 20 when the time comes in the loop?  That comparison has
already been made.  So if I stored the distance in a huge 38k x 38k array(or
vector), and the first thing I check is if a distance has been recorded in
this array, then I already know that they are comparable and do not need to
execute any more conditionals or calculate the distance.  So the logic cuts
the number of times through the conditionals and calculations in half, at
the expense of a very large array and adding another conditional for at
least half of the calculations.

Do you think it is worth trying this scenario? 



--
View this message in context: 
http://apache-flex-users.2333346.n4.nabble.com/Workers-and-Speed-tp13098p13202.html
Sent from the Apache Flex Users mailing list archive at Nabble.com.

Reply via email to