Here is the description of this problem:
Professor Diogenes has n supposedly identical VLSI[1] chips that in
principle are capable of
testing each other. The professor's test jig accommodates two chips at
a time. When the jig is
loaded, each
Big Oh notation only gives the proportionality of the time required for that
particular algorithm.
For an approximation you can assume some hypothetical machine and calculate
the time taken using the costs for loads,stores,etc.
And you can also find the total time for execution using the platform
Well, there is no easy answer. You'll have to fix up the computing
model (RAM model etc) and then calculate the number of operations.
Memory hierarchy and parallelism make it nearly impossible to come
with an exact number for random data on a modern computer but you
could see it as
cost of loads
I know how the complexity of an algorithms is calculated, but how
would this relate to the time it takes? Let's say I have 25000 random
numbers I'd like to sort with the selection sort. Now how could I use
Big O notation to calculate the time taken to sort these numbers?? I
mean I understand it's
Case 1
Start:
0 1 1
1 1 1
1 1 1
After first scan:
0 1 0
1 1 1
0 1 1
After second scan:
0 1 0
0 1 1
0 1 1
After third scan:
0 0 0
0 1 1
0 1 1
Case 2
Start:
1 1 0
1 1 1
0 1 1
After first scan:
0 1 0
1 1 1
0 1 0
After second scan:
0 0 0
0 1 1
0 0 0
After third scan:
0 0 0
0 1 0
0 0 0
Hope thi
Hi Dave,
Can you explain your algo for these 2 cases...
0 1 11 1 0
1 1 11 1 1
1 1 10 1 1
Please explain me in steps cause we tried the same problem and can't get
it done for without using extra space. ( we used 1 extra space, if the