@Pralaybi: You've got it right. Since h is proportional to the log of the
smaller number, we also can say that the complexity is O(log^2 (smaller
number)).
Dave
On Friday, November 23, 2012 2:09:01 PM UTC-6, pralaybi...@gmail.com wrote:
> @Dave: Could you please correct me if am wrong here.
>
@Dave: Could you please correct me if am wrong here.
1) So we are looking out for the worst case, and that happens when m and n
are consecutive Fibo numbers, being mutually prime to reach other.
2) Its taking 5 iterations to reduce the number of digits in the smaller of
m and n, by one. Assuming th
@Bharat: See, e.g.,
http://en.wikipedia.org/wiki/Euclidean_algorithm#Algorithmic_efficiency.
Dave
On Sunday, November 18, 2012 12:03:33 AM UTC-6, bharat wrote:
> @ Dave : can u pls explain the solution u gave .
> How can u say fibnocci sequence produces worst case ?
>
> On Fri, Nov 16, 2012 at
@ Dave : can u pls explain the solution u gave .
How can u say fibnocci sequence produces worst case ?
On Fri, Nov 16, 2012 at 11:15 AM, Shruti Gupta wrote:
> Don, thanks for the solution, Can u pls explain how a+b gets reduced by
> atleast 25%
>
>
> On Fri, Nov 16, 2012 at 3:42 AM, Don wrote:
>
Don, thanks for the solution, Can u pls explain how a+b gets reduced by
atleast 25%
On Fri, Nov 16, 2012 at 3:42 AM, Don wrote:
> Tail recursion is no different than a loop, so the complexity is the
> same as an iterative solution like this:
>
> int gcd(int a, int b) // For a {
> while(1)
>
@Sonesh: Not so. The worst case for Euclid's algorithm is when the numbers
are consecutive Fibonacci numbers. So (89,55) --> (55,34) --> (34,21) -->
(21,13) --> (13,8) --> (8,5), so it took 5 steps to reduce the number of
digits of the first number from 2 to 1. Asymptotically, the number of
dig
you see, in each step of recursion, the number of digits in *n* is decrease
by one(at least in worst case), so the complexity you can decide.
On Tuesday, November 13, 2012 3:34:06 PM UTC+5:30, Shruti wrote:
>
> hi
>
> Can anyone help me find out the time complexity of recursive gcd algorithm
> (
Sorry, I said that wrong. a+b is reduced by 25% over any TWO
iterations.
Look at these cases
2a <= b : b % (a % b) < a and 2*a <= b, so b is decreased by at least
half, so a+b decreased by at least 25%
a < b : b will become b-a, which is less than b/2, decreasing a+b
by at least 25%.
a==b
@Don : how can u say that a+b reduces by atleast 25% ..
in each iteration b won't be changed in (a+b). b just shifted to a. a gets
changed to a%b. ==> a+b will become at max 2b-1 .. we can't deduce any
relation between a+b to 2b-1.
why do we need the % of reduction on (a+b) in each iteration ?
O
Tail recursion is no different than a loop, so the complexity is the
same as an iterative solution like this:
int gcd(int a, int b) // For a wrote:
> hi
>
> Can anyone help me find out the time complexity of recursive gcd algorithm
> (using euclid's algo)
> i.e. for the following program :
>
> in
@atul:
on the first look, even I thought the same.. O(log N).. and this is may be
true for the given precision.
*[-- the following may not be related to given qn.] -- but.. can u comment
on this view point..*
but.. I am thinking that, the complexity is dependent on the level of
precision requir
answer to this post has not yet been answered ...i.e abt complexity.
seems log(n) to me..
correct me if i am wrong.
On Mon, Jan 16, 2012 at 8:53 AM, Gene wrote:
> I'm sorry for not being specific. I meant it doesn't converge for all
> roots, e.g. cube root.
>
> On Jan 15, 10:18 pm, Dave wrote
I'm sorry for not being specific. I meant it doesn't converge for all
roots, e.g. cube root.
On Jan 15, 10:18 pm, Dave wrote:
> @Gene: Actually, Newton's Method for sqrt(a), where a > 0, also
> sometimes called Heron's Method, converges for every initial guess x_0> 0.
> This is not true general
To find sqrt(a), this is equivalent to Newton's Method with
f(x)=x^2 - a.
Newton is:
x_{i+1} = x_i + f'(x_i) / f'(x_i).
So you have
x_{i+1} = x_i + (x_i^2 - a) / (2 x_i) = (x + a/x) / 2,
which is just what the Babylonian method says to do.
Newton's method roughly doubles the number of
@Gene: Actually, Newton's Method for sqrt(a), where a > 0, also
sometimes called Heron's Method, converges for every initial guess x_0
> 0. This is not true generally for Newton's Method, but it is true
for Newton's Method applied to f(x) = x^2 - a.
Dave
On Jan 15, 5:39 pm, Gene wrote:
> To find
To find sqrt(a), this is equivalent to Newton's Method with f(x)=x^2 -
a. Newton is: x_{i+1} = x_i + f'(x_i) / f'(x_i). So you have x_{i+1}
= x_i + (x_i^2 - a) / (2 x_i) = (x + a/x) / 2, which is just what the
Babylonian method says to do.
Newton's method roughly doubles the number of significa
Hi Atul,
You're correct. I was commenting on the original poster's words "I
guess its NlogN for balanced tree as it is traversing N nodes for H
times," which is not correct.
The general result is that the run time will be O(N) for any balance
rule where the ratio R of left to right subtree size s
@Gene : i guess you are considering balanced tree.
what if the tree is left skewed or right skewed .
then height of thee will ne H=No. of node.
so we will get :-
1+2+3+4H
so recurrence will be T(n) = T(n-1) + O(n)
hence complexity will be O(N^2)
correct me if i am missing somthing.
Thank
My advice is don't guess. Do the math.
To print level L of a perfectly balanced tree, you will traverse 2^L
-1 nodes.
The outer loop that prints levels 1 .. H will therefore traverse
sum_{L=1..H} (2^L - 1) = 1 + 3 + 7 + ... 2^H - 1
Let N = 2^H - 1 be the number of nodes in the tree. Then
sum
I guess its NlogN for balanced tree as it is traversing N nodes for H
times ie the height of tree which is logN for balanced and N for
skewed tree. So it should be Nlogn and N^2
On Nov 20, 9:27 am, tech coder wrote:
> @ sravanreddy001
> complexity is O(N^2) whether tree is balanced or not doesn't
Its level order traversal but its O(n^2) u search entire tree to check
if the nodes level is i (say 0<= i <= height). i guess can use stack
or queue to get O(n) complexity
On Nov 19, 2:58 pm, shady wrote:
> this doesn't seem like level order printing, because you are simply
> printing the tree st
Try it once. It is for level order only in dfs way
On Nov 19, 2:58 pm, shady wrote:
> this doesn't seem like level order printing, because you are simply
> printing the tree starting with the children as the root node.
>
>
>
>
>
>
>
> On Sat, Nov 19, 2011 at 12:57 PM, Ankuj Gupta wrote:
> > What
yes it is n^5
On Sep 22, 8:53 am, siva viknesh wrote:
> @Dave ... thanks dude.So it should be O(n^5) .. am i right ??
>
> On Sep 22, 8:19 am, Dave wrote:
>
>
>
>
>
>
>
> > @Siva: Work from the inside out, using the identities
>
> > sum from i = 1 to n (i) = n*(n+1)/2
>
> > sum from i = 1 to
@Dave ... thanks dude.So it should be O(n^5) .. am i right ??
On Sep 22, 8:19 am, Dave wrote:
> @Siva: Work from the inside out, using the identities
>
> sum from i = 1 to n (i) = n*(n+1)/2
>
> sum from i = 1 to n (i^2) = n*(n+1)*(2*n+1)/6
>
> sum from i = 1 to n (i^3) = n^2*(n+1)^2/4
>
> sum
@Siva: Work from the inside out, using the identities
sum from i = 1 to n (i) = n*(n+1)/2
sum from i = 1 to n (i^2) = n*(n+1)*(2*n+1)/6
sum from i = 1 to n (i^3) = n^2*(n+1)^2/4
sum from i = 1 to n (i^4) = n^5/5 + n^4/2 + n^3/3 - n/30
Dave
On Sep 21, 10:03 pm, siva viknesh wrote:
> somebody
somebody plz reply...
On Sep 21, 10:53 pm, sivaviknesh s wrote:
> for(i=0;i for(j=0;j for(k=0;k sum++;
>
> Is it n^5 log n . n * (n^2 log n) * (n^2 log n) ???
>
> correct me if i m wrong? also anybody can tell some easy approach to tackle
> these ques ?? I worked o
For all practical purposes most implementations are constant time.
--
You received this message because you are subscribed to the Google Groups
"Algorithm Geeks" group.
To post to this group, send email to algogeeks@googlegroups.com.
To unsubscribe from this group, send email to
algogeeks+unsub
can anyone say how to calculate the complexity of a recurrence relation
--
You received this message because you are subscribed to the Google Groups
"Algorithm Geeks" group.
To post to this group, send email to algoge...@googlegroups.com.
To unsubscribe from this group, send email to
algogeeks+
@asutosh::
all ur efforts 2 write some code wud be worthless if it cant work for
some inputs...if it can , then no one does bother about time
complexity.
since it doesn for large i/ps v hav 2 write some efficient code
(although i agree wid u dat its painful :P )
On Aug 17, 9:45 pm, Dave wro
For 30 years, I developed mathematical software (linear equation
solvers, eigenvalue routines, fast Fourier transforms, etc.) on a wide
variety of high-performance computers with interesting architectures.
For those, the optimal-order algorithms are well known. My principal
goal was to implement a
Linked list:
Insertion at beginning and end O(1) ... constant time
Insertion at an arbitrary location (based on some condition): O(n) as
you have to traverse through all n-x elements to get to element x in
the list.
Same goes for deletion. Since linked lists are dynamic, other than
changing one po
Don't know if the O-notation is also defined for complex functions.
Well, if it isn't, here's a possibility: - (please correct me if I am
wrong here.. )
For sqrt( x ) to be real, x needs to be > 0
=> log(log(m)) / log(m) > 0
But we also know that log(m) > log(log(m)) for all values of m for
wh
Hi Ray,
Can u do this without using Master theorem?
I also need to fine the time complexity of problems like:
T(m) = 2T(m/2) + O( m^2 * squareroot((log log m) / (log m)) )
basically i need a solution without using master theorem.
Regards,
Phani
On 6/12/07, Ray <[EMAIL PROTECTE
I think it's O(n).
Because the order of squareroot((log log m) / (log m)) is less than
m's.
T(n) = a T (n/b) + f(n)
1. O(n ^(lgb/lga) ) > O(f(n))
T(n) = O(n ^(lgb/lga))
2. O(n ^(lgb/lga) ) = O(f(n))
T(n) = O(lg(n)*f(n))
3. O(n ^(lgb/lga) ) < O(f(n))
T(n) = O(f(n))
The problem fits the 1st si
Adiran, Yes u r right. Let T(1) = 1.
On 6/12/07, Adrian Godong <[EMAIL PROTECTED]> wrote:
>
> You should provide the limit/point where T(m) is constant.
>
> Say T(1) = 1, or something else. Only then we can calculate the time
> complexity.
>
>
> On 6/12/07, Phanisekhar B V <[EMAIL PROTECTED]> wro
You should provide the limit/point where T(m) is constant.
Say T(1) = 1, or something else. Only then we can calculate the time
complexity.
On 6/12/07, Phanisekhar B V <[EMAIL PROTECTED]> wrote:
>
> How can i calculate the time complexity of the following problem?
> T(m) = 2T(m/2) + O( squa
hello
look the best case of circular queue is 0(1) and worst case is 0(n) so avg.
case should be between these two so if n+1/2 is avg case then also avg. case
will be then 0(n).
plz check it and may be i am wrong
bye
On 12/16/06, programming <[EMAIL PROTECTED]> wrote:
>
>
>
> programming
I wonder the average case here means the average successful searching
case or average unsuccessful searching case or the total average
searching case?
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups
"Algorithm Gee
programming wrote:
> Thanks for the advice, i think i am right about the circular queue been
> a(n)=n+1/2, you agree?
Is this also the average case for a tree of degree(4)?
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Thanks for the advice, i think i am right about the circular queue been
a(n)=n+1/2, you agree?
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups
"Algorithm Geeks" group.
To post to this group, send email to algogeeks
1.)
If you're talking about search using a tree with each node having degree 4, then
the best-case complexity is indeed O(1). Why, the first node (the
root) could be the one that you're looking for.
3.)
Yes indeed. Since, your tree doesn't seem to have any branching logic
(like a BST does), the c
41 matches
Mail list logo