Complexety of algorithm

  • Thread starter Thread starter Allan Ebdrup
  • Start date Start date
A

Allan Ebdrup

Hi
I'm trying to calculate a complexety of an algorithm what does the sum
1*log(1) + 2*log(2) + ... + (n-1)*log(n-1) + n*log(n)
give in O()?
I'm guession it's O(log(n)*n^2) but is it?

Kind Regards,
Allan Ebdrup
 
James Curran said:
Assuming that the log() function is itself O(1), then the complexity
should be O(n).

the log part has at least O(log(n)) complexety and the 1+2+3+...+n is O(n^2)
 
the log part has at least O(log(n)) complexety

Whatever gave you that idea? Consider the following:
f(n) = n * 1000000000. What is the complexity of f(n). O(1 billion
n) ? O(n)? Actually, it's just O(1), because there is exactly one step to
be done calculate the value. There will always be just one step regardly of
the value of n.

and the 1+2+3+...+n is O(n^2)
Again nonsense. It's *result* is in the range of n^2, but the process
to calculate it involves n steps, and therefore is O(n):
int sum = 0;
for(int i = 1; i <= n; ++i)
{
sum += i;
}


In fact, that particular summation is what I was thinking of when I
mentioned an algorithm of O(1). Summing 1...n as I did above is O(n), but
as Gauss proved as a child the same answer can be achieve via:

f(n) = ((n +1) * (n / 2)

which is of O(1) (the number of steps is unrelated to the value of n)


--
--
Truth,
James Curran
[erstwhile VC++ MVP]

Home: www.noveltheory.com Work: www.njtheater.com
Blog: www.honestillusion.com Day Job: www.partsearch.com
 
James Curran said:
Whatever gave you that idea? Consider the following:
f(n) = n * 1000000000. What is the complexity of f(n). O(1
billion
n) ? O(n)? Actually, it's just O(1), because there is exactly one step
to
be done calculate the value. There will always be just one step regardly
of
the value of n.
Again nonsense. It's *result* is in the range of n^2, but the process
to calculate it involves n steps, and therefore is O(n):
int sum = 0;
for(int i = 1; i <= n; ++i)
{
sum += i;
}


In fact, that particular summation is what I was thinking of when I
mentioned an algorithm of O(1). Summing 1...n as I did above is O(n), but
as Gauss proved as a child the same answer can be achieve via:

f(n) = ((n +1) * (n / 2)

which is of O(1) (the number of steps is unrelated to the value of n)

Well you seem to have misunderstood the question, the sum I'm adding
represents steps in an algorithm so when I write n*log(n) thats O(n*log(n)),
the sum represents calling an algorithm that's O(n*log(n)) n times, with 1
to n elements.

Kind Regars,
Allan Ebdrup
 
So, let's see if I get this straight: When you stated:

You actually meant that you want to calculate the complexety of an algorithm
which does the sum:
1*f(1) + 2*f(2) + ... + (n-1) * f(n-1) + n* f(n), where f(n) has a
complexity of O(n*log(n)).

If that is the case (you are sorting increasingly larger arrays?), then the
overall complexity would be O(Log(n) *n ^2)


--
--
Truth,
James Curran
[erstwhile VC++ MVP]

Home: www.noveltheory.com Work: www.njtheater.com
Blog: www.honestillusion.com Day Job: www.partsearch.com
 
Allan,

I believe you are correct. The complexity is O(n^2*log(n)).

Let g(x) = x*log(x)

and

Let f(x) = Sum[g(x), x=1, n] = 1*log(1) + 2*log(2) + ... + n*log(n)

The following inequality is immediately obvious.

n*log(n) < f(n) < n^2*log(n)

That narrows things down a little bit, but we need to go further. We
can do this by integrating g(x) to approximate f(x).

x^2 x^2
So f(x) = Integral[g(x), dx] = --- log(x) - --- + E
2 4

I used integration by parts to integrate x*log(x). E is some error
term which I believe you can approximate and convince yourself that is
negligible by using the Euler-MacLaurin Formula.

Therefore, f(x) has O(n^2*log(n)) complexity.

Perhaps someone who is much better at math can verify this?

Brian
 
Brian said:
x^2 x^2
So f(x) = Integral[g(x), dx] = --- log(x) - --- + E
2 4

For those not using a fixed-width font:

f(x) = Integral[g(x), dx] = (x^2/2)*log(x) - (x^2/4) + E
 
James Curran said:
So, let's see if I get this straight: When you stated:


You actually meant that you want to calculate the complexety of an
algorithm
which does the sum:
1*f(1) + 2*f(2) + ... + (n-1) * f(n-1) + n* f(n), where f(n) has a
complexity of O(n*log(n)).

If that is the case (you are sorting increasingly larger arrays?), then
the
overall complexity would be O(Log(n) *n ^2)

Yes, as I write I wanted to know:
what does the sum
1*log(1) + 2*log(2) + ... + (n-1)*log(n-1) + n*log(n)
give in O()?

Kind Regards,
Allan Ebdrup
 
James Curran said:
You actually meant that you want to calculate the complexety of an
algorithm
which does the sum:
1*f(1) + 2*f(2) + ... + (n-1) * f(n-1) + n* f(n), where f(n) has a
complexity of O(n*log(n)).

If that is the case (you are sorting increasingly larger arrays?), then
the
overall complexity would be O(Log(n) *n ^2)

Actually I'm not doing the sum, I'm sorting increasingly larger arrays as
you state (or another programmer was), but I've found a different approach
that's O(n*log(n))

Kind Regards,
Allan Ebdrup
 
Back
Top