One of these algorithms I encountered is the Sweeping Line Algorithm,
which is quite useful in computational geometry.
For this algorithm, input is n points, running time will be sometimes
sensitive to the output.
For example, if we want to calculate all the intersections of n
segments, worst case is O(n*n).
But if we use Sweeping Line Algorithm, it will be O(nlogn+k), where k
is the number of intersections.

I think the key idea here is the original input is n, but we are
calculating extra input for this process, and the extra input size is
directly or undirectly related to the output size. For example, the
intersection algorithm goes like this:
  if no intersection, good, it is O(nlogn)
  but if there is any intersection points, they are output, but also
we have to assume they are part of input to check all the other
intersection points.

Therefore, output related means output goes back to impact the input.

On Oct 23, 9:46 pm, Sticker <[EMAIL PROTECTED]> wrote:
> I used to come across some algorithms whose running time is related to
> the size of output. One kind of such algorithms is the pattern mining
> algorithms in Data Mining area. For these algorithms, how far can they
> run is related to the number of patterns existing in the input data.
>
> The problem is how to calculate the time complexity of this type of
> algorithms? I am just not used to do this. Any clue or point to
> tutorials will be helpful to me. Thanks very much.


--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Algorithm Geeks" group.
To post to this group, send email to algogeeks@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at http://groups.google.com/group/algogeeks
-~----------~----~----~----~------~----~------~--~---

Reply via email to