On 17 Mai, 17:30, Jeffrey Martin <360cit...@gmail.com> wrote:
> that reminds me, is there any way to limit the number of CP's per pair?

I think the CPs are found anyway. Once all likely candidates are lined
up, the best ones are kept. Some CPGs have a built-in option to limit
the number of CPs per pair to a certain number, but cpfind doesn't
implement this (please correct me if I'm wrong). With apsc, the
relevant paramenter is

--maxmatches <matches>  Output no more that this many control points
per
                          image pair (default: 25, zero means
unlimited)

but cpfind only offers

--minmatches <int>            Minimum matches    (default : 6)

what the intended use of this parameter is escapes me - if it can't
find any matches, how would it fabricate the requested minimum of six?
Anyway, once the CPs are there, you can try and throw away ones which
are 'worse' than others. The problem is which quality criteria you
use. One obvious criterion is the 'CP distance', and my simple
top_cps_only script does just that:

http://bazaar.launchpad.net/~kfj/+junk/script/view/head:/main/top_cps_only.py

the other criterion is even spread of CPs, which is mathematically
more demanding, and I've not dealt with the issue, so top_cps_only may
leave a bunch of CPs very close to each other, which is not really
what you want. You can also use cpclean (or click on 'clean CPs')
until you only have roughly the desired number left - the tool itself
doesn't offer to only keep a certain number but removes a certain
percentage. I think it takes into account even spread, though, and
will keep a minimum number per pair as well.

While I'm going on about CPGs - it's also worth noting that using
cpfind (and, for that matter, panomatic) without --linearmatch is
particularly bad with many images, since both CPGs will look at all
image pairs, so processing time is quadratic with the number of
images. apsc on the other hand will keep all feature points from all
images in memory and then do a global search, so it is much better
than quadratic (I'm not sure about the precise mathematics, but my gut
feeling is N log N for the global search). I hope these are all just
teething problems - cpfind claims decendency from panomatic and
therefore has inherited some idiosyncrasies from it, but I hope it'll
continue evolving at it's current pace and now that the feature
detection seems to be running very well indeed, maybe other parts of
it can be improved.

The look-at-all-pairs method has it's merits (badly matched images are
more likely to have some CPs found for them), and if you only have up
to, say, two dozen images, processing time is still bearable. So my
point, in short, is: if you can't use --linearmatch and have many
images, try apsc instead of cpfind.

> i fully agree, generating too many CP's makes problems but sometimes it is
> necessary to jack up the sensitivity of cpfind to find any matches at all!
> so there is a problem here waiting to be solved :-)))

In my experience, having a large number of CPs is also helpful when
calibrating lenses, for multilens panoramas and with handheld takes
with parallax problems. And as far as ramping up cpfind's sensitivity
is concerned, rejoice: with a (very recent) fix, the --fullscale
option now seems to work just fine.

Kay

-- 
You received this message because you are subscribed to the Google Groups 
"Hugin and other free panoramic software" group.
A list of frequently asked questions is available at: 
http://wiki.panotools.org/Hugin_FAQ
To post to this group, send email to hugin-ptx@googlegroups.com
To unsubscribe from this group, send email to 
hugin-ptx+unsubscr...@googlegroups.com
For more options, visit this group at http://groups.google.com/group/hugin-ptx

Reply via email to