Re: Data smoothing algorithms? - Thank you all

2005-05-04 Thread Bengt Richter
On Wed, 04 May 2005 16:01:07 GMT, Dennis Lee Bieber <[EMAIL PROTECTED]> wrote:

>On Tue, 03 May 2005 16:28:34 GMT, Dennis Lee Bieber
><[EMAIL PROTECTED]> declaimed the following in comp.lang.python:
>
>
>>  It's not going to be easy, though... 
>
>   Yes, talking to myself... I crawled through one of my texts (at
>work) yesterday. This is incomplete -- I wasn't going to copy the entire
>chapter -- but may serve as an example of some of the complexity that
>goes into some of the entries found in almanacs:
>
>From: "Spherical Astronomy" [Robin M. Green; 1985 Cambridge University] 
>
>"The 1980 theory of nutation contains 106 terms both in longitude and
>the obliquity" 
>
>Displacement of true celestial pole from mean pole, /principal/ terms
>only -- lunar caused nutation
>
>Nutation in longitude = -17".1996 sin omega 
>-1".3187 sin (2F - 2D + 2 omega) 
>-0".2274 sin (2F - 2 omega) 
>Nutation in obliquity = 9".2025 cos omega 
>+ 0".5736 cos (2F - 2D + 2 omega) 
>+ 0".0927 cos (2F - 2 omega) 
>
>Where: 
>omega = mean longitude of the node (I presume of the moon)
>F = mean argument from node (moon) 
>D = mean elongation from sun (moon) 
>
>Periods of interest: 
>18.6 year lunar (movement of the node), 
>6 month solar, 
>14 day lunar... 
>
>26000 year luni-solar precession 
>
I wonder why the original post, which I presume was

http://mail.python.org/pipermail/python-list/2005-April/278752.html

doesn't show up in google groups, but can seemingly only be found indirectly
by google search on "smoothing algorithms" site:python.org giving one main post

http://mail.python.org/pipermail/python-list/2005-April/278871.html

which is a reply with no immediate apparent parent. And then going to the
sorted-by-thread index, where you can find the original.

Maybe it's because there was an html attachment (which I didn't "wget" to 
investigate)?

Anyway, the original post sounds like the OP was really just looking for better 
numbers
than in some text tables he found, and not really for a way of estimating 
better numbers based
on flawed data (though that was what he apparently thought his best option was, 
using the
rounded text tables as data). I'd bet that is not his best option, especially 
since distributions
of actual roundoff errors can be weird. There must be tons of 
telescope-pointing and planetarium-driving
software out there that can do similar stuff. And if that's not accurate 
enough, the relevant
newsgroup crowd will be able to advise, I'd bet.

Regards,
Bengt Richter
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Data smoothing algorithms? - Thank you all

2005-05-03 Thread Anthra Norell
To contribute to this interesting discussion, which after having provided
practical solutions, has become academic, I believe we are dealing with a
particularly innocuous case of noise infection, innocuous on account of the
noise being conspicuously distinct from the signal. The signal is the orbtal
and rotational motion of the earth. Its period is one year and it is
perfectly smooth in that it is not subject to disturbance, such as impacts
of high-end asteroids. The noise is rounding errors with a period of one
day. If I plot the data and look at the plot from a distance far enough to
make the jitter invisible, what I see is a smooth sinoidal kind of curve.
That is the signal. Looking at the curve close up, what I see is similar to
a trample path. A trample path is a line through a landscape from one place
to another, traced by footprints which display the latitude of bipeds
walking comfortably, that is without the ambition to place their feet
exactly on the line like tight-rope walkers. Looking at the trample path, I
can easly disregard its latitude and perceive the line it traces as its
lateral median. Applying the analogy, I look at my plot as if it were a
trample path, I can then reconstruct  a close approximation of the signal by
drawing a lateral median along the dots. It helps to know that the signal is
perfectly smooth, because that allows me to identify the lateral deviation
of each dot as an artifact of rounding to be discarded. Using a tracing
ruler flexible enough to follow the signal but too inflexible to follow the
rounding errors I could draft a very good line plot.
  Clearly, this kind of reconstruction is curve fitting. The result is
not a formula, but is quiet output data mapping noisy input data. I should
not expect any of my data points to come out exactly right. But that isn't
the goal of the exercise. The goal is to narrow down the margin of error to
consistently fall below a critical limit.
  Interpolation would be artificially increasing the number of data
points by guessing the most likely values between established ones. On the
line plot, the flexible ruler would have done a continous interpolation
together with the curve-fit. Working with numbers, I'd need an interpolation
algorithm to find intermediate data. As my data points happen to pertain to
midnight of each day, I would interpolate if I also needed data for, say,
twelve noon each day. The quality of interpolated data, quite obviously,
depends on the quality of the established adjacent data. That's why I don't
think interpolating noisy data reduces the noise.
  " ... gravitational perturbations, for more than two isolated bodies,
can NOT be predicted via simple polynomials." I have heard about that and
didn't think it would ever pertain to my activities. So I relegated it to
the cerebral regions labeled 'Conversation Topics'.
  " ... the JPL planetary ephemeris file(s) and a copy of Meeus
("Astronomical Algorithms" I believe)... Then basically write the code to
extract the data from the JPL ephemeris and convert to your desired
reference point (that is basically what is done to create these almanacs, in
the first place." This is a vey useful suggestion. Thanks a lot.

Frederic



-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Data smoothing algorithms? - Thank you all

2005-04-30 Thread Anthra Norell
Thank you all for your solutions! The moving average filter will surely do.
I will take a closer look at SciPy, though. The doc is impressive. I believe
it's curve fitting I am looking for rather than interpolation. There's a
chapter on that too.

Frederic


- Original Message -
From: "Larry Bates" <[EMAIL PROTECTED]>
Newsgroups: comp.lang.python
To: 
Sent: Friday, April 29, 2005 9:02 PM
Subject: Re: Data smoothing algorithms?


> Sounds like what you are looking for is spline interpolation.
> Given a set of datapoints is passes spline curves through
> each point giving you smooth transitions.  Did a lot of this
> in Fortran MANY years ago.
>
> Google turned up:
>
> http://www.scipy.org/documentation/apidocs/scipy/scipy.interpolate.html
>
> http://cmp.felk.cvut.cz/~kybic/thesis/pydoc/bigsplines.html
>
> http://www.mirror5.com/software/plotutils/plotutils.html
>
> Good Luck
> Larry Bates
>
> John J. Lee wrote:
> > "Anthra Norell" <[EMAIL PROTECTED]> writes:
> >
> >
> >>Hi,
> >>
> >>The following are differences of solar declinations from one day to
> >>the next, (never mind the unit). Considering the inertia of a
> >>planet, any progress of (apparent) celestial motion over regular
> >>time intervals has to be highly regular too, meaning that a plot
> >>cannot be jagged. The data I googled out of Her Majesty's Nautical
> >>Almanac are merely nautical precision and that, I suppose, is where
> >>the jitter comes in. There's got to be algorithms out there to iron
> >>it out. If it were a straight line, I could do it. But this, over
> >>the whole year, is a wavy curve, somthing with a dominant sine
> >>component. Suggestions welcome.
> >
> >
> > The important thing is to have a (mathematical, hopefully) model of
> > how you expect the data to vary with time.  Start from there, and
> > then, for example, use regression to fit a curve to the data.
> >
> > The "Numerical Recipes" (Press et al.) book is popular and IMHO is a
> > good place to learn about these things (comes in several language
> > flavours, including Fortran and C -- sadly no Python AFAIK), though
> > the implementations aren't a great choice for serious "production"
> > use, according to those in the know.
> >
> > OTOH, there are quick and dirty methods that don't involve any model
> > worth speaking of -- and Press et al. covers those too :-)
> >
> >
> > John
> >
> --
> http://mail.python.org/mailman/listinfo/python-list

-- 
http://mail.python.org/mailman/listinfo/python-list