Re: [Scilab-users] "Smoothing" very localised discontinuities in (scilab: to exclusive) curves.

2016-04-04 Thread Tim Wescott
I wish you'd made this response to the list.

Here's another thought, to address your overall problem:

* Find out what flavor of splines FEMM uses (there are subtleties).
* Figure out what Scilab calls them (naturally, everyone uses
  different names -- particularly when there's multiple countries
  and/or disciplines involved)
* Get your bad data point problem beaten to the ground
* Use Scilab to find the minimum number of points and their
  locations which, when smoothed with the right kind of
  splines, gives an acceptably close match to your B-H curve.
  Determining what's "acceptably close" is your job,
  of course.

You may find that minimizing the number of points helps the speed, you
may not -- it may just be that FEMM's numerical solver doesn't like a
large second derivative in the B-H data, in which case you'll just need
to smile and accept long solver times.

On Mon, 2016-04-04 at 08:43 -0800, scilab.20.browse...@xoxy.net wrote:
> Tim,
> 
> (sorry for the earlier "Tom"; finger trouble not name recognition problems :)
> 
> > If your FEM software can't handle non-monotonic data, can it handle
> > noisy data?
> 
> That's a good (and open) question.
> 
> The software is FEMM (www.femm.info) which is widely held in high esteem.
> 
> When entering BH curve values into the materials definitions for those with 
> non-linear magnetic properties, the program simply won't allow you to enter 
> non-monotonic data; hence my need to remove these discontinuities before I 
> can progress my project.
> 
> Once given data it will accept, it then manipulates the data to allow it to 
> ... "do its thang". It's easier (and far more accurate) for me to quote David 
> Meaker here than try to paraphrase him:
> 
> "Since FEMM interpolates between your B-H points using cubic splines, it is 
> possible to get a bad curve if you haven’t entered an adequate number of 
> points. “Weird” B-H curves can result if you have entered too few points 
> around relatively sudden changes in the B-H curve. Femm “takes care of” bad 
> B-H data (i.e. B-H data that would result in a cubic spline fit that is not 
> single-valued) by repeatedly smoothing the B-H data using a three-point 
> moving average filter until a fit is obtained that is single-valued. This 
> approach is robust in the sense that it always yields a single-valued curve, 
> but the result might be a poor match to the original B-H data. It may also be 
> important to note that FEMM extrapolates linearly off the end of your B-H 
> curve if the program encounters flux density/field intensity levels that are 
> out of the range of the values that you have entered. This extrapolation may 
> make the material look more permeable than it “really” is at high flux 
> densities. You have to be careful to enter enough B-H values to get an 
> accurate solution in highly saturated structures so that the program is 
> interpolating between your entered data points, rather than extrapolating. 
> Also in the nonlinear parameters box is a parameter, φhmax. For nonlinear 
> problems, the hys-teresis lag is assumed to be proportional to the effective 
> permeability. At the highest effective permeability, the hysteresis angle is 
> assumed to reach its maximal value of φhmax. This idea can be represented by 
> the formula: φh(B) = ( µe f f (B) ) / ( µe f f ,max ) φhmax "
> 
> So the first goal is to get data that FEMM will accept. After that there are 
> other criteria that may require the raw data to be further manipulated.
> 
> I've demonstrated that 'inconveniently real-world' data can interact badly in 
> (at least) two different ways. with FEMMs manipulation and use of that data 
> in its Tensor equations:
> 
> 1) Unless the final points approach a slope of 0 very closely, it can 
> extrapolate the data to produce ludicrously high B values. (Eg. the 'best# I 
> achieved was 67.7T!)
> 
> 2) Unless the data values works with FEMM's interpolations rather than 
> against them; it can result in extraordinarily long processing times as the 
> Newton-Ralphson iterations oscillate and converge very slowly. 
> 
> The former is fairly easy to deal with by simply appending a few artificial 
> values to the dataset that ensure a very near 0 slope.
> 
> The latter is far harder to get a handle on. One of the reasons I'm being 
> quite adamant about making a little change to the real data as possible is so 
> as to give me a chance to determine the absolute minimum requirements for 
> ensuring accurate simulations with minimal run times.
> 
> The typical approach to the latter problem seems to be to approximate the 
> data using some nicely behaving mathematical approximations (1-exp( tanh ) or 
> similar), and feed those values as the materials definition. 
> 
> The problem with that is that it throws away much of the subtlety of the 
> data; which for silicon steel laminates doesn't matter; but for 
> amorphous/nono-crystalline  metals, can mask important parts of their 
> desirable behaviours.
> 
> 

Re: [Scilab-users] "Smoothing" very localised discontinuities in (scilab: to exclusive) curves.

2016-04-04 Thread scilab . 20 . browseruk
Tim,

(sorry for the earlier "Tom"; finger trouble not name recognition problems :)

> If your FEM software can't handle non-monotonic data, can it handle
> noisy data?

That's a good (and open) question.

The software is FEMM (www.femm.info) which is widely held in high esteem.

When entering BH curve values into the materials definitions for those with 
non-linear magnetic properties, the program simply won't allow you to enter 
non-monotonic data; hence my need to remove these discontinuities before I can 
progress my project.

Once given data it will accept, it then manipulates the data to allow it to ... 
"do its thang". It's easier (and far more accurate) for me to quote David 
Meaker here than try to paraphrase him:

"Since FEMM interpolates between your B-H points using cubic splines, it is 
possible to get a bad curve if you haven’t entered an adequate number of 
points. “Weird” B-H curves can result if you have entered too few points around 
relatively sudden changes in the B-H curve. Femm “takes care of” bad B-H data 
(i.e. B-H data that would result in a cubic spline fit that is not 
single-valued) by repeatedly smoothing the B-H data using a three-point moving 
average filter until a fit is obtained that is single-valued. This approach is 
robust in the sense that it always yields a single-valued curve, but the result 
might be a poor match to the original B-H data. It may also be important to 
note that FEMM extrapolates linearly off the end of your B-H curve if the 
program encounters flux density/field intensity levels that are out of the range 
of the values that you have entered. This extrapolation may make the material 
look more permeable than it “really” is at high flux densities. You have to be 
careful to enter enough B-H values to get an accurate solution in highly 
saturated structures so that the program is interpolating between your entered 
data points, rather than extrapolating. Also in the nonlinear parameters box is 
a parameter, φhmax. For nonlinear problems, the hys-teresis lag is assumed to 
be proportional to the effective permeability. At the highest effective 
permeability, the hysteresis angle is assumed to reach its maximal value of 
φhmax. This idea can be represented by the formula: φh(B) = ( µe f f (B) ) / ( 
µe f f ,max ) φhmax "

So the first goal is to get data that FEMM will accept. After that there are 
other criteria that may require the raw data to be further manipulated.

I've demonstrated that 'inconveniently real-world' data can interact badly in 
(at least) two different ways. with FEMMs manipulation and use of that data in 
its Tensor equations:

1) Unless the final points approach a slope of 0 very closely, it can 
extrapolate the data to produce ludicrously high B values. (Eg. the 'best# I 
achieved was 67.7T!)

2) Unless the data values works with FEMM's interpolations rather than against 
them; it can result in extraordinarily long processing times as the 
Newton-Ralphson iterations oscillate and converge very slowly. 

The former is fairly easy to deal with by simply appending a few artificial 
values to the dataset that ensure a very near 0 slope.

The latter is far harder to get a handle on. One of the reasons I'm being quite 
adamant about making a little change to the real data as possible is so as to 
give me a chance to determine the absolute minimum requirements for ensuring 
accurate simulations with minimal run times.

The typical approach to the latter problem seems to be to approximate the data 
using some nicely behaving mathematical approximations (1-exp( tanh ) or 
similar), and feed those values as the materials definition. 

The problem with that is that it throws away much of the subtlety of the data; 
which for silicon steel laminates doesn't matter; but for 
amorphous/nono-crystalline  metals, can mask important parts of their desirable 
behaviours.

Indeed, my entire foray into learning SciLab is to try and establish a good 
method for pre-processing these kind of curves so that it strikes the balance 
between horrendous run times; and using artificial substitutes that ignore 
those subtleties.

And yes. You've certainly given me food for thought :)



> 
> I'm taking "smoothing" to mean "filtering" here.
> 
> In general the filtering problem is to decide on the characteristics of
> your desired signal, the characteristics of your noise, and then make
> some mathematical transformation that minimizes the effects of anything
> with noise-like characteristics while retaining as much of anything with
> desired-signal-like characteristics.
> 
> This has two salient corollaries: first, you have to know what the
> characteristics of your signal and noise are; and second, the more that
> your signal resembles your noise, the less of it you will be able to
> recover.
> 
> You can spin off from there in a number of different directions, leading
> you to a number of different filtering techniques.
> 
> It appears that you're doing your 

Re: [Scilab-users] "Smoothing" very localised discontinuities in (scilab: to exclusive) curves.

2016-04-04 Thread Serge Steer
If your data are regulary sampled along the y axis you can use the 
sgolay filter 
(http://en.wikipedia.org/wiki/Savitzky%E2%80%93Golay_filter;>http://en.wikipedia.org/wiki/Savitzky-Golay_filter)otherwise 
the loess regression 
(http://cran.r-project.org/doc/contrib/Fox-Companion/appendix-nonparametric-regression.pdf) 
may be tried .


Both methods and others  like medianfilter , sdfilter, ... are available 
in the CWA scilab module...


Serge Steer

___
users mailing list
users@lists.scilab.org
http://lists.scilab.org/mailman/listinfo/users


Re: [Scilab-users] "Smoothing" very localised discontinuities in (scilab: to exclusive) curves.

2016-04-04 Thread Rafael Guerra
If your data is not recorded in real-time, you can sort it (along the x-axis)
and this does not imply that the "y(x) function" will become monotonous. See
below.
 
As suggested, by Stephane Mottelet, see one 3-point median filter solution below
applied to data similar to yours:
 
 
M = [1.0  -0.2;
1.4   0.0;
2.1   0.2;
1.7   0.45;
2.45  0.5;  
2.95  0.6;
2.5   0.75;
3.0   0.8;
3.3   1.2];
x0 = M(:,1);
y0 = M(:,2);
clf();
plot2d(x0,[y0 y0],style=[5 -9]);
[x,ix] = gsort(x0,'g','i'); // sorting input x-axis
y = y0(ix);
k =1; // median filter half-lenght
n = length(x);
x(2:n+1)=x; y(2:n+1)=y;
x(1)=x(2); y(1)=y(2);
x(n+2)=x(n+1); y(n+2)=y(n+1);
n = length(x);
for j = 1:n
j1 = max(1,j-k);
j2 = min(n,j+k);
ym(j) = median(y(j1:j2));
end
plot2d(x,ym+5e-3,style=[3],leg="3-point median filtering@"); // shift for
display purposes
 
 
 
This gets rid of obvious outliers but does not guarantee a monotonous output
(idem for the more robust LOWESS technique, that can be googled).
 
Rafael
 
-Original Message-
From: users [mailto:users-boun...@lists.scilab.org] On Behalf Of
scilab.20.browse...@xoxy.net <mailto:scilab.20.browse...@xoxy.net> 
Sent: Monday, April 04, 2016 1:05 PM
To: users@lists.scilab.org <mailto:users@lists.scilab.org> 
Subject: Re: [Scilab-users] "Smoothing" very localised discontinuities in
(scilab: to exclusive) curves.
 
Yes.
 
C:\Motor>graphRdat T HB1M_Core25_No_Field_No_Epoxy_800Am.rdat
   s = splin( h', b', 'monotone' );
!--error 999
splin: Wrong value for input argument #1: Not (strictly) increasing or +-inf
detected.
at line  22 of exec file called by :
 
Since there are no inf values in the data; that kind of implies that it requires

monotonic input in order to produce monotonic output; which ain't so useful.
 
That said, I get that same error message whichever variation of the splin()
function I try
 
Which suggests there's something wrong with my data, but that stupid cos the
data is real.
The math has to adapt to the data not the other way around.
 
 
___
users mailing list
users@lists.scilab.org
http://lists.scilab.org/mailman/listinfo/users


Re: [Scilab-users] "Smoothing" very localised discontinuities in (scilab: to exclusive) curves.

2016-04-04 Thread scilab . 20 . browseruk
Yes.

C:\Motor>graphRdat T HB1M_Core25_No_Field_No_Epoxy_800Am.rdat
s = splin( h', b', 'monotone' );
!--error 999
splin: Wrong value for input argument #1: Not (strictly) increasing or +-inf 
detected.
at line  22 of exec file called by :

Since there are no inf values in the data; that kind of implies that it 
requires 
monotonic input in order to produce monotonic output; which ain't so useful.

That said, I get that same error message whichever variation of the splin() 
function I try

Which suggests there's something wrong with my data, but that stupid cos the 
data is real.
The math has to adapt to the data not the other way around.




> -Original Message-
> From: scilab.browseruk.b28bd2e902.jrafaelbguerra#hotmail@ob.0sg.net
> Sent: Mon, 4 Apr 2016 11:20:13 +0200
> To: users@lists.scilab.org
> Subject: Re: [Scilab-users] "Smoothing" very localised discontinuities in
> (scilab: to exclusive) curves.
> 
> Hi Buk.
> 
> Have you tried Scilab's cubic splines using the "monotone" option?
> 
> Regards,
> Rafael
> 
> -Original Message-
> From: users [mailto:users-boun...@lists.scilab.org] On Behalf Of
> scilab.20.browse...@xoxy.net
> Sent: Sunday, April 03, 2016 9:09 PM
> To: users@lists.scilab.org
> Subject: [Scilab-users] "Smoothing" very localised discontinuities in
> curves.
> 
> HI,
> 
> The data I'm dealing with is experimentally produced; and thus contains
> occasional, localised discontinuities (inflections), that I need to
> remove
> before that data is suitable for is use in FEM modeling software, which
> requires
> that it be strictly monotonic. The attachment shows the full curve plus a
> close
> up of a couple of examples of the type of discontinuity I need to deal
> with.
> 
> I haven't yet decided whether to simply omit points (thus connect A to F
> & G to
> J) or whether to retain the same number of points by interpolating new
> points
> onto that line as shown in red.
> 
> I've looked and played several of the smoothing, convolution and
> interpolation
> routines that scilab provides, but (besides that I don't understand the
> output
> some of them produce) they also seem to affect the data more than I would
> like.
> Some seem to introduce a 'phase shift'; others smooth out larger scale
> bumps in
> the curve that need to be retained; and others generate many extra points
> which
> I don't think is helpful, the FEM software is going to do its own
> interpolations
> anyway.
> 
> 
> But the bit I'm asking about here is how to detect point A and G?
> 
> Any thoughts or pointers as to a) the algorithm to use; b) how to
> implement it
> in SciLab?
> 
> Cheers, Buk.
> 
> 
> Can't remember your password? Do you need a strong and secure password?
> Use Password manager! It stores your passwords & protects your account.
> Check it out at http://mysecurelogon.com/password-manager
> 
> ___
> users mailing list
> users@lists.scilab.org
> http://lists.scilab.org/mailman/listinfo/users


FREE ONLINE PHOTOSHARING - Share your photos online with your friends and 
family!
Visit http://www.inbox.com/photosharing to find out more!



___
users mailing list
users@lists.scilab.org
http://lists.scilab.org/mailman/listinfo/users