Weighting a linear fit with a gaussian

I am having trouble finding what I am looking for in a way that can be applied in igor without writing a new fitting program. As the title states, I want to create a modification to the normal curve fitting scheme where the contribution of the leastsquaresfit(LSF)  is weighted by the distance from the central point according to a gaussian curve. The reason for doing this is that I am trying to calculate a faux derivative of raw set of data, and am using the elementary definition of a derivative to do so. At its fundamental core, a derivative is just the tangential slope of a point at each point of a curve. My data changes at a fairly smooth rate, so at this point I have made a panel that allows you to include N number of points on either side of the point being investigated, and fits a small line to the subset of points. This works pretty well for very smooth parts of the data. On "sharp" changes in slope, the slope isn't accurately represented. The "goodness of slope" is verified by taking a trapezoidal integral of the derivative and comparing the integral to the original data.

My desire is to weight points closer to the singular point being investigated higher than those farther. I have seen the /w,/I  parameters in the curvefit help box, but that wants a wave with standard deviation, or standard error (/w says standard error, while  /I says standard deviation. I would like a clarification on which is actually used since those two things are not the same). Is there a way to bridge my desire to have gaussian weighted contributions to an artificially created wave full of standard deviations(errors?) that will mimic the effect that I am seeking? 

Hopefully this idea is expressed clearly enough.

-Cory

So you want to fit a line to X, Y data using a weighting wave that has a Gaussian profile centred at the ith point? That is easily done by creating a weighting wave and assigning values with a wave assignment:

weightwave=Gauss(Xwave[p], centre, width)

Provided you don't need to determine statistically realistic estimates for uncertainties in the fit parameters, this approach should work, it's the relative magnitude of the values in the weighting wave that matters.

But I have to wonder whether your problem is not best tackled with some alternative strategy. For instance, if the data can be fit with a polynomial function, the derivative at any point can be obtained analytically by differentiation of the polynomial equation. Or perhaps you could apply binomial smoothing to a copy of the data and then use the Differentiate operation.

Thank you for the polynomial fit suggestion. I had not thought of that, and it is a very clever idea.

As for the gaussian wave, The issue is not making a gaussian distributed weighting wave, it's applying it with igor's fitting scheme. Since igor requires a "wave of standard errors" to be the weighting wave, I am unsure of how to translate a gaussian distribution of 0->1 weights to a equivalently impactful wave of standard erros

In reply to by Dolbash

The default for weighting is 1/sd. That is, unless you specify otherwise by setting the flag /I=1, the weighting for data point i is proportional to the value of weight i.

try this:

make test=x^2/100
display test
duplicate test weight
weight=gauss(x, 60, 5)
CurveFit/X=1 line test /W=weight /D

 

In reply to by Dolbash

There are no changes here! As it says in your screenshot, the default (no flag) is the same as /I=0, meaning the weighting wave contains the reciprocal of the standard deviation.

Least squares minimizes the sum of the squares of the residuals, where the residuals are (fit(Xi)-Yi)/sigma(Yi). If weight W is specified as 1/sigma, then you minimize the sum of squares of Wi*(fit(Xi)-Yi). So a weight of zero excludes a point from the fit, and with increasing weight a point has increasing influence on the fit.

I have to imagine that you could also do one or the other of these two approaches to reach your goal (which it seems you are likely doing using a for(ic...) loop?).

Method 1 - Smooth + Differentiate

* Smooth the data (the advantage being that you can use any one of various methods in the smooth operation to weight how the smoothing is applied).

* Take the discreet derivative of the smoothed curve

Method 2 - FFT + Multiply + IFFT

The procedure is outlined in this reference.

https://math.mit.edu/~stevenj/fft-deriv.pdf

All methods suffer from either over-smoothing discontinuities or from stability of the derivatives. To understand that, consider that what you want is a method that removes high-frequency variation except where it "matters". To try to come up with a generalized version of "where it matters" is very difficult.

You might look into smoothing with Loess (see the Loess operation).

You might look into a "smoothing spline". In Igor 8, see the Interpolate2 operation.

Both Loess and a smoothing spline take into account a measure of the deviation of the smoothed data from the input data. Loess basically uses your technique of computing fitted slopes, but has a sophisticated measure of variance from the input data to inform how the fitting is done. The documentation for the Loess operation has links to a couple of example experiments that you may wish to look at.

Be sure to take into account that you are looking at the data and saying, "I know what it should look like". That's not always a scientifically rigorous determination :)