Number of digits

Hello,

at the moment I'm programming a procedure to import data from a general text file. There is just one little problem.
The data has the form
1399
1398.2
1397.4
..

The stepwidth between the datapoints differs for different text files.
Now igor should calculate the difference between the first two datapoints, which should be 0.8 but Igor calculates 0.800049.
For other textfiles with a stepwidth of 0.05 igor calculates for example 0.0499992 or 0.0500031. This is a problem because when I want to scale my waves with this factor, the small deviations are adding up.
Is there a simple trick to solve this problem?

Best regards
This is because you are dealing with floating point numbers. Real decimal numbers often do not have an exact representation. This a consequence of computers using binary to deal with decimal. In your case work out what the difference between the first and last point is and scale appropriately. Setscale has an option for inclusive scaling.

or you could do:

(Last - first) * ( p / (numpnts-1)) + first
This is a consequence of finite-precision floating point arithmetic which, in general, does not give precise results. This is because finite-precision can exactly represent only a small subset of the real numbers with the vast majority being represented approximately.

You can get better precision by loading the data into double-precision waves (/D flag of LoadWave operation). However it will still not be precise for all cases.

Single-precision floating point stores 24 binary bits of precision which is about 7 decimal digits. Double-precision floating point stored 53 binary bits which is about 15 decimal digits.

Here are some commands that illustrate this issue:
Make/N=2 SP = {1388.0, 1388.8}      // Single-precision floating point
Make/N=2/D DP = {1388.0, 1388.8}    // Double-precision floating point
Edit SP, DP
Print SP[1] - SP[0]
Print/D SP[1] - SP[0]
Print DP[1] - DP[0]
Print/D DP[1] - DP[0]