New book about programming Igor

For quite a while, there was no book about Igor programming and it was hard for my students to reach a certain level.
I got the feeling that many people actually had problems with programming Igor and so I decided to write a little book about it.

It is now available on Amazon.

I tried to write the book from a more general perspective (I used C/C++ and Python in the past) and discussed things like code encapsulation, graphical user interfaces, or regular expressions on a beginners level (some people are completely unaware that Igor even supports concepts like regular expressions and the like).

And several new (and nice) features of Igor 7 are mentioned as well.
Great! I just ordered a copy. Hoping it will get more people in my lab programming in Igor.
Wonderful to see such a resource in print!

--
J. J. Weimer
Chemistry / Chemical & Materials Engineering, UAH
Hi,

I too have order a copy and look forward to it.

Maybe we should form a book club ;)


Andy
Thanks for the positive feedback! It is actually really nice to see that the community cares!
It is a good read for the Igor beginner, I've skimmed through the book last weekend.

@MSchimd: I've sent my longer feedback to the email provided in the book.
Hi,

I got the book and read it over the weekend. Very nice. I even learned/appreciated a thing or two.

I will provide some additional feedback directly.

I think there is an opportunity to have a discussion as how to teach getting the most out of Igor Pro when the target user is most likely a scientist/engineer as their day job and need a tool to get things done. This is in contrast to programmers as the primary audience for some other programs/languages.

The hybrid interface is a unique feature which I use often in doing ad hoc analysis. Also I find myself creating a user interface even for myself when I am exploring data, something again Igor Pro does so very well. So I think there may be common workflows that could act as teaching method to bring the new user up the learning curve faster.

Andy
Dear Igor users,
I updated the book when Igor 8 was released and included another example that shows how to do a curve fit with a neural network. Because I think neural networks are awesome, I list this example below, so that also everyone with an older version of the book can see it.
Run this module with

  1. NeuNet#teach(150)

  2. NeuNet#run()

  3. NeuNet#show()

  4. Several times more NeuNet#run() to play around a bit


The parameter 150 is the number of training sets for the NN which works best. Note that getting the number of training sets and learning rate right can be a bit finicky. Play around yourself.

#pragma moduleName = NeuNet

static constant N =70
static constant low = 0
static constant high = 1
static constant width = 0.01


static function teach(M)
	
	variable M 				// number of parameter sets for training
	variable i
															
	make /o/d/n=(M,2) TrainingParameters		
	wave par = TrainingParameters
			
	// for simplicity, use random parameters in a reasonable range
	// first column: amplitude
	// second column: position
	// then, each row contains a full parameter set
	par[][0]=0.1 + 0.8*(0.5+enoise(0.5))    			//[0.1 ; 0.9]
	par[][1]=0.5 + enoise(0.45)					//[0.05 ; 0.95]
	
	// generate the curves of the training parameters
	make /o/d/n=(M,N) TrainingCurves			
	wave tc = TrainingCurves
	SetScale /I y, low, high, tc  	 		// note the normalization to [0,1]
	
	// store them in rows, not in columns
	for (i=0; i<M; i+=1)		
		tc[i][] = par[i][0]*exp(-(y-par[i][1])^2/width)
	endfor
	
	//now the neural network will learn the 
	//connection between parameters and curveshape

	NeuralNetworkTrain nhidden=50, input=tc ,output=par
	
	// the result of this learning process will be saved in two 
	// waves M_weights1 and M_weights2
	// these waves contain all the necessary information for running the network

end


static function run()
	
	// ------------ 
	// make an arbitrary test curve
	make /o/d/n=(N) sampleWave  		  // number of points has to be the same as
                                                                 // in the training set!
	wave sW = sampleWave
	SetScale /I x,low,high, sW
	
	// neural networks are better with interpolating rather then extrapolating:
	// use smaller ranges than in the training set
	variable randomHeight = 0.2 + 0.6*(0.5+enoise(0.5))		//[0.2 ; 0.8]
	variable randomLoc = 0.5 + enoise(0.25)				//[0.25 ; 0.75]
	
	sW = randomHeight*exp(-(x-randomLoc)^2/width)
	sw += gnoise(0.01)

	// ------------
	// make references to the output waves of the training session
	wave W1 = M_weights1
	wave W2 = M_weights2
	
	// run the neural network 
	NeuralNetworkRun input=sW, weightsWave1=W1, weightsWave2=W2	
	
	// ------------
	// draw the result
	// the wave W_NNResults is automatically created by the neural network
	wave NNRes = W_NNResults   
	
	make /D /O /N=(N) NNCurve
	wave NNC = NNCurve
	SetScale /I x,low, high, NNC
	
	NNC = NNRes[0]*exp(-(x-NNRes[1])^2/width)
	
end


static function show()

	// call this function only after NeuNet#run() was active at least once
	// so all waves are actually there

	wave sW = sampleWave
	wave NNC = NNCurve
	
	Display sW
	AppendToGraph NNC
	
end