NeuralNetworkTrain variability

Does the NeuralNetworkTrain function start with random interconnection weights? I'm running the following command:

NeuralNetworkTrain Iterations = 1e6, MinError = 1e-8, Momentum=0.075, LearningRate=0.05, nhidden=18, input=trainingDataWave ,output=trainingResultWave

Using the same input and output data, I get different RMS errors from run to run after 1E6 iterations (deleting the interconnection weight waves between runs), and therefore I guess slightly different interconnection weights. Is this not designed to be totally repeatable?

I'm not too worried, as the RMS errors on my training data are respectably small, but just wanted to check I wasn't missing something fundamental here...

Thanks in advance for reassurance or for pointing out my stupidity.
Hello jdorsey,

Neural network training starts by initializing the array of weights using a pseudo-random number generator. If you are concerned about repeatability you have the option of providing your own set of weights (see the weightsWave1 and weightsWave2 keywords).

As for getting "different" results: If you consider this as an optimization problem you should note that the end of N-iterations does not assure you of either a unique or a converged solution. It is prudent to run subsequent tests to make sure that the resulting network actually works as you expect.

If you are going to provide your own starting weights, one is tempted to feed back into the operation the results of a previous training run. At this point, if your momentum parameter is small, your solution is likely to remain in the same local minimum so it is not at all certain that you gain much by repeat training.

WaveMetrics, Inc.
Thanks for that, - explains everything. I was aware that it'll probably take something like an infinite number of iterations for the weights to completely stabilise. The repeat training was just part of an exercise in optimising the momentum and learning rate. If it's starting in a different state each time then I'm impressed at the consistency with which it keeps heading towards the same solution / set of weights.