2 thoughts on “In Which I Trained A Neural Network :)

  1. Eric, I can’t track down your real email, so I’ll leave a comment here and hope for the best.

    The other night I mentioned other neural network structures that deal better with discrete data, but I couldn’t remember the names then. A little web research yielded these memory-joggers: Hopfield nets, Boltzmann machines, and restricted Boltzmann machines. I would look particularly at lectures/slides and such from Geoffrey Hinton of U. Toronto; there is a lot available on the web. (I’m prejudiced: I initially learned about neural networks from him 20+ years ago.) I have not used them personally, but there is a fair amount of literature on experience with them. There is even some software for Hopfield networks and Boltzmann annealing in Python. Books? Even if I could find my old books on the topic, they’d be badly out of date.

    I hope these hints will be helpful. The standard feed-forward, back-propagation neural networks don’t work well, in my experience, with discrete data, largely, I think, because the training depends on partial derivatives of outputs w.r.t. inputs, and if the inputs are 0 and 1, the derivatives are hard to assess. These other types of networks use an entirely different approach to training, as I recall, and might yield better results.

    That’s it for now. If you have any questions, my email is landau@ricksoft.com. And I’ll see you at the next Boston Python meetings, I hope.

    rick

Leave a Reply