19 Replies to “Lecture – 27 Learning : Neural Networks”

  1. To finish up. A lazy, or arrogant or useless teacher is the kind that throws only abstract math at the students. A *good* teacher can use metaphor and analogy to help the student picture in their minds what the math is expressing. I’ve seen this before in´╗┐ programming courses and in math courses. People may be able to use the tools, but they don’t gain an intuitive feel for the stuff without different views both mathematical and non-mathematical of the subject matter.

  2. You think chain rule calculus is baby math? To a mathematician it obviously is, but to most people, even those that might have degrees but don’t use´╗┐ calculus everyday will get rusty. I doubt you could derive the chain rule from first principles, even if you know how to use it in a mechanical way. But the key conceptual ideas of understanding the power of backprop have to do with the nonlinear sigmoid and its differentiability in relation to credit assignment problem and functional superposition

  3. And what I am saying is that I understand all of that, but if you cannot tolerate even baby math, there are many things which you won’t understand on this´╗┐ topic (including, in particular, the very subjects you raised). If you do not mind that, then so be it, but a lot of people assume at least a marginally mathematical mindset.

  4. I am saying that a mathematics student trying to improve upon back-propagation will only do so from the view of viewing it as a differential mathematical machine. They will not know or understand its relationship to whats gone on before, nor what its intended to model, which is´╗┐ parallel distributed processing in brains. Stupid approaches led to dead ends like perceptrons and newer support vector machines, which are mathematical circle jerks by people who don’t know brain science.

  5. What are you even talking about? I would say I’m living proof that knowing the flaws of back-propagation from a purely mathematical perspective is possible (this shouldn’t be hard to´╗┐ believe). Please stop calling people names and complaining about math. The math in this video is of a trivial nature.

  6. Typical 3rd world mentality. Middle east schools are similar in that they focus only on the harder sciences and stupidly think those are the only ones that matter. Indians are forgetting their intellectual history of a broad spectrum of ideas. To properly understand learning in general, you need to understand psychology, and neuroscience, and cognitive science before´╗┐ you can fully grasp backpropagation being just one way of doing it. You know you’re not that bright.

  7. This video is average quality. The relatively archaic explanation methods and strong mathematical bias place burdens on the learner to be good at math and able to follow the topics in an order thats not necessarily´╗┐ the easiest.

  8. Seems like that the professor is very confused… He’s also skipping topics by making excuse´╗┐ “i will tell in a mo” n then forget. Not good for newbies ­čÖü . One must read book to learn basics before watching this video.

  9. I used to teach that 20 years ago, but with a more´╗┐ visual method, using color interactive graphics (not as developed as now, but still quite helpful) to understand the different steps and show the convergence over time. regarding BIAS: it does NOT matter whether you use + or -, as long as you are consistent throughout the learning and operation phase.

  10. according to the famous book ” Neural Networks A Comprehensive Foundation” in= W A+ BIAS. other books say that threshold in biological neurons has a negative value but in the artificial it may be positive, in this case, it is usally reffered as bias. and that what makes me´╗┐ confused!!!! if you have any explanation you can help me. Thank you for responding

  11. I cant understand onething: in some´╗┐ books I find that in= sum wa+ Bias and in other ones in = sum wa- bias !!! wich of them is the coorect? please help

Leave a Reply