Lecture – 27 Learning : Neural Networks

Lecture Series on Artificial Intelligence by Prof. P. Dasgupta, Department of Computer Science & Engineering, IIT Kharagpur. For more Courses visit nptel.iitm.ac.in

This entry was posted in Neural Networks and tagged , , , . Bookmark the permalink.

19 Responses to Lecture – 27 Learning : Neural Networks

  1. Baha Thabet says:

    very helpful, many thanks :)

  2. pavan sughosh says:

    Wonderful video for understanding Intelligent Character Recognition (ICR) module.. Nice Classical music in the end sums up all :)

  3. nightowl8936 says:

    To finish up. A lazy, or arrogant or useless teacher is the kind that throws only abstract math at the students. A *good* teacher can use metaphor and analogy to help the student picture in their minds what the math is expressing. I’ve seen this before in programming courses and in math courses. People may be able to use the tools, but they don’t gain an intuitive feel for the stuff without different views both mathematical and non-mathematical of the subject matter.

  4. nightowl8936 says:

    You think chain rule calculus is baby math? To a mathematician it obviously is, but to most people, even those that might have degrees but don’t use calculus everyday will get rusty. I doubt you could derive the chain rule from first principles, even if you know how to use it in a mechanical way. But the key conceptual ideas of understanding the power of backprop have to do with the nonlinear sigmoid and its differentiability in relation to credit assignment problem and functional superposition

  5. idunnononame says:

    And what I am saying is that I understand all of that, but if you cannot tolerate even baby math, there are many things which you won’t understand on this topic (including, in particular, the very subjects you raised). If you do not mind that, then so be it, but a lot of people assume at least a marginally mathematical mindset.

  6. nightowl8936 says:

    I am saying that a mathematics student trying to improve upon back-propagation will only do so from the view of viewing it as a differential mathematical machine. They will not know or understand its relationship to whats gone on before, nor what its intended to model, which is parallel distributed processing in brains. Stupid approaches led to dead ends like perceptrons and newer support vector machines, which are mathematical circle jerks by people who don’t know brain science.

  7. idunnononame says:

    What are you even talking about? I would say I’m living proof that knowing the flaws of back-propagation from a purely mathematical perspective is possible (this shouldn’t be hard to believe). Please stop calling people names and complaining about math. The math in this video is of a trivial nature.

  8. CassandraAbbey says:

    This was traditionally used to refer to a network or circuit of biological neurons.

  9. bayrees says:

    @Norman60Fahrer How many times did (do) you confuse between gas pedal and break pedal when you are driving @ high speeds?

  10. nightowl8936 says:

    Typical 3rd world mentality. Middle east schools are similar in that they focus only on the harder sciences and stupidly think those are the only ones that matter. Indians are forgetting their intellectual history of a broad spectrum of ideas. To properly understand learning in general, you need to understand psychology, and neuroscience, and cognitive science before you can fully grasp backpropagation being just one way of doing it. You know you’re not that bright.

  11. venkatarun95 says:

    IITan’s are expected to know Maths!!

  12. nightowl8936 says:

    This video is average quality. The relatively archaic explanation methods and strong mathematical bias place burdens on the learner to be good at math and able to follow the topics in an order thats not necessarily the easiest.

  13. raptor12143 says:

    Seems like that the professor is very confused… He’s also skipping topics by making excuse “i will tell in a mo” n then forget. Not good for newbies :( . One must read book to learn basics before watching this video.

  14. robextra0 says:

    I used to teach that 20 years ago, but with a more visual method, using color interactive graphics (not as developed as now, but still quite helpful) to understand the different steps and show the convergence over time. regarding BIAS: it does NOT matter whether you use + or -, as long as you are consistent throughout the learning and operation phase.

  15. SalsaTiger83 says:

    we can look forward to the khanacademy version of this 😉

  16. Faisal Satti says:

    Gr8 video and a useful lec material

  17. sacaafrica says:

    according to the famous book ” Neural Networks A Comprehensive Foundation” in= W A+ BIAS. other books say that threshold in biological neurons has a negative value but in the artificial it may be positive, in this case, it is usally reffered as bias. and that what makes me confused!!!! if you have any explanation you can help me. Thank you for responding

  18. theoark says:

    consider the case of bias being negative.

  19. sacaafrica says:

    I cant understand onething: in some books I find that in= sum wa+ Bias and in other ones in = sum wa- bias !!! wich of them is the coorect? please help

Leave a Reply