Lec-3 Gradient Descent Algorithm

Lec-3 Gradient Descent Algorithm

Lecture Series on Neural Networks and Applications by Prof.S. Sengupta, Department of Electronics and Electrical Communication Engineering, IIT Kharagpur. Fo…

This entry was posted in Neural Networks and tagged , , , . Bookmark the permalink.

17 Responses to Lec-3 Gradient Descent Algorithm

  1. Paul Donnelly says:

    Two slopes define a plane in 3D, not a 3D line. A constant bias input would move the plane up or down.

  2. Rene Dekker says:


  3. kameshwar99 says:

    Professor Sengupta’ lectures’s are lucid and easy to follow. Good Work.

  4. iomega15 says:

    d[t-y]/dy = -1

  5. David Adler says:

    At 30:00 not sure I understand why the derivative of the error becomes __MINUS__ (t-y). Should it not just be (t-y) ?

  6. imattavarela says:

    . It was a mistake. You just have to consider o as i and i as j from that point ahead.

  7. Rhembot says:


  8. asim naupane says:

    Great thanks

  9. Hala Eid says:

    how can we download those lectures?

  10. shakirahmed1979 says:

    Thank you very much

  11. Mahesh Hegde says:

    hey every one remind you free online machine learning class has begun..which is offered by stanford university

  12. Neil Dundon says:

    Arrrrgh. Can you please keep your subscripts consistent? It adds needless confusion to an otherwise very clear lecture series.

  13. Ted Flethuseo says:

    How did he get from Wij to Woi?

  14. waterloocompengg says:

    Professor Sengupta, you are doing an amazing job!! I really like your lectures.

  15. 7139448993499432 says:

    @Dattatreya007 Is a plane in 3D?

  16. Paul zhang says:

    hey. is that hebb’s rule? the gradient descent algorithm. hebbs rule is w=w+kxy and this is w=w+kx(t-y) are they basically the same?

  17. stefandye says:

    Video is not playiing at all.

Leave a Reply