Quantcast
Channel: Data Science, Analytics and Big Data discussions - Latest topics
Viewing all articles
Browse latest Browse all 4448

Why Gradient Descent for Optimization?

$
0
0

@TarunSingh wrote:

Hello folks,

I have a question regarding the optimization technique used for updating the weights. People generally use gradient descent for the optimization whether its SGD or adaptive. Why can’t we use other techniques like Newton Raphson. Please help

Thanks in advance
Tarun Singh

Posts: 1

Participants: 1

Read full topic


Viewing all articles
Browse latest Browse all 4448

Trending Articles