Skip to content

b/m gradient calculation #7

@gtrevi

Description

@gtrevi

Hi,
the (2/N) factor could be pulled out of the for loop, since it's out of the sigma in the p-derivative equation, correct?
So something like this:

def step_gradient(b_current, m_current, points, learningRate): b_gradient = 0 m_gradient = 0 N = float(len(points)) for i in range(0, len(points)): x = points[i, 0] y = points[i, 1] b_gradient += -(y - ((m_current * x) + b_current)) # (2/N) outta here m_gradient += -x * (y - ((m_current * x) + b_current)) # (2/N) outta here new_b = b_current - (learningRate * ((2/N)*b_gradient)) # (2/N) to be used here new_m = m_current - (learningRate * ((2/N)*m_gradient)) # (2/N) to be used here return [new_b, new_m]

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions