Skip to content

Commit c4e0d21

Browse files
authored
Update README.md
1 parent 8db74aa commit c4e0d21

File tree

1 file changed

+4
-1
lines changed

1 file changed

+4
-1
lines changed

README.md

+4-1
Original file line numberDiff line numberDiff line change
@@ -1,8 +1,11 @@
11
# Exponential-regression-using-Gradient-descent
22
This repository contains the code that does exponential regression using gradient descent optimizer.
33
Consider the exponential regression, <a href="https://www.codecogs.com/eqnedit.php?latex=$y&space;=&space;a&space;e^{bx}$" target="_blank"><img src="https://latex.codecogs.com/gif.latex?$y&space;=&space;a&space;e^{bx}$" title="$y = a e^{bx}$" /></a>, after the log-transformation the equation becomes <a href="https://www.codecogs.com/eqnedit.php?latex=$log(y)&space;=&space;log(a)&space;&plus;&space;bx$" target="_blank"><img src="https://latex.codecogs.com/gif.latex?$log(y)&space;=&space;log(a)&space;&plus;&space;bx$" title="$log(y) = log(a) + bx$" /></a>.
4-
Gradient Descent algorithm:
4+
55
<a href="https://www.codecogs.com/eqnedit.php?latex=$SSE&space;=\sum{\hat{log(y)-&space;loga-&space;bx)}^2}&space;/&space;2n$" target="_blank"><img src="https://latex.codecogs.com/gif.latex?$SSE&space;=\sum{\hat{log(y)-&space;loga-&space;bx)}^2}&space;/&space;2n$" title="$SSE =\sum{\hat{log(y)- loga- bx)}^2} / 2n$" /></a>
6+
7+
Gradient Descent algorithm:
8+
69
Step 1: Initialize the weights(loga & b) with random values and calculate Error (SSE)
710

811
Step 2: Calculate the gradient i.e. change in SSE when the weights (loga & b) are changed by a very small value from their original randomly initialized value. This helps us move the values of loga & b in the direction in which SSE is minimized.

0 commit comments

Comments
 (0)