# Gradient decreases, but mean squared error function still very high

Page 1 of 1

## 1 Replies - 775 Views - Last Post: 05 May 2017 - 04:10 AMRate Topic: //<![CDATA[ rating = new ipb.rating( 'topic_rate_', { url: 'http://www.dreamincode.net/forums/index.php?app=forums&module=ajax&section=topics&do=rateTopic&t=403680&amp;s=14831e1b1615350c39c041abfa56e50b&md5check=' + ipb.vars['secure_hash'], cur_rating: 0, rated: 0, allow_rate: 0, multi_rate: 1, show_rate_text: true } ); //]]>

### #1 moonman239

• New D.I.C Head

Reputation: 0
• Posts: 10
• Joined: 24-March 15

# Gradient decreases, but mean squared error function still very high

Posted 03 May 2017 - 06:19 PM

Hi, I'm a newbie at neural networks. I'm trying to build one from scratch, using backpropagation & gradient descent. My dataset is a bunch of triangles, and I'm using the NN to approximate the hypotenuse. While my gradient-descent function appears to be doing what it's supposed to be doing, I consistently end up with an error no less than 50 units, when the maximum hypotenuse in my dataset is approx. 14 units.

With that said, here's my Python code:
```#!/usr/local/bin/python3
#Module: neuralnetwork.py
# Approximates the Pythagorean theorem using a neural network.
import random
import math
# create training and testing datasets
class rightTriangle:
height = 0.00
width = 0.00
def hypotenuse(self):
return (self.height**2 + self.width**2)**(1/2)
triangles = []
for i in range(0,300):
triangle = rightTriangle()
triangle.height = random.random() * 9 + 1
triangle.width = random.random() * 9 + 1
triangles.append(triangle)
trainingTriangles = []
for i in range(0,200):
trainingTriangles.append(triangles[i])
testingTriangles = []
for i in range(201,300):
testingTriangles.append(triangles[i])
class Neuron:
inputWeights = [1,1]
numInputs = 2
inputBiases = [1,1]
# Calculate the output given a set of inputs.
def activate(self,inputs):
sum = 0
for i in range(0,self.numInputs - 1):
sum = sum + self.inputWeights[i] * inputs[i] + self.inputBiases[i]
function = 1/(1 + math.exp(-sum))
return function
# Adjust weights and biases so as to minimize the cost (error) function for a given dataset.
h = 10 # Constant used to compute gradient.
for weightNum in range(0,self.numInputs - 1):
for i in range(0,199):
# Compute the gradient (aka. derivative) for the given inputs & weight.
currentCost = meanSquaredError(inputs)
self.inputWeights[weightNum] += h
gradient = (meanSquaredError(inputs) - currentCost) / h
self.inputWeights[weightNum] -= h
# Multiply the gradient by a learning factor.
learningFactor = 10
# Subtract weight by scaled gradient.
for biasNum in range(0,self.numInputs - 1):
for i in range(0,199):
# Compute the gradient (aka. derivative) for the given inputs & weight.
currentCost = meanSquaredError(inputs)
self.inputBiases[biasNum] += h
gradient = (meanSquaredError(inputs) - currentCost) / h
self.inputBiases[biasNum] -= h
# Multiply the gradient by a learning factor.
learningFactor = 10
# Subtract bias by scaled gradient.
inputNeuron = Neuron()
hiddenNeuron = Neuron()
outputNeuron = Neuron()
# Function to predict the hypotenuse given a height (inputs[0]) and width (inputs[1])
def predict(inputs):
inputNeuronOutput = inputNeuron.activate(inputs)
hiddenNeuronOutput = hiddenNeuron.activate([inputNeuronOutput,0])
outputNeuronOutput = outputNeuron.activate([hiddenNeuronOutput,0])
return outputNeuronOutput
# Function to calculate the mean squared error for a given set of triangles.
def meanSquaredError(triangles):
sum = 0
for triangle in triangles:
errorSquared = (triangle.hypotenuse() - predict([triangle.width,triangle.height]))**2
sum = sum + errorSquared
return sum / len(triangles)
# Train our neural network.
# Test the neural network.
print("Mean squared error: " + str(meanSquaredError(testingTriangles)))
```

Is my problem due to an improper use of backpropagation, or does my hidden layer need more neurons?

Is This A Good Question/Topic? 0

## Replies To: Gradient decreases, but mean squared error function still very high

### #2 Skydiver

• Code herder

Reputation: 6071
• Posts: 20,894
• Joined: 05-May 12

## Re: Gradient decreases, but mean squared error function still very high

Posted 05 May 2017 - 04:10 AM

Instead of just running the code have you stepped through the code with pen and paper or a text editor nearby to record values? Do these values match up with your manual computations when you designed your circuits?

Or are you treating everything like a black box and you are now wondering what else to throw into the black box?

Page 1 of 1

 .related ul { list-style-type: circle; font-size: 12px; font-weight: bold; } .related li { margin-bottom: 5px; background-position: left 7px !important; margin-left: -35px; } .related h2 { font-size: 18px; font-weight: bold; } .related a { color: blue; }