a

Neural Network In 5 Minutes | What Is A Neural Network? | How Neural Networks Work | Simplilearn

last summer my family night visited

Russia even though none of us go to read

Russian we did not have any trouble in

figuring our way out

all thanks to Google's real-time

translation of Russian boards into

English this is just one of the several

applications of neural networks neural

networks form the base of deep learning

a subfield of machine learning where the

algorithms are inspired by the structure

of the human brain neural networks take

in data train themselves to recognize

the patterns in this data and then

predict the outputs for a new set of

similar data let's understand how this

is done let's construct a neural network

that differentiates between a square

circle and triangle neural networks are

made up of layers of neurons these

neurons are the core processing units of

the network first we have the input

layer which receives the input the

output layer predicts our final output

in between exist the hidden layers which

perform most of the computations

required by our network here's an image

of a circle this image is composed of 28

by 28 pixels which make up for 784

pixels each pixel is fed as input to

each neuron of the first layer neurons

of one layer are connected to neurons of

the next layer through channels each of

these channels is assigned a numerical

value known as weight the inputs are

multiplied to the corresponding weights

and their sum is sent as input to the

neurons in the hidden layer each of

these neurons is associated with a

numerical value called the bias which is

then added to the input sum this value

is then passed through a threshold

function called the activation function

the result of the activation function

determines if the particular neuron will

get activated or not an activated neuron

transmits data to the neurons of the

next layer over the channels in this

manner the data is propagated through

the network this is called forward

propagation in the output layer the

neuron with the highest value fires and

determines the output the values are

basically a probable

for example here are near unassociated

with square has the highest probability

hence that's the output predicted by the

neural network of course just by a look

at it

we know our neural network has made a

wrong prediction but how does the

network figure this out

note that our network is yet to be

trained during this training process

along with the input our network also as

the output fed to it the predicted

output is compared against the actual

output to realize the error in

prediction the magnitude of the error

indicates how wrong we are in the sign

suggests if our predicted values are

higher or lower than expected the arrows

here give an indication of the direction

and magnitude of change to reduce the

error this information is then

transferred backward through our network

this is known as back propagation now

based on this information the weights

are adjusted this cycle of forward

propagation and back propagation is

iteratively performed with multiple

inputs this process continues until our

weights are assigned such that the

network can predict the shapes correctly

in most of the cases this brings our

training process to an end

you might wonder how long this training

process takes honestly neural networks

may take hours or even months to train

but time is a reasonable trade-off when

compared to its scope let us look at

some of the prime applications of neural

networks facial recognition cameras on

smartphones these days can estimate the

age of the person based on their facial

features

this is neural networks at play first

differentiating the face from the

background and then correlating the

lines and spots on your face to a

possible age forecasting neural networks

are trained to understand the patterns

and detect the possibility of rainfall

or arise and stock prices with high

accuracy music composition neural

networks can even learn patterns and

music and train itself enough to compose

a fresh tune so here's a question for

you which of the following statements

does not hold true

activation functions are threshold

functions B error is calculated at each

layer of the neural network see both

forward and back propagation take place

during the training process of a neural

network D most of the data processing is

carried out in the hidden layers leave

your answers in the comments section

below three of you stand a chance to win

Amazon vouchers so don't miss it with

deep learning and neural networks we are

still taking baby steps the growth in

this field has been foreseen by the big

names companies such as Google Amazon

and Nvidia have invested in developing

products such as libraries predictive

models and intuitive GPUs that support

the implementation of neural networks

the question dividing the visionaries is

on the reach of neural networks to what

extent can we replicate the human brain

we'd have to wait a few more years to

give a definite answer but if you

enjoyed this video it would only take a

few seconds to like and share it also if

you haven't yet do subscribe to our

Channel and hit the bell icon as we have

a lot more exciting videos coming up fun

learning till then

you