Consider a neural network with two hidden layers: p= 4 input units, 2 units in the first hidden layer, 3 units in the second hidden layer, and a single output. (a) Draw a picture of the network, similar to Figures 10.1 or 10.4.
(b) Write out an expression for f(x), assuming ReLU activation functions. Be as explicit as you can! (c) Now plug in some values for the coefficients and write out the value of f(X).
(d) How many parameters are there?



Answer :

Other Questions