2281

iii) hyperbolic tangent function:

tanh(*v*)

(5)

In practical applications, a neural network often consists of several neurons in

several layers. A schematic diagram of a three-layer neural network is given in Fig.

functions of wind and water levels); *Y*i (*i *= 1, ..., *m*) represents the outputs of neurons

in the hidden layer; and *Z*i (*i *= 1, ..., *p*) represents the outputs of the neural network

such as water levels and currents in and around coastal inlets. The layer that produces

the network output is called the *output layer*, while all other layers are called *hidden*

matrix, while the weight matrices coming from layer outputs are called *layer*

Multiple-layer neural networks using backpropagation training algorithms are

popular in neural network modeling (Hagan et al., 1995) because of their ability to

recognize the patterns and relationships between nonlinear signals. The term back-

propagation usually refers to the manner in which the gradients of weights are com-

puted for non-linear multi-layer networks. A neural network must be trained to deter-

mine the values of the weights that will produce the correct outputs. Mathematically,

the training process is similar to approximating a multi-variable function, *g*(*X*)*, *by

another function of *G*(*W,X*)*, *where *X *= [*x*1,*x*2,*...*,*x*n] is the input vector, and *W *=

[*w*1,*w*2,*...*.*w*n] the coefficient or weight vector. The training task is to find the weight

Fig. 4.

A three-layer feed-forward neural network for multivariate signal processing.

Integrated Publishing, Inc. |