Recurrent neural network softsensor model

A RNN is chosen to estimate the biomass concentration because of its strong capability of capturing the dynamic information underlying the input-output data pairs. The configuration selection of a RNN is problem specific. In this study, an extended, fully-connected RNN, known as the Williams-Zipser network [37,91], is used for on-line biomass estimation in the fermentation process due to the "dynamically rich" nature of this kind of network. Selection of a suitable RNN topology is based on simulation data generated by a mathematical model. The suitable RNN topology is then re-trained using experimental data. A fine-tuning of the RNN is necessary to make it adaptable to the real environment.

The structure of the proposed neural softsensor is given in Figure 4.1. The inputs of the neural sensor are feed rate F, volume V and DO, which are all continuously available. The output of the sensor gives the estimated biomass concentration. This neural network consists of TDLs, one hidden layer, one

Tapped delay line i

Hidden Output layer layer

Tapped delay line i


Hidden Output layer layer

Output feedback

Estimated biomass

Fig. 4.1. Structure of the proposed recurrent neural softsensor.

output neuron, feed-forward paths and feedback paths. All connections could be multiple paths. In order to enhance dynamic behaviors of the sensor, outputs from the output layer (output feedback) and the hidden layer (activation feedback) are connected to the input layer through TDLs. The output of the i-th neuron in the hidden layer is of the form:

hi(t) = /h(E Wj p(t - j) + £ WR y(t - k) + ■■■ j=0 k=1 n.

where, p is the neural network input, y is the neural network output and h is the hidden neuron's output; bf is the bias of i-th hidden neuron; na, nb, nc are the number of input delays, the number of output feedback delays and the number of hidden neurons, respectively; /h( ) is a sigmoidal function;

Wl is the weight connecting the j-th delayed input to i-th hidden neuron; WR is the weight connecting the k-th delayed output feedback to i-th hidden neuron;

Wf is the weight connecting the Z-th hidden neuron output feedback to the i-th hidden neuron.

Only one neuron is placed in the output layer, so the output is:

Wmyy is the weight connecting the rn-th hidden neuron's output to the output neuron;

bY is the output neuron bias.

Was this article helpful?

0 0

Post a comment