## Description

The MPPT control in grid-connected PV array has always been a research area so that maximum power can be transferred in distribution lines. With the evaluation of artificial intelligence, it is getting a new level. In this work, we used neural network initially to track the maximum power generated from PV array and then NN is updated with optimization method which is gravitational search algorithm (GSA). Previously particle swarm optimization (PSO) was used for this purpose. Optimization process changes the input weights and biases values of NN to achieve less error. As discussed in the previous chapter NN is also an iterative process which changes its input weights and biases to achieve the minimum mean square error (MSE). It is using feedback propagation loop which is using Lquenberg algorithm. This algorithm iterates locally which means it doesn’t guarantee the convergence of all minima points. it may skip some combinations of input weights and biases which may reduce the MSE more. To avoid this issue we have adapted the optimization method named Gravitational Search Algorithm (GSA) which is explained in the previous chapter. It is based on the movement of celestial bodies and the position of these agents are input weights and biases in our case.

It is based on the movement of celestial bodies and the position of these agents are input weights and biases in our case. The output of NN is calculated by formula as:

The number of input weights and biases depends upon the number of hidden layers. The GSA algorithm is supposed to tune these values.

Before tuning the neural network we must design the code for NN and even before that the important thing is theÂ *data*. We have collected the data of 25000 samples for this work which was generated by executing the model without NN in ideal conditions. The data collected is of temperature, irradiance and duty cycle out of which temperature, irradiance is the input to the NN and duty cycle will be output.

We can generate the NN code for it by MATLAB toolbox. It will make our work easier and only tuning or parameters settling will left for us.

**Automatic NN Script Generation by MATLAB toolbox**

Neural network toolbox requires a training dataset for training purpose so that it can get trained and learn the behavior of data. We don’t have a separate dataset for testing and training so we divided the present dataset randomly in 70/30 ratio to use 70% for network training and 30% for testing.

MATLAB provides a neural network toolbox which can be used for several purposes and network this trained can be deployed as a standalone application or can generate a script for further use of modifications. We used this facility to speed up our work. A user interface of NN toolbox can be opened by using the command Â in MATLAB’s command window.

Figure: MATLAB’s NN toolbox interface

The figure shows the interface opened from this command. Since our work is recognizing the pattern of previous temperature and irradiance as input and duty cycle as output, so we will use it in pattern recognition app which lands to a page to choose the input data and target data. These datasets are picked from the workspace of MATLAB, so these must be there already.

After choosing the data division for training and testing of network, network is created which further leads to a page where user can input the number of hidden neurons. We have set hidden neurons at 20. Next Figure shows that landing page.

Figure: NN toolbox UI for entering hidden neurons

Then this network is trained for the loaded dataset and tested with the rest 30% of data. After training mean square error is generated and displayed on the user interface. Thus trained and tested NN by this toolbox can be converted in the form of MATLAB script which is required in our work. The figure shows the option in NN toolbox interface for it.

Figure: NN toolbox interface to generate the require NN script

This way we can easily get our MATLAB code for NN network for our MPPT in PV-grid. The generated MATLAB script will look like this. This is developed at https://free-thesis.com. Similar work was used in https://free-thesis.com/product/house-price-prediction/

function [net,NN]=nntrain(input,target) % This script assumes these variables are defined: % % input - input data. % target - target data. x = input'; t = target'; t(t==2)=0; % Choose a Training Function % For a list of all training functions type: help nntrain % 'trainlm' is usually fastest. % 'trainbr' takes longer but may be better for challenging problems. % 'trainscg' uses less memory. Suitable in low memory situations. trainFcn = 'trainbr'; % Scaled conjugate gradient backpropagation. % Create a Pattern Recognition Network hiddenLayerSize = 20; net = patternnet(hiddenLayerSize); % Choose Input and Output Pre/Post-Processing Functions % For a list of all processing functions type: help nnprocess net.input.processFcns = {'removeconstantrows','mapminmax'}; net.output.processFcns = {'removeconstantrows','mapminmax'}; % Setup Division of Data for Training, Validation, Testing % For a list of all data division functions type: help nndivide net.divideFcn = 'dividerand'; % Divide data randomly net.divideMode = 'sample'; % Divide up every sample net.divideParam.trainRatio = 70/100; % net.divideParam.valRatio = 15/100; net.divideParam.testRatio = 30/100; % Choose a Performance Function % For a list of all performance functions type: help nnperformance net.performFcn = 'mse'; % Cross-Entropy % Choose Plot Functions % For a list of all plot functions type: help nnplot net.plotFcns = {'plotperform','plottrainstate','ploterrhist', ... 'plotconfusion', 'plotroc'}; % Train the Network [net,tr] = train(net,x,t); % Test the Network y = net(x); e = gsubtract(t,y); NN.performance = perform(net,t,y); tind = vec2ind(t); yind = vec2ind(y); percentErrors = sum(tind ~= yind)/numel(tind); % Recalculate Training, Validation and Test Performance trainTargets = t .* tr.trainMask{1}; valTargets = t .* tr.valMask{1}; testTargets = t .* tr.testMask{1}; NN.trainPerformance = perform(net,trainTargets,y); NN.valPerformance = perform(net,valTargets,y); NN.testPerformance = perform(net,testTargets,y); NN.y=y;NN.e=e;NN.tr=tr; [NN.tpr,NN.fpr,~] = roc(t,y); % View the Network view(net) % Plots % Uncomment these lines to enable various plots. %figure, plotperform(tr) %figure, plottrainstate(tr) %figure, ploterrhist(e) %figure, plotconfusion(t,y) %figure, plotroc(t,y) end

**Tuning of NN by Optimization Algorithm**

Neural network tuning is done to get the high accuracy and less mean square error to generate the duty cycle such that maximum power can be transferred from PV array to grid. To achieve this aim we use GSA optimization and tuned NN’s weights and biases. in every optimization task, it is required that an objective function must be set which calculates the target value like MSE in our case. This objective function will be called in each iteration and for each agent in that iteration. For this purpose, we define the input weights and biases of NN in Simulink as variables. MATLAB Simulink looks for those variable values in the workspace and after optimization, it will select those from the workspace and execute the modeling. This way an external tuned NN is finally in the main PV grid distribution system for maximum power transfer.

## Reviews

There are no reviews yet.