Credit card Fraud Detection using Neural Network

$100.00

The credit card fraud detection is emerging risk field with more and more presence of user’s on internet. With the introduction of Digital India movement, online payments and money transfer is increased. This all raises a group of people who defraud the online activities. So the need of credit card fraud detection and prevention is utmost required. In our work we proposed a novel algorithm to detect the credit card. The method is using machine learning algorithm as main along with evolutionary optimization algorithm to improve the performance of neural network (NN). 

The downloaded package will contain:

  1. Complete MATLAB code for GSA and SA optimized NN
  2. Documentation File

Note:  We don’t claim the documentation file to be plagiarism free and neither support to copy this code for your academic submission. This is to ease your pain to start writing code from scratch. We suggest to modify the code for your work.

Download Test
 Discuss Code

Description

NN is an iterative process which changes its input weights and biases to achieve the minimum mean square error (MSE). It is using feedback propagation loop which is using Lquenberg algorithm. This algorithm iterates locally which means it doesn’t guarantee the convergence of all minima points. it may skip some combinations of input weights and biases which may reduce the MSE more. To avoid this issue we have adapted the optimisation method named Gravitational Search Algorithm (GSA) which is explained in previous chapter. It is based on the movement of celestial bodies and position of these agents are input weights and biases in our case. The output of NN is calculated by formula.

output=input*IW+B

where IW are the input weights and B are the biases. The number of input weights and biases depends upon the number of hidden layers. The GSA algorithm is supposed to tune these values. For this purpose first the Neural network is created in MATLAB. That network will be used further for optimisation algorithm. We have use the German dataset downloaded from UCI machine learning repository [30]. This dataset contains 20 attributes along with a label of good and bad. If label is 1, those attributes are for non fraud case and vice versa. We have used German Credit Score Dataset. In out proposed algorithm of optimised neural network we need the numeric dataset, so this dataset in numeric format is also available on the same web link.

Neural Network optimization by GSA

The proposed work is to tune NN to get the high accuracy and less mean square error. To achieve this aim we use GSA optimisation and tuned NN’s weights and biases. in every optimisation task, it is required that an objective function must be set which calculates the target value like MSE in our case. This objective function will be called in each iteration and for each agent in that iteration. since the neural network is already created and trained in previous step so it is not required to create again every time when objective function is called as our objective function updates the pre trained NN’s weights and biases which are 251 in numbers and calculate the MSE for those set of weights and biases. The developed objective function snippet is shown in table.

function [performance,net]=Objective_function(L,~,net,input, target)

% input - input data.
% target - target data.
 
x = input';
t = target';
t(t==2)=0;
net=setwb(net,L); % set the input weights and biases of NN using values in 'L'
% Test the Network
y = net(x);
e = gsubtract(t,y);
performance = perform(net,t,y);

GSA is based on its agents’ movements and an agent’s position is represented by the weights and biases values. The number of co- ordinates of an agent’s position is equal to total number of input weights and biases. In our case this number is 251. These weights and biases are the positions of all agent used in the optimisation and updated as per the equations quoted in chapter 3. These weights and biases can be fetched from generated neural network by using a MATLAB function ‘getwb’ and after updating these are set back to NN by ‘setwb’. The significance of GSA terminology with NN tuning is provided in table 4.3.

Table 4.3: Significance of GSA terminology in NN tuning

GSA terms

In NN tuning significance

Agents Position

Input weights and biases

Dimension for optimisation/ number of variables to be tuned

Total number of input weights and biases

Update in the position of agents

Change the values of weights and biases to move towards minimum MSE

A complete step by step algorithm is explained below.

  1. Load the German credit card fraud dataset in numeric format and divide that into random 70/30 ratio for training and testing of neural network.

  2. generate the NN script to create and train the network whose weights and biases are to be optimised.

  3. initialise the GSA parameters like number of iterations, number of agents, initial G0 and alpha. Pass the previously created network into GSA to get the dimension of weights and biases.

  4. randomly initialise the new input weights and biases to give an initial seed to GSA optimisation. These must be within a boundary as given in next chapter.

  5. call the objective function to update the neural network’s weights and biases and calculate the MSE for those values by using the testing dataset.

  6. to update the random positions of agents, force and mass has to be calculated by using the equations

  1. The new updated position is obtained from the formula

= +

         the velocity in this case is calculated by using acceleration which is based on force and mass calculated in previous step.

  1. For this new updated position or values of weights and biases, objective function is again called and MSE is saved.

  2. The weights and biases for which minimum of MSE is obtained out of previous two set of values, is further considered for updating.

  3. This process continues till all iterations are not completed.

  4. The final minimum MSE is obtained and weights ad biases set for them is used as final NN weights and biases which gives less MSE than conventional NN and Simulated Annealing tuned NN which was done previously by Khan A. et.al [26].

By following the methodology the accuracy improvement over RBFNN is 

Table: % improvement of GSA tuned NN over other algorithms

GSA vs SA (%)

GSA vs NN (%)

AUC

13.43

25

MSE

4

24

Reviews

There are no reviews yet.

Only logged in customers who have purchased this product may leave a review.