Skip to content

Amir-Entezari/A-CNN-study-based-on-selected-deep-features-for-grapevine-leaves-classification

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 
 
 

Repository files navigation

A CNN study based on selected deep features for grapevine leaves classification

Data Set

The dataset that is considered in this project is about to recognition of the types of grape leaves.In addition to the fruit of the grape tree, the leaves of the grape tree can also be used and based on its type, its price and use are determined. It is not possible to separate these leaves manually on a large scale.Therefore, we try to do the sorting of leaves automatically with the data we have.

Step 1:

In this step we are going to split the data into training and test data with a ratio of 80 to 20. For this purpose, we load the data first. We have a zip file that must first be downloaded and then extracted. Since the project is going to be placed in google colab, we save the data in a temporary folder.

image2 Now we want to split the data randomly in the ratio of 80 to 20. Therefore, first we shuffle the data and then copy 80 of them to the train folder and the other 20 to test.(We do this for each class)

image1

Step 2:

Now we want to increase the number of training images using data augmentation.Therefore, using the relevant libraries, we first rotate, flip, zoom, etc. the data; Because it helps to learn more of the model and the model does not depend on the angle of the images. But we note that we do not change the color of the data because the colors of two leaves from two different classes may be similar and cause the model to learn incorrectly. We also note that augmentation is not part of the test data.

image3

Step 3:

In this step, we want to classification the data using different CNN neural network architectures (like keras). Therefore, first we build the model and then we give it training and valid data (which are from training data) so that learning can be done.

image4 In the above model, CNN is built with three layers, the activation of the first three layers is relu and the activation of the last layer is sigmoid. As it can be seen in the results, the results are the same with epoch 15, and the accuracy has gone up to 98%!:

image6 image5

The confusion matrix is also as follows:

image7

Step 4:

In this step let's use pretrained models available on the Internet and packages: a)VGG16 b)Inception

VGG16:

We run the model on ecpoch=25, according to the plot, the accuracy of the training data goes up to 70 and the test data up to 50, which according to the F1 score, the final accuracy is equal to 52%:

image9 image8

Inception:

We run the model with epoch = 25 and observe that its accuracy is good and in some cases it goes up to 70%:

image10

The red line corresponds to val_accuracy and the blue one corresponds to accuracy. Its confusion matrix is as follows that its accuracy is not good in most cases.

image11

Step 5:

In this step, we use autoencoder networks for applications such as denoising or dimensionality reduction. Autoencoder is used to denoise the data. In the first step, we noise the data and fit the model using validation and train data. By setting epoch to 10, after fitting the model, we reach 40% Loss, which is not a bad value, but it is not good either. Because the data is decoded in an ambiguous way.

image12 image13

Step 6:

Using several times of execution for different random seeds, we mention the average results. After running the model for different seeds, the results are as follows: In some cases, we reached an accuracy of over 80%, which is an excellent accuracy.

image14

Step 7:

Finally, we use validation cross fold 10 and report the accuracy in each case. We merge the train and validation data together and use the new training data set for this part. We are careful that the test data should not be used in the cross validation process.

image15 image16 image17

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published