# MNIST-Matlab **Repository Path**: tang_wan_qiang/MNIST-Matlab ## Basic Information - **Project Name**: MNIST-Matlab - **Description**: Using data augmentation and MatConvNet, create a robust CNN that achieves 99.21% accuracy on noisy, rotated validation data. - **Primary Language**: Unknown - **License**: MIT - **Default Branch**: master - **Homepage**: None - **GVP Project**: No ## Statistics - **Stars**: 0 - **Forks**: 0 - **Created**: 2020-02-21 - **Last Updated**: 2020-12-19 ## Categories & Tags **Categories**: Uncategorized **Tags**: None ## README # MNIST-Matlab Using data augmentation and MatConvNet, create a robust CNN that achieves **99.21% accuracy on noisy, rotated validation data.** **Quick Note:** _This is not meant to be a forkable, out-of-the-box implementation._ The designed architectures are uploaded for reference; however, the framework for testing the architectures is not included in this repository (data loading, architecture compilation, etc.). Please visit [MatConvNet](http://www.vlfeat.org/matconvnet/) for further help with the details. ## Requirements * [**MATLAB_R2018A**](https://www.mathworks.com/downloads/web_downloads/select_release?mode=gwylf) * Recommended packages installed. * [**MatConvNet**](http://www.vlfeat.org/matconvnet/) ## Results Below is a table summary of the results for the provided architectures. summary ### Baseline The baseline architecture achieved a **98.47%** classification accuracy on the clean validation data. Below is a visualization of the training cycle of 20 epochs. baseline This architecture is contained in `cnn_init_baseline.m`. ### Filters-Dropout The baseline architecture achieved a **98.54%** classification accuracy on the clean validation data. When trained with augmented (X-Y shifting, rotation) data, this increased to **99.21%** accuracy on the _dirty_ validation set. Below is a visualization of the latter training cycle of 16 epochs. robust This architecture is contained in `cnn_init_filters_dropout.m`. ### Data Augmentation The script `augment_data.m` uses the `imageDataAugmenter` method from the `Deep Learning Image Classification` section of the `Neural Network Toolbox`. The data is doubled with augmented (X-Y shifting, rotation) images.