Found 1,774 repositories(showing 30)
hereismari
Handwritten digits classification from MNIST with TensorFlow on Android; Featuring Tutorial!
LinguoLi
A tutorial for MNIST handwritten digit classification using sklearn, PyTorch and Keras.
qiyaoliang
Recent advances in many fields have accelerated the demand for classification, regression, and detection problems from few 2D images/projections. Often, the heart of these modern techniques utilize neural networks, which can be implemented with deep learning algorithms. In our neural network architecture, we embed a dynamically programmable quantum circuit, acting as a hidden layer, to learn the correct parameters to correctly classify handwritten digits from the MNIST database. By starting small and making incremental improvements, we successfully reach a stunning ~95% accuracy on identifying previously unseen digits from 0 to 7 using this architecture!
ataturhan21
A complete solution for the MNIST handwritten digit classification challenge using PyTorch, including data exploration, model training, and Kaggle submission generation.
This project implements a Convolutional Neural Network (CNN) to recognize handwritten digits (0–9) using the MNIST dataset. The model is trained on labeled image data, achieving high accuracy in digit classification, and demonstrates the practical application of deep learning techniques in computer vision.”
RafayKhattak
Simple MNIST Handwritten Digit Classification using Pytorch
Classification and Segmentation of the MNIST dataset given as a point set input. Classification: the program classifies hand written digits, given as a sample of 100 points in a 2 dimensional field. the architecture is based on a Stanford article of a PointNet which is especially efficient for 3D image classification. the PointNet classification accuracy is 92.86% Segmentation: this is an extension to the classification net which can later define segments within the pointset. the program receives an input of a handwritten digit, given as a sample of 200 points in a 2 dimensional field, where 100 of the points are a sample of the digit itself, and the rest of the points are "background" points which are not part of the digit. the program classifies each point into one of the 2 segments and returns if it is part of the digit or part of the background. the PointNet segmentation accuracy is 97.65%
chandan450
This is a machine learning model that classifies digits from 0-9 from handwritten digits.
darshanbagul
Implementation of handwritten digit classification models trained on MNIST dataset and understanding the No Free Lunch Theorem by testing on USPS Dataset
This project uses pyside6 to package a QT program for pytorch-based MNIST handwritten digit classification, which is suitable for demonstrating convolutional neural networks in a cognitive course
darshanbagul
Implementation of Handwritten digits classification from MNIST on Android using Keras and TensorFlow.
ayooshkathuria
Classification of MNIST Handwritten Digits Database using Deep Learning
Ronny-22-Code
This repository introduces to my project "Handwritten-Digit-Classification" using MNIST Data-set . This project was implemented and executed by applying KNN algorithm with recognition accuracy of around 91-93 % . The desired results have been obtained by training the machine first using the mnist_train data-set and later testing the obtained results using mnist_test data-set , to recognise the handwritten digit.
MNIST Handwritten Digit Classification and Recognition Using Convolutional Neural Network (CNN) Deep Learning
nex3z
TensorFlow Serving model for handwritten digits classification from MNIST.
Rushi589
Handwritten Digit Recognition is a deep learning project that uses a Convolutional Neural Network (CNN) to accurately identify digits (0–9) from handwritten images. It leverages the MNIST dataset to train and evaluate the model for real-time digit classification.
NvsYashwanth
MNIST handwritten digit classification using PyTorch
coxy1989
MNIST handwritten digit classification with Clojure.
SheidaAbedpour
This project involves implementing a Multilayer Perceptron (MLP) using the PyTorch library for MNIST-handwriting-digits dataset.
Hunterdii
DigiPic-Classifier is a powerful image classification app built with Streamlit. It features two models: CIFAR-10 Object Recognition to classify objects like airplanes, cars, animals, and more, and MNIST Digit Classification for recognizing handwritten digits. With a sleek interface and real-time predictions, DigiPic-Classifier offers a seamless
Human written styles vary person to person even for a single letter or a digit. But there is some similarity in those digits and some unique features that help human to understand it by visualizing different digits. In this we are performing these digits classification, so machine can learn from it and recognized unique features and layout of individual digits. We have applied Histogram-Of-Oriented-Gradients (HOG) and Local Binary Pattern (LBP) on the images to extract features of digits. We used Support vector machine (SVM), K-Nearest Neighbors (KNN) and Neural Network Supervised Algorithms for digits classification. All these algorithms have some pros and cons like time consumption, dataset length and Accuracy. We have received 98% accuracy of classification using Neural Network of 3 layers.
ArenD100
# Final_Project ## English Alphabet Machine Learning Model This project utilizes the EMNIST-letters image database. This database is an extension of the MNIST database of handwritten digits. EMNIST datasets can be found at: https://www.nist.gov/node/1298471/emnist-dataset. MNIST datasets can be found at: http://yann.lecun.com/exdb/mnist/. This project imported data by pip installing the emnist (and mnist) libraries. Note - the "data/python-mnist" is from sorki/python-mnist on github. This repository can also be utilized with EMNIST testing. This project did not utilize the sorki folder. #### Notes on EMNIST 'Letters' Dataset "The EMNIST Balanced dataset contains a set of characters with an equal number of samples per class. The EMNIST Letters dataset merges a balanced set of the uppercase and lowercase letters into a single 26-class task. The EMNIST Digits and EMNIST MNIST dataset provide balanced handwritten digit datasets directly compatible with the original MNIST dataset. The EMNIST Letters dataset seeks to further reduce the errors occurring from case confusion by merging all the uppercase and lowercase classes to form a balanced 26-class classification task. In a similar vein, the EMNIST Digits class contains a balanced subset of the digits dataset containing 28,000 samples of each digit." ## About Model This model uses the EMNIST 'Letters' dataset. Refer to 'EMNIST.ipynb' for a detailed look at the model. This model trained at 95% accuracy and test at 91% accuracy. ``` model.fit(X_train, y_train, batch_size=128, epochs=10, shuffle=True, verbose=2) ``` ```Epoch 10/10 1248000/1248000 - 19s - loss: 0.1345 - acc: 0.9512 `` ``` model_loss, model_accuracy = model.evaluate(X_test, y_test, verbose=2) print(f"Loss: {model_loss}, Accuracy: {model_accuracy}") 20800/20800 - 2s - loss: 0.4131 - acc: 0.9130 Loss: 0.41310153188356846, Accuracy: 0.9129807949066162 ## Predicting the Model A: 1, B: 2, C: 3, D: 4, E: 5, F: 6, G: 7, H: 8, I: 9, J: 10, K: 11, L: 12, M: 13, N: 14, O: 15, P: 16, Q: 17, R: 18, S: 19, T: 20, U: 21, V: 22, W: 23, X: 24, Y: 25, Z: 26 This model tested a couple letters from the test. Both were correctly predicted. This model also collected some images from the internet to test. These images came from https://graphemica.com. These images are in the 'images' folder. The first image predicted incorrectly, but the same letter in a different font was predicted correctly. The second imported image seems to be more similar to the emnist dataset. This model attempted to read newly handwritten data (written and imported myself) to test the accuracy of the emnist model made and actual handwriting. The results were not as thought. An image of each letter of the alphabet (capital only) were pictured and uploaded. Their are 4-sets: Pencil, Pen, Sharpie, and Marker. The goal of this was to determine which writing utensil would test the most accurate with the emnist set, however; the pictures imported are all sideways and rotating them made the images much lighter and illegible thus predicting failure for every one tested (only one of each utensil was predicted in the EMNIST.ipynb) A self created model based off the 26 images uploaded for each utensil were created only to have an accuracy of 0. More photos would need to be taken as well as better pixel conversions need to be done to get from 4D down to 2D. ## Conclusions The emnist model itself runs extremely well. The model run with api images need more testing but predicted correct results. Actual handwritten data needs to be redone to come up with better predictions. This is a case of bad data going in, bad data coming out with the self model. Better quality pictures need to be taken and uploaded so rotating is not needed by the 'pillow' library.
Implemented and compared the performance of below Classifiers using Cross-validation & error metrics. Linear classifier, K-nearest neighbor classifier, RBF neural network,1 & 2-hidden layer Neural Network
bsiegelwax
MNIST classification of handwritten digits on a quantum computing simulator using OpenQASM.
matiascaputti
🎥 Real-time digit classification using MNIST handwritten digit database OpenCV and Python.
patankaraditya1
Developed a 5-layer Sequential Convolutional Neural Network using Keras with Tensorflow backend for digit recognition trained on MNIST dataset. Adjusted parameters such as kernel size, activation function and optimizer properties to compute the best fit. Obtained an accuracy of 97.12%. Performed Data Augmentation such as image scaling, image flips and image rotation to avoid overfitting and increase the accuracy to 98.02%. Compared the accuracy to KNN, Logistic Regression, Random Forest which had accuracies of 96.37%, 91.22% and 96.19% respectively.
mkisantal
A neural network for MNIST handwritten digit classification. Implemented from scratch in MATLAB.
prateek54
This project demonstrates how to use TensorFlow Mobile on Android for handwritten digits classification from MNIST.
PhucHuwu
Handwritten digit classification system with custom neural networks from scratch. 96.53% accuracy on MNIST with interactive GUI for real-time testing.
The goal of this project is to label images of 10 handwritten digits of “zero”, “one”,...,“nine”. The images are 28 by 28 in size (MNIST dataset), which we will be represented as a vector x of dimension 784 by listing all the pixel values in raster scan order. The labels are 0,1,2,...,9 corresponding to 10 classes as written in the image. There are 3000 training cases, containing 300 examples of each of 10 classes. PROBLEM 1: Here you must read an input file. Each line contains 785 numbers (comma delimited): the first values are between 0.0 and 1.0 correspond to the 784 pixel values (black and white images), and the last number denotes the class label: 0 corresponds to digit 0, 1 corresponds to digit 1, etc. PROBLEM 2: Implement the backpropagation algorithm in a zero hidden layer neural network (weights between input and output nodes). The output layer should be a softmax output over 10 classes corresponding to 10 classes of handwritten digits (e.g. an architecture: 784 > 10). Your backprop code should minimize the cross-entropy entropy function for multi-class classification problem (categorical cross entropy). PROBLEM 3: Extend your code from problem 2 to support a single layer neural network with N hidden units (e.g. an architecture: 784 > 10 > 10). These hidden units should be using sigmoid activations. PROBLEM 4: Extend your code from problem 3 (use cross entropy error) and implement a 2-layer neural network, starting with a simple architecture containing N hidden units in each layer (e.g. with architecture: 784 > 10 > 10 > 10). These hidden units should be using sigmoid activations. Extend your code from problem 4 to implement different activations functions which will be passed as a parameter. In this problem all activations (except the final layer which should remain a softmax) must be changed to the passed activation function. PROBLEM 6: Extend your code from problem 5 to implement momentum with your gradient descent. The momentum value will be passed as a parameter. Your function should perform “epoch” number of epochs and return the resulting weights.