Under Development
Please note: This section is a work in progress. Information and features are subject to change and may be incomplete.

Custom Neural Network for MNIST
A hand-crafted neural network built from the ground up in Python with NumPy for MNIST digit recognition. This project was undertaken to demonstrate and solidify an understanding of the fundamental concepts of machine learning without relying on high-level frameworks like TensorFlow or PyTorch.
abstract
This project details the implementation of a custom neural network for digit recognition on the classic MNIST dataset. The entire network, including layers, activation functions, and the backpropagation algorithm, was built using only the NumPy library.
approach
The core objective was to build a functional neural network without high-level ML libraries. This required a deep dive into the mathematics of forward and backward passes, implementing the backpropagation algorithm by hand, and managing weights and biases manually.
results
After tuning hyperparameters such as learning rate and the number of hidden neurons, the custom-built network achieved a respectable 85% accuracy on the MNIST test set, demonstrating the viability of the from-scratch approach.
educational
This serves as an excellent learning project for anyone seeking to understand the fundamental mechanics of neural networks. It provides a clear, practical insight into how data flows through a network and how learning occurs via gradient descent and backpropagation.