Classifying Digits with Logistic Regression
Personal Projects #Data Science#Python
Overview
A digit classification project using PyTorch to implement logistic regression on the MNIST dataset, demonstrating foundational deep learning concepts including forward pass, backpropagation, and optimization.
Key Achievements
- Achieved 82.6% accuracy on test set after 5 epochs
- Implemented PyTorch training pipeline
Implementation
Custom Logistic Regression Class
Built a PyTorch model with:
- Constructor: Initializes mapping input pixels to output classes (784px to 10)
- Forward Method: Uses linear transformation to calculate softmax probabilities
Training Pipeline
- Data Loading: Batch processing to split MNIST into training and test sets
- Loss Function: Cross-entropy for multiclass classification
- Optimizer: Stochastic Gradient Descent with learning rate 0.001
- Training Loop: Forward pass -> calculate loss -> backward pass -> update weights
Hyperparameters
| Parameter | Value | Reason |
|---|---|---|
| Input Size | 784 | 28x28 px flattened |
| Num Classes | 10 | Digits 0-9 |
| Epochs | 5 | Training iterations |
| Batch Size | 100 | Memory efficiency |
| Learning Rate | 0.001 | Standard SGD |
Training Process
# For each epoch:for images, labels in trainingData:
# set images and labels images = Variable(images.view(-1,28*28)) labels = Variable(labels)
# reset gradients to 0 optimize.zero_grad()
# forward pass output=model(images) loss = cross_entropy(output,labels)
# backward pass loss.backward()
# update weights optimize.step()Digit Classification Results

Technologies
Python, PyTorch, MNIST, Stochastic Gradient Descent, Cross-Entropy Loss