I've been diving deep into machine learning whenever I can carve out time. I've done everything from building LLMs from scratch to a location‑scoring system I built during an internship. I've been living in
I'm most confident with
|
Tiny GPT‑style Transformer trained from scratch in PyTorch on a 1.5 MB corpus of public‑domain Aristotle. Character‑level next‑token prediction, multiple model sizes, full training logs, loss curves, and sample generations. |
Simple PyTorch neural networks (MLP and CNN) trained on the Kaggle digit-recognizer dataset (MNIST in CSV format). The goal is practice with neural nets, not leaderboard sniping. |
|
Surprisingly complex terminal Battleship with a fully custom text UI, used as a sandbox to push code organization, modular design, and concise Python. |
End‑to‑end ML pipeline from EDA (class balance, correlations, KDEs) to a logistic regression baseline, then tuned gradient boosting models and interpretability with SHAP. |