|
|
--- |
|
|
license: mit |
|
|
datasets: [] |
|
|
language: |
|
|
- en |
|
|
model_name: Micrograd AutoGrad Engine |
|
|
library_name: pytorch |
|
|
tags: |
|
|
- micrograd |
|
|
- autograd |
|
|
- backpropagation |
|
|
- neural-networks |
|
|
- andrej-karpathy |
|
|
--- |
|
|
|
|
|
# Micrograd AutoGrad Engine: Backpropagation Implementation |
|
|
|
|
|
This repository contains the implementation of **Backpropagation** using an **AutoGrad Engine**, inspired by the **Micrograd** video by Andrej Karpathy. It explores the foundations of training neural networks and implementing key operations from scratch. |
|
|
|
|
|
## Overview |
|
|
- **Manual Backpropagation**: Building intuition and understanding of the gradient calculation process. |
|
|
- **Implementation Notebooks**: Step-by-step code for implementing and understanding backpropagation and related concepts. |
|
|
|
|
|
## Documentation |
|
|
For a better reading experience and detailed notes, visit my **[Road to GPT Documentation Site](https://muzzammilshah.github.io/Road-to-GPT/Micrograd/)**. |
|
|
|
|
|
> **💡 Pro Tip**: This site provides an interactive and visually rich explanation of the notes and code. It is highly recommended you view this project from there. |
|
|
|
|
|
## Acknowledgments |
|
|
Notes and implementations inspired by the **Micrograd** video by [Andrej Karpathy](https://karpathy.ai/). |
|
|
|
|
|
For more of my projects, visit my [Portfolio Site](https://muhammedshah.com). |