I'm Adam. If you don't know who I am, then my work is still at least one order of magnitude away from echoing my name --- which soon it will:)
Pinned Loading
-
Vit-on-small-data
Vit-on-small-data PublicThe Lightest Vision Transformer (ViT) trained from scratch out there to achieve 93.37 ± 0.07%” top-1 accuracy on CIFAR-10 within just 50 epochs.
Python 4
-
training_models_from_scratch
training_models_from_scratch PublicTraining tiny models from scratch using NumPy in code and linear algebra on a piece of paper.
Python 11
-
optimizers-from-scratch
optimizers-from-scratch Publictraining models with different optimizers using NumPy only. Featuring SGD, Adam, Adagrad, NAG, RMSProp, and Momentum. This repo also includes a benchmark against Pytorch developed optims.
-
BatchNorm-Interactive-Playground
BatchNorm-Interactive-Playground PublicInteractive visualization simulating how Gamma and Beta control ICS (Internal Covariate Shift) for faster convergence in BatchNorm.
-
COCO-CONVERTER
COCO-CONVERTER PublicA CLI that converts CSV files and folders with image data to a JSON file with COCO annotations and builds a custom dataset off of it ready to train with pytorch.
Python 5
-
Transformer-from-scratch
Transformer-from-scratch Publicelaborate transformer implementation + detailed explanation
Python 1
If the problem persists, check the GitHub status page or contact support.