← All Courses
Learn AI & Machine Learning
Coming SoonNeural networks, LLMs, and intelligent applications
AI Machine Learning LLMs PyTorch RAG
Coming Soon
Build intelligent systems that learn from data. From classical machine learning algorithms through deep neural networks to modern LLM applications — understand how AI works and build with it.
This course covers ML fundamentals, PyTorch, transformer architectures, RAG pipelines, and deploying AI-powered applications in production.
Start Here — Learning Roadmap
A suggested path from zero to mastery. Follow these steps in order:
- Learn Python and math foundations — Ensure solid Python skills plus basics of linear algebra, calculus, and probability/statistics
- Understand classical ML algorithms — Learn linear/logistic regression, decision trees, SVMs, and k-means clustering with scikit-learn
- Master model evaluation — Understand train/test splits, cross-validation, precision/recall, ROC curves, and overfitting vs underfitting
- Build neural networks from scratch — Implement forward/backward propagation, gradient descent, and loss functions to understand the fundamentals
- Learn PyTorch — Build models with tensors, autograd, nn.Module, DataLoaders, and training loops in the dominant DL framework
- Study deep learning architectures — Understand CNNs for vision, RNNs/LSTMs for sequences, and the attention mechanism
- Master transformers and LLMs — Learn self-attention, positional encoding, tokenization, and how GPT/BERT architectures work
- Fine-tune and prompt-engineer — Adapt pre-trained models with LoRA/QLoRA, write effective prompts, and use few-shot learning
- Build RAG applications — Create retrieval-augmented generation pipelines with embeddings, vector databases, and LangChain/LlamaIndex
- Deploy AI in production — Serve models with vLLM/TGI, build APIs, handle inference at scale, and monitor model performance
Official & Core Documentation
- PyTorch Tutorials — Official deep learning tutorials from basics to advanced models (All levels)
- Hugging Face Documentation — Transformers, datasets, model hub, and inference API reference (Intermediate)
- TensorFlow Documentation — Google’s ML framework with tutorials, guides, and pre-trained models (Intermediate)
- OpenAI API Documentation — GPT, DALL-E, and embeddings API reference and cookbook (Intermediate)
- Anthropic API Documentation — Claude API, tool use, prompt engineering guides, and best practices (Intermediate)
- LangChain Documentation — LLM application framework for chains, agents, and RAG pipelines (Intermediate)
- LlamaIndex Documentation — Data framework for building RAG and LLM-powered applications (Intermediate)
- scikit-learn User Guide — Classical ML algorithms with theory, examples, and parameter tuning (Beginner)
- AI Engineer Roadmap — Visual step-by-step guide to the AI engineering learning path (Beginner)
GitHub Awesome Lists & Curated Collections
- awesome-machine-learning — ML frameworks, libraries, and software organized by language (66k+ stars)
- awesome-deep-learning — Deep learning tutorials, projects, and communities (24k+ stars)
- Awesome-LLM — Curated list of Large Language Model resources, papers, and tools (22k+ stars)
- awesome-generative-ai — Generative AI tools, works, models, and references (6k+ stars)
- awesome-production-machine-learning — Open source libraries to deploy, monitor, version, and scale ML (17k+ stars)
- llm-course — Free roadmap and Colab notebooks to get into Large Language Models (42k+ stars)
- awesome-ml-courses — Free machine learning and AI courses with video lectures
Interactive Courses & Hands-On Platforms
Free Courses
- fast.ai — Practical deep learning course, free and project-first with PyTorch and fastai (Intermediate)
- Google Machine Learning Crash Course — Free ML fundamentals with TensorFlow exercises (Beginner)
- Hugging Face NLP Course — Free NLP with Transformers, hands-on with real models and datasets (Intermediate)
- DeepLearning.AI Short Courses — Free bite-sized courses on LLMs, RAG, prompt engineering, and AI agents (All levels)
- Learn PyTorch (Zero to Mastery) — Hands-on PyTorch learning with code-first approach and notebooks (Beginner)
University & MOOC Courses
- Stanford CS229 — Machine Learning — Andrew Ng’s legendary ML course with free lecture notes and videos (Intermediate)
- Stanford CS231n — CNNs for Visual Recognition — Deep learning for computer vision with assignments (Advanced)
- Stanford CS224n — NLP with Deep Learning — NLP course covering word vectors through transformers (Advanced)
- MIT 6.S191 — Introduction to Deep Learning — Fast-paced intro to deep learning with TensorFlow labs (Intermediate)
- Duke — Introduction to Machine Learning — Free self-paced course covering CNNs, RNNs, and NLP with PyTorch (Beginner)
Practice & Challenges
- Kaggle Competitions — Real ML challenges with datasets, notebooks, and leaderboards (All levels)
- LLM Course Notebooks — Free Colab notebooks for fine-tuning, quantization, and RAG (Intermediate)
- Hugging Face Spaces — Deploy and share ML demos with Gradio or Streamlit for free (Intermediate)
Video Courses & YouTube Channels
Structured Course Playlists
- Andrej Karpathy — Neural Networks: Zero to Hero — Build GPT from scratch, the best deep learning lecture series available (Intermediate)
- 3Blue1Brown — Neural Networks — Visual, intuitive neural network explanations (Beginner)
- DeepLearning.AI (Andrew Ng) — ML courses, LLM tutorials, and industry interviews (All levels)
- freeCodeCamp — Machine Learning with Python — Full ML course covering neural networks, CNNs, and NLP (Beginner)
Individual Creators & Channels
- Yannic Kilcher — ML paper reviews and in-depth explanations of cutting-edge research (Advanced)
- Two Minute Papers — Accessible AI research summaries with visual demonstrations (Beginner)
- AI Explained — Clear breakdowns of the latest AI developments, papers, and benchmarks (Intermediate)
- StatQuest with Josh Starmer — ML and statistics concepts explained simply with animations (Beginner)
- Jeremy Howard (fast.ai) — Practical deep learning insights from the fast.ai creator (Intermediate)
Books & Long-Form Reading
Free Online Books
- Deep Learning (Goodfellow et al.) — The foundational deep learning textbook, free to read online (Advanced)
- Speech and Language Processing (3rd Ed.) — Free draft by Jurafsky & Martin covering NLP through transformers and LLMs (Advanced)
- Dive into Deep Learning — Interactive deep learning textbook with code in PyTorch, TensorFlow, and JAX (Intermediate)
- Neural Networks and Deep Learning — Free online book covering backpropagation and deep network training (Intermediate)
Essential Paid Books
- Build a Large Language Model (From Scratch) — Understand LLMs by building a transformer with PyTorch (Advanced, Paid)
- AI Engineering — Chip Huyen’s guide to building AI systems in production (Advanced, Paid)
- Hands-On Machine Learning (3rd Ed.) — Practical ML with scikit-learn, Keras, and TensorFlow (Intermediate, Paid)
- Natural Language Processing with Transformers — By Hugging Face engineers, covers fine-tuning and inference (Intermediate, Paid)
- The LLM Engineer’s Handbook — Prompt engineering, fine-tuning, RAG, and deployment patterns (Intermediate, Paid)
Community, Practice & News
Forums & Discussion
- Hugging Face Community — Forum for Transformers, model sharing, and ML discussion
- r/MachineLearning — Research papers, discussions, and industry news (2.8M+ members)
- r/LocalLLaMA — Active community for running and fine-tuning open-source LLMs locally
- MLOps Community — Community for ML engineering, deployment, and production best practices
Newsletters & Blogs
- The Batch (DeepLearning.AI) — Andrew Ng’s weekly AI newsletter with curated news and original articles
- Ahead of AI — Sebastian Raschka’s newsletter on ML, deep learning, and AI research
- Latent Space — Newsletter, podcast, and community for AI engineers covering LLMs and tooling
- TLDR AI — Daily newsletter delivering AI, ML, and data science updates (620k+ subscribers)
- The Rundown AI — Daily AI newsletter covering developments, trends, and practical applications (750k+ subscribers)
Ecosystem Resources
- Papers With Code — ML papers with code implementations, benchmarks, and state-of-the-art leaderboards
- Kaggle Discussion Forums — Competition strategies, data science Q&A, and notebook sharing
- Arxiv Sanity — Browse, search, and filter ML research papers from arXiv
Tools & Environments
- Google Colab — Free cloud notebooks with GPU/TPU access for training models
- Hugging Face Model Hub — Repository of 500k+ pre-trained models ready for inference and fine-tuning
- Weights & Biases — ML experiment tracking, visualization, and model registry (free for individuals)
- Ollama — Run open-source LLMs locally on your machine with a simple CLI
- LM Studio — Desktop app for downloading and running LLMs locally with a chat interface
- vLLM — High-throughput LLM serving engine with PagedAttention for production inference
- Gradio — Build and share ML demos with a web interface in a few lines of Python
- Lightning AI — Platform for building, training, and deploying AI models with free GPU access
Cheat Sheets & Quick References
- PyTorch Cheat Sheet — Official quick reference for tensors, autograd, models, and training loops
- Machine Learning Cheat Sheet (Stanford) — Comprehensive visual reference for ML algorithms and concepts from CS229
- Deep Learning Cheat Sheet (Stanford) — Visual reference for CNNs, RNNs, training tips, and optimization from CS230
- Transformer Architecture Diagram — Jay Alammar’s visual guide explaining attention and transformer internals step by step
- Prompt Engineering Guide — Comprehensive guide to prompt engineering techniques for LLMs
Research & Staying Current
- Papers With Code — State of the Art — Track state-of-the-art results across ML benchmarks and tasks
- Arxiv Sanity — Browse, search, and filter the latest ML research papers from arXiv
- Distill.pub — Clear, visual explanations of machine learning concepts through interactive articles
- The Gradient — Long-form essays on AI research, policy, and industry trends from Stanford researchers
- Sebastian Raschka’s Blog — In-depth articles on deep learning, LLMs, and research methodology