← All Courses
Learn AI & Machine Learning logo

Learn AI & Machine Learning

Coming Soon

Neural networks, LLMs, and intelligent applications

AI Machine Learning LLMs PyTorch RAG

Coming Soon

Build intelligent systems that learn from data. From classical machine learning algorithms through deep neural networks to modern LLM applications — understand how AI works and build with it.

This course covers ML fundamentals, PyTorch, transformer architectures, RAG pipelines, and deploying AI-powered applications in production.

Start Here — Learning Roadmap

A suggested path from zero to mastery. Follow these steps in order:

  1. Learn Python and math foundations — Ensure solid Python skills plus basics of linear algebra, calculus, and probability/statistics
  2. Understand classical ML algorithms — Learn linear/logistic regression, decision trees, SVMs, and k-means clustering with scikit-learn
  3. Master model evaluation — Understand train/test splits, cross-validation, precision/recall, ROC curves, and overfitting vs underfitting
  4. Build neural networks from scratch — Implement forward/backward propagation, gradient descent, and loss functions to understand the fundamentals
  5. Learn PyTorch — Build models with tensors, autograd, nn.Module, DataLoaders, and training loops in the dominant DL framework
  6. Study deep learning architectures — Understand CNNs for vision, RNNs/LSTMs for sequences, and the attention mechanism
  7. Master transformers and LLMs — Learn self-attention, positional encoding, tokenization, and how GPT/BERT architectures work
  8. Fine-tune and prompt-engineer — Adapt pre-trained models with LoRA/QLoRA, write effective prompts, and use few-shot learning
  9. Build RAG applications — Create retrieval-augmented generation pipelines with embeddings, vector databases, and LangChain/LlamaIndex
  10. Deploy AI in production — Serve models with vLLM/TGI, build APIs, handle inference at scale, and monitor model performance

Official & Core Documentation

GitHub Awesome Lists & Curated Collections

Interactive Courses & Hands-On Platforms

Free Courses

University & MOOC Courses

Practice & Challenges

  • Kaggle Competitions — Real ML challenges with datasets, notebooks, and leaderboards (All levels)
  • LLM Course Notebooks — Free Colab notebooks for fine-tuning, quantization, and RAG (Intermediate)
  • Hugging Face Spaces — Deploy and share ML demos with Gradio or Streamlit for free (Intermediate)

Video Courses & YouTube Channels

Structured Course Playlists

Individual Creators & Channels

  • Yannic Kilcher — ML paper reviews and in-depth explanations of cutting-edge research (Advanced)
  • Two Minute Papers — Accessible AI research summaries with visual demonstrations (Beginner)
  • AI Explained — Clear breakdowns of the latest AI developments, papers, and benchmarks (Intermediate)
  • StatQuest with Josh Starmer — ML and statistics concepts explained simply with animations (Beginner)
  • Jeremy Howard (fast.ai) — Practical deep learning insights from the fast.ai creator (Intermediate)

Books & Long-Form Reading

Free Online Books

Essential Paid Books

Community, Practice & News

Forums & Discussion

  • Hugging Face Community — Forum for Transformers, model sharing, and ML discussion
  • r/MachineLearning — Research papers, discussions, and industry news (2.8M+ members)
  • r/LocalLLaMA — Active community for running and fine-tuning open-source LLMs locally
  • MLOps Community — Community for ML engineering, deployment, and production best practices

Newsletters & Blogs

  • The Batch (DeepLearning.AI) — Andrew Ng’s weekly AI newsletter with curated news and original articles
  • Ahead of AI — Sebastian Raschka’s newsletter on ML, deep learning, and AI research
  • Latent Space — Newsletter, podcast, and community for AI engineers covering LLMs and tooling
  • TLDR AI — Daily newsletter delivering AI, ML, and data science updates (620k+ subscribers)
  • The Rundown AI — Daily AI newsletter covering developments, trends, and practical applications (750k+ subscribers)

Ecosystem Resources

  • Papers With Code — ML papers with code implementations, benchmarks, and state-of-the-art leaderboards
  • Kaggle Discussion Forums — Competition strategies, data science Q&A, and notebook sharing
  • Arxiv Sanity — Browse, search, and filter ML research papers from arXiv

Tools & Environments

  • Google Colab — Free cloud notebooks with GPU/TPU access for training models
  • Hugging Face Model Hub — Repository of 500k+ pre-trained models ready for inference and fine-tuning
  • Weights & Biases — ML experiment tracking, visualization, and model registry (free for individuals)
  • Ollama — Run open-source LLMs locally on your machine with a simple CLI
  • LM Studio — Desktop app for downloading and running LLMs locally with a chat interface
  • vLLM — High-throughput LLM serving engine with PagedAttention for production inference
  • Gradio — Build and share ML demos with a web interface in a few lines of Python
  • Lightning AI — Platform for building, training, and deploying AI models with free GPU access

Cheat Sheets & Quick References

Research & Staying Current

  • Papers With Code — State of the Art — Track state-of-the-art results across ML benchmarks and tasks
  • Arxiv Sanity — Browse, search, and filter the latest ML research papers from arXiv
  • Distill.pub — Clear, visual explanations of machine learning concepts through interactive articles
  • The Gradient — Long-form essays on AI research, policy, and industry trends from Stanford researchers
  • Sebastian Raschka’s Blog — In-depth articles on deep learning, LLMs, and research methodology