Maths under the hood of ML/Neural Network/LLM

Girish Kurup
Aug 3, 2024

--

All the Math fundamentals for AI (including Backpropagation primers)

Here are some understanding of the underlying math in AI.

  • Linear Algebra (Vectors, Plotting Vectors, Norm)
  • - Differential Calculus (Differentiating a Function, Partial Derivatives, Gradients, Jacobians, Hessians)
  • - Probability Theory (Random Variable, Central Limit Theorem, Expectation, Variance, Conditional Probability)
  • - Probability Distributions and their PDF/CDFs (Bernoulli, Gaussian, Poisson, Uniform, T-distribution, Chi-squared, Exponential)
  • - Partial Derivatives of Standard Layers/Loss Functions (Sigmoid Function, tanh, ReLU, Logistic Regression, Support Vector Machines/Hinge Loss, Convolutional Layers, Batchnorm: Staged Computation, Batchnorm: Gradient Expression)

--

--

Girish Kurup
Girish Kurup

Written by Girish Kurup

Passionate about Writing . I am Technology & DataScience enthusiast. Reach me girishkurup21@gmail.com.

No responses yet