Stephane Bersieraiconversations.hashnode.dev·Nov 1, 2024Reparametrization sensitivity, with ChatGPTThe conversation has been condensed and edited for clarity. User: One issue with vanilla gradient descent is that the gradient direction is sensitive to parametrization. Are there practical methods to avoid this issue? Does natural gradient descent h...Natural Gradient Descent
Deepak Kumar Mohantykumarblog-1.hashnode.dev·Oct 26, 2024Finding the Best-Fit Line in Linear Regression – Manual Minimization vs. Gradient DescentWhen working with linear regression, the goal is to identify the best-fit line that captures the relationship between your input (independent) variable x and output (dependent) variable y. This line, represented by the equation: $$h_{\theta}(x) = \th...best fit line
TJ Gokkentjgokken.com·Oct 23, 2024From Theory to Reality: Understanding Machine Learning with ML.NETMachine learning uses algorithms to learn. Great, but how? How does it use these algorithms and which algorithm does what? Let’s break down some of the core concepts that power these intelligent algorithms, starting with one of the most important: Gr...35 readsMachine Learning FundamentalsMachine Learning
Gerard Sansai-cosmos.hashnode.dev·Oct 7, 2024Beyond Gradient Descent: Theoretical AnalysisAudience: ML Engineers, NLP Practitioners, AI Researchers Keywords: NLP, Transformers, Gradient Descent, Optimisation, Euclidean Geometry, Non-Euclidean Geometry, Hyperbolic Geometry, Elliptic Geometry, High-Dimensional Data, Discrete Data, Symbolic ...60 readsnlp transformers
Sai Aneeshlhcee3.hashnode.dev·Aug 15, 2024Stochastic Gradient DescentStochastic Gradient Descent (SGD) is a cornerstone algorithm in the realm of machine learning. Its simplicity, efficiency, and effectiveness have made it a go-to choice for optimizing a wide range of models, from linear regression to deep neural netw...Deep Learning
Utkal Kumar Dasukc.hashnode.dev·Jul 23, 2024Understanding Linear Regression in Machine LearningRegression in ML is a supervised learning algorithm which computes a relationship between dependent and independent variables. It is most often used to predict an output from multiple possible outputs. (in most of the cases it is a number) There are ...71 readsLinear Regression
Ojas Aroraojas1423.hashnode.dev·Jul 20, 2024Machine Learning Basics: Simple Guide to Regression and Classification for BeginnersMachine learning (ML) is transforming industries by providing systems the ability to automatically learn and improve from experience without being explicitly programmed. This guide will introduce key concepts and techniques that form the backbone of ...10 likesMachine Learning
Juan Carlos Olamendyjuancolamendy.hashnode.dev·Jul 15, 2024Backpropagation in Deep Learning: The Key to Optimizing Neural NetworksHave you ever wondered how neural networks learn? Have you ever wondered how your smartphone recognizes your face or how virtual assistants understand your voice? The secret lies in a powerful algorithm called backpropagation. Imagine trying to teach...Machine Learning
Rashid Ul Haqrashid-ul-haq.hashnode.dev·Jul 6, 2024Gradients: The Building Blocks of Backpropagation in TensorFlowIn a neural network, backpropagation is essential for error minimization. It involves calculating the partial derivatives, or gradients, of the loss function with respect to trainable parameters. Manually computing and implementing these derivatives ...10 likesDeep Learning UnpluggedTensorFlow
Vamshi Avamshi.co·Jun 23, 2024Understanding Batch and Stochastic Gradient Descent and Normal Equation for Linear RegressionIn this article I'll be discussing about 3 things: Batch Gradient Descent Stochastic Gradient Descent Normal Equation Gradient Descent Gradient descent in machine learning is used to find the values of a function's parameters (coefficients), in ...stochastic-gradient-descent