Sai Aneeshlhcee3.hashnode.dev·Aug 15, 2024Stochastic Gradient DescentStochastic Gradient Descent (SGD) is a cornerstone algorithm in the realm of machine learning. Its simplicity, efficiency, and effectiveness have made it a go-to choice for optimizing a wide range of models, from linear regression to deep neural netw...DiscussDeep Learning
Utkal Kumar Dasukc.hashnode.dev·Jul 23, 2024Understanding Linear Regression in Machine LearningRegression in ML is a supervised learning algorithm which computes a relationship between dependent and independent variables. It is most often used to predict an output from multiple possible outputs. (in most of the cases it is a number) There are ...Discuss·58 readsLinear Regression
Ojas Aroraojas1423.hashnode.dev·Jul 20, 2024Machine Learning Basics: Simple Guide to Regression and Classification for BeginnersMachine learning (ML) is transforming industries by providing systems the ability to automatically learn and improve from experience without being explicitly programmed. This guide will introduce key concepts and techniques that form the backbone of ...Discuss·10 likesMachine Learning
Juan Carlos Olamendyjuancolamendy.hashnode.dev·Jul 15, 2024Backpropagation in Deep Learning: The Key to Optimizing Neural NetworksHave you ever wondered how neural networks learn? Have you ever wondered how your smartphone recognizes your face or how virtual assistants understand your voice? The secret lies in a powerful algorithm called backpropagation. Imagine trying to teach...DiscussMachine Learning
Rashid Ul Haqrashid-ul-haq.hashnode.dev·Jul 6, 2024Gradients: The Building Blocks of Backpropagation in TensorFlowIn a neural network, backpropagation is essential for error minimization. It involves calculating the partial derivatives, or gradients, of the loss function with respect to trainable parameters. Manually computing and implementing these derivatives ...Discuss·10 likesDeep Learning UnpluggedTensorFlow
Vamshi Avamshi.study·Jun 23, 2024Understanding Batch and Stochastic Gradient Descent and Normal Equation for Linear RegressionIn this article I'll be discussing about 3 things: Batch Gradient Descent Stochastic Gradient Descent Normal Equation Gradient Descent Gradient descent in machine learning is used to find the values of a function's parameters (coefficients), in ...Discussstochastic-gradient-descent
Vamshi Avamshi.study·Jun 16, 2024Step-by-Step Guide to Linear Regression from ScratchThis is beginner friendly, you don't need any prerequisites and I'll explain each and every single line of code and mathematical logic as simply as possible so jump right in! There are 3 things I'll be covering in this article Linear Regression Mat...Discuss·123 readsMachine Learning
Sunghoon Kimssuhoon.hashnode.dev·Jun 12, 2024경사하강법 활용선형회귀분석 n개의 변수로 데이터로 이뤄져 있는 상황에서 이를 가장 잘 표현하는 선형 모델을 찾는 문제가 선형회귀분석이다. 변수의 개수보다 식의 개수가 더 많아지는 경우 (n > m), 무어-펜로즈 역행렬을 이용해서 선형 모델에 해당하는 선형회귀식을 찾을 수가 있다. 또한, 경사하강법을 이용해서 선형 모델을 찾을 수 있다. 경사하강법으로 선형회귀 계수 구하기 경사하강법은 선형 모델이 아닌 다른 방식의 모델을 사용할 때도 이용할 수 있...Discuss인공지능 기초Mathematics
Sunghoon Kimssuhoon.hashnode.dev·Jun 11, 2024경사하강법 기초미분이 뭔가요? 미분(differentiation)은 변수의 움직임에 따른 함수값의 변화를 측정하기 위한 도구로 최적화에서 제일 많이 사용하는 기법이다. 어떤 주어진 한 점 $x$와 그 $x$에서 $h$만큼 이동한 $x+h$에서의 함수값과의 차이를 움직인 거리 $h$로 나눠줬을 때 변화율(기울기)이라 부른다. 변화율의 극한을 미분으로 정의한다. 파이썬에서는 sympy.diff를 가지고 미분을 계산할 수 있다. $$f'(x) = lim_...Discuss인공지능 기초Mathematics
Juan Carlos Olamendyjuancolamendy.hashnode.dev·May 29, 2024Real World ML - Understanding Batch Size. Train Faster and Better Deep Learning ModelsHave you ever spent days fine-tuning a deep learning models, only to see no difference on its performance? A couple days ago, I was helping a friend of mine to fine tune a classifier using CNN. We spent a couple of hours, adjusting every hyper-parame...DiscussMachine Learning