
MLG 007 Logistic Regression
Failed to add items
Sorry, we are unable to add the item because your shopping cart is already at capacity.
Add to basket failed.
Please try again later
Add to wishlist failed.
Please try again later
Remove from wishlist failed.
Please try again later
Adding to library failed
Please try again
Follow podcast failed
Unfollow podcast failed
-
Narrated by:
-
By:
About this listen
Try a walking desk to stay healthy while you study or work!
Full notes at ocdevel.com/mlg/7. See Andrew Ng Week 3 Lecture Notes
Overview- Logistic Function: A sigmoid function transforming linear regression output to logits, providing a probability between 0 and 1.
- Binary Classification: Logistic regression deals with binary outcomes, determining either 0 or 1 based on a threshold (e.g., 0.5).
- Error Function: Uses log likelihood to measure the accuracy of predictions in logistic regression.
- Gradient Descent: Optimizes the model by adjusting weights to minimize the error function.
- Classification: Predicts a discrete label (e.g., a cat or dog).
- Regression: Predicts a continuous outcome (e.g., house price).
- Train on a dataset of house features to predict if a house is 'expensive' based on labeled data.
- Automatically categorize into 0 (not expensive) or 1 (expensive) through training and gradient descent.
- Neurons in Neural Networks: Act as building blocks, as logistic regression is used to create neurons for more complex models like neural networks.
- Composable Functions: Demonstrates the compositional nature of machine learning algorithms where functions are built on other functions (e.g., logistic built on linear).
No reviews yet