## Notes

- After AI winter, AI was revived through more realistic applications (recommenders), better computers, and more data. Also somewhat better algorithms
- Reactions to AI: fear and optimism. We’ve survived through every revolution so far
- AI is the next stage of evolution. It’s our duty to create AI
- ML stages: predict, error, learn
- An algorithm is code which describes a machine learning process
- Gradient descent and linear regression are building blocks
- Logistic regression is a bit of a misnomer. Logistic function is performed in linear regression
- A logit is basically a probability
- Objective function is the name of the function in the predict step. It is a sigmoid function in logistic regression
- Gradient descent is descent to where the error is the lowest
- 3 maths are linear algebra, stats, and calculus
- Cooking analogy. Linalg is mise en place, stats are the recipe or cookbook, and calculus is actually cooking
- Deep learning is closer to
*automating*a mental task - Deep learning stacks shallow learning algorithms (eg logistic regression logit)
- GPA is a good approximation of PCA
- Kernel trick is like looking at a dataset from a different angle so you can see a better solution. Or like colored goggles
- Markov: no current states have dependencies on previous states
- NLP is a good specialization. It’s everywhere.
- 3 levels from lowest to highest: parts, tasks, goals
- Parts are corpora, lexicon, etc
- Tasks are named entity recognition, part of speech
- Goals are text classification, sentiment analysis, spell check
- GPU is 20-100x performance gain
- Nvidia provides cuda
- Pandas to get and munge data. Then pass to numpy for transformations
- In CNNs, window and stride gobhand in hand for filter definitions
- Hyperparameter is anything the human chooses
- When feature scaling. Normalization always does 0 to 1. Standardization is mean 0 with unit stdev. Standardization is more common