11 OCT 2017 Step by Step @ Metis (San Francisco, CA)
21 SEP 2016 Forms (w/ Carolyn Strauss) @ Conference on Complex Systems (Amsterdam, NL)
9 AUG 2016 Form and Movement in Complex Systems @ PyLadies (San Francisco, CA) video
4 JUN 2016 On People, Cubes and Snacks @ AlterConf (San Francisco, CA) - video
STEP BY STEP: When and how to use STochastic Optimization
- Intro to stochastic optimization
- In-depth look at Stochastic Gradient Descent, Particle Swarm Optimization and Simulated Annealing
- Implementation and optimization walkthrough in Python
- Industry applications
You’ve got your model up and running, and now you’d like to tune your hyperparameters. If you’ve gotten to this point in the process, it means you’ve successfully cleaned your data, engineered your features, coded up your model, identified your objective function, and are now looking to optimize. You could cancel your evening plans to manually explore all hyperparameter combinations, or programmatically iterate through them with something like grid search, but you want to try something else. Something that will produce impressive results in a fraction of the time. Enter stochastic optimization.
This presentation will build on what you already know about parameter optimization to dive deeper into the world of stochastic optimization. In particular, we’ll explore stochastic gradient descent, simulated annealing, and particle swarm optimization - cultivating intuition on how these algorithms work, when they can be applied, what their underlying topologies look like, and how you can get the best performance out of them. If possible, bring models you’re working, so you can relate the content directly to what you’re building. I’ll be available afterwards (and by e-mail) to ensure you have what you need to implement the optimizers we discuss. Code examples will be presented in Python, but I’ll also provide libraries and resources in R.