WSP 363 — Introduction to Mathematical Optimization (ICME Workshop)
Big data has changed the way we work, live, and play. Data science—developing and testing models and algorithms—helps us gain knowledge for ourselves and provide insights to others.
Introduction to Mathematical Optimization is one of ten workshops included in Fundamentals of Data Science,a series of one-day workshops offered by the Stanford Institute for Computational and Mathematical Engineering (ICME). Fundamentals of Data Science provides an introduction to multiple aspects of data science for those who are new to the field and those seeking to broaden their education and skills in data science. Students can sign up for one workshop, or several throughout the week. Students who complete four workshops will qualify for the Stanford ICME Fundamentals of Data Science Summer Workshops Certificate of Completion.
These workshops are not eligible for tuition discounts through Stanford Continuing Studies, but ICME offers discounts for eligible affiliates and partners. See below for details:
Stanford staff and full-time Stanford students: You may be eligible for a tuition discount if you register directly through ICME. The tuition for Stanford staff is $100 and the tuition for full-time Stanford students is $75. For more information and to register with these discounts, visit https://sto.stanfordtickets.org/icme2020/homepage.
ICME partners and affiliates qualify for a discount. If you think you qualify and have not received a discount code separately, email firstname.lastname@example.org. You will need to register through ICME's website.
Workshop: Introduction to Mathematical Optimization
Instructor: Kevin Carlberg, AI Research Science
Manager, Facebook Reality Labs; Affiliate Associate Professor of Applied Mathematics and Mechanical Engineering, University of Washington
Mathematical optimization underpins many applications in science and engineering, as it provides a set of formal tools to compute the ‘best’ action, design, control, or model from a set of possibilities. In data science and machine learning, mathematical optimization is the engine of model fitting. This workshop will provide an overview of the key elements of this topic (unconstrained, constrained, convex optimization, optimization for model fitting), and will have a practical focus, with participants formulating and solving optimization problems early and often using standard modeling languages and solvers. By introducing common models from machine learning and other fields, this workshop aims to make participants comfortable with optimization modeling so that they may use it for rapid prototyping and experimentation in their own work.
Topics to be discussed in this workshop include: formulating optimization problems; fundamentals of constrained and unconstrained optimization; convex optimization; optimization methods for model fitting in machine learning
optimization in Python using SciPy and CVXPY; and in-depth Jupyter Notebook examples from machine learning, statistics, and other fields.
Prerequisite: Students should be comfortable with linear algebra, differential multivariable calculus, and basic probability and statistics. Experience with Python will be helpful, but not required.
Please note: Although the enrollment limit for this workshop is set to 30 Continuing Studies students, this course is designed for the entire Stanford community, and enrolled Continuing Studies students will be joined in the classroom by Stanford graduates and undergraduates. Students should expect a large class.