Search Constraints
Number of results to display per page
Results for:
« Previous 
1  10 of 217

Next »
Search Results

Video
Before the advent of computers around 1950, optimization centered either on smalldimensional problems solved by looking at zeroes of first derivatives and signs of second derivatives, or on infinitedimensional problems about curves and surfaces. In both cases, "variations" were employed to understand how a local solution might be characterized. Computers changed the picture by opening the possibility of solving largescale problems involving inequalities, instead of only equations. Inequalities had to be recognized as important because the decisions to be optimized were constrained by the need to respect many upper or lower bounds on their feasibility. A new kind of mathematical analysis, beyond traditional calculus, had to be developed to address these needs. It built first on appealing to the convexity of sets and functions, but went on to amazingly broad and successful concepts of variational geometry, subgradients, subderivatives, and variational convergence beyond just that. This talk will explain these revolutionary developments and why there were essential.
Event date: 1/11/2022
Speaker: Prof. Terry Rockafellar (University of Washington)
Hosted by: Department of Applied Mathematics
 Subjects:
 Mathematics and Statistics
 Keywords:
 Mathematical optimization Computer science  Mathematics Convex sets Convex functions
 Resource Type:
 Video

Video
Adaptive computation is of great importance in numerical simulations. The ideas for adaptive computations can be dated back to adaptive finite element methods in 1970s. In this talk, we shall first review some recent development for adaptive methods with some application. Then, we will propose a deep adaptive sampling method for solving PDEs where deep neural networks are utilized to approximate the solutions. In particular, we propose the failure informed PINNs (FIPINNs), which can adaptively refine the training set with the goal of reducing the failure probability. Compared with the neural network approximation obtained with uniformly distributed collocation points, the proposed algorithms can significantly improve the accuracy, especially for low regularity and highdimensional problems.
Event date: 18/10/2022
Speaker: Prof. Tao Tang (Beijing Normal UniversityHong Kong Baptist University United International College)
Hosted by: Department of Applied Mathematics
 Subjects:
 Mathematics and Statistics
 Keywords:
 Adaptive computing systems Mathematical models Sampling (Statistics) Differential equations Partial  Numerical solutions
 Resource Type:
 Video

Video
Convex Matrix Optimization (MOP) arises in a wide variety of applications. The last three decades have seen dramatic advances in the theory and practice of matrix optimization because of its extremely powerful modeling capability. In particular, semidefinite programming (SP) and its generalizations have been widely used to model problems in applications such as combinatorial and polynomial optimization, covariance matrix estimation, matrix completion and sensor network localization. The first part of the talk will describe the primaldual interiorpoint methods (IPMs) implemented in SDPT3 for solving medium scale SP, followed by inexact IPMs (with linear systems solved by iterative solvers) for large scale SDP and discussions on their inherent limitations. The second part will present algorithmic advances for solving large scale SDP based on the proximalpoint or augmented Lagrangian framework In particular, we describe the design and implementation of an augmented Lagrangian based method (called SDPNAL+) for solving SDP problems with large number of linear constraints. The last part of the talk will focus on recent advances on using a combination of local search methods and convex lifting to solve lowrank factorization models of SP problems.
Event date: 11/10/2022
Speaker: Prof. KimChuan Toh (National University of Singapore)
Hosted by: Department of Applied Mathematics
 Subjects:
 Mathematics and Statistics
 Keywords:
 Convex programming Semidefinite programming
 Resource Type:
 Video

Video
We introduce a DimensionReduced SecondOrder Method (DRSOM) for convex and nonconvex (unconstrained) optimization. Under a trustregionlike framework, our method preserves the convergence of the secondorder method while using only Hessianvector products in two directions. Moreover; the computational overhead remains comparable to the firstorder such as the gradient descent method. We show that the method has a local superlinear convergence and a global convergence rate of 0(∈3/2) to satisfy the firstorder and secondorder conditions under a commonly used approximated Hessian assumption. We further show that this assumption can be removed if we perform one step of the Krylov subspace method at the end of the algorithm, which makes DRSOM the first firstordertype algorithm to achieve this complexity bound. The applicability and performance of DRSOM are exhibited by various computational experiments in logistic regression, L2Lp minimization, sensor network localization, neural network training, and policy optimization in reinforcement learning. For neural networks, our preliminary implementation seems to gain computational advantages in terms of training accuracy and iteration complexity over stateoftheart firstorder methods including SGD and ADAM. For policy optimization, our experiments show that DRSOM compares favorably with popular policy gradient methods in terms of the effectiveness and robustness.
Event date: 19/09/2022
Speaker: Prof. Yinyu Ye (Stanford University)
Hosted by: Department of Applied Mathematics
 Subjects:
 Mathematics and Statistics
 Keywords:
 Mathematical optimization Convex programming Nonconvex programming
 Resource Type:
 Video

Video
In a lively show, mathemagician Arthur Benjamin races a team of calculators to figure out 3digit squares, solves another massive mental equation and guesses a few birthdays. How does he do it? He’ll tell you.
 Subjects:
 Mathematics and Statistics
 Keywords:
 Mental arithmetic Mental calculators
 Resource Type:
 Video

Video
By analyzing raw data on violent incidents in the Iraq war and others, Sean Gourley and his team claim to have found a surprisingly strong mathematical relationship linking the fatality and frequency of attacks.
 Subjects:
 Mathematics and Statistics
 Keywords:
 War  Mathematical models
 Resource Type:
 Video

Video
Today's math curriculum is teaching students to expect  and excel at  paintbynumbers classwork, robbing kids of a skill more important than solving problems: formulating them. Dan Meyer shows classroomtested math exercises that prompt students to stop and think.
 Subjects:
 Mathematics and Statistics
 Keywords:
 Mathematics  Study teaching
 Resource Type:
 Video

Video
What can mathematics say about history? According to TED Fellow JeanBaptiste Michel, quite a lot. From changes to language to the deadliness of wars, he shows how digitized history is just starting to reveal deep underlying patterns.
 Subjects:
 Mathematics and Statistics
 Keywords:
 History  Mathematical models
 Resource Type:
 Video

Video
Having trouble remembering the order of operations? Let's raise the stakes a little bit. What if the future of your (theoretical) kingdom depended on it? Garth Sundem creates a world in which PEMDAS is the hero but only heroic when in the proper order.
 Subjects:
 Mathematics and Statistics
 Keywords:
 Games in mathematics education Games  Mathematics
 Resource Type:
 Video

Video
What's so special about Leonardo da Vinci's Vitruvian Man? With arms outstretched, the man fills the irreconcilable spaces of a circle and a square  symbolizing the Renaissanceera belief in the mutable nature of humankind. James Earle explains the geometric, religious and philosophical significance of this deceptively simple drawing.
 Subjects:
 History and Mathematics and Statistics
 Keywords:
 Mathematics  Social aspects Vitruvian man (Leonardo da Vinci)
 Resource Type:
 Video