Skip to main content

Virtual Seminar Series | ISE Graduate Student Colloquiums

All dates for this event occur in the past.

Sobhan Nazari

Advisor: Prof. Farhang Pourboghrat

Committee members: Prof. Yannis Korkolis, Prof. Michael Groeber, Prof. Prasad Mokashi

Aluminum alloy thin-walled tubes are widely used for lightweight applications. It is important to understand the Elasto-plastic behavior of these materials under different loadings when forming them for different applications. In this study AA7075 tubular material produced using conventional extrusion and Shear-assisted Processing and Extrusion (ShAPE) processes are characterized both in microstructure and macroscale levels. Also, phenomenological, and microstructure-based material models are used to model the mechanical properties and anisotropic plasticity of these materials. 

 

Dewei Zhang

Advisor: Sam Davanloo Tajbakhsh

Committee members: Guzin Bayraksan, Yingbin Liang, Raef Bassily

Convex optimization has been a successful area due to existence of scalable algorithms that leverage the rich structure provided by convexity. In recent years, however, there is growing interest to understand non- convexity that allows developing computationally efficient algorithms to solve such problems. Interestingly, there is a type of nonconvexity that disappears when one introduces a smooth topological structure and redefine convexity with respect to that structure, i.e., geodesic convexity. This has allowed reformulation of some nonconvex problems as geodesically convex problems and development of classes of algorithms to solve them globally. Even when the resulting problems are not geodesically convex, in some scenarios, these algorithms converge to desirable solutions, e.g., second-order stationary solutions. However, behavior of these algorithms from convergence and convergence rate perspectives is scarcely investigated. This dissertation intends to mainly contribute toward manifold optimization algorithms; more specifically, it focuses on some large-scale, first and second order algorithms (those that use up to the gradient and hessian information, respectively) for smooth objectives over different manifolds. Manifold optimization finds numerous applications in machine learning, statistics, engineering, robotics, and control in solving problems e.g., deep learning, low- rank matrix completion, sparse or nonnegative principal component analysis (PCA), approximation models for integer programming, solving large-scale semidefinite programs (SDP), etc. More specifically, matrix estimation problem in the presence of low-rank constraint, the positive semidefinite covariance estimation in the Gaussian mixture models, the orthogonality constraint in linear eigenvalue decomposition require optimization in the presence of certain Riemannian manifolds constraints. In the absence of other constraints, these constrained optimization problems on Euclidean space can be regarded as unconstrained optimization problems in Riemannian spaces. We believe that this dissertation would be a timely investigation to fill in some gaps in this domain.

While a major part of the dissertation focuses on algorithms for Riemannian optimization, we would also like to include a work on splitting algorithms for statistical learning problems in the presence of hierarchical sparsity structures into the thesis. Convex sparsity-inducing regularization functions play an important role in different fields including machine learning, statistics, and signal processing. Some well-known regularization functions, e.g., lasso, are commonly used in different learning frameworks to induce sparsity which allows simultaneous model fitting and feature selection. In contrast to such sparsity-inducing penalties, in some scenarios, there is a need that sparsity follows a certain structure provided by a Directed Acyclic Graph (DAG). Such regularization functions take advantage of variable grouping and certain overlapping structures that makes solving the underlying optimization problems challenging, specifically in large-scales. Hence, providing computationally efficient optimization algorithms to solve learning problems in the presence of graph-induced structured sparsity regularizer is of great importance.

Category: Seminars