文件名称:Machine.Learning.for.Model.Order.Reduction
文件大小:4.85MB
文件格式:PDF
更新时间:2021-03-27 02:58:09
Machine Learning Model Order Reduction
This Book discusses machine learning for model order reduction, which can be used in modern VLSI design to predict the behavior of an electronic circuit, via mathematical models that predict behavior. The author describes techniques to reduce significantly the time required for simulations involving large-scale ordinary differential equations, which sometimes take several days or even weeks. This method is called model order reduction (MOR), which reduces the complexity of the original large system and generates a reduced-order model (ROM) to represent the original one. Readers will gain in-depth knowledge of machine learning and model order reduction concepts, the tradeoffs involved with using various algorithms, and how to apply the techniques presented to circuit simulations and numerical analysis. Introduces machine learning algorithms at the architecture level and the algorithm levels of abstraction; Describes new, hybrid solutions for model order reduction; Presents machine learning algorithms in depth, but simply; Uses real, industrial applications to verify algorithms. Table of Contents Chapter 1: Introduction Chapter 2: Bio-Inspired Machine Learning Algorithm: Genetic Algorithm Chapter 3: Thermo-Inspired Machine Learning Algorithm: Simulated Annealing Chapter 4: Nature-Inspired Machine Learning Algorithm: Particle Swarm Optimization, Artificial Bee Colony Chapter 5: Control-Inspired Machine Learning Algorithm: Fuzzy Logic Optimization Chapter 6: Brain-Inspired Machine Learning Algorithm: Neural Network Optimization Chapter 7: Comparisons, Hybrid Solutions, Hardware Architectures, and New Directions Chapter 8: Conclusions