Refine Your Search

Topic

Search Results

Journal Article

An RBDO Method for Multiple Failure Region Problems using Probabilistic Reanalysis and Approximate Metamodels

2009-04-20
2009-01-0204
A Reliability-Based Design Optimization (RBDO) method for multiple failure regions is presented. The method uses a Probabilistic Re-Analysis (PRRA) approach in conjunction with an approximate global metamodel with local refinements. The latter serves as an indicator to determine the failure and safe regions. PRRA calculates very efficiently the system reliability of a design by performing a single Monte Carlo (MC) simulation. Although PRRA is based on MC simulation, it calculates “smooth” sensitivity derivatives, allowing therefore, the use of a gradient-based optimizer. An “accurate-on-demand” metamodel is used in the PRRA that allows us to handle problems with multiple disjoint failure regions and potentially multiple most-probable points (MPP). The multiple failure regions are identified by using a clustering technique. A maximin “space-filling” sampling technique is used to construct the metamodel. A vibration absorber example highlights the potential of the proposed method.
Technical Paper

Balance between Reliability and Robustness in Engine Cooling System Optimal Design

2007-04-16
2007-01-0594
This paper explores the trade-off between reliability-based design and robustness for an automotive under-hood thermal system using the iSIGHT-FD environment. The interaction between the engine cooling system and the heating, ventilating, and air-conditioning (HVAC) system is described. The engine cooling system performance is modeled using Flowmaster and a metamodel is developed in iSIGHT. The actual HVAC system performance is characterized using test bench data. A design of experiment procedure determines the dominant factors and the statistics of the HVAC performance is obtained using Monte Carlo simulation (MCS). The MCS results are used to build an overall system response metamodel in order to reduce the computational effort. A multi-objective optimization in iSIGHT maximizes the system mean performance and simultaneously minimizes its standard deviation subject to probabilistic constraints.
Journal Article

Computational Efficiency Improvements in Topography Optimization Using Reanalysis

2016-04-05
2016-01-1395
To improve fuel economy, there is a trend in automotive industry to use light weight, high strength materials. Automotive body structures are composed of several panels which must be downsized to reduce weight. Because this affects NVH (Noise, Vibration and Harshness) performance, engineers are challenged to recover the lost panel stiffness from down-gaging in order to improve the structure borne noise transmitted through the lightweight panels in the frequency range of 100-300 Hz where most of the booming and low medium frequency noise occurs. The loss in performance can be recovered by optimized panel geometry using beading or damping treatment. Topography optimization is a special class of shape optimization for changing sheet metal shapes by introducing beads. A large number of design variables can be handled and the process is easy to setup in commercial codes. However, optimization methods are computationally intensive because of repeated full-order analyses.
Technical Paper

Design Optimization Under Uncertainty Using Evidence Theory

2006-04-03
2006-01-0388
Early in the engineering design cycle, it is difficult to quantify product reliability due to insufficient data or information to model uncertainties. Probability theory can not be therefore, used. Design decisions are usually, based on fuzzy information which is imprecise and incomplete. Recently, evidence theory has been proposed to handle uncertainty with limited information. In this paper, a computationally efficient design optimization method is proposed based on evidence theory, which can handle a mixture of epistemic and random uncertainties. It quickly identifies the vicinity of the optimal point and the active constraints by moving a hyper-ellipse in the original design space, using a reliability-based design optimization (RBDO) algorithm. Subsequently, a derivative-free optimizer calculates the evidence-based optimum, starting from the close-by RBDO optimum, considering only the identified active constraints.
Technical Paper

Design Optimization and Reliability Estimation with Incomplete Uncertainty Information

2006-04-03
2006-01-0962
Existing methods for design optimization under uncertainty assume that a high level of information is available, typically in the form of data. In reality, however, insufficient data prevents correct inference of probability distributions, membership functions, or interval ranges. In this article we use an engine design example to show that optimal design decisions and reliability estimations depend strongly on uncertainty characterization. We contrast the reliability-based optimal designs to the ones obtained using worst-case optimization, and ask the question of how to obtain non-conservative designs with incomplete uncertainty information. We propose an answer to this question through the use of Bayesian statistics. We estimate the truck's engine reliability based only on available samples, and demonstrate that the accuracy of our estimates increases as more samples become available.
Technical Paper

Design Under Uncertainty and Assessment of Performance Reliability of a Dual-Use Medium Truck with Hydraulic-Hybrid Powertrain and Fuel Cell Auxiliary Power Unit

2005-04-11
2005-01-1396
Medium trucks constitute a large market segment of the commercial transportation sector, and are also used widely for military tactical operations. Recent technological advances in hybrid powertrains and fuel cell auxiliary power units have enabled design alternatives that can improve fuel economy and reduce emissions dramatically. However, deterministic design optimization of these configurations may yield designs that are optimal with respect to performance but raise concerns regarding the reliability of achieving that performance over lifetime. In this article we identify and quantify uncertainties due to modeling approximations or incomplete information. We then model their propagation using Monte Carlo simulation and perform sensitivity analysis to isolate statistically significant uncertainties. Finally, we formulate and solve a series of reliability-based optimization problems and quantify tradeoffs between optimality and reliability.
Journal Article

Design under Uncertainty using a Combination of Evidence Theory and a Bayesian Approach

2008-04-14
2008-01-0377
Early in the engineering design cycle, it is difficult to quantify product reliability due to insufficient data or information to model uncertainties. Probability theory can not be therefore, used. Design decisions are usually based on fuzzy information which is imprecise and incomplete. Various design methods such as Possibility-Based Design Optimization (PBDO) and Evidence-Based Design Optimization (EBDO) have been developed to systematically treat design with non-probabilistic uncertainties. In practical engineering applications, information regarding the uncertain variables and parameters may exist in the form of sample points, and uncertainties with sufficient and insufficient information may exist simultaneously. Most of the existing optimal design methods under uncertainty can not handle this form of incomplete information. They have to either discard some valuable information or postulate the existence of additional information.
Technical Paper

Dynamic Properties of Styrene-Butadiene Rubber for Automotive Applications

2009-05-19
2009-01-2128
Styrene-Butadiene Rubber (SBR) is a copolymer of butadiene and styrene. It has a wide range of applications in the automotive industry due to its high durability, resistance to abrasion, oils and oxidation. SBR applications vary from tires to vibration isolators and gaskets. SBR is also used in tuned dampers which aim to reduce and control the angular vibrations of crankshafts, acting as an isolator and energy absorber between the tune damper's hub and the inertia ring. The dynamic properties of this polymer are therefore, very important in developing an appropriate analytical model. This paper presents the results of a series of experiments performed to determine the dynamic stiffness and damping properties of SBR. The frequency, temperature and displacement dependent properties are determined in a low frequency range from 0.4 to 150 Hz, and in a mid frequency range from 150 to 550 Hz. The most interesting property of SBR is its frequency dependent behavior.
Journal Article

Efficient Global Surrogate Modeling Based on Multi-Layer Sampling

2018-04-03
2018-01-0616
Global surrogate modeling aims to build surrogate model with high accuracy in the whole design domain. A major challenge to achieve this objective is how to reduce the number of function evaluations to the original computer simulation model. To date, the most widely used approach for global surrogate modeling is the adaptive surrogate modeling method. It starts with an initial surrogate model, which is then refined adaptively using the mean square error (MSE) or maximizing the minimum distance criteria. It is observed that current methods may not be able to effectively construct a global surrogate model when the underlying black box function is highly nonlinear in only certain regions. A new surrogate modeling method which can allocate more training points in regions with high nonlinearity is needed to overcome this challenge. This article proposes an efficient global surrogate modeling method based on a multi-layer sampling scheme.
Journal Article

Efficient Probabilistic Reanalysis and Optimization of a Discrete Event System

2011-04-12
2011-01-1081
This paper presents a methodology to evaluate and optimize discrete event systems, such as an assembly line or a call center. First, the methodology estimates the performance of a system for a single probability distribution of the inputs. Probabilistic Reanalysis (PRRA) uses this information to evaluate the effect of changes in the system configuration on its performance. PRRA is integrated with a program to optimize the system. The proposed methodology is dramatically more efficient than one requiring a new Monte Carlo simulation each time we change the system. We demonstrate the approach on a drilling center and an electronic parts factory.
Journal Article

Efficient Re-Analysis Methodology for Probabilistic Vibration of Large-Scale Structures

2008-04-14
2008-01-0216
It is challenging to perform probabilistic analysis and design of large-scale structures because probabilistic analysis requires repeated finite element analyses of large models and each analysis is expensive. This paper presents a methodology for probabilistic analysis and reliability based design optimization of large scale structures that consists of two re-analysis methods; one for estimating the deterministic vibratory response and another for estimating the probability of the response exceeding a certain level. The deterministic re-analysis method can analyze efficiently large-scale finite element models consisting of tens or hundreds of thousand degrees of freedom and large numbers of design variables that vary in a wide range. The probabilistic re-analysis method calculates very efficiently the system reliability for many probability distributions of the design variables by performing a single Monte Carlo simulation.
Journal Article

Enhancing Decision Topology Assessment in Engineering Design

2014-04-01
2014-01-0719
Implications of decision analysis (DA) on engineering design are important and well-documented. However, widespread adoption has not occurred. To that end, the authors recently proposed decision topologies (DT) as a visual method for representing decision situations and proved that they are entirely consistent with normative decision analysis. This paper addresses the practical issue of assessing the DTs of a designer using their responses. As in classical DA, this step is critical to encoding the DA's preferences so that further analysis and mathematical optimization can be performed on the correct set of preferences. We show how multi-attribute DTs can be directly assessed from DM responses. Furthermore, we show that preferences under uncertainty can be trivially incorporated and that topologies can be constructed using single attribute topologies similarly to multi-linear functions in utility analysis. This incremental construction simplifies the process of topology construction.
Journal Article

Flexible Design and Operation of a Smart Charging Microgrid

2014-04-01
2014-01-0716
The reliability theory of repairable systems is vastly different from that of non-repairable systems. The authors have recently proposed a ‘decision-based’ framework to design and maintain repairable systems for optimal performance and reliability using a set of metrics such as minimum failure free period, number of failures in planning horizon (lifecycle), and cost. The optimal solution includes the initial design, the system maintenance throughout the planning horizon, and the protocol to operate the system. In this work, we extend this idea by incorporating flexibility and demonstrate our approach using a smart charging electric microgrid architecture. The flexibility is realized by allowing the architecture to change with time. Our approach “learns” the working characteristics of the microgrid. We use actual load and supply data over a short time to quantify the load and supply random processes and also establish the correlation between them.
Technical Paper

Imprecise Reliability Assessment When the Type of the Probability Distribution of the Random Variables is Unknown

2009-04-20
2009-01-0199
In reliability design, often, there is scarce data for constructing probabilistic models. It is particularly challenging to model uncertainty in variables when the type of their probability distribution is unknown. Moreover, it is expensive to estimate the upper and lower bounds of the reliability of a system involving such variables. A method for modeling uncertainty by using Polynomial Chaos Expansion is presented. The method requires specifying bounds for statistical summaries such as the first four moments and credible intervals. A constrained optimization problem, in which decision variables are the coefficients of the Polynomial Chaos Expansion approximation, is formulated and solved in order to estimate the minimum and maximum values of a system’s reliability. This problem is solved efficiently by employing a probabilistic re-analysis approach to approximate the system reliability as a function of the moments of the random variables.
Technical Paper

Improving Robust Design with Preference Aggregation Methods

2004-03-08
2004-01-1140
Robust design is a methodology for improving the quality of a product or process by minimizing the effect of variations in the inputs without eliminating the causes of those variations. In robust design, the putative best design is obtained by solving a multi-criteria optimization problem, trading off the nominal performance against the minimization of the variation of the performance measure. Because some existing methods combine the two criteria with a weighted sum or another fixed aggregation strategy, which are known to miss Pareto points, they may fail to obtain a desired design. To overcome this inadequacy, a more comprehensive preference aggregation method is implemented here into robust design. Three examples -- one simple mathematical example, one multi-criteria structure design example, and one automotive example -- are presented to illustrate the effectiveness of the proposed method.
Journal Article

Long Life Axial Fatigue Strength Models for Ferrous Powder Metals

2018-04-03
2018-01-1395
Two models are presented for the long life (107 cycles) axial fatigue strength of four ferrous powder metal (PM) material series: sintered and heat-treated iron-carbon steel, iron-copper and copper steel, iron-nickel and nickel steel, and pre-alloyed steel. The materials are defined at ranges of carbon content and densities using the broad data available in the Metal Powder Industries Federation (MPIF) Standard 35 for PM structural parts. The first model evaluates 107 cycles axial fatigue strength as a function of ultimate strength and the second model as a function of hardness. For all 118 studied materials, both models are found to have a good correlation between calculated and 107 cycles axial fatigue strength with a high Pearson correlation coefficient of 0.97. The article provides details on the model development and the reasoning for selecting the ultimate strength and hardness as the best predictors for 107 cycles axial fatigue strength.
Technical Paper

Managing the Computational Cost in a Monte Carlo Simulation by Considering the Value of Information

2012-04-16
2012-01-0915
Monte Carlo simulation is a popular tool for reliability assessment because of its robustness and ease of implementation. A major concern with this method is its computational cost; standard Monte Carlo simulation requires quadrupling the number of replications for halving the standard deviation of the estimated failure probability. Efforts to increase efficiency focus on intelligent sampling procedures and methods for efficient calculation of the performance function of a system. This paper proposes a new method to manage cost that views design as a decision among alternatives with uncertain reliabilities. Information from a simulation has value only if it enables the designer to make a better choice among the alternative options. Consequently, the value of information from the simulation is equal to the gain from using this information to improve the decision. A designer can determine the number of replications that are worth performing by using the method.
Journal Article

Managing the Computational Cost of Monte Carlo Simulation with Importance Sampling by Considering the Value of Information

2013-04-08
2013-01-0943
Importance Sampling is a popular method for reliability assessment. Although it is significantly more efficient than standard Monte Carlo simulation if a suitable sampling distribution is used, in many design problems it is too expensive. The authors have previously proposed a method to manage the computational cost in standard Monte Carlo simulation that views design as a choice among alternatives with uncertain reliabilities. Information from simulation has value only if it helps the designer make a better choice among the alternatives. This paper extends their method to Importance Sampling. First, the designer estimates the prior probability density functions of the reliabilities of the alternative designs and calculates the expected utility of the choice of the best design. Subsequently, the designer estimates the likelihood function of the probability of failure by performing an initial simulation with Importance Sampling.
Journal Article

Mean-Value Second-Order Saddlepoint Approximation for Reliability Analysis

2017-03-28
2017-01-0207
A new second-order Saddlepoint Approximation (SA) method for structural reliability analysis is introduced. The Mean-value Second-order Saddlepoint Approximation (MVSOSA) is presented as an extension to the Mean-value First-order Saddlepoint Approximation (MVFOSA). The proposed method is based on a second-order Taylor expansion of the limit state function around the mean value of the input random variables. It requires not only the first but also the second-order sensitivity derivatives of the limit state function. If sensitivity analysis must be avoided because of computational cost, a quadrature integration approach, based on sparse grids, is also presented and linked to the saddlepoint approximation (SGSA - Sparse Grid Saddlepoint Approximation). The SGSA method is compared with the first and second-order SA methods in terms of accuracy and efficiency. The proposed MVSOSA and SGSA methods are used in the reliability analysis of two examples.
Technical Paper

Modeling Dependence and Assessing the Effect of Uncertainty in Dependence in Probabilistic Analysis and Decision Under Uncertainty

2010-04-12
2010-01-0697
A complete probabilistic model of uncertainty in probabilistic analysis and design problems is the joint probability distribution of the random variables. Often, it is impractical to estimate this joint probability distribution because the mechanism of the dependence of the variables is not completely understood. This paper proposes modeling dependence by using copulas and demonstrates their representational power. It also compares this representation with a Monte-Carlo simulation using dispersive sampling.
X