Refine Your Search

Topic

Search Results

Technical Paper

A Comprehensive Method for Piston Secondary Dynamics and Piston-Bore Contact

2007-04-16
2007-01-1249
Low vibration and noise level in internal combustion engines has become an essential part of the design process. It is well known that the piston assembly can be a major source of engine mechanical friction and cold start noise, if not designed properly. The piston secondary motion and piston-bore contact pattern are critical in piston design because they affect the skirt-to-bore impact force and therefore, how the piston impact excitation energy is damped, transmitted and eventually radiated from the engine structure as noise. An analytical method is presented in this paper for simulating piston secondary dynamics and piston-bore contact for an asymmetric half piston model. The method includes several important physical attributes such as bore distortion effects due to mechanical and thermal deformation, inertia loading, piston barrelity and ovality, piston flexibility and skirt-to-bore clearance. The method accounts for piston kinematics, rigid-body dynamics and flexibility.
Technical Paper

A Cost-Driven Method for Design Optimization Using Validated Local Domains

2013-04-08
2013-01-1385
Design optimization often relies on computational models, which are subjected to a validation process to ensure their accuracy. Because validation of computer models in the entire design space can be costly, we have previously proposed an approach where design optimization and model validation, are concurrently performed using a sequential approach with variable-size local domains. We used test data and statistical bootstrap methods to size each local domain where the prediction model is considered validated and where design optimization is performed. The method proceeds iteratively until the optimum design is obtained. This method however, requires test data to be available in each local domain along the optimization path. In this paper, we refine our methodology by using polynomial regression to predict the size and shape of a local domain at some steps along the optimization process without using test data.
Technical Paper

A Design Optimization Method Using Possibility Theory

2005-04-11
2005-01-0343
Early in the engineering design cycle, it is difficult to quantify product reliability or compliance to performance targets due to insufficient data or information for modeling the uncertainties. Design decisions are therefore, based on fuzzy information that is vague, imprecise qualitative, linguistic or incomplete. The uncertain information is usually available as intervals with lower and upper limits. In this work, the possibility theory is used to assess design reliability with incomplete information. The possibility theory can be viewed as a variant of fuzzy set theory. A possibility-based design optimization method is proposed where all design constraints are expressed possibilistically. It is shown that the method gives a conservative solution compared with all conventional reliability-based designs obtained with different probability distributions.
Journal Article

A Group-Based Space-Filling Design of Experiments Algorithm

2018-04-03
2018-01-1102
Computer-aided engineering (CAE) is an important tool routinely used to simulate complex engineering systems. Virtual simulations enhance engineering insight into prospective designs and potential design issues and can limit the need for expensive engineering prototypes. For complex engineering systems, however, the effectiveness of virtual simulations is often hindered by excessive computational cost. To minimize the cost of running expensive computer simulations, approximate models of the original model (often called surrogate models or metamodels) can provide sufficient accuracy at a lower computing overhead compared to repeated runs of a full simulation. Metamodel accuracy improves if constructed using space-filling designs of experiments (DOEs). The latter provide a collection of sample points in the design space preferably covering the entire space.
Journal Article

A Methodology for Design Decisions using Block Diagrams

2013-04-08
2013-01-0947
Our recent work has shown that representation of systems using a reliability block diagram can be used as a decision making tool. In decision making, we called these block diagrams decision topologies. In this paper, we generalize the results and show that decision topologies can be used to make many engineering decisions and can in fact replace decision analysis for most decisions. We also provide a meta-proof that the proposed method using decision topologies is entirely consistent with decision analysis at the limit. The main advantages of the method are that (1) it provides a visual representation of a decision situation, (2) it can easily model tradeoffs, (3) it can incorporate binary attributes, (4) it can model preferences with limited information, and (5) it can be used in a low-fidelity sense to quickly make a decision.
Journal Article

A Methodology for Fatigue Life Estimation of Linear Vibratory Systems under Non-Gaussian Loads

2017-03-28
2017-01-0197
Fatigue life estimation, reliability and durability are important in acquisition, maintenance and operation of vehicle systems. Fatigue life is random because of the stochastic load, the inherent variability of material properties, and the uncertainty in the definition of the S-N curve. The commonly used fatigue life estimation methods calculate the mean (not the distribution) of fatigue life under Gaussian loads using the potentially restrictive narrow-band assumption. In this paper, a general methodology is presented to calculate the statistics of fatigue life for a linear vibratory system under stationary, non-Gaussian loads considering the effects of skewness and kurtosis. The input loads are first characterized using their first four moments (mean, standard deviation, skewness and kurtosis) and a correlation structure equivalent to a given Power Spectral Density (PSD).
Technical Paper

A Methodology of Design for Fatigue Using an Accelerated Life Testing Approach with Saddlepoint Approximation

2019-04-02
2019-01-0159
We present an Accelerated Life Testing (ALT) methodology along with a design for fatigue approach, using Gaussian or non-Gaussian excitations. The accuracy of fatigue life prediction at nominal loading conditions is affected by model and material uncertainty. This uncertainty is reduced by performing tests at a higher loading level, resulting in a reduction in test duration. Based on the data obtained from experiments, we formulate an optimization problem to calculate the Maximum Likelihood Estimator (MLE) values of the uncertain model parameters. In our proposed ALT method, we lift all the assumptions on the type of life distribution or the stress-life relationship and we use Saddlepoint Approximation (SPA) method to calculate the fatigue life Probability Density Functions (PDFs).
Journal Article

A New Metamodeling Approach for Time-Dependent Reliability of Dynamic Systems with Random Parameters Excited by Input Random Processes

2014-04-01
2014-01-0717
We propose a new metamodeling method to characterize the output (response) random process of a dynamic system with random parameters, excited by input random processes. The metamodel can be then used to efficiently estimate the time-dependent reliability of a dynamic system using analytical or simulation-based methods. The metamodel is constructed by decomposing the input random processes using principal components or wavelets and then using a few simulations to estimate the distributions of the decomposition coefficients. A similar decomposition is also performed on the output random process. A kriging model is then established between the input and output decomposition coefficients and subsequently used to quantify the output random process corresponding to a realization of the input random parameters and random processes. What distinguishes our approach from others in metamodeling is that the system input is not deterministic but random.
Journal Article

A Nonparametric Bootstrap Approach to Variable-size Local-domain Design Optimization and Computer Model Validation

2012-04-16
2012-01-0226
Design optimization often relies on computational models, which are subjected to a validation process to ensure their accuracy. Because validation of computer models in the entire design space can be costly, a recent approach was proposed where design optimization and model validation were concurrently performed using a sequential approach with both fixed and variable-size local domains. The variable-size approach used parametric distributions such as Gaussian to quantify the variability in test data and model predictions, and a maximum likelihood estimation to calibrate the prediction model. Also, a parametric bootstrap method was used to size each local domain. In this article, we generalize the variable-size approach, by not assuming any distribution such as Gaussian. A nonparametric bootstrap methodology is instead used to size the local domains. We expect its generality to be useful in applications where distributional assumptions are difficult to verify, or not met at all.
Journal Article

A Re-Analysis Methodology for System RBDO Using a Trust Region Approach with Local Metamodels

2010-04-12
2010-01-0645
A simulation-based, system reliability-based design optimization (RBDO) method is presented that can handle problems with multiple failure regions and correlated random variables. Copulas are used to represent the correlation. The method uses a Probabilistic Re-Analysis (PRRA) approach in conjunction with a trust-region optimization approach and local metamodels covering each trust region. PRRA calculates very efficiently the system reliability of a design by performing a single Monte Carlo (MC) simulation per trust region. Although PRRA is based on MC simulation, it calculates “smooth” sensitivity derivatives, allowing therefore, the use of a gradient-based optimizer. The PRRA method is based on importance sampling. It provides accurate results, if the support of the sampling PDF contains the support of the joint PDF of the input random variables. The sequential, trust-region optimization approach satisfies this requirement.
Technical Paper

A Reliability-Based Robust Design Methodology

2005-04-11
2005-01-0811
Mathematical optimization plays an important role in engineering design, leading to greatly improved performance. Deterministic optimization however, can lead to undesired choices because it neglects input and model uncertainty. Reliability-based design optimization (RBDO) and robust design improve optimization by considering uncertainty. A design is called reliable if it meets all performance targets in the presence of variation/uncertainty and robust if it is insensitive to variation/uncertainty. Ultimately, a design should be optimal, reliable, and robust. Usually, some of the deterministic optimality is traded-off in order for the design to be reliable and/or robust. This paper describes the state-of-the-art in assessing reliability and robustness in engineering design and proposes a new unifying formulation. The principles of deterministic optimality, reliability and robustness are first defined.
Journal Article

A Simulation and Optimization Methodology for Reliability of Vehicle Fleets

2011-04-12
2011-01-0725
Understanding reliability is critical in design, maintenance and durability analysis of engineering systems. A reliability simulation methodology is presented in this paper for vehicle fleets using limited data. The method can be used to estimate the reliability of non-repairable as well as repairable systems. It can optimally allocate, based on a target system reliability, individual component reliabilities using a multi-objective optimization algorithm. The algorithm establishes a Pareto front that can be used for optimal tradeoff between reliability and the associated cost. The method uses Monte Carlo simulation to estimate the system failure rate and reliability as a function of time. The probability density functions (PDF) of the time between failures for all components of the system are estimated using either limited data or a user-supplied MTBF (mean time between failures) and its coefficient of variation.
Journal Article

A Subdomain Approach for Uncertainty Quantification of Long Time Horizon Random Processes

2023-04-11
2023-01-0083
This paper addresses the uncertainty quantification of time-dependent problems excited by random processes represented by Karhunen Loeve (KL) expansion. The latter expresses a random process as a series of terms involving the dominant eigenvalues and eigenfunctions of the process covariance matrix weighted by samples of uncorrelated standard normal random variables. For many engineering appli bn vb nmcations, such as random vibrations, durability or fatigue, a long-time horizon is required for meaningful results. In this case however, a large number of KL terms is needed resulting in a very high computational effort for uncertainty propagation. This paper presents a new approach to generate time trajectories (sample functions) of a random process using KL expansion, if the time horizon (duration) is much larger than the process correlation length.
Technical Paper

A Time-Dependent Reliability Analysis Method using a Niching Genetic Algorithm

2007-04-16
2007-01-0548
A reliability analysis method is presented for time-dependent systems under uncertainty. A level-crossing problem is considered where the system fails if its maximum response exceeds a specified threshold. The proposed method uses a double-loop optimization algorithm. The inner loop calculates the maximum response in time for a given set of random variables, and transforms a time-dependent problem into a time-independent one. A time integration method is used to calculate the response at discrete times. For each sample function of the response random process, the maximum response is found using a global-local search method consisting of a genetic algorithm (GA), and a gradient-based optimizer. This dynamic response usually exhibits multiple peaks and crosses the allowable response level to form a set of complex limit states, which lead to multiple most probable points (MPPs).
Journal Article

A Variable-Size Local Domain Approach to Computer Model Validation in Design Optimization

2011-04-12
2011-01-0243
A common approach to the validation of simulation models focuses on validation throughout the entire design space. A more recent methodology validates designs as they are generated during a simulation-based optimization process. The latter method relies on validating the simulation model in a sequence of local domains. To improve its computational efficiency, this paper proposes an iterative process, where the size and shape of local domains at the current step are determined from a parametric bootstrap methodology involving maximum likelihood estimators of unknown model parameters from the previous step. Validation is carried out in the local domain at each step. The iterative process continues until the local domain does not change from iteration to iteration during the optimization process ensuring that a converged design optimum has been obtained.
Technical Paper

An Analytical Investigation of the Crankshaft-Flywheel Bending Vibrations for a V6 Engine

1995-05-01
951276
High vibration levels at the rear bearing cap and oil pump were observed in dyno tests for a particular design of a V6 engine at a rated speed of 4800 r/min. It was found experimentally that the crankshaft-flywheel assembly had a bending resonance at 240 Hz which was excited at around 4800 r/min by 3rd order forces on the crankshaft. A newly developed crankshaft system model (CRANKSYM) was used to analytically verify the above finding and propose possible solutions to the problem. CRANKSYM can perform a coupled analysis among the crankshaft structural dynamics, main bearing hydrodynamics and engine block flexibility. It considers the flywheel dynamics (including the gyroscopic effect), belt loads, crankshaft “bent” and block misboring, and the anisotropy of the block flexibility as seen from a rotating crankshaft. It can also calculate the dynamic stresses on the crankshaft throughout the whole engine cycle. A brief description of CRANKSYM is given in the paper.
Journal Article

An Efficient Method to Calculate the Failure Rate of Dynamic Systems with Random Parameters Using the Total Probability Theorem

2015-04-14
2015-01-0425
Using the total probability theorem, we propose a method to calculate the failure rate of a linear vibratory system with random parameters excited by stationary Gaussian processes. The response of such a system is non-stationary because of the randomness of the input parameters. A space-filling design, such as optimal symmetric Latin hypercube sampling or maximin, is first used to sample the input parameter space. For each design point, the output process is stationary and Gaussian. We present two approaches to calculate the corresponding conditional probability of failure. A Kriging metamodel is then created between the input parameters and the output conditional probabilities allowing us to estimate the conditional probabilities for any set of input parameters. The total probability theorem is finally applied to calculate the time-dependent probability of failure and the failure rate of the dynamic system. The proposed method is demonstrated using a vibratory system.
Technical Paper

An Efficient Possibility-Based Design Optimization Method for a Combination of Interval and Random Variables

2007-04-16
2007-01-0553
Reliability-based design optimization accounts for variation. However, it assumes that statistical information is available in the form of fully defined probabilistic distributions. This is not true for a variety of engineering problems where uncertainty is usually given in terms of interval ranges. In this case, interval analysis or possibility theory can be used instead of probability theory. This paper shows how possibility theory can be used in design and presents a computationally efficient sequential optimization algorithm. The algorithm handles problems with only uncertain or a combination of random and uncertain design variables and parameters. It consists of a sequence of cycles composed of a deterministic design optimization followed by a set of worst-case reliability evaluation loops. A crank-slider mechanism example demonstrates the accuracy and efficiency of the proposed sequential algorithm.
Technical Paper

An Efficient Re-Analysis Methodology for Vibration of Large-Scale Structures

2007-05-15
2007-01-2326
Finite element analysis is a well-established methodology in structural dynamics. However, optimization and/or probabilistic studies can be prohibitively expensive because they require repeated FE analyses of large models. Various reanalysis methods have been proposed in order to calculate efficiently the dynamic response of a structure after a baseline design has been modified, without recalculating the new response. The parametric reduced-order modeling (PROM) and the combined approximation (CA) methods are two re-analysis methods, which can handle large model parameter changes in a relatively efficient manner. Although both methods are promising by themselves, they can not handle large FE models with large numbers of DOF (e.g. 100,000) with a large number of design parameters (e.g. 50), which are common in practice. In this paper, the advantages and disadvantages of the PROM and CA methods are first discussed in detail.
Journal Article

An Improved Reanalysis Method Using Parametric Reduced Order Modeling for Linear Dynamic Systems

2016-04-05
2016-01-1318
Finite element analysis is a standard tool for deterministic or probabilistic design optimization of dynamic systems. The optimization process requires repeated eigenvalue analyses which can be computationally expensive. Several reanalysis techniques have been proposed to reduce the computational cost including Parametric Reduced Order Modeling (PROM), Combined Approximations (CA), and the Modified Combined Approximations (MCA) method. Although the cost of reanalysis is substantially reduced, it can still be high for models with a large number of degrees of freedom and a large number of design variables. Reanalysis methods use a basis composed of eigenvectors from both the baseline and the modified designs which are in general linearly dependent. To eliminate the linear dependency and improve accuracy, Gram Schmidt orthonormalization is employed which is costly itself.
X