Refine Your Search

Search Results

Viewing 1 to 7 of 7
Journal Article

From Model Validation to Reliability Assessment: Comments on Non-Deterministic Approaches (NDA)

2009-04-20
2009-01-0569
This paper is a discussion of topics presented by the author at the Panel Session “Evaluation of Studies on Non-Deterministic Approaches (NDA) for Complex Systems” at the 2007 SAE World Congress. The emphasis herein is on issues in conducting model based reliability assessments, by combining verification and validation (V&V) of the model with a process compatible with traditional reliability assessments conducted as part of Reliability Based Design Optimization (RBDO). Formulations are presented and simplified to isolate each of the terms in V&V, and make each term compatible with subsequent RBDO while still accommodating non-deterministic design situations. The paper concludes with overall issues regarding V&V and Reliability, and compares this combined method with some other methods in use in the community and discussed at a follow-on SAE panel in 2008.
Technical Paper

Verification & Validation: Process and Levels Leading to Qualitative or Quantitative Validation Statements

2004-03-08
2004-01-1752
The concepts of Verification and Validation (V&V) can be oversimplified in a succinct manner by saying that “verification is doing things right” and “validation is doing the right thing”. In the world of the Finite Element Method (FEM) and computational analysis, it is sometimes said “verification means solving the equations right” and “validation means solving the right equations”. In other words, if one intends to give an answer to the equation “2+2=”, then one must run the resulting code to assure that the answer “4” results. However, if the nature of the physics or engineering problem being addressed with this code is multiplicative rather than additive, then even though Verification may succeed (2+2=4 etc), Validation will fail because the equations coded are not those needed to address the real world (multiplicative) problem.
Technical Paper

Design for Six Sigma with Critical – To-Quality Metrics for Research Investments

2006-04-03
2006-01-0995
Design for Six Sigma (DFSS) has evolved as a worthy predecessor to the application of Six-Sigma principles to production, process control, and quality. At Lawrence Livermore National Laboratory (LLNL), we are exploring the interrelation of our current research, development, and design safety standards as they would relate to the principles of DFSS and Six-Sigma. We have had success in prioritization of research and design using a quantitative scalar metric for value, so we further explore the use of scalar metrics to represent the outcome of our use of the DFSS process. We use the design of an automotive component as an example of combining DFSS metrics into a scalar decision quantity. We then extend this concept to a high-priority, personnel safety example representing work that is toward the mature end of DFSS, and begins the transition into Six-Sigma for safety assessments in a production process.
Technical Paper

Energy Absorption in Aluminum Extrusions for a Spaceframe Chassis

1995-04-01
951079
This work describes the design, finite-element analysis, and verifications performed by LLNL and Kaiser Aluminum for the prototype design of the CALSTART Running Chassis purpose-built electric vehicle. Component level studies, along with our previous experimental and finite-element works, provided the confidence to study the crashworthiness of a complete aluminum spaceframe. Effects of rail geometry, size, and thickness were studied in order to achieve a controlled crush of the front end structure. These included the performance of the spaceframe itself, and the additive effects of the powertrain cradle and powertrain (motor/controller in this case) as well as suspension. Various design iterations for frontal impact at moderate and high speed are explored.
Technical Paper

Use of Non-Quadratic Yield Surfaces in Design of Optimal Deep-Draw Blank Geometry

1996-02-01
960597
Planar anisotropy in the deep-drawing of sheet can lead to the formation of ears in cylindrical cups and to undesirable metal flow in the blankholder in the general case. For design analysis purposes in non-linear finite-element codes, this anisotropy is characterized by the use of an appropriate yield surface which is then implemented into codes such as DYNA3D. The quadratic Hill yield surface offers a relatively straightforward implementation and can be formulated to be invariant to the coordinate system. Non-quadratic yield surfaces can provide more realistic strength or strain increment ratios, but they may not provide invariance and thus demand certain approximations. Forms due to Hosford and Barlat et al. have been shown to more accurately address the earing phenomenon. In this work, use is made of these non-quadratic yield surfaces in order to determine the optimal blank shape for cups and other shapes using ferrous and other metal blank materials with planar anisotropy.
Technical Paper

The Relative Sensitivity of Formability to Anisotropy

1997-02-24
970440
This work compares the relative importance of material anisotropy in sheet forming as compared to other material and process variables. The comparison is made quantitative by the use of normalized dependencies of depth to failure (forming limit is reached) on various measures of anisotropy, as well as strain and rate sensitivity, friction, and tooling. Comparisons are made for a variety of forming processes examined previously in the literature as well as two examples of complex stampings in this work. The examples cover a range from nearly pure draw to nearly pure stretch situations, and show that for materials following a quadratic yield criterion, anisotropy is among the most sensitive parameters influencing formability. For materials following higher-exponent yield criteria, the dependency is milder but is still of the order of most other process parameters.
Technical Paper

Solution Verification Linked to Model Validation, Reliability, and Confidence

2005-04-11
2005-01-1774
The implementation of Verification and Validation (V&V) of a computational model of a physical system can be simply described as a 4-step process. One of the steps in the 4-step process is that of Solution Verification. Solution Verification is the process of assuring that a model approximating a physical reality with a discretized continuum (e.g. finite element) code converges in each discretized domain to a converged answer on the quantity of validation interest. The modeling reality is that often we are modeling a problem with a discretized code because it is neither smooth nor continuous spatially (e.g. contact and impact) or in relevant physics (e.g. shocks, melting, etc). The typical result is a non-monotonic convergence plot that can lead to spurious conclusions about the order of convergence, and a lack of means to estimate residual error or uncertainty.
X