Refine Your Search

Search Results

Author:
Viewing 1 to 9 of 9
Technical Paper

Effects of Seat and Sitter Dimensions on Pressure Distribution in Automotive Seats

2017-03-28
2017-01-1390
Seat fit is characterized by the spatial relationship between the seat and the vehicle occupant’s body. Seat surface pressure distribution is one of the best available quantitative measures of this relationship. However, the relationships between sitter attributes, pressure, and seat fit have not been well established. The objective of this study is to model seat pressure distribution as a function of the dimensions of the seat and the occupant’s body. A laboratory study was conducted using 12 production driver seats from passenger vehicles and light trucks. Thirty-eight men and women sat in each seat in a driving mockup. Seat surface pressure distribution was measured on the seatback and cushion. Relevant anthropometric dimensions were recorded for each participant and standardized dimensions based on SAE J2732 (2008) were acquired for each test seat.
Technical Paper

Statistical Modeling of Automotive Seat Shapes

2016-04-05
2016-01-1436
Automotive seats are commonly described by one-dimensional measurements, including those documented in SAE J2732. However, 1-D measurements provide minimal information on seat shape. The goal of this work was to develop a statistical framework to analyze and model the surface shapes of seats by using techniques similar to those that have been used for modeling human body shapes. The 3-D contour of twelve driver seats of a pickup truck and sedans were scanned and aligned, and 408 landmarks were identified using a semi-automatic process. A template mesh of 18,306 vertices was morphed to match the scan at the landmark positions, and the remaining nodes were automatically adjusted to match the scanned surface. A principal component (PC) analysis was performed on the resulting homologous meshes. Each seat was uniquely represented by a set of PC scores; 10 PC scores explained 95% of the total variance. This new shape description has many applications.
Technical Paper

Development of an Automatic Seat-Dimension Extraction System

2016-04-05
2016-01-1429
This paper reports on the development and validation of an automated seat-dimension extraction system that can efficiently and reliably measure SAE J2732 (2008) seat dimensions from 3D seat scan data. The automated dimension-extraction process consists of four phases: (1) import 3D seat scan data along with seat reference information such as H-point location, back and cushion angles, (2) calculate centerline and lateral cross-section lines on the imported 3D seat scan data, (3) identify landmarks on the centerline and cross-section lines based on the SAE J2732 definitions, and (4) measure seat-dimensions using the identified landmarks. To validate the automated seat measurements, manually measured dimensions in a computer-aided-design (CAD) environment and automatically extracted ones in the current system were compared in terms of mean discrepancy and intra- and inter-observer standard deviations (SD).
Technical Paper

The Role of Visual and Manual Demand in Movement and Posture Organization

2006-07-04
2006-01-2331
The organization of upper body and gaze movements was quantified as an attempt to identify the types of task descriptors associated with the visual and manual functions of movement control. Nine subjects were asked to either read a word (high visual demand), reach a target (low visual demand), or simultaneously read a word and reach the object target placed just below the word (high visual demand). Similarly the manual demand condition was either low or high, depending on the target distance from the shoulder (either 80 or 120% of extended arm length, respectively). Torso flexion and gaze-on-target duration showed that movements are influenced by the both visual and manual demands in an interactive manner. Also both torso posture and gaze movements were predominantly changed by the visual demand. These results suggest that tasks to be simulated should be described in terms of both visual and manual demand.
Technical Paper

Posture and Motion Prediction: Perspectives for Unconstrained Head Movements

2006-07-04
2006-01-2330
The relationship between motion and posture was investigated from the kinematics of unconstrained head movements. Head movements for visual gazing exhibited an initial component whose amplitude does not exceed 20.3° for target eccentricity up to 120°. This component was truncated by subsequent corrective movements whose occurrence generally increases with target eccentricity, although with a large variability (R2 ≤ 0.46). The head is finally stabilized at 72% of target eccentricity (R2 ≥ 0.92). These results indicate that the final head posture can be achieved through a number of loosely-programmed kinematic variations. Based on these results, unconstrained head movements were simulated, within the context of application to posture prediction for estimation of the visual field.
Technical Paper

Modeling the Coordinated Movements of the Head and Hand Using Differential Inverse Kinematics

2004-06-15
2004-01-2178
Hand reach movements for manual work, vehicle operation, and manipulation of controls are planned and guided by visual images actively captured through eye and head movements. It is hypothesized that reach movements are based on the coordination of multiple subsystems that pursue the individual goals of visual gaze and manual reach. In the present study, shared control coordination was simulated in reach movements modeled using differential inverse kinematics. An 8-DOF model represented the torso-neck-head link (visual subsystem), and a 9-DOF model represented the torso-upper limb link (manual subsystem), respectively. Joint angles were predicted in the velocity domain via a pseudo-inverse Jacobian that weighted each link for its contribution to the movement. A secondary objective function was introduced to enable both subsystems to achieve the corresponding movement goals in a coordinated manner by manipulating redundant degrees of freedom.
Technical Paper

Evaluating the Effect of Back Injury on Shoulder Loading and Effort Perception in Hand Transfer Tasks

2004-06-15
2004-01-2137
Occupational populations have become increasingly diverse, requiring novel accommodation technologies for inclusive design. Hence, further attention is required to identify potential differences in work perception between workers with varying physical limitations. The major aim of this study was to identify differences in shoulder loading and perception of effort between a control population (C) and populations affected by chronic back pain (LBP) and spinal cord injury (SCI) in one-handed seated transfer tasks to targets. The effects of the injuries, and associated pain, are likely to produce variations in movement patterns, muscle loading and perceived effort.
Technical Paper

Prediction of Head Orientation based on the Visual Image of a Three Dimensional Space

2001-06-26
2001-01-2092
Head movements contribute to the acquisition of targets in visually guided tasks such as reaching and grasping. It has been found that head orientation is generally related to the spatial location of the visual target. The movements of the head in a three-dimensional space are described using six degrees of freedom including translations along x-, y- and z-axis plus rotations about x-, y- and z-axis. While the control of head movement is heavily dependent upon visual perception, head movements lead to a change in the visual perception of the task space as well. In the present study we analyzed head movements in a set of driving simulation experiments. Also a theoretical reconstruction of the perceived task space after head movements was modeled by a statistical regression. This process included the transformation of the task space from a global reference frame (earth-fixed) into a perceived space in a head-centered reference frame (head-fixed).
Technical Paper

Head Orientation in Visually Guided Tasks

2000-06-06
2000-01-2174
Where is my head? Knowing head orientation in space is necessary to estimate the extent of the visual field in tasks requiring visual feedback such as driving or manual materials handling. Visually guided tasks are generally dependent on head and eye movements for visual acquisition of the target, and head movements are of significant importance when target eccentricity from the neutral reference point is large. The aim of the present work was to investigate head orientation in space in hand pointing tasks and to model the head response. Standing subjects were required to direct their gaze at one of three targets, equally distributed (vertically) in the sagittal plane. The task was performed while standing a) with the arms next to the body, b) holding a load in a static condition, c) aiming at targets with a heavy or light load held in the hands. Movements of the head and the body segments were recorded by the motion capture systems.
X