Refine Your Search

Search Results

Author:
Viewing 1 to 10 of 10
Technical Paper

Video Based Simulation of Daytime and Nighttime Rain Affecting Driver Visibility

2021-04-06
2021-01-0854
This paper presents a methodology for generating video realistic computer simulated rain, and the effect rain has on driver visibility. Rain was considered under three different rain rates, light, moderate and heavy, and in nighttime and daytime conditions. The techniques and methodologies presented in this publication rely on techniques of video tracking and projection mapping that have been previous published. Neale et al. [2004, 2016], showed how processes of video tracking can convert two-dimensional image data from video images into three-dimensional scaled computer-generated environments. Further, Neale et al. [2013,2016] demonstrated that video projection mapping, when combined with video tracking, enables the production of video realistic simulated environments, where videographic and photographic baseline footage is combined with three-dimensional computer geometry.
Technical Paper

Application of 3D Visualization in Modeling Wheel Stud Contact Patterns with Rotating and Stationary Surfaces

2017-03-28
2017-01-1414
When a vehicle with protruding wheel studs makes contact with another vehicle or object in a sideswipe configuration, the tire sidewall, rim and wheel studs of that vehicle can deposit distinct geometrical damage patterns onto the surfaces it contacts. Prior research has demonstrated how relative speeds between the two vehicles or surfaces can be calculated through analysis of the distinct contact patterns. This paper presents a methodology for performing this analysis by visually modeling the interaction between wheel studs and various surfaces, and presents a method for automating the calculations of relative speed between vehicles. This methodology also augments prior research by demonstrating how the visual modeling and simulation of the wheel stud contact can extend to almost any surface interaction that may not have any previous prior published tests, or test methods that would be difficult to setup in real life.
Technical Paper

Nighttime Videographic Projection Mapping to Generate Photo-Realistic Simulation Environments

2016-04-05
2016-01-1415
This paper presents a methodology for generating photo realistic computer simulation environments of nighttime driving scenarios by combining nighttime photography and videography with video tracking [1] and projection mapping [2] technologies. Nighttime driving environments contain complex lighting conditions such as forward and signal lighting systems of vehicles, street lighting, and retro reflective markers and signage. The high dynamic range of nighttime lighting conditions make modeling of these systems difficult to render realistically through computer generated techniques alone. Photography and video, especially when using high dynamic range imaging, can produce realistic representations of the lighting environments. But because the video is only two dimensional, and lacks the flexibility of a three dimensional computer generated environment, the scenarios that can be represented are limited to the specific scenario recorded with video.
Technical Paper

Determining Position and Speed through Pixel Tracking and 2D Coordinate Transformation in a 3D Environment

2016-04-05
2016-01-1478
This paper presents a methodology for determining the position and speed of objects such as vehicles, pedestrians, or cyclists that are visible in video footage captured with only one camera. Objects are tracked in the video footage based on the change in pixels that represent the object moving. Commercially available programs such as PFTracktm and Adobe After Effectstm contain automated pixel tracking features that record the position of the pixel, over time, two dimensionally using the video’s resolution as a Cartesian coordinate system. The coordinate data of the pixel over time can then be transformed to three dimensional data by ray tracing the pixel coordinates onto three dimensional geometry of the same scene that is visible in the video footage background.
Technical Paper

Video Projection Mapping Photogrammetry through Video Tracking

2013-04-08
2013-01-0788
This paper examines a method for generating a scaled three-dimensional computer model of an accident scene from video footage. This method, which combines the previously published methods of video tracking and camera projection, includes automated mapping of physical evidence through rectification of each frame. Video Tracking is a photogrammetric technique for obtaining three-dimensional data from a scene using video and was described in a 2004 publication titled, “A Video Tracking Photogrammetry Technique to Survey Roadways for Accident Reconstruction” (SAE 2004-01-1221).
Technical Paper

Photogrammetric Measurement Error Associated with Lens Distortion

2011-04-12
2011-01-0286
All camera lenses contain optical aberrations as a result of the design and manufacturing processes. Lens aberrations cause distortion of the resulting image captured on film or a sensor. This distortion is inherent in all lenses because of the shape required to project the image onto film or a sensor, the materials that make up the lens, and the configuration of lenses to achieve varying focal lengths and other photographic effects. The distortion associated with lenses can cause errors to be introduced when photogrammetric techniques are used to analyze photographs of accidents scenes to determine position, scale, length and other characteristics of evidence in a photograph. This paper evaluates how lens distortion can affect images, and how photogrammetrically measuring a distorted image can result in measurement errors.
Technical Paper

Evaluation of Photometric Data Files for Use in Headlamp Light Distribution

2010-04-12
2010-01-0292
Computer simulation of nighttime lighting in urban environments can be complex due to the myriad of light sources present (e.g., street lamps, building lights, signage, and vehicle headlamps). In these areas, vehicle headlamps can make a significant contribution to the lighting environment 1 , 2 . This contribution may need to be incorporated into a lighting simulation to accurately calculate overall light levels and to represent how the light affects the experience and quality of the environment. Within a lighting simulation, photometric files, such as the photometric standard light data file format, are often used to simulate light sources such as street lamps and exterior building lights in nighttime environments. This paper examines the validity of using these same photometric file types for the simulation of vehicle headlamps by comparing the light distribution from actual vehicle headlamps to photometric files of these same headlamps.
Technical Paper

Simulating Headlamp Illumination Using Photometric Light Clusters

2009-04-20
2009-01-0110
Assessing the ability of a driver to see objects, pedestrians, or other vehicles at night is a necessary precursor to determining if that driver could have avoided a nighttime crash. The visibility of an object at night is largely due to the luminance contrast between the object and its background. This difference depends on many factors, one of which is the amount of illumination produced by a vehicle’s headlamps. This paper focuses on a method for digitally modeling a vehicle headlamp, such that the illumination produced by the headlamps can be evaluated. The paper introduces the underlying concepts and a methodology for simulating, in a computer environment, a high-beam headlamp using a computer generated light cluster. In addition, the results of using this methodology are evaluated by comparing light values measured for a real headlamp to a simulated headlamp.
Journal Article

Determining Vehicle Steering and Braking from Yaw Mark Striations

2009-04-20
2009-01-0092
This paper presents equations that relate the orientation and spacing of yaw mark striations to the vehicle braking and steering levels present at the time the striations were deposited. These equations, thus, provide a link between physical evidence deposited on a roadway during a crash (the tire mark striations) and actions taken by the driver during that crash (steering and braking inputs). This paper also presents physical yaw tests during which striated yaw marks were deposited. Analysis of these tests is conducted to address the degree to which the presented equations can be used to determine a driver’s actual steering and braking inputs. As a result of this testing and analysis, it was concluded that striated tire marks can offer a meaningful glimpse into the steering and braking behavior of the driver of a yawing vehicle. It was also found that consideration of yaw striations allows for the reconstruction of a vehicle’s post-impact yaw motion from a single tire mark.
Journal Article

A Method to Quantify Vehicle Dynamics and Deformation for Vehicle Rollover Tests Using Camera-Matching Video Analysis

2008-04-14
2008-01-0350
This paper examines the use of camera-matching video analysis techniques to quantify the vehicle dynamics and deformation for a dolly rollover test run in accordance with the SAE Recommended Practice J2114. The method presented enables vehicle motion data and deformation measurements to be obtained without the use of the automated target tracking employed by existing motion tracking systems. Since it does not rely on this automated target tracking, the method can be used to analyze video from rollover tests which were not setup in accordance with the requirements of these automated motion tracking systems. The method also provides a straightforward technique for relating the motion of points on the test vehicle to the motion of the vehicle's center-of-mass. This paper, first, describes the specific rollover test that was utilized. Then, the camera-matching method that was used to obtain the vehicle motion data and deformation measurements is described.
X