BEVERLYQIN

Back to Projects

VR Simulation for Autonomous Vehicles

my role

Designer, Developer, Researcher

year

2023

contribution

Research, Design, Development, Testing

tools

C++, Rhino, Unreal Engine, Blueprint Visual Scripting

team

Jidu Auto Advanced Design Team

Overview

This project developed an immersive VR simulation environment in Unreal Engine to visualize and test future autonomous vehicle transit systems. The simulation serves as a strategic tool for Jidu Auto's autonomous vehicle design and urban planning initiatives, enabling stakeholders to experience and evaluate proposed transportation solutions in a realistic virtual environment.

Note: I'm unable to disclose the actual VR environments developed for this project due to company policy, but the summary reflects my role and contributions.

VR Opening Scene

Problem Statement

Autonomous vehicle adoption raises new questions around how advertising, retail, and passenger experience will evolve inside and around self-driving transit systems. Traditional visualization tools often fail to capture how digital billboards, in-car displays, and city-scale ad placements integrate into future autonomous mobility. There was a need for a VR simulation platform that could:

• Visualize advertisement and media spaces within autonomous vehicle interiors

• Model city-scale ad placements across dynamic, self-driving transit networks

• Help designers and stakeholders assess passenger attention, visibility, and engagement

• Support long-term planning of monetization strategies alongside vehicle and city design

Project Gallery

VR Simulation Screenshot 1
VR Simulation Screenshot 2
VR Simulation Screenshot 3

Technical Architecture

Core Technologies

• Unreal Engine 5 - Real-time rendering and physics

• C++ - Core simulation logic and performance-critical systems

• Blueprint Visual Scripting - Rapid prototyping and iteration

• Rhino/Grasshopper - Urban environment modeling

• VR SDK Integration - HTC Vive and Oculus support

Key Features

• Real-time traffic simulation with AI agents

• Dynamic weather and lighting systems

• Multi-user collaborative testing

• Data logging and analytics

• Modular scenario framework

Development Methodology

The project followed an iterative development approach combining rapid prototyping with systematic validation:

Phase 1: Environment Modeling

Created conceptual vehicle interiors and city-scale layouts in Unreal Engine to serve as canvases for testing ad placements, optimizing geometry for real-time rendering.

Phase 2: Traffic Simulation

Prototyped in-car and urban ad placements (screens, panels, billboards) using parameterized layouts. Focused on visibility, user attention, and spatial constraints inside autonomous vehicle cabins and across city corridors.

Phase 3: VR Integration

Implemented immersive VR experiences in Unreal Engine to allow stakeholders to “sit” inside the vehicle or navigate city spaces, evaluating ad exposure and passenger interaction opportunities.

Phase 4: Testing Framework

Delivered VR walkthroughs to design leads and stakeholders, enabling iteration on ad placement strategies and long-term planning for monetization within autonomous transit systems.