Vision: From Language to Simulation
This project reimagines urban simulation through the lens of semantic reasoning and ontological precision. It transforms natural language descriptions, into a high-fidelity, rule-based simulation of a bustling Korean nightlife street. The system embodies the fusion of spatial computation, knowledge representation, and real-time interactivity, forming a new paradigm of city modeling grounded in semantic interpretation.
Framework: Ontology as the Engine
The simulation is structured around a deeply layered ontological model, including over 300 agents, vehicles, buildings, roads, and policing mechanisms. Entities are classified into spatial (buildings, infrastructure), dynamic (agents, vehicles), and regulatory systems (noise, lighting, occupancy). These are interconnected through semantic constraints such as bidirectional traffic, building capacities, and behavior-triggered transitions. All relationships are formalized through a rigorously defined entity-relationship schema that encodes entry/exit logic, noise generation, and emergency responses.
Intelligence: Multi-Level Semantic Interpretation
A custom semantic pipeline translates narrative input into structured rules via a multi-stage loop: Natural Language → Semantic Extraction → Ontological Mapping → Implementation → Validation. Every element in the simulation adheres to a controlled state machine: agents change color-coded states as they explore, enter buildings, evacuate, or exit voluntarily. These transitions are time-bounded and causally linked, ensuring ontological integrity across 4,000+ real-time state changes.
Innovation: Technical Architecture & Tools
The simulation runs on a modular JavaScript system using HTML5 Canvas and P5.js, capable of supporting 300+ concurrent entities at 20 FPS. Core technical innovations include:
A 4-state semantic agent model (seeking, in_building, evacuated, voluntary_exit)
Police and noise subsystems with automated enforcement logic
A DSL-like modular architecture for ontologies, constraints, and validations
Real-time visual semantic debugging and entity tracing tools
Impact: Applications and Research Frontiers
This project contributes to the frontier of semantic city modeling and knowledge-driven simulation. Its core ideas are applicable to:
Smart city infrastructure modeling
AI-driven scenario generation and training environments
Digital twin systems for regulatory compliance and urban analysis
With over 99.97% semantic consistency across millions of operations and full fidelity between visual output and descriptive semantics, the simulation sets a new benchmark for semantic system design in urban computation.