Skip to main content
SearchLoginLogin or Signup

Simulation of Events

Published onApr 23, 2021
Simulation of Events

When enough empirical or experimental rules are known about a process, machines can be made to take on the character of the event and undergo a make-believe happening of that process, a simulation. Given reasonable protocols and maxims, this form of mechanical masquerade is a powerful method for refining an original set of rules or pretesting designs and procedures.

The simulation of events can benefit the architect in two ways. If the designer does not fully understand the behavioral aspects of an event, he can play with rules and regulations, searching for recognizable activity patterns. In other words from empirical knowledge of a set of actions and reactions for specific environments, a designer could inductively compose postulates or algorithms applicable in other contexts. For example, if he understands from on-premise measurements the vertical circulation patterns for several different environments, he could describe these environments to the machine and hypothesize seemingly appropriate rules. Then when the machine, using these rules, displays the vertical circulation patterns for the known environments, it reveals the divergencies between the empirical data and the designer’s rules—between what he knows he should see and what he does see—giving him information by which to modify the rules, always observing whether the change has a positive or negative effect. Eventually, using a dynamic on-line system, he will be able to converge on rules of simulation that can be applied to other environments.

The second design application, pretesting, assumes the rules are correct. Whether empirical or experimental, simulations are no better than their underlying rules, whether the rules are provided by the man or by the machine. If the simulation model is correct, a designer or a machine can observe the performance of an environment, a specific context. Someday, designers will be able to subject their projects to the simulations of an entire day or week or year of such events as use patterns and fasttime changes in activity allocations. On display devices, designers will be able to see the incidence of traffic jams, the occurrence of sprawl, or sweltering inhabitants searching for shade. For the present discussion, the most easily reproducible event is circulation, a perplexing and important urban situation in itself.

Many sophisticated organizations have spent time and money in programs that simulate circulation, primarily vehicular circulation. Rather than observe these elaborate simulation techniques, let us observe two very simple circulation models that have been devised to simulate pedestrian movement. The models result from two M.I.T. student projects involving architecture students, once again with almost no previous programming experience.

The first simulation model describes three parameters: spaces (function, capacity, desirability), circulation interfaces (direction, capacity, demand) and people (arrivals, departures, frustrations). The model assumes the chosen environment to be a discrete chunk of the real world, with a certain number of pedestrians leaving the system and others arriving at each time interval. At each instant, the circulation activity and the space populations are determined by random numbers controlled by parameters of frustration and desire. Although this work was not implemented on a graphic display device, the authors (with some effort) can observe jammed doorways, vacant commercial spaces, and periodic peaks in major circulation routes. Their physical model can be changed and manipulated in search for less antagonistic circulation patterns, iterating toward a design solution that would display ambulatory ease and facility.

CARS—computer-automated routing and scheduling. This system is designed to provide door-to-door transportation in low-density suburban areas. The aim of CARS is to provide service approximating that of the taxicab, but at a price approximating mass transit bus systems.

The six illustrations represent a simulation of CARS in operation with twelve vehicles on an area of nine square miles with about ninety demands per hour. Two particular criteria are enforced: no one should wait more than fifteen minutes, and no one should travel more than 1.8 times the direct driving distance.

The illustrations have been photographed from an ARDS tube which runs off the M.I.T. time-sharing system. The work is being performed at M.I.T’s newly created Urban Systems Laboratory under the direction of Daniel Roos.

1 All waiting demands at time 45

2 The projected tour of vehicle 11 at time 45

3 A history of vehicle 11 at time 45

4 All waiting demands at time 60

5 The projected tour of vehicle 11 at time 60

6 A history of vehicle 11 at time 60


In this example, simulation techniques describe agglomerates of people, whole groups moving from space to space in one cycle. The second student model applies variable parameters to each individual pedestrian. Characteristics of desire and destination control the simulated movement of each individual pedestrian in accordance with his local environment. The student can observe frustrations and localized frictions that are not only a function of the physical form but responses to the individual personalities of the other pedestrians in the same space. The student can observe a dashing blonde unsettle corridors or a precipitate fleet-footed latecomer disrupt a reception area.

Both student projects, even in their infancy, exhibit viable methods for prediction. When simulation techniques improve and are part of architecture machines, physical structures can be tested within environments that acknowledge their presence. In other words, when a change is contemplated for some neighborhood, it can be tested by observing its effect over time, but in fast time, unreal time. Usually, in the nonpretend world, the real world, a neighborhood immediately responds to a change, generates new demands, and the supposedly beneficial event is too often invalidated. Such negations can be avoided. Direct interplay between event and effect, desire and result can be observed and can be enveloped in simulation procedures.

SYMAP. This computer system is primarily concerned with the display of spatially distributed data rather than with its manipulation. Developed by Howard Fisher, it employs an overprinting technique on a high-speed printer, which in the days before computer graphics was fine, but is quite obsolete today. The four maps are based on the 1960 census and are at the scale of the census tracks. From left to right, they display density per acre of total population, whites, blacks, Puerto Ricans. The work was performed by Peter Rogers and Isao Oishi. (Illustrations courtesy of the Harvard Laboratory for Computer Graphics)

Comments
0
comment
No comments here
Why not start the discussion?