Tuesday, March 11, 2008

How I Spent My Summer Vacation

In the non-linear spirit of TVA, I'm going to jump forward several chapters at the same time I go back 20 years. (Because it's what I was thinking about this morning, that's why.)

So it's summer 1989. I'm 31 years old, a senior at Berkeley (THERE's a long story...), and I got a summer internship with the Berkeley Solar Group. I'd already taken most of the building energy courses available at the school, and was good at the math and physics that it entailed.

My summer internship was one monster project that took me the entire three months. We had a research project to examine the energy savings that had been brought about by California's residential energy code. My role in the project was two-fold. The first was to develop a field-inventory system that would allow a layperson to capture the gross elements of a house (footprint, orientation, general construction and insulation, windows, heating/cooling/water heating equipment and controls, and a few other things) in a two-hour visit. I tested the protocol myself on half a dozen new houses, and built the paper form for field workers to fill out... it ran to about ten pages or so if I remember correctly. Then I helped train the temp workers, average folks with no building knowledge whatsoever, to do the inventories.

The second element was my great nemesis. I spent the better part of two months building a spreadsheet program that would accept data entry for these field forms, and then convert that data into a CSV (comma-separated values) file that would run in another program called CalRes. CalRes was the state-approved residential energy-use simulation software, and was pretty remarkable for its era. Here's how it worked. You entered all that home data that I mentioned above into the program, and the program then did two things. First, it did an energy-use simulation of your design, on an hour-by-hour basis using 50-year average climate and solar data for the appropriate one of California's 16 climate zones, to come up with an annual energy expenditure. Then it did an equivalent simulation against a house of the same size, orientation and window placement, but with the California-checklist construction features (R-11 walls, R-19 ceiling/roof, middle of the road double-pane windows, average efficiency equipment, and so on). If your proposed design performed better than the checklist version, you got your building permit; if not, not. So you had a lot of flexibility as a designer to meet the energy budget for a house of a given size; you could put way more windows than you ought to on the southwest side of the house if your overhangs were sized properly, for instance.

So I spent hours and hours working on a primitive portable computer that weighed about thirty pounds, had a green-on-black built-in screen that was about 6" across, had a 20-meg hard drive, and was powered by an IBM AT Turbo chip that ran at a mighty 25Mz. I wrote endless nested if-statements to convert one kind of data into another, trying to make Quattro (a primitive precursor to Excel) create CalRes files so that the easier spreadsheet data entry would immediately run a CalRes simulation. And by the time I went back to school in September, the damn thing actually worked, and I was wearing glasses. Fortunately, somebody else did the data entry on all 500 sampled houses after I left.

Now, I bother telling you this story for an important reason. Twenty years ago -- TWENTY YEARS AGO, when computing power was both crappy and expensive, and home video games still involved using the arrow keys to keep your cursor between the scrolling asterisks that tried to mimic a ski run -- we were able to develop simulation tools that allowed designers to know how their decisions would play out in the energy use of their ultimate buildings. Fast-forward to 2009, and through the use of BIM software that contains endless amounts of building information, we ought to be able to have a series of "speedometers" at the bottom of the screen that keep a running estimate of the design's performance against some standard conditions. Every time we place a window or spec a wall system, the meters ought to move up or down to tell us instantly what that choice means for a building's performance.

This could easily be done for energy use, separated out into heating, cooling, water heating, and demand electrical load. But we should also have meters for construction cost (using the Means cost estimation data), life-cycle operational cost, workplace safety issues (using OSHA historical data, we could easily estimate how many work hours would be lost on average for every foot of open-tread stair, for instance), lateral load performance (both wind and seismic), fire performance, payback period on investment, and any number of other quantifiable aspects of building performance. For all of these, the data we need already exists; currently, we just look it up from Means or OSHA or the National Weather Service or whatever, and figure things out once our design is nearly complete. We ought to be able to do it constantly from the very beginning of design work.

And we also have fifty years or so of information from environment-behavior research that should be able to help us simulate how buildings perform in human, experiential terms as well. Gerald Davis and Francoise Szigeti have computerized serviceability inventories; Frank Duffy has his workplace types and the kinds of work they best support; Irv Altman started thirty years of privacy research; William Whyte codified how social spaces work; the ISO has all kinds of acoustical and lighting performance data. We ought to be able to have meters that tell us about privacy, sociability, productivity, security (for people and for objects), wayfinding, visual and acoustic comfort, and a wide variety of human performance criteria. These won't be as precise as the ones related to building physics and economics, just as the temperature gauge in your car isn't a precision scientific instrument -- but they'll let you know if things are going okay or getting dangerously out of hand.

I know that one of the reactions to this may be that it shifts the decisionmaking from the designer to the software. But I think there's an opportunity there. It would be almost impossible to imagine a design that put all the meters at an optimal point; the architect would be responsible for orchestrating some desired balance of performance, and educating the client in the likely outcomes of every decision along the way. The software will help evaluate the ultimate performance of the building in the client's terms, leaving the designer more room for craft, mainly related to what Ed Allen refers to as the singular skill of the architect -- detailing, the thoughtfulness, care and precision with which the elements are brought together.

2 comments:

MCS said...

Ironically I took a break from doing a REScheck/MAScheck calculation to read todays post...

I suppose if you wanted to further your argument that "architecture is not art" this portion of the book would be quite persuasive.

As boring as these calculations may be, and trust me they are, I can see the overall importance and impact that having a much more elaborate set of data might have. Being able to forecast any one of the factors may be a useful tool in future planning.

I laughed at your comment:
"I know that one of the reactions to this may be that it shifts the decisionmaking from the designer to the software. But I think there's an opportunity there."
Yes. There is an opportunity there. I would rather the software do these calculations so that I can use my time more judiciously, having the opportunity to draw up schematic designs, talk with clients about future projects, or read my professors blog posts.

Anonymous said...
This comment has been removed by a blog administrator.