I am happy to join in the praise of the structures curriculum model described by Gary Black and Stephen Duff in the September 1994 edition of JAE. I was familiar with Gary's work in his early days of implementing that program at Berkeley, and I have used computer analysis in teaching structures since I began in the Department of Architecture at the University of Virginia in the fall of 1992. I'd like to offer some observations on using computers in teaching structures. There are potential pitfalls that I think Gary addresses in his work, but which deserve further elaboration for those interested in pursuing this approach.
I agree with Gary's observation that a finite element analysis program can be treated as a black box; contrary to the traditions of engineering education, a designer does not need to know how an equation solver works in order to use finite element analysis effectively, any more than someone needs to know the laws of thermodynamics to drive a car. But, it is very important that a designer understand the assumptions and limitations of the modelling method; this is particularly important for linear elastic analysis since it creates stark mathematical abstractions that neglect many important aspects of behavior, even in the simplest structures.
An ordinary desktop provides many demonstrations for the limits of linear analysis. Linear analysis can't analyze a stretched rubber band because the material is non-linear; rubber has an S-shaped stress-strain curve with higher stiffness at very low and very high strains. Linear analysis can't model a bent paper clip because of the material yielding and the large deformations. A piece of string hanging in a catenary also evades linear analysis because of the coupling of shape and the load imposed by the self-weight; the shape of the string determines the load, and the load determines the shape: a cycle that linear analysis cannot close. Stretching the string in a straight line, linear analysis can model its tension, but the analysis assumes it resists compression equally well, while the real string hangs limp. Conversely, linear analysis can analyze a stack of books in compression, but assumes it also works in tension, when it actually splits apart.
Not to belabor the structural properties of office supplies, the point is that linear analysis neglects important physical phenomena; this is actually true of sophisticated non-linear analysis as well, but the simplifications of linear analysis are particularly severe. Those simplifications don't mean the results aren't useful, they are invaluable, but it does mean that they are incomplete and approximate; students must understand this, and must be constantly reminded. Gary's metaphor of Galileo's telescope is very fitting. At it's best, computer analysis certainly does reveal otherwise unseen worlds, but the picture never comes into completely sharp focus; many features are glossed over. At its worst, computer analysis appears to be a telescope but is really a kaleidoscope, generating entrancing pictures divorced from reality. Using a metaphor suited to today's generation of students, computer analysis can--without proper care--become a kind of video game ("stress invaders") where students manipulate the input until they get nice looking output, ignoring the physical implications of both. I don't think that's happening in Gary's classes, but it is a danger for those who pursue this approach. Of course, the problem of linking models to reality is not limited to computers; many students master traditional manual calculation techniques without understanding their meaning for real structures.
In teaching, it is useful to emphasize the limits of the modelling method by giving examples where the computer gives wrong answers; blatantly wrong answers are best. I use computer projection in a lecture setting where I can have an interactive session with the whole class. One of my first examples is a three-node, two-element model in an inverted V configuration, with supports at the ends and a downward point load in the center: a funicular arch. I analyze the structure using conventional values for sections and materials and we look at the results, intuitively sensible compression in each strut, easily verified by hand. I then change the value of the E modulus to a very low value and reanalyze, displaying the deflected shape on the screen (with a simple program like RISA-2D, these modifications can be done in seconds, less time than it takes to erase the board). The deflected shape shows the V hanging down. The more intuitive students assume that the computer has correctly analyzed the behavior of the rubber-like structure, which has snapped through and now hangs in tension, as it really would. But then I display the numeric values, which show the members in compression, exactly the same compression as before.
This demonstration is a moment of truth for some students. The computer is not just off by ten or twenty percent, it is completely at odds with common sense, it is even at odds with itself; the picture says tension but the numbers say compression. At that point I ask for possible explanations, and the discussion moves to the hypothesis that the computer bases its calculations on the original geometry rather than the deformed geometry; this leads to a discussion of other situations where the computer might give wrong answers, such as cables which change shape under changing load patterns.
Another illustrative demonstration is to analyze an indeterminate frame and look at the distribution of moments graphically, then impose a support settlement and look again, noting the significantly different distribution. Follow with the question: "which distribution of moments is correct?" The true answer is "no one knows," since it's impossible to say whether or when the foundation settlement might occur. Proper detailing for ductility to redistribute stresses with settlement and creep is as important as thorough analysis of presumed initial conditions. The initial analysis is not really a simulation of reality, it's just a means to get in the ballpark of structural behavior. Lamenting the overconfidence of engineering academics in elastic theory, Pier Luigi Nervi once wrote:
All actual states of equilibrium are the result of a happy tendency, common to all structures, to find the state of equilibrium which best suits their shape and nature, beyond and above our limited knowledge. This best possible solution is obtained because the supports settle, the materials are not perfectly elastic, and the different parts of a single structure have different moduli of elasticity (due to occasional variations in the mixture an curing of the concrete).Nervi was writing in the pre-computer age, but his observations still hold true. He would undoubtedly be thrilled with the computational and graphic power available today, but would nonetheless interpret the results with a healthy skepticism that we must instill in our students. Regardless of the power of the model, the mapping between model and reality is always imperfect and always will be.
Beyond analysis, the computer is emerging in other important arenas besides it traditional engineering role as a calculation engine. The emergence of the World Wide Web and graphic web browsers over the past two years has created new possibilities for the computer as a communicator and information resource. Chris Luebkeman at the University of Oregon has created an exemplary web site for an introductory structures course and related resources (http://darkwing.uoregon.edu/~struct/courseware/461/461_index.html). I plan to create similar resources here at Virginia over the coming year. As more of us create and electronically publish such materials it will be possible for us to learn from one another and share resources and ideas in a way never before possible.
In addition, important resources are coming on line. For example, the library at the Earthquake Engineering Research Center at Berkeley now has an on-line repository of 5,800 digital images of earthquake damage from Karl Steinbrugge's slide collection (http://nisee.ce.berkeley.edu/eqiis.html). FEMA also maintains an on-line photo collection (http://www.fema.gov/fema/photo02.html) including wind and earthquake damage, and there are several sites with images of damage from the Kobe earthquake (e.g. http://www.nando.net/newsroom/jsources.html). As the hype and flash of the web dies down and more serious resources like these become available, it will be possible for teachers and students to access, assemble, and synthesize information in completely new ways.
Analysis programs allow students to use the computer to learn about the behavior of mathematical models, the web will allow students to learn about the behavior of real structures. Putting those things together, they can really learn about structural behavior and design, and we can learn with them.