Fall 1993 vol. 7 no. 2 National Center for Supercomputing Applications o University of Illinois at Urbana-Champaign Cover: A collection of antique engineering and surveying instruments. Loaned by the UIUC Department of Mechanical and Industrial Engineering, Berns Clancey & Associates PC, and Fran Bond. (Art director, Carlton Bruett; research, Fran Bond; photography, Thompson-McClellan Photography) Table of Contents departments 2 ncsa contacts 3 editor's note 34 book review 35 center activities 4 Developing tools for nanolithography 8 Creating the numerical engine 10 Engineering with visual analysis 12 NCSA and structural mechanics 15 Allocations in engineering 17 CM-5 enhancements attract friendly users 20 Envisioning the Earth: the Geosphere Project 21 Integrating global models: Conference report industrial program 24 Caterpillar wins Grand Challand award 26 Modeling disposable diapers 26 AT&T's XUNET demonstrated for Gore education 27 ISDN links C-U schools to NCSA 28 Software for math & science 29 1993-94 REU members new technology 30 HP, Convex team with NCSA . . . for scalable applications research 31 High definition technologies at NCSA 32 Remodeling a career 33 NCSA Mosaic exploits Internet Fall 1993 access (ISSN 1064-9409) is published by the National Center for Supercomputing Applications at the University of Illinois at Urbana-Champaign (UIUC) with support from the University of Illinois, the State of Illinois, the National Science Foundation, other federal agencies, and NCSA industrial partners. Permission to reprint any item in access is freely given, provided that the author and access are acknowledged. Editor: Fran Bond fbond@ncsa.uiuc.edu Managing Editor: Melissa LaBorg Johnson melissaj@ncsa.uiuc.edu Designer: Linda Jackson Printing by University of Illinois Office of Printing Services, Printing Division This publication is printed on Evergreen Matte and Evergreen Matte Cover. ncsa contacts GENERAL INFORMATION AND PROGRAMS Academic Affiliates Program/Client Administration Judy Olson (217) 244-1986 uadmin@ncsa.uiuc.edu Applications Group/Faculty Program Melanie Loots (217) 244-2921 mloots@ncsa.uiuc.edu Central Facilities Sue Lewis (217) 244-0708 slewis@ncsa.uiuc.edu For services/help: (217) 244-0710 Chemistry User Group Balaji Veeraraghavan (217) 333-2754 balajiv@ncsa.uiuc.edu Computing and Communications Charles Catlett (217) 333-1163 catlett@ncsa.uiuc.edu Consulting Office (217) 244-1144 consult@ncsa.uiuc.edu Education Program Lisa Bievenue (217) 244-1993 bievenue@ncsa.uiuc.edu Industrial Program John Stevenson (217) 244-0474 Media Relations Jarrett Cohen (217) 244-3049 jcohen@ncsa.uiuc.edu Media Services Vincent Jurgens (217) 244-1543 vjurgens@ncsa.uiuc.edu media@ncsa.uiuc.edu (services) MetaCenter Allocations Susan Zukosky (217) 244-0635 allocations@ncsa.uiuc.edu NCSA Receptionist (217) 244-0072 FAX (217) 244-1987 NCSA Security Officers Lex Lane (217) 244-0642 lex@ncsa.uiuc.edu Michael Smith (217) 244-7714 msmith@ncsa.uiuc.edu Networking network@ncsa.uiuc.edu Orders for Publications, NCSA Software, and Multimedia Debbie Shirley (217) 244-4130 orders@ncsa.uiuc.edu Publications Group Melissa Johnson (217) 244-0645 melissaj@ncsa.uiuc.edu Software Development Group Joseph Hardin (217) 244-6095 jhardin@ncsa.uiuc.edu Software Group Technical Support Jennie File (217) 244-0638 jfile@ncsa.uiuc.edu Systems Software Curt Canada (217) 333-3480 canada@ncsa.uiuc.edu Training Program Alan Craig (information) (217) 244-1988 acraig@ncsa.uiuc.edu Deanna Walker (registration) (217) 244-1996 dwalker@ncsa.uiuc.edu User Services Jim Bottum (Acting) (217) 244-0633 bottum@ncsa.uiuc.edu Visitors Program Jean Soliday (217) 244-1972 jsoliday@ncsa.uiuc.edu NOTE: All electronic mail addresses are via Internet. See inside back cover for details on anonymous FTP, Gopher, and World Wide Web. Subscriptions Mail to Documentation Orders, NCSA Newsletters, 152 Computing Applications Building, 605 E. Springfield Avenue, Champaign, IL 61820-5518. Enter delete my subscription to access. Change my subscription address. Remove duplicate name. (Enter name to be deleted below.) Name Title Company/Institution Address City State Zip Electronic mail address Telephone number Signature Date If you have questions or comments, please contact NCSA Orders at orders@ncsa.uiuc.edu or (217) 244-4130. editor's note HPCC AND ENGINEERING "Power and Progress in the Mechanical Age" was painted in 1939--at the threshold of World War II. Once again the entire globe is in a state of transition with the ending of the Cold War. As we momentarily poise to meet whatever challenges await, scientists and engineers anticipate new needs. Just as in the Mechanical Age, engineering will be a major player. Some "hot" innovations in the experimental aspects of engineering, such as nanolithography (see page 4) and multimedia in the laboratory (page 10), are just two examples of the current generation's technological advancements. Computational mechanics (see page 12) offers further potential. Today's world is keenly aware of the environment, and high- performance computing is providing solutions. Through its unique capabilities in processing large datasets, HPCC aids in the development of more efficient engines (page 8) and in the understanding of the Earth's complex systems (pages 20 and 21). Virtual reality, a 21st century technology, is being employed by award-winning design engineers at Caterpillar, an NCSA industrial partner, to develop better field machines (page 24). Advancements in XUNET, a bridge to implementing the information highway, are described on page 26. NCSA is enabling students to learn real science via networking and special software (page 27). A new section, "new technology" (pages 30-33), begins in this issue. Additions to NCSA's resources, including new hardware and software, will be featured in this department. Other news follows in "center activities." --Fran Bond, Editor "A nation is never finished. . . . It has to be recreated for each new generation." --John Gardner "Power and Progress in the Mechanical Age" painted by Bettie Becker and Frank Wiater, two seniors in Art and Design at UIUC, under the supervision of Professor Warren Doolittle (1939) is a mural on the first landing of the East entrance to Talbot Lab, Engineering campus, UIUC. (Photo by Wilmer Zehr) RESEARCH DEVELOPING TOOLS FOR NANOLITHOGRAPHY by Randall Graham, Science Writer nano-li-thog-ra-phy is a technique used for integrated circuit fabrication done on a dwarfed scale. (The prefix nano represents 10-9, or one-billionth of the unit adjoined.) "When it comes to making faster, smaller computers, the microelectronics industry is reaching a limit," says Joseph Lyding, NCSA principal investigator and UIUC professor of electrical and computer engineering. "Our goal is to use the Scanning Tunneling Electron Microscope (STM) for research-- particularly in the area of circuit miniaturization." STM DEFINED STM is one of two types of 3D atomic resolution microscopes. The other is the Atomic Force Microscope, or AFM, which generates images by measuring the atomic forces of interaction between its scanning tip and the sample's surface atoms. (A number of UIUC faculty are planning to utilize an AFM facility that is being established at the Beckman Institute.) Only STM, however, allows scientists to study both topographical and electrical properties of materials, which are important for understanding the behavior of microelectronic devices. STM is also the only instrument with the ability to rip atoms from the sample surface and relocate or otherwise manipulate them. Lyding's STM produces atomic-scale pictures by positioning its sharp tip a few atomic diameters from the sample and raster- scanning it to produce an image. Just enough voltage is applied to the tip to cause a small quantum mechanical tunneling current to jump across the gap to or from the sample. The quantum mechanical tunneling current (which cannot be explained by classical physics) is exponentially dependent on the distance from the tip to the atomic surface, so that one can convert tunneling current to the tip height above the surface very accurately. As the tip scans, the electrical and topographic properties of the atoms in the sample cause variations in the current. A feedback circuit moves the tip normal to the surface to minimize current variations. The feedback information is then processed into a picture on the atomic scale. Increasing the voltage enables a researcher to move atoms around, pile them up, or trigger chemical reactions. This activity takes place in an infinitesimal world of invisible units. "Current resolution with optical nanolithographic techniques is on the order of a half micron, about 5,000 angstroms," Lyding says. "Those methods are expected to hit their limit at around 1,000 to 2,000 angstroms in the next ten years. This is due to the wavelength restriction of the UV [ultraviolet] light used to project the circuit image onto a chip. "STM can easily make devices in the sub-1,000 angstrom resolution range. Recently we have used it to chemically modify silicon surfaces with 20 angstrom resolution. In these experiments, electrons from the STM tip are used to trigger local chemical reactions between the surface and adsorbed gas molecules." JOINING UIUC Lyding built his first STM personally, turning the parts in a UIUC machine shop. He holds a patent on one microscope currently being sold and has designed a nonmechanical method of positioning samples to eliminate vibration problems. "You can actually pick up my microscope while it's scanning, and the sample and tip will not crash into each other," he says. Lyding's STM, and other designs based on it, are widely used. Because of its simplicity, he explains, most users build their own instead of buying commercial products. In 1984 Lyding came to the UIUC to study the dynamics of charge- density wave (CDW) transport in quasi-1D metals, under the tutelage of electrical and computer engineering Professor John Tucker and legendary two-time Nobelist in physics John Bardeen. While using STM to observe CDWs, Lyding became fascinated with the technology's endless experimental possibilities. "John Bardeen encouraged my pursuit of STM," says Lyding. "He was very interested in probing the microscopic aspects of CDW phe- nomena, and he also encouraged the semiconductor STM work that we are now pursuing." METACOMPUTING AT NCSA "I'm always looking for ways to mechanize things," Lyding adds. To him, an alliance with NCSA seemed perfectly natural. "NCSA has given me the ability to control my lab from remote locations while viewing real-time images from the microscope. This means that once my students learn how to run the microscope, they can run it from anywhere." Key to Lyding is the integrationof the STM experiment into NCSA's metacomputer environment [see access, September-December 1991]. Since Lyding's STM facility is located in the Beckman Institute, which is intended to intermingle various scientific disciplines and to encourage collaboration, it was inevitable that he would learn of NCSA and its biological imaging group headed by Clint Potter. "A typical scenario for us is to use AVS [software] in conjunction with NCSA's CONVEX C3880 and CM-5, and our laboratory's HP 700 [Hewlett Packard] and PCs simultaneously. We typically use Rachael's [NCSA research programmer Rachael Brady of the biological imaging group] STM module running on the CONVEX to control the STM, acquire image data, and perform Viewit operations. [View-it is a general-purpose software for manipulating multidimensional arrays. It was developed by Potter and his group in collaboration with the Biomedical Resonance Laboratory.] All of this occurs within a remote module from within AVS, which is running on the HP in our laboratory. AVS on the HP is used to render high-quality light-sourced images of the STM scan area, bringing out details that we cannot see in the raw data. "Furthermore, we frequently process these images by taking 2D fast Fourier transforms [FFT] and looking at the power spectra for symmetry features in the image. The FFT module in our HP-AVS display actually sends the data to the CM-5 for parallel computation of the power spectrum which is then displayed on the HP. "All of this interactivity between machines is transparent to the user, thus my students and I can concentrate on the experiment and use the advanced rendering and analysis to steer the course of the experiment." SIGGRAPH DEMO Lyding demonstrated his metacomputing capabilities at SIGGRAPH '92 in Chicago as part of the innovative Showcase. Funded by NSF and ARPA, Showcase was the largest exhibit and featured 40 interactive and collaborative leading-edge applications of HPCC. It was supported by fourteen corporations and six forefront laboratories [see access, May-June 1992]. An SGI workstation in a Showcase display booth was connected to NCSA's CONVEX C3880 in Urbana via the T3 fiber optic link. "People were genuinely impressed when they found out we were controlling a live experiment from Urbana in Chicago and could remotely move the STM scan area around in real time," says Lyding. "The distance had no effect on response time." Providing Lyding with remote capabilities was a challenge for Brady, who explains how she forced a solution: "All of Joe Lyding's microscopes are controlled by IBM PCs running MS/DOS on either 386 or 486 chips. These are single tasking machines, which means you cannot run something in the background while you do something in the foreground. To hook his computers up to another box, those PCs are going to have to control the experiment and at the same time listen on the network for any commands coming from a remote site. "Making his machines do two things at once is a total hack. You tie all the network listening stuff to interrupts, which are built into the MS/DOS operating system. When a request comes from the network, it gets shoved into a buffer on the PC and a flag is set within the PC's hardware. . . . Right now, this is all done synchronously. Next, we would like to do it asynchronously--that is, interrupt Joe's program and say 'here's a new request.'" MORE POWER FOR MICROMANIPULATION The STM's ability to manipulate atoms has Lyding and others considering it for microelectronic manufacturing roles. To advance this effort, Lyding joined six colleagues from UIUC and two from the University of Minnesota in founding the STM-based Nanolithography University Research Initiative. They study the behavior of proposed new electronic devices and design STM techniques for fabricating them. UIUC members of the research group include: Ilesanmi Adesida, Stephen Bishop, K.-Y. (Norman) Cheng, Karl Hess, and John Tucker, Department of Electrical and Computer Engineering, and Munir Nayfeh, Physics. Members at Minnesota are Stephen Campbell and Ted Higman, Department of Electrical Engineering. "We are finding that these tiny new devices will not function like today's larger-scale ones. Quantum mechanical effects begin to play a dominant role, so we are exploring ways to harness those effects for our advantage." Since STMs operate sequentially, a large array of them will be necessary to map circuits with enough speed to be cost effective. Lyding's STM images are currently rendered from 2D datasets, but they will become 3D as soon as he begins to use electron energy as the third dimension. Lyding says he will need teraflops computing speed to bring detailed simulation of the tunneling probe tip and its interaction with the sample online with the experiment. Understanding the probe tip is especially important in the STM-based Nano- lithography University Research Initiative where atomic-scale surface modification schemes are being developed. Voltages required to modify a sample could also damage the tip. Other collaborations underway with Karl Hess, director of the NSF- funded National Center for Computational Electronics (NCCE), will also require more compute power. "Karl Hess's group has developed the analysis tools to perform a full quantum mechanical molecular dynamics simulation of the tip and surface atoms in response to the fields and forces that exist in the tunneling junction," says Lyding. "Tremendous compute power is needed to extend this calculation to a volume consisting of, say, 50 atoms that might realistically bound the volume of tip/surface interaction. This effort will dovetail smoothly with the existing compute/experiment protocols developed for us by NCSA. "Hess has proposed a number of intriguing new electronic devices that will only operate in the size regime where quantum effects emerge and dominate," Lyding continues. "Using conventional lithographic technology, these devices will only work at liquid helium temperature (4.2 K) or lower. However, with STM nanolithography, room temperature operation should be possible. Hess is working in close collaboration with Tucker on developing and simulating new device concepts that are amenable to STM nanolithography." LINKING WITH OTHER SCIENTISTS "I think there are a lot of other scientists out there who could benefit by networking their scientific instruments with NCSA's machines," says Brady. "It's just that they do not realize the potential advantage when they have been getting by without us." In September, Brady got an oppor-tunity to outreach to other researchers when she copresented with Lyding at the Workshop on Real-Time Applications of High Performance Computing for Biological Imaging that NCSA cosponsored with the Beckman Institute with NSF funding. Their topic was entitled "The Distributed Scanning Tunneling Microscopy Laboratory: Real-time Control, Visualization, and Modification at the Atomic Level." Potter and Bridget Carragher, director of the Optical Visualization Facility at the Beckman Institute, were co-chairs. "The great thing about working with Joe is that he appreciates computers," says Brady. "He understands what they can do." s NOTE: If you want to order a videotape about Lyding's research, "A Quantum View" from NCSA RealTime #4, contact Orders for Publications, NCSA Software, and Multimedia [see ncsa contacts, page 2]. "I'm always looking for ways to mechanize things."--Joseph Lyding "All of this interactivity between machines is transparent to the user, thus . . . I can concentrate on the experiment," says Joseph Lyding, shown here with his lab setup in the Beckman Institute. Seated before the HP 700 workstation, he calls up an AVS file of an experiment that was run on NCSA's CONVEX C3880. STMs are housed in vacuum chambers in the background. (Photo by Wilmer Zehr) (Far left) GaAs (100)--gallium arsenide--showing as atoms in channels on surface. (Above right) Si (100)--silicon--surface showing dimer rows of Si atoms. (Courtesy of Joseph Lyding) (Far left) Si (111)--silicon--surface showing detailed atomic structure. (Above right) "UI" written with small piles of silicon atoms on a Si (111) surface. Atoms were moved by the STM. AVS running on NCSA's CONVEX C3880 was used to control Joseph Lyding's STM nanolithography experiment in which "NCSA" was written on a silicon surface that is shown above each head on this spread. (Courtesy of Joseph Lyding) Creating the numerical engine by Sara Latta, Science Writer Vehicle manufacturers are finding themselves in a quandary: the 1990 amendments to the Clean Air Act require considerable reductions of pollutant-forming emissions by the year 2000, and there is ever-increasing pressure to improve fuel efficiency. At the same time, competitive market pressures demand a faster turnaround time for new designs than ever before. Fortunately, vehicle manufacturers have some new tools to help them keep pace with the demands of the environment and the market. One of them is engine modeling, or the creation of a "numerical engine"--allowing manufacturers to test new engine components on computer before creating expensive prototypes. Dennis Assanis, associate professor in the UIUC Department of Mechanical and Industrial Engineering, has been performing pioneering work modeling turbulent flows in engines on NCSA's high-performance computers. He spent part of his 1991-92 sabbatical at NCSA, working to improve engine modeling techniques, and part of it traveling to major vehicle manufacturers worldwide. Assanis wanted to tailor his research efforts to the problems and challenges faced by the automotive industry. MODELING TURBULENT REACTIVE FLOWS Internal combustion engines are extremely complex energy systems, involving the coupled phenomena of combustion, turbulent fluid flow, turbulent flame propagation, ignition and extinction, pollutant formation, and wall heat transfer, and in diesel and fuel injection engines, spray dynamics. Those phenomena are characterized by a number of different time and length scales. Because of the extreme computational demands, engine combustion modeling has been identified as a Grand Challenge problem. One of the most powerful multi-dimensional engine simulation codes is KIVA and its offshoot, KIVA-II, developed by the Los Alamos National Laboratory (LANL). Although KIVA-II, a vectorized code for Cray supercomputer systems, can deal with many of the phenomena occurring in internal combustion engines, Assanis and his students at UIUC found that it does not adequately predict some aspects of fuel spray dynamics and pollutant formation. Their improvements are being incorporated into a new version of KIVA called KIVA-3. Unlike KIVA-II, KIVA-3 can handle both the intake and exhaust flows, allowing for the simulation of an entire cycle. In automobile engines prior to the mid-1980s, air and fuel were mixed in a carburetor prior to entry into the combustion chamber. In modern gasoline engines, fuel is sprayed into the ports of individual cylinders. In diesel engines, fuel is sprayed direct-ly into the combustion chamber. The formation and dynamics of a fuel spray--poorly understood from a modeling or experimental standpoint--is crucial to the combustion process. The droplets must be small enough to evaporate readily so as to mix with the air-- within flammability limits. Assanis is trying to add to KIVA-3 much of the knowledge about single-drop evaporation under high temperatures and pressures into the reality of drop evaporation in a spray. Unless the combustion chamber is very large (as it is in some diesel engines), some of the fuel spray will impinge on the walls of the chamber. "We do not really know what will happen after the droplets impinge upon the walls," says Assanis. "They can shatter into smaller droplets, they can rebound from the walls, or they can form fuel films on the walls, which may then evaporate. Depending on what happens to the droplets after impingement, fuel mixing can be better or worse." KIVA-II assumed that drops reaching the wall stick to the wall and continue to vaporize. Assanis has incorporated other impingement options into KIVA-3, producing a code that is much better at predicting the actual behavior of the fuel droplets. During the combustion process, fuel vapor can be absorbed on the thin layer of oil present on the walls of the combustion chamber. Later in the cycle, as temperatures drop, the fuel vapor is released back into the chamber--just in time for the exhaust process. According to Assanis, this is one of the major mechanisms leading to unburned hydrocarbon emissions--major culprits in pollutant formation. Assanis is using KIVA-3 to better model this behavior, an area of which he says: "I feel that we are pioneers." Assanis is currently working on improving KIVA-3 to model direct injection of natural gas into the combustion chamber. In the near future, he hopes to collaborate with experimentalists to validate code predictions. "Lately, people have started to realize that natural gas is a fuel with great promise in terms of reducing pollution and improving fuel econ-omy," says Assanis. "This presents a new challenge. We would like to be able to provide some guidance to industry on how to improve the design of gaseous fuel injectors, how to position and target them, where to place the glow plug (an ignition assist device), and how to improve the mixing and penetration of the gaseous fuel plume with the rest of the air in the combustion chambers." IMPROVING TOTAL SYSTEM DESIGN Assanis found--during his sabbatical tour of automobile manufacturers in Europe, Japan, and the U.S.--that although KIVA-II is among the software of choice in the companies' research labs, it is still not used for real engine design. "I don't see KIVA being routinely used for component and system design for at least another ten years," he says. Modeling all the complex processes inside the combustion chamber means little if engine design-ers are unable to interpret and couple those processes with other engine components. Yet the results of these computational fluid dynamics (CFD) computations are not readily interpretable by most designers. Assanis is taking the results from KIVA and incorporating them into thermal, structural, and other systems level codes, in an attempt to bridge the gap between modeling and actual design and manufacturing. Working with NCSA's Software Development Group, Assanis hopes to create visualizations and animations of fluid and spray dynamics and their interactions, an approach that he feels will allow designers to better understand the results of the KIVA code. Another problem is that a typical simulation of just one cycle takes about 30 hours on the CRAY Y-MP system, a time that Assanis says is simply unacceptable to industry designers. Advances in rapid proto-typing have allowed designers to cheaply test some--but not all--components. He says many medium-sized companies simply do not own or have access to Cray computers--they typically do all of their engine development using simplified, zero-dimensional tools on PCs or workstations. The answer, according to Assanis, is parallelization of KIVA--a task that KIVA's developers at LANL are currently pursuing. NCSA is working with LANL and Thinking Machines Corp. to port KIVA to NCSA's CM-5. Assanis will then incorporate his latest models of engine flow and spray processes into the parallelized version of KIVA. Not only would parallelization allow the modelers to introduce even more variables into their calculations and achieve finer grid resolution, but a parallelized code would be faster and could be ported to clusters of networked workstations. SOLVING FLOWS IN TWO-STROKE ENGINES Assanis and his students have developed a code, ARIS-3D, designed to study 3D flows in complex geometries. Intake and exhaust occur simultaneously in two-stroke engines, which are currently used to power motorcycles, snowmobiles, and lawn mowers. In a single revolution, a fresh fuel mixture must enter the combustion chamber while displacing the exhaust mixture--a process known as scavenging. "Unless you are very careful," says Assanis, "you can have many recirculation zones in the chamber full of unburnt fuel and exhaust products." This results in reduced fuel efficiency and increased hydrocarbon emissions. Two-stroke engines may be the automobile engine design of the future, according to Assanis, if the scavenging process can be improved so that the engines can meet emission standards. "If a code is ever to be successful," says Assanis, "you need the participation and close collaboration of the modeler; the experimentalists who validate the models; the specialists in visualization, parallelization, and algorithm development; and from designers in industry. We will see a lot more multimachine, multiperson collaborations in the future." s Combustion modeling has been identified as a Grand Challenge problem. (Illustration by Marshall Greenberg) Evaporation history of a decane droplet, initially at a temperature of 350 K, injected into air at a temperature of 1000 K and a pressure of 10 atmospheres. Computations using the model proposed by Varnavas and Assanis (black lines) are compared with other simplified droplet evaporation models, including one in the original KIVA code (blue lines). (Courtesy of Constantine Varnavas and Dennis Assanis) Dennis Assanis (photo by Thompson-McClellan Photograghy) Finite element analysis of transient heat flow through a low-heat rejection piston-liner design. (Left) Component meshes illustrating design with various insulating materials.(Right) Temperature contours at top dead center predicted using gas-side boundary conditions from KIVA. (Courtesy David Baker and Dennis Assanis) Engineering with Visual Analysis by Sara Latta, Science Writer The 1988 crash of United Airlines Flight 232 in Sioux City, IA, might have been prevented if the computer-aided tomography (CAT) scan data and visual analysis techniques used by Ron Kriz and his collaborators at NASA Langley had been available. The cause of the crash was a crack in a turbine fan blade in the aircraft's engine-- a flaw not always detectable by conventional ultrasonic non- destructive testing. Kriz, NCSA's academic affiliate representative from Virginia Polytechnic Institute and State University (Virginia Tech), came to Champaign-Urbana in the summer of 1992 with a set of data supplied by the manufacturer of QUEST, a CAT-scan system that detects small flaws in advanced material systems. Using NCSA's visualization software (i.e., X DataSlice Isosurface, Viewit, X DataSlice Dicer, and X Image), Kriz transformed the complex dataset into an image of a turbine fan blade--with a serious flaw hidden in the center (see image above left). The results of his visual analysis efforts helped Kriz secure a contract with NASA Langley Laboratory, which had just acquired a QUEST system of their own for materials analysis. BUILDING THE LAB An associate professor of engineering science and mechanics, Kriz has become something of the "visual analysis guru" in Blacksburg, VA. Beginning in 1990 with just a couple of Sun SPARC 2 workstations and PV Wave software, he and his colleagues soon realized the need for a more comprehensive visual analysis lab. With the financial backing of the High Performance Polymers and Adhesives NSF Science and Technology Center and the Virginia Institute for Material Systems, and space from his department, Kriz formed the Scientific Visual Analysis Laboratory at Virginia Tech. The facility, which is open to any Virginia Tech faculty member, is equipped with two Mac-intosh IIfx computers with multi- media capabilities, two Sun SPARC 1 workstations, two Sun SPARC 2 workstations, and one Silicon Gra-phics IRIS-4D/320 VGX system. Most of the lab's software programs are from NCSA's Scientific Visualization Software Suite produced by the Soft-ware Development Group (SDG). All these efforts led Digital Equipment Corporation to start a digital visualization reference center in the visual analysis lab. "After building the laboratory," says Kriz, "I returned to NCSA to learn how to use the visual analysis tools. The support I received from the folks at NCSA, especially those in Joe Hardin's [SDG] group, was really fantastic. There was hardly any learning curve at all, which translated back to my students and colleagues at Virginia Tech. As an academic affiliate representative of NCSA, I can provide the people working in the laboratory a choice of computers and software." Kriz has brought NCSA's visual analysis tools to the classroom as well. He created a new course in scientific visual analysis, during the same busy summer that he put together the Visual Analysis Lab, and came to NCSA to learn to use visualization tools. Students in the course, popular among engineering students in a number of areas, develop projects in scientific visual analysis that culminate in multi-media presentations. "This eventually helped to support the creation of a new multimedia laboratory for undergraduate education," says Kriz, "as part of the NSF SUCCEED program." Virginia Tech is one of eight southeastern universities that comprise SUCCEED (Southeastern University and College Coalition for Engineering Education). "We're anticipating that we will be using NCSA sonification tools [see access, May-June 1991] in our multimedia lab," Kriz continues. "The director of that lab, Gordon G. Miller III, is interested in using NCSA visualization tools, like Image and DataScope, in the lab." During the summer of 1993, three undergraduate students worked with their professors at Virginia Tech to develop multimedia training modules for undergraduate courses in chemistry, materials science, and engineering science and mechanics. Funding was provided by the NSF Science and Technology Center Summer Research Undergraduate Program. Virginia Tech has an anonymous ftp site to distribute the multimedia modules, which are specifically geared to engineering students. Enter ftp viz2.multimedia.vt.edu to obtain these modules. Username is anonymous; password is E-mail. Multimedia applications are deemed so important to the College of Engineering's curriculum that this fall's freshman class is required to have computers that can handle multimedia applications such as animation, graphics, photographs, full-motion video, and sound. For nearly a decade, the college has required students to have their own computers. "SEEING" STRESS WAVE MEASUREMENTS Having performed some preliminary simulations and visualizations of acoustic microscope measurements on MIT's CRAY-2 system, Kriz now plans to use Thinking Machines' CM-5 system at NCSA to do the same. An acoustic microscope differs from an optical microscope in that it uses stress waves (sound waves traveling through a solid material), rather than light waves. "Stress waves do not just reflect off the surface of a material," says Kriz. "They penetrate and can reflect off structures below the surface as well. "The materials I am studying--fiber-reinforced epoxy polymers--have fibers below an epoxy polymer surface. The area where the fibers bond to the polymer forms an interphase--not a fiber, not a polymer, but something in-between." Kriz plans to use NCSA's CM-5 to simulate the propagation of stress waves below the surface of the poly-mer and the reflection off the fibers below. After running the simulation and visualizing the nature of the interphase, Kriz can better interpret the actual measurements from the acoustic microscope. "This is the first time," Kriz adds, "that anyone has ever experimentally measured these interphase structures by stress wave propagation. The objective is to bring the two--simulation and visualization, and experimental measurement--together. We're getting there." s NOTE: Virginia Tech's Visual Analysis Lab has posted material on its Gopher server. Using NCSA Mosaic, select the NCSA Mosaic Home Page, then World Wide Web; W3 servers; Virginia Tech (under Virginia); Visualization Lab. For directions about obtaining NCSA-developed software, see the inside back cover. "The support I received from the folks at NCSA . . . was really fantastic." --Ron Kriz Visualizations on this spread (counterclockwise from top of page 10) show profile of a titanium turbine, flaws (black spots) in a turbine cross section, and a sequence of wave propagation in an epoxy/graphite composite. The latter is a unipolar-planar pulse that can only be created with a supercomputer. (Courtesy Ron Kriz) NCSA and Structural Mechanics by Jarrett Cohen, Staff Associate, Director's Office "Our goal is doing computational mechanics, especially solid mechanics, on massively parallel machines," says M. Fouad Ahmad, NCSA research scientist in structural mechanics. Ahmad, who holds a Ph.D. in theoretical and applied mechanics, is collaborating with researchers from several engineering disciplines in academia and industry. A large part of Ahmad's time is spent in helping researchers better use NCSA's resources for the structural mechanics community. Some examples follow. ANALYSIS OF SHEAR FLOW LOCALIZATION Tarek Shawki, UIUC associate professor of theoretical and applied mechanics, is pleased with the outcome of his collaboration with Ahmad on shear flow localization. Their results, Shawki says, put "things into a form that is easily used by the engineering community." Beginning with his graduate studies at Brown University in the early 1980s, Shawki has been studying the phenomenon of shear flow localization--"a mechanism through which materials fail to carry the load they are supposed to carry." Localization takes place "by the concentration of plastic deformation into narrow, band-like regions instead of being uniformly spread out," he says. It occurs in such applications as projectile impact, high-speed machining, high-speed cutting, and armor penetration. Conditions that lead to shear flow localization are numerous, which is what makes the problem so complex, according to Shawki. Material behavior, rates of deformation, inertia, heat conduction, and boundary conditions are just a few. Shawki's research involves both analysis and computation. On the analytical side, he and his colleagues have developed nonlinear solutions for special cases and a new "energy-based localization theory." "The energy-based theory associates localization with a characteristic evolutionary behavior of the system's total kinetic energy," says Shawki. "In this context, the kinetic energy plays a fundamental role as a single parameter capable of characterizing the complete localization history. "With that discovery," he continues, "we realized that computations that have been performed in the past were not always covering the full localization history. What we saw on the computer was not what we expected through the theory. Armed with the theory, we realized that we could push the computations a lot further." NEW PERFORMANCE LEVELS "For a computation to resolve the evolution of localization until failure takes place," Shawki stresses, "great care should be taken to ensure that the numerical scheme converges to the right answer." Even though the physical event takes place within microseconds or milliseconds, getting to the final stage of localization is computationally demanding, he adds. First attempts by Shawki and NCSA research assistant Harischandra Cherukuri to reach the "right answer" involved running 1000 x 1000 grids for five days straight on a Hewlett-Packard 750 workstation. About two years ago, Ahmad joined their team. His first step was to optimize the code for the CRAY Y-MP system, achieving 130 MFLOPS on a single processor. After further investigation of the motion and energy balance equations that depict localization, Ahmad says "it seemed quite obvious that they could be parallelized." He went on to rework the code for NCSA's CM-2 and then the CM-5 systems (both from Thinking Machines Corp.). On the CM-5's full 512 nodes, the code ran at 11 Gflops for a very fine grid. (See graph on page 13 for Mflops rates on several machines.) "Today, problems which took days to solve before take less than six hours on the CM-5," Ahmad says. This simulation mirrors and agrees with the results of an experiment done at Brown University, Shawki says. "It is a model for a Kolsky bar experiment, which involves applying a torque at a high rate to a thin-walled tube." The NCSA/UIUC team used AVS for their visualization software, with data moved from the CM-5 to the CONVEX C3880 system. Daniel Cancro, an undergraduate intern at NCSA on leave from the University of Notre Dame, wrote the AVS module to interpret the data. The visualization was part of a videotape "A Massive Parallel Computation of Plastic Flow Localization in Dynamic Viscoplasticity." The video shows a solid block, subject to continued shearing, that appears to break into two rigid blocks. A small colored band between the blocks represents the localized zone. Color designates where significant temperature increases occur. "The material inside those localization regions softens as the temperature increases, and it deforms more," Shawki says. "As it deforms, it generates heat, and that heat contributes to further softening, and the cycle continues, leading to further localization. It is catalytic." "The final catastrophic behavior has not been reported before; the visualization shows the details of the temperature rise and the deformation," Ahmad adds. A GRAND CHALLENGE This "very careful development of the numerical schemes led to results that really exhibit the same things that the theory anticipated," Shawki says. "With recent advances involving the energy characterizations, it appears that the kinetic energy is a potential candidate for the characterization of localization." Even with an 85-fold increase in speed over the Y-MP on the CM-5, today's machines are not powerful enough to make a definite determination, classifying this as a Grand Challenge project. The current study only considers one spatial dimension. "A higher dimension, even at this speed, would take thousands of hours on a CM-5," Ahmad says. This difficulty is caused by the need for a very fine grid, which Ahmad calls a "Catch-22," as a finer grid requires more timesteps. The current 1D simulation already takes 3 million timesteps. Since teraflop computers are expected to be available by the mid-1990s, such hurdles will not remain insurmountable for long. With such performance in hand, Shawki hopes "this energy development leads us to the identification of a material resistance to localization that can be experimentally measured and tabulated in materials handbooks." Ahmad presented this work at the First Computational Mechanics Working Group Meeting of the National Consortium for High Performance Computing (NCHPC), held May 19-20 in Albuquerque, NM. Engineers from univer-sities, industry, and government attended. One of the workshop's purposes was facilitating collaborations between NCHPC members and industry in HPCC use, a task with which Ahmad is much engaged. Other ongoing academic projects in which Ahmad is involved include the study of viscoelastic material with Harry Hilton, UIUC professor emeritus of aeronautical and astronautical engineering, and the probing of the micromechanics of composite materials with Rami Hajali, UIUC graduate student in civil engineering. HELPING BUSINESSES Besides working with the academic community, Ahmad aids companies in their R&D efforts. Structural mechanics projects with Caterpillar, an NCSA industrial partner, have involved simulations of waste properties in landfill compaction and of the edge-rolling process for equipment design. With Dow, the effect of foam structure on cushioning properties of flexible polyurethane was modeled. In the last year, Ahmad also worked with a smaller company, Bonutti Orthopedic Services, Effingham, IL, whose research was funded by the Illinois Consortium for Technology Transfer [see access, January-February 1991]. Forty-five employees work in the Bonutti complex, which includes a clinic; a sports medicine and physical therapy area; and Apogee Medical Products, the R&D arm. Developing orthopedic devices James Hawkins, R&D director at Apogee, says the com-pany's philosophy is that "all aspects of a person's wellness are incorporated into the group." Founder and president Peter Bonutti is an orthopedic surgeon. "When Dr. Bonutti is performing surgery, he sees the need for new devices to help him," says Hawkins. "Apogee was begun for R&D because he had special applications." Treating Carpal Tunnel Syndrome is one example of a special application. In operations, Bonutti needs to "create a potential space in the wrist [and] . . . to incrementally expand tissue in a defined area," Hawkins says. To do this, it is necessary to "know the size and shape of the space that the expander would create." To model expander shapes, the company turned to NCSA for assistance. In the summer of 1992, Hawkins began a collaboration with Ahmad and NCSA research assistant Gerry Pollock, a recent Ph.D. graduate from UIUC in aeronautical and astronautical engineering. NCSA initially trained Hawkins on the center's systems. For the expander project, he used the CRAY-2 system from a Silicon Graphics workstation at NCSA. The expander "pushes flesh out of the way so they can see the area a little better. They insert a camera through the hollow center," Pollock says. Bonutti had been using mechanical devices but wanted to try a balloon instead, he explains. "He wanted to follow the same procedures but in a way that the tissue would be less damaged." Modeling expanders Employing finite element analysis, the NCSA and Apogee collaborators looked at several different shapes. "Initially, we had hoped to effectively predict the shape of an expanded balloon from its deflated geometry and its material properties," Hawkins says. To accomplish this goal, they employed an iterative technique. It takes a large amount of computer time to expand the balloons, since it is done incrementally. "The expander is a nonlinear problem; access to supercomput-ers is important," Pollock says. "On a workstation, it would take a couple of days." After completing the supercomputer models, Hawkins and his team at Apogee tried several experimental designs in the laboratory-- principally to solve material inconsistencies. "By theoretically performing the design iterations on the computer, we save a great deal of time and money." Creating better knee replacements A more recent Apogee project is patella, or kneecap, replacements. Pollock says they are typically made of a laminate, with a polyethylene part in contact with the tissue and a metallic part that meshes into the bone. Hawkins says the patella replacement is applied using a solid layer of cement--which Bonutti wants to avoid. "We looked at using smaller amounts of bone cement," he says. "We are trying to see if it will have the same fixation strength. Our related research indicates that less cement is better." Again taking a finite elements approach, the team modeled several patella designs with "loading" on them and evaluated the stress distribution through the system. Loading is the stress put on the knee by a bent leg, for example, when climbing stairs. This computational analysis looked at new geometries and less cement as opposed to a solid layer of cement. The models were carried out on NCSA's CRAY Y-MP system, with visualizations created using a postprocessor. "After the computer modeling, we created geometries like the model to see if what the computer said actually happens," Hawkins says. The patella replacement work will be useful for studying other cemented components. NEW APPLICATIONS CODE As part of Ahmad's responsibilities to NCSA users in structural mechanics, he enables them to have state-of-the-art resources, such as software. Recently he and NCSA research programmer Danesh Tafti got Centrex Spectrum software application for fluids and solids on the CRAY Y-MP system. Soon a CM-5 version will be in place. According to Ahmad, it is a "new code with great potential." s "Our goal is doing computational mechanics, especially solid mechanics, on massively parallel machines."--M. Fouad Ahmad Mflops rate for different machines. As an extension of his collaboration with Tarek Shawki, M. Fouad Ahmad is studying the performance of the shear flow localization code on several platforms. Mflops performance rates are given for a 10,000 - timestep simulation. (Courtesy M. Fouad Ahmad and Tarek Shawki) Treatment for Carpal Tunnel Syndrome with expanders developed using NCSA's computational resources. (Illustration by Marshall Greenberg) Inset: Expansion of bladder cross section with uniform internal pressure. (Courtesy Peter Bonutti) M. Fouad Ahmad (photo by Thompson McClellan Photography) Allocations in engineering Suresh K. Aggarwal, UIC, "Computations of Particle-Laden Turbulent Flows" and "Comprehensive Modeling of Multicomponent Fuel Sprays" Russell L. Alberts, U. of Nebraska, Lincoln, "Finite Element Analysis of Retained and Removed Orthopedic Screws and Finite Element Analysis of the Growth Plate in Bone" Ryoichi S. Amano, U. of Wisconsin, Milwaukee, "Computations of 3D Turbulent Flows in Gas Turbines" Ronald Paul Andres, Purdue U., "Molecular Dynamics Simulations of Cluster Melting, Cluster-Cluster Collisions, and Cluster- Substrate Interactions" Mehdi Anwar, U. of Connecticut, "Transport in Lower Dimensional Structures" Dennis N. Assanis, UIUC, "Development of an Improved Evaporation Model for the KIVA-3 Multidimensional Engine Simulation" Nasser Ashgriz, SUNY, Buffalo, "Numerical Simulation of Binary Drop Collisions" S. Balachandar, UIUC, "Simulation of High Rayleigh Number Thermal Convection with Imposed Mean Shear and System Rotation" Constantine Balanis, Arizona State U., "Finite-Difference Time-Domain Technique for Antenna Radiation" Sanjoy Banerjee, U. of California, Santa Barbara, "Multiphase and Interfacial Phenomena" Romesh C. Batra, U. of Missouri, Rolla, "Studies in Penetration Mechanics" Yildiz Bayazitoglu, Rice U., "A 3D Model for the Enhancement of Heat and Mass Transfer from a Sphere in a Superimposed Acoustic Field" Robert A. Beddini, UIUC, "Simulation of Large-Eddy Turbulence in Compressible Ducted Flows" Theodore B. Belytschko, Northwestern U., "Adaptive Multidomain Methods for High Gradient Problems in Solid Mechanics" Lawrence Alan Bergman, UIUC, "Solution of the Fokker-Planck Equation Arising in Stochastic Dynamical Systems" M. A. Bhatti, U. of Iowa, "Performance of Rigid Concrete Pavements Subjected to Dynamic Vehicle Loads" Amir Boag, UIUC, "Complex Multipole Beam Approach to Electromagnetic Scattering" David E. Boyce, UIC, "Dynamic Route Guidance Systems Design" and "Transportation Network Equilibrium Modeling" M. Quinn Brewster, UIUC, "Nonsteady Burning of Solid Fuels in the Presence of Thermal Radiation and Pressure Transients" and "Solid Rocket Motor Internal Flowfields" Richard O. Buckius, UIUC, "Rigorous Predictions of the Scattering of Thermal Radiation from Very Rough Surfaces" Aditi Chattopadhyay, Arizona State U., "Multidisciplinary Design Optimization Multiobjective Formulation Techniques" Weng Cho Chew, UIUC, "Nonlinear Inverse Scattering of Large Objects using the Local Shape Function (LSF) Method" and "Multiple Scattering Study of Random Inhomogeneous Media using Supercomputers" Yee C. Chiew, Rutgers U., "Force Biased Monte Carlo Simulation of Dense Chain Fluids" Shun Lien Chuang, UIUC, "Linear and Nonlinear Optical Properties in Semiconductor Lasers and Optoelectronic Devices" Alexander Chudnovsky, UIC, "Numerical Simulation of Crack- Damage Interaction" Jacob Nan-Chu Chung, Washington State U., "Direct Numerical Simulations of Laminar-Turbulent Transition in Mixed Convection Pipe Flow and Free Convection Boundary Layer Flow," "Direct Numerical Simulation of Flow Transition at Low Reynolds Numbers in a Vertical Pipe Flow," and "Heat Transfer Effects on the Structures of Transition in Blasius and Entrance Channel Flows: A Direct Simulation Study" T. J. Chung, U. of Alabama, Huntsville, "Analysis of Turbulent Shock Wave Boundary Layers in Compressible Reacting Flows with Parallel Processing" and "Development of Parallel Computer Algorithm for Turbulent Compressible Reacting Flows" David Churchill, Clemson U., "Fiber-Fiber Interactions in the Prediction of Elastic Properties of Short-Fiber Composites" M. E. Clark, UIUC, "Cardiovascular System Simulation" Michael P. Cleary, MIT, "Theoretical and Numerical Simulations of Hydraulic Fracturing Processes" Thomas F. Conry, UIUC, "Thermal Analysis of Elastohydro-dynamic Bearing Systems" Bruce A. Conway, UIUC, "Direct Optimization of Dynamical Systems using Nonlinear Programming" Anoop K. Dhingra, U. of Wisconsin, Milwaukee, "Complete Solutions to Polynomial Systems using Probability One Globally Convergent Homotopy Methods" Jeremiah F. Dwyer, Montana State U., "Computational Analysis of Damage in Composites" Fangpu Gao, UIUC, "Dimension Analysis on Finger Tremor in Tardive Dyskinesia Patients" Benjamin Gebhart, U. of Pennsylvania, "Instability Analysis of Natural Convection along a Vertical Surface using Numerical Simulations" Ahmed F. Ghoniem, MIT, "Numerical Simulation of Turbulent Combustion" Peyman Givi, SUNY, Buffalo, "Direct Numerical Simulations and Large Eddy Simulations of Unpremixed Turbulent Reacting Flows" George Gogos, Rutgers U., "Fuel Droplet Evaporation at High Pressure Environment; Natural Convection Effect" Stephen M. Goodnick, Oregon State U., "NCCE: Monte Carlo, Transport in Quantum Wires" Harold L. Grubin, Scientific Research Assoc., "Computational Device Modeling" Robert B. Haber, UIUC, "Supercomputer Applications in Design Optimization and Computational Solid Mechanics" George T. Hahn, Vanderbilt U., "Analysis of Rolling Contact Deformation and Fracture" Thomas J. Hanratty, UIUC, "Computer Experiments on Wall Turbulence" John G. Harris, UIUC, "Scattering from Partially Closed Cracks and Other Imperfect Interfaces" James C. Hill, Iowa State U., "Numerical Simulation of Nonreacting and Reacting Turbulent Flows" Antoine Kahn, Princeton U., "Atomic Structure of Semi-conductor Surfaces" Stephen F. Kawalko, UIC, "Electromagnetic Scattering by Impedance Bodies of Revolution with Fins" Ifiyenia Kececioglu, UIC, "Finite Element Computation of Multiphase Flow and Transport Problems" Jay M. Khodadadi, Auburn U., "Numerical Simulation of Liquid Metal Flow and Heat Transfer in the Mold of Continuous Casters" Kyekyoon Kevin Kim, UIUC, "Exploratory Project" and "Stability of a Uniform Liquid Fuel Layer inside a Spherical-Shell Cryogenic Inertial Confinement Fusion Target" Doyle D. Knight, Rutgers U., "A Compressible Navier-Stokes Algorithm using an Unstructured Adaptive Grid" and "3D Asymmetric Crossing Shock Wave-Turbulent Boundary Layer Interaction" Ranga Komanduri, Oklahoma State U., "FEM Analysis of Various Manufacturing Processes" Joel Koplik, CUNY, "Molecular Dynamics of Fluid-Solid Systems" George Kosaly, U. of Washington, "Investigation of the Laminar Flamelet Model of Turbulent Diffusion Flames" Ron D. Kriz, Virginia Polytechnic Inst. & State U., "Simulation and Visualization of Stress Wave Propagation near Fiber-Matrix Interphases: Scanning Acoustic Microscope Simulation for Determining Interphase Structure" Anil K. Kulkarni, Pennsylvania State U., "Flow Characteristics of a Precessing Deflected Jet" Richard T. Lahey, Jr., Rensselaer Polytechnic Inst., "The Development of Multidimensional CFD Capabilities for Multiphase Flows" Douglas A. Lauffenburger, UIUC, "Monte Carlo Simulations of Cell Shape and Adhesion" Shung-Wu Lee, UIUC, "Computation and Visualization of Electromagnetic Scattering" Wilbert Lick, U. of California, Santa Barbara, "Sediment and Contaminant Transport in Aquatic Systems" Gang Lin, Louisiana Tech U., "Design Optimization of Herring Bone Air Bearings" J. Bruce Litchfield, UIUC, "Measurement of Transient Moisture Profiles and Structural Changes Using 3D NMR Microscopy" Wing Kam Liu, Northwestern U., "Computer Modeling of Material Forming Processes with Micro/Macro Crack Damage Accumulization" Eric Loth, UIUC, "Bubble Dispersion and Modulation," "Shock Interaction with Rigid Bodies and Mixing Layers," and "3D Shock Interaction with Rigid Bodies and Mixing Layers" Joseph W. Lyding, UIUC, "The Distributed Scanning Tunneling Microscopy Laboratory" Karthikeyan Mahadevan, UIUC, "Development of New Finite Element Algorithms for the Solution of 2D and 3D Electro- magnetic Scattering Problems" Paul E. Mayes, UIUC, "Electromagnetic Radiation and Scattering from Conducting Surfaces in Free Space" Raj Mittra, UIUC, "Time Domain Electromagnetic Analysis of Interconnects in Digital Computer Systems" Satish Nair, U. of Missouri, Columbia, "Computations for Dynamic Systems with Neural Network and Fuzzy Logic Control Algorithms" Raghu N. Natarajan, Rush-Presbyterian St. Luke's Med. Center, "Mechanical-Biological Interaction with Treatmentof Joint Diseases" Andrew R. Neureuther, U. of California, Berkeley, "TEMPEST" Ahmed K. Noor, U. of Virginia, "Nonlinear Finite Element Dynamic and Postbuckling Analyses on Multiprocessor Computers" Lorraine Olson, U. of Nebraska, Lincoln, "Improved Model Techniques for the Inverse Problem of Electrocardiography" Kook D. Pae, Rutgers U., "Computational Modeling of Hydrostatic Metal Forming Processes" Mangalore A. Pai, UIUC, "High-Speed Simulation of Large-Scale Dynamic Power System Models" Stanley L. Paul, UIUC, "Validation of Structural Analysis Techniques using Field Test Data" Dimos Poulikakos, UIC, "Numerical Simulation of Splat Quench Solidification" Umberto Ravaioli, UIUC, "Study of Transport in Deep Sub-micron MOSFETS using a Full Bandstructure Monte Carlo Approach," "NCCE: Monte Carlo, Mode Matching," "Code Performance Analysis," "Special Projects (CRAY Y-MP)," "Special Projects (CRAY-2)," and "Molecular Dynamic Algorithms for Monte Carlo Simulation of Semiconductor Transport" Chittaranjan Ray, Ill. State Water Survey, "Modeling Transport of Agricultural Chemicals in a Dual-Porosity System Resulting from Macropores" Jorge Rodriguez, Rutgers U., "Nonlinear Analysis of WMBAR Motion Segments" Vythialingam Sathiaseelan, Northwestern U., "Computer Simulation Studies for Electromagnetic Hyperthermia" William Raymond Schowalter, UIUC, "Die Swell of Emulsions" Justin Schwartz, UIUC, "Mechanical and Magnetothermal Optimization of Superconducting Magnets for Large-Scale Applications" Huseyin Sehitoglu, UIUC, "Modeling of Microstructure Effects on Fatigue Crack Closure" James A. Sherwood, U. of New Hampshire, "Investigation of the Thermomechanical Response of a Titanium Aluminide/Silicon Carbide Composite using a Uniform State Variable Model and the Finite Element Method" Wei Shyy, U. of Florida, "Numerical Prediction of Transport Characteristics and Macrosegregation of Superalloy Processing" Alfredo Soldati, U. of California, Santa Barbara, "Direct Simulation of the Elctrohydrodynamic of Electrostatic Precipitators" Mark Allen Stadtherr, UIUC, "Strategies for Using Advanced Computer Architectures in Designing Complex Chemical Process" Scott D. Stewart, UIUC, "Numerical Simulations of Detonation Dynamics" Gregory E. Stillman, UIUC, "Quatum Well Detectors" Larry Taber, U. of Rochester, "Theoretical Models for the Embryonic and Mature Heart" Christos George Takoudis, Purdue U., "Thin Film Deposition of Patterned Substrates from Silicon Tetrachloride-Based Systems: Relationships between Material Properties and Processing" Troy Torbeck, UIUC, "Startup-Postive Crank Case Ventilation Studies" George S. Triantafyllou, CUNY, "3D Interaction of a Vorticity Field with the Ocean Surface" Keh C. Tsao, U. of Wisconsin, Milwaukee, "Combustion System Design of High-Output Diesel Engine" Surya Pratap Vanka, UIUC, "Direct and Large-Eddy Simulations on Massively Parallel Computers" and "Higher Level Simulations of Complex Fluid Flows" Frank B. Van Swol, UIUC, "Micelle Formation at the Fluid-Wall Interface" and "Molecular Dynamics Studies of Micellar Solutions" Jonathon C. Veihl, UIUC, "Antenna Analysis in Complex Environments" John Scott Walker, UIUC, "High-Velocity Boundary and Interior Layers in Liquid-Metal Magnetohydrodynamic Flows" William M. Worek, UIC, "Heat Transfer and Fluid Flow in Planar 90 Degree Bifurcations with and without Rounded Corners" Hu Yang, U. of Akron, "Parallel Finite Elements Method on MIMD and SIMD Systems" Yook-Kong Yong, Rutgers U., "Numerical Analysis of Very High Frequency, Piezoelectric Quartz Crystal Resonators" For details about how to apply for time on NCSA's high- performance computing systems, contact MetaCenter Allocations [see ncsa contacts, page 2]. s CM-5 enhancements attract friendly users by Michael Welge, Research Programmer, Computational Mathematics and Computer Science Team A year ago the first CM-5 hardware shipping crates arrived at NCSA [see access, May-June 1992]. Since that time, the CM-5 friendly user community has experienced several hardware changes that have greatly enhanced the viability of the system for Grand Challenge scientific computations. The CM-5, Connection Machine (Model 5), is designed by Thinking Machines Corp. (TMC), Cambridge, MA. HARDWARE ENHANCEMENTS NCSA's CM-5 initially supported 512 RISC processors. This basic processing node consisted of a microprocessor, memory subsystem, and a CM-5 network interface all connected to a standard 64-bit bus. The SPARC microprocessor, running at 32 MHz, is capable of 22 MIPS and 5 Mflops. Today, NCSA's CM-5 has been optionally configured with the Data- path floating point system (also referred to as DASH, or vector units). There are four vector units per node, all running under control of the SPARC microprocessor. One memory and two arithmetic operations (mult, add) can be performed per vector unit per clock cycle. The vector unit clock rate is 16 MHz, and peak performance is 128 Mflops per node. This upgrade increased the aggregate theoretical performance from 2.5 to 64 Gflops and the DRAM memory from 8 to 16 Gbytes. The addition of the 100 Gbyte Scalable Disk Array (SDA) in mid- April provides an extremely high-performance, highly expandable disk storage system comprised of Disk Storage Nodes. The basic Disk Storage Node--which provides 9.2 Gbytes of storage, a peak bandwith over 17 Mbytes per second, and 25 MIPS of processing power--comprises a controller built on a SPARC processor, a network interface, a large disk buffer, four advanced SCSI controllers, and eight 3.5" hard disk drives. These SDA Disk Storage Nodes-- analogous to the computational processing nodes--are directly connected to the CM-5 internal network. This direct connection enables each Disk Storage Node to contribute not only to storage capacity but also to I/O performance. The number of Disk Storage Nodes in the SDA can be increased or decreased, thereby achieving an I/O system matched to the performance and capacity needed for the user's application. NCSA's CM-5 SDA consists of 12 Disk Storage Nodes that provide 100 Gbytes of storage at I/O bandwidths of up to 132 Mbytes per second sustained. CM-HIPPI arrives soon NCSA's user community will experience increased transfer rates from the CM-5 to other supercomputer systems when the CM-HIPPI device is ready later this year. CM-HIPPI is an integrated system that receives CMFS files system commands from the CM-5 control processor. The CM-HIPPI contains a CPU, disk drives, a VME bus, and HIPPI I/O interface modules. The CM-HIPPI connects directly to the CM-5 network to provide 1 gigabit aggregate bandwidth. In mid-September a TMC CM-5 testbed system with a HIPPI interface was installed at NCSA. Its goal is to support UIC/NCSA's CAVE (Cave Automatic Virtual Environment) demo at Supercomputing '93. It will also allow for a smooth integration of HIPPI into NCSA's CM-5 production system. CM-HIPPI team members are Randy Butler, group leader; Joseph Godsil; Vijay Rangarajan; Von Welch; and Paul Zawada. APPLICATIONS RESEARCH EFFORTS Many users took advantage of the CM-5 friendly user period to make use of the increased performance of the distributed memory CM-5. Many of those same users continue computing on the CM-5, which entered into production in May. CM-5 users and their areas of research are listed on the following page. Watch for reports of some of their research in the next issue of access, which will focus on Grand Challenge research. Astronomy and gravitation Richard Crutcher Astronomy UIUC David Wesley Hobill Physics/Astronomy Univ. of Calgary Computational biology Roscoe Giles Electrical, Computer, and Systems Boston Univ. Michael Keegan Institute for Computational Science George Mason Univ. Ronald Levy Chemistry Rutgers Univ. James Andrew McCammon Chemistry Univ. of Houston, University Park Klaus J. Schulten Physics UIUC Earth, environmental, and social sciences Richard K. Sato Scientific Computing Division NCAR Fred Sklar Southern Florida Water Survey Engineering sciences Weng Cho Chew Electrical/Computer Engineering UIUC Sanjay Mehrotra Industrial Engineering/Management Sciences Northwestern Univ. Dennis Parson Civil Engineering UIUC Otmar S. Schlunk Electrical/Computer Engineering UIUC Kim Simmons Nuclear Engineering/Engineering Physics Univ. of Wisconsin, Madison Surya Pratap Vanka Mechanical/Industrial Engineering UIUC Mathematics and computer science Andrew Chien Computer Science UIUC George Cybenko Thayer School of Engineering Dartmouth College Dennis Gannon Computer Science Indiana Univ. Ewing Lusk Mathematics/Computer Science Argonne National Laboratory David Padua CSRD UIUC Quantum systems Ronald E. Cohen Geophysical Laboratory and Center for High Pressure Research Carnegie Mellon Univ. Arthur Freeman Physics/Astronomy Northwestern Univ. Rajan Gupta Los Alamos National Laboratory Herbert W. Hamber Physics Univ. of California, Irvine John B. Kogut Physics UIUC Don Secrest Chemistry UIUC Robert Sugar Physics Univ. of California, Santa Barbara Industrial users Ramesh K. Agarwal McDonnell Douglas Research Laboratories McDonnell Douglas Corp. Caterpillar FMC Lilly J. P. Morgan United Technologies Research Center NCSA users Fouad Ahmad--Engineering sciences David Ceperley--Quantum systems Michael Heath--Mathematics and computer science Eric Jakobsson--Molecular dynamics Michael Norman--Astrophysics Clint Potter--Biomedical imaging Ed Seidel--Gravitation Danesh Tafti--Computational fluid dynamics Robert Wilhelmson--Earth sciences Contact MetaCenter Allocations for details about how to apply for time on NCSA's high-performance computing systems [see ncsa contacts, page 2]. s NCSA's CM-5. (Photo by Wilmer Zehr) ENVISIONING THE EARTH: THE GEOSPHERE PROJECT by Sara Latta, Science Writer The late Nobel Prize-winning physicist Richard Feynman called Tom Van Sant "the only truly modern artist that I know," citing Van Sant's appreciation of technology and science and his incorporation of them into his work. Yet, there is something fundamental about Van Sant's vision: "We spent a couple of million of years evolving as a species . . . understanding our world through visualization," Van Sant said at an NCSA lecture at the Beckman Institute in March. "A few hundred years ago, modern cultures committed themselves to the written word as a way of storing and disseminating information," he continued. "While this was obviously beneficial . . . it also committed us to linear thinking. We soon realized that we could not handle [the information] unless we started chopping it up into segments and compartmentalizing it into a library. But nature is not compartmentalized: it is a symbiotic continuum." The Geosphere Project, founded in 1989 by Van Sant--also its CEO--is based on the painter-sculptor-architectural designer's dream that complex Earth resource management and global change issues could-- and should--be understood as a symbiotic continuum by policy makers, the research community, educators, and the lay public alike. As an artist, he realized that the only way to understand the complex relationships between the many Earth resource management issues was through visualization. REALITY MODELS OF THE EARTH The first product of the Geosphere Project was a global map, at 4 kilometer resolution, made by digitally piecing together thousands of cloud-free satellite photos of the Earth, gleaned from National Oceanic and Atmospheric Administration (NOAA) archives. The map, which took 10 months to complete, depicts both hemispheres of the Earth in the summer. It is color converted so that oceans appear blue; forests, green; and deserts, brown--as we would see it from space. A 1k resolution image of the Earth and a series of geosphere globes are in the works, onto which a variety of databases can be projected. Plans are underway to use the globe data in the CAVE (Cave Automatic Virtual Environment)--a virtual reality, or VR, laboratory designed to surround the viewer with data at UIC's Electronic Visualization Laboratory (EVL). NCSA will soon have access to the globe data in an interactive environment as well, when EVL completes construction of a CAVE at NCSA's VR Laboratory. With the CAVE technology, a viewer could stand inside the globe, as it were, viewing the globe from the inside out. This perspective would give the viewer a much larger dis-tortion-free perspective of the globe. NCSA is also exploring the possibility of allocating supercomputer time for the production of the 1k geosphere image, according to NCSA Director Larry Smarr. "Now we can look at a reality model," said Van Sant. "That's very important for us. Political boundaries cannot be seen from space. They are an artificial concept that we superimpose, I'm sure for many good reasons, but not necessarily for reasons of Earth resource management." THE GEOSPHERE INTERACTIVE VISUAL LIBRARY Van Sant is gathering and creating visualizations for global databases acquired from NOAA, the National Aeronautics and Space Administration, The World Bank, the National Geographic Society, the World Conservation Monitoring Centre, and other government and nongov-ernment organizations. The databases are digitized, coregistered in geographic information systems, rendered for overlays, and animated for time-lapse sequences and zooms. Integrated databases can provide for layered viewing, special effects, or flight simulations. The product is a laser disk prototype of the Geosphere Interactive Visual Library, which debuted with great success at the Earth Summit in Rio de Janeiro, June 1992. The library's interface is a map of the Earth with various icons representing geographic, human, land, and animal systems. The viewer might select the habitat of the Asian elephant, from the 1970s to the present time, for example. By overlaying this database with agricultural, demographic, and other databases, the viewer can easily understand the relationships between the activities of elephants and man. Or, by selecting an icon representing the South American rainforests, overlaid databases can show that slash and burn deforestation is related not only to the reduction of species habitat, but also the depletion of the already scanty topsoil, the pollution of the streams and ocean reefs, the release of CO2 into the atmosphere, and large releases in methane--the result of an increase in the termite population. Van Sant envisions a library on each continent someday--housed in what he calls an "Earth Situation Room," the post-Cold War version of the War Room. The library would be networked so that anyone with a computer terminal could access the information. KIDS DOING REAL SCIENCE Van Sant believes that "kids can do real science, particularly if they can interactively overlay databases." For example, Van Sant has a database from the U.S. Naval Postgraduate School in Monterey, CA that shows all of the surface currents off the California coast. He also has a database of the migration pattern for gray whales from the Bering Strait to Baja California, where they calve and breed. Do the whales swim against or with the current? It is probably of some interest, but no one has bothered to overlay the two databases. "I'm saving these kinds of problems for the kids in the pilot programs," Van Sant said. "We can start empowering kids to not only do the work, but then perhaps even stand up and join in policy, to claim their inheritance: the resources that are being used in a nonsustainable manner." "WE ARE IN THE SYSTEM" "I think people are responding more to the Earth, and to Earth resource management, because we just finished exploring the solar system," said Van Sant. "We found out that there is a very narrow hospitality zone for life. "Bucky Fuller said that we are passengers on the spaceship Earth. Well, we are really not passengers on anything. We are more like part of the upholstery, so to speak. It is us, and we are it, and we are in the system." s ". . . Nature is not compartmentalized; it is a symbiotic continuum."--Tom Van Sant "We spent a couple of million years evolving as a species . . . understanding our world through visualization," says Tom Van Sant, founder and CEO of the Geosphere Project. (Photo by Thompson- McClellan Photography) INTEGRATING GLOBAL MODELS: CONFERENCE REPORT by Randall Graham, Science Writer, and Fran Bond, Publications Editor "Ecological economics is an attempt to integrate social sciences and natural sciences in order to provide the information necessary to formulate policy dealing with global climate change," said Karl-Gran Maler, director of the Beijer International Institute for Ecological Economics, a research institute of the Royal Academy of Sciences, Stockholm, Sweden. "In order to do that, we have to construct an understanding of the processes which join the social systems to the natural systems." Maler spoke to about 30 scientists (see list on page 22) who gathered for the Global Ecological Economic Modeling Conference in March at the Beckman Institute. NCSA co-sponsored the conference with the Beijer Institute. Robert Costanza, president of the International Society for Ecological Economics and a research affiliate of the Beijer Institute, coordinated the conference. The conference focused on constructing a new global model of the Earth by joining techniques from social science and natural science. It is widely thought that such a model could clarify humankind's impact on Earth's ecosystems and climate [see access, July-September 1992]. Attendees gathered to form research alliances for the future and to collaborate on a preproposal for funding global modeling R&D. After a day of presentations that introduced each other's research and modeling processes, conferees divided into small working groups to discuss and hone issues that were then formulated into a formal document. ECOLOGY "Our goal is to build an evolutionary, adaptive modeling environment rather than a specific model," explained Costanza, professor of ecology at the University of Maryland and director of the Maryland International Institute for Ecological Economics. "We want the model to continue to pro-gress." He said that the model could be used in conflict resolution. The conferees were seeking a new modular approach to model building that could evolve over time through adaptations and improvements. According to Costanza, they did not want to "build it once, get the answer, and quit." This was to be the first step in an ongoing endeavor. ECOLOGICAL ECONOMICS NYU's Ecological Economist Faye Duchin said the timing could not have been better for linking the model she uses with those of others. "We have done a lot of theoretical work in the last decade that has not been brought in [to the model], and we were ready to rewrite the model to make it more transparent to others." She and her colleagues use a model developed by the Nobelist Wassily Leontief. Duchin directs the Institute for Economic Analysis founded by Leontief. Duchin felt the conference broke new ground: "At a conceptual level none of us are yet asking the most strategic global questions because each of us is only asking the questions that appear to be globally strategic in our own area." CLIMATE Climatologists' models are complex because they render a number of variables in 3D. According to UIUC Professor of Atmospheric Sciences Michael Schlesinger, "The whole spectrum of human endeavors, including agriculture and the ability to maintain water quality, is influenced by climate. We do not have a good understanding of how this works, because we need information on the small scale at which impacts are felt. . . .We are also held back by our understanding of how the processes work in the climate system. The problem, as I see it, is to ask 'Well so what if climate changes?'" GEOGRAPHY UIUC Geography Professor Bruce Hannon was NCSA's host and co- coordinator for the conference. Together with an international group of scientists, including Costanza, Hannon visited the Beijer Institute in the summer of 1992 (see photo below right). He returned in September 1993 for the 4th Annual International Complex Systems Conference. NCSA Director Larry Smarr attended also. We are in "the infant stage of the modeling process," Hannon acknowledged. "Both data and the modeling process need to be addressed. . . . Although many of the ecological modelers are using STELLA, there was talk of needing [more complex] software to integrate the federation that goes into producing a large global model. NCSA has a software development initiative to address this." As a participant in this initiative, Hannon is a visiting professor at NCSA, where he has been enhancing modeling techniques and running code on the CM-5 [see access, October-December 1992]. Albert Cheng, NCSA Software Development Group (SDG) programmer, is working with Hannon on this effort, which was demonstrated. Duchin showed a software program she developed with SDG Programmer Jason Ng. GOALS OF THE GROUP "We are interested in how to make connections between various models and also in how that coupling affects our model," said Hannon, summarizing the conference's goals. "For us it means moving from 30 x 60 meter cells to much larger ones for the global model." Costanza joined in: "People at the conference are working on stand-alone parts of the global model. . . .What we want to view is the whole. This is conceived of as a 5-year project," he continued. "The model structure itself will be evolving." The group dealt with seemingly astounding datasets in terms of variables, but their vision is equally challenging. Since the "endgame" of their efforts could have broad, worthwhile consequences, the group gathered at the conference were dedicated to the long term. According to Duchin, this ingathering of researchers showed that "science can be used to solve real world problems. . . .The issue here," she said, "is that the way we are living on this planet is causing repercussions that are bound to feed back on us. . . .Our conviction is that [humankind is] probably going to have to make more significant changes in how we live than most people have allowed their imaginations to consider." s "[We are in] the infant stage of the modeling process."--Bruce Hannon GLOBAL ECOLOGICAL ECONOMIC MODELING CONFERENCE ATTENDEES Lars Bergman, Stockholm, Sweden Robert Costanza, U. of Maryland Ralph d'Arge, U. of Wyoming Kieran Donaghy, UIUC Faye Duchin, NYU Charles Falkenburg, U. of Maryland Bruce Hannon, UIUC Robert Herendeen, INHS John Hobbie, Marine Biology Lab, Woods Hole Robert Kaufmann, Boston U. Eric Lambert, CERL Glenmarie Lang, NYU Robert Lempert, RAND Corp. Robert Lozar, CERL Karl-Gran Maler, Beijer Institute Tom Maxwell, U. of Maryland Gottfried Mayer-Kress, UIUC Helena Mitasova, CERL Matthias Ruth, Boston U. Michael Schlesinger, UIUC Mark Schwartz, INHS Kevin Seel, CERL Steve Seitz, UIUC Jason Shogren, Yale U. Fred Sklar, Florida Water Management Michael Sonis James Westervelt, CERL (Top) Attendees at the Global Ecological Economic Modeling Conference included (left to right) Karl-Gran Maler, director of the Beijer Institute; Robert Costanza, director of the International Society for Ecological Economics; Michael Schlesinger, UIUC professor of atmospheric sciences; Larry Smarr, director of NCSA; Faye Duchin, director of the Institute for Economic Analysis; and Bruce Hannon, UIUC professor of geography on sabbatical to NCSA. (Photo by Thompson-McClellan Photography) (Bottom) Pictured at the Beijer Institute in Stockholm are (left to right) Carl-Olof Jacobson, general secretary of the Royal Swedish Academy; Costanza; Joseph Hardin, NCSA associate director for the Software Development Group; Maler; and Hannon. (Photo by Lars Falck, courtesy of Bruce Hannon) industrial program CATERPILLAR WINS GRAND CHALLENGE AWARD BY CATERPILLAR INC. Caterpillar Inc. received NCSA's top corporate honor for its use of virtual reality and HPCC to design some of its best-selling machines. The Grand Challenge Award--to recognize breakthrough research resulting from an NCSA partnership--was presented to Caterpillar at the Fifth Annual Executive Meeting of NCSA's Industrial Partners in May 1993. "Caterpillar's research work on this project is very significant because of the improvement to their design process," says John Stevenson, NCSA's corporate officer and head of the Industrial Program. "This will help improve the company's ability to compete globally." DESIGNING WITH VIRTUAL REALITY Caterpillar received the award for effectively using virtual reality and HPCC technology to improve visibility for drivers in the cabs of its wheel loaders and backhoe loaders. These machines are used to build roads, dig foundations, load trucks, and haul loose materials such as gravel. A cab's visibility affects an operator's productivity, comfort, and safety. Through virtual reality, engineering blueprints become 3D, allowing engineers to see how their designs will work in a seemingly real world. Hundreds of thousands of mathe-matical computations are almost instantaneously translated into a simulated environment that the operator can "see" by wearing a special helmet (see photo left). This year's award recognizes the efforts of design engineers Dave Stevenson of the Wheel Loader and Excavator Business Unit, Aurora, IL, and John Bettner of the Building Construction Products Division, Clayton, NC. "This technology allows us to dramatically shorten the amount of time it takes to analyze a new design concept and incorporate it into our production process," says Dave Stevenson. "It also represents a sizable cost savings because we are not having to build prototype machines or make last-minute design changes." Design engineer Stevenson says that when Caterpillar researchers relied on conventional design methods, they spent from six to nine months building full-scale models and evaluating design changes. With virtual reality, multiple design options often can be evaluated in less than a month. A number of design options already have been tested for new models of Caterpillar wheel loaders and backhoe loaders that will be introduced by 1996. The company also plans to eventually allow customers to "field test" new product designs using virtual reality technology. CATERPILLAR/NCSA PARTNERSHIP Caterpillar, headquartered in Peoria, IL, is the world's largest manufacturer of earthmoving and construction equipment and a major manufacturer of diesel and natural gas engines. The corporation joined NCSA in August 1989 through a multiyear, multimillion- dollar agreement. Caterpillar engineers working in Champaign and from Peoria, via remote linkage to NCSA's supercomputers, receive access, training, and extensive experience with the latest computing capabilities. AT&T, Dow Chemical Co., East-man Kodak Co., Eli Lilly & Co., FMC, J. P. Morgan, Motorola Inc., Phillips Petroleum Co., Schlumberger, and United Technologies Corp. are NCSA's other partners. NCSA established the annual Industrial Grand Challenge Award last year [see access, May--June 1992] to recognize corporations who accomplish breakthrough research through their NCSA partnership to improve product development and quality. s EDITOR'S NOTE: Caterpillar's award and use of virtual reality was featured in 45 media outlets, including: Associated Press CNN Chicago Sun-Times Cleveland Plain-Dealer Houston Chronicle HPCWire National Public Radio United Press International The Wall Street Journal Washington Times "With virtual reality," says Caterpillar Design Engineer Dave Stevenson, "we can evaluate dozens of design options." Wearing a Virtual Research Flight Helmet, Stevenson simulates "driving" a Caterpillar machine on Silicon Graphics' Skywriter workstation. (Courtesy Caterpillar Inc.) Dave Stevenson (left) and John Bettner (right) display their Industrial Grand Challenge Awards presented by NCSA Director Larry Smarr (center). Images of the abacus and CM-5 in the background of the plaques symbolize the growth in computing; pictures of Caterpillar vehicles in the foreground represent the researchers' work. (Photo by Wilmer Zehr) Partner executives and NCSA staff met for the Fifth Annual Executive Meeting of NCSA's Industrial Partners to discuss new high-performance computing technologies and strategic business importance. (Photo by Wilmer Zehr) MODELING DISPOSABLE DIAPERS BY LEE M. HUBER, ONSITE REPRESENTATIVE, THE DOW CHEMICAL COMPANY A Dow Chemical Company project, carried out by Dow's Kishore Kar, is producing a diaper design incorporating Drytech* superabsorbent polymer particles distributed in a fluff pad. The design was evaluated using a computer model run on NCSA's CRAY-2 system. NCSA research programmer Michael McNeill visualized the results by using a high-performance Silicon Graphics Inc. workstation. A still from the visualization [left] shows the computational grid that extends over a quarter of the diaper. Polymer particles are represented by spheres. The concentration of liquid and its motion, described as a function of time, are represented by the variation in color from violet to brown as well as the isometric surfaces. Numerical results, which were obtained at many timesteps, were converted to images and animated to view the time evolution of the process. The overall model was compared to a magnetic resonance imaging experiment, which provides a three- dimensional image of the water distribution in a diaper, and was shown to give comparable results to the final steady-state values. This new way of visualizing diaper performance with HPCC techniques is facilitating the development process for improved disposable diapers. More information about this project and other HPCC MetaCenter research efforts is available on the NCSA World Wide Web server at the following URL: http://www.ncsa.uiuc.edu/Pubs/MetaCenter/SciHi93/O.Highlights_ Contents.html * Drytech is a trademark of the Dow Chemical Company. Still image from animation showing simulation of fluid diffusion through a heterogeneous medium--cellulose fluff with embedded SuperAbsorbent Particles (SAP). (Courtesy Lee Huber, Dow Chemical Co., research; Michael McNeill, visualization) AT&T'S XUNET DEMONSTRATED FOR GORE Early in May Vice President Albert Gore toured AT&T Bell Laboratories at Murray Hill, NJ and saw a demonstration of the XUNET/BLANCA project with NJ Governor Jim Florio and U.S. Senator Frank Lautenberg. Afterwards AT&T announced that a major step forward had been taken in making the vision of a national information superhighway a reality. Now operating the nation's fastest wide-area Asynchronous Transfer Mode (ATM) network over fully optical links, the experimental research network will help in the development of commercial ATM networks. AT&T's 500-mile fiber optic network runs at 622 Mbps (megabits per second). The new network is an enhancement to AT&T's existing Experimental University Network (XUNET), an ATM network that connects seven leading research laboratories and universities and three AT&T switching sites nationwide. The new 622 Mbps links will remain connected to the rest of the XUNET, which uses ATM at 45 Mbps, through AT&T's XUNET experimental switch in Chicago. XUNET, sponsored by the Corporation for National Research Initiatives (CNRI) with funding for research provided by NSF and ARPA, forms an integral part of the BLANCA Gigabit Testbed [see access, September-October 1990]. The network currently links the University of Wisconsin at Madison, the University of Illinois at Urbana- Champaign, the University of California at Berkeley, Rutgers University, AT&T Bell Laboratories, Sandia National Laboratory, Lawrence Livermore National Laboratory, and three AT&T network switching sites. With the speed of this network, the entire contents of the Encyclopedia Britannica--approximately 13 million words--could be transmitted in less than 1 second. (The same amount of information sent from a home PC via the usual modem speed would take more than 2 1/2 days.) Networks such as XUNET could revolutionize people's lives. For example, high-resolution images of complex medical data, such as x-rays and scans, could be transmitted nearly instantaneously from a remote hospital to a specialist located across the country or the world to save diagnostic time. Designers would be able to use their desktop workstations to harness the power of remote supercomputers to build 3D prototypes of products. Interactive multimedia audio, full-motion video, and imaging conferences could bring people from distances together as if they were physically in the same room. The fully optical network from Madison, WI to Champaign-Urbana, IL is connected through the AT&T XUNET testbed ATM switch in Chicago and uses optical amplifiers which carry much higher speed signals than previously. The network has undergone operational testing and is currently active. Work will continue in the areas of HPCC applications testing, including real-time networking, multimedia, and desktop videoconferencing. s education ISDN links C-U schools to NCSA by Sarah Thomas, Graduate Student, UIUC School of Library and Information Science NCSA's Education Group is training and encouraging local elementary and high school teachers to use computers (including supercomputers) in their teaching. Through the ISDN (Integrated Services Digital Network) connection project, teachers and students can experiment with HPCC and obtain results in their classrooms in seconds or minutes. Using a microcomputer alone would take hours or days. Eight schools in the Champaign-Urbana area were selected to receive ISDN lines connected to the campus network at UIUC: Leal Elementary, Urbana Middle, and Urbana High (Urbana); Centennial High, Central High, Franklin Middle, Dr. Howard Elementary, and Holy Cross (Cham-paign). Ameritech donated the equipment necessary to install the ISDN lines, with the agreement that a six-month trial would be run to assess the benefits of the connection and to examine the possibilities for enhanced education through the use of computers by teachers and students. ENABLING "REAL SCIENCE" An objective of the Education Group is to encourage teachers and students to do "real science in real classrooms on real computers." Robert Panoff, senior research scientist at NCSA, clarifies this concept by explaining that most science education software only allows students to work with watered-down simulations or animations, which often distort real conditions and obscure the true nature of the objects being studied. By having access to data from realistic simulations--which often require the power of a supercomputer to run--and from direct observations, such as satellites or online telescopes, stu-dents gain a more accurate view of what real science is. PROVIDING SOFTWARE For example, the link with the campus network enables teachers and students to access data often discussed in science and math classes, such as the UIUC weather station. Satellite images that are updated every hour can be viewed in the classroom. In addition to weather data, the ISDN connection enables teachers and students to experiment with interactive computing on a supercomputer. s How ISDN circuits work A generic model of an ISDN line consists of two 64 kb/s circuit- switched channels (called B- [Basic] channels), and one 16 kb/s packet-switched channel (the D- [Data] channel). The B-channels carry data transmitted by the user, while the D-channel carries control information for the B-channels from the network to the user. The D-channel can also carry a limited amount of packet- switched data, but for the most part, this is the function of the B-channels. Most of the local ISDN upgrade was done at the telephone switching station. Elementary schools were given one B-channel; high schools got two. At the end of the ISDN line to the schools are IMACs. These are linked by Ethernet cable to Gator-boxes. Each school has a separate Gatorbox to reduce traffic. TCP/IP local talk network runs on the lines, but any protocol can be used. AppleTalk cables connect up to 15 personal computers to the Gatorbox. Each computer has an IP address; up to 15 addresses can be administered through the IMAC. Software for math & science by Sarah Thomas, Graduate Student, UIUC School of Library and Information Science NCSA is developing software and experimenting with commercially available software programs for teaching math and science. Teachers are encouraged to participate in programming efforts and software design that address concepts they are teaching. DEVELOPMENT OF FRACTAL MICROSCOPE SOFTWARE One of the software programs being used at this time is Fractal Microscope, which was developed by Panoff and Michael South, an REU intern at NCSA [see access, Spring 1993]. Their program plots the Mandelbrot set (named after Benoit Mandelbrot) by means of a calculation that is repeated recursively until the result is greater than 2. Based on the number of times the calculation can be performed until it "blows up" (i.e., produces a number greater than 2), different colors are assigned to each pixel location on a computer monitor. The calculation used in the Mandelbrot set takes a complex number, represented in the form of a point on a graph--one coordinate is a real number and the other an imaginary one--which corresponds to a pixel location. This number is squared and then added to its original value. The result of this calculation is squared, then added to the original value, and this process is repeated until a number greater than 2 is produced. As many as a billion calculations may be needed to generate a single image. RUNNING FRACTAL MICROSCOPE Students can zoom in on portions of the Mandelbrot set and view new images. With only minor promptings by the teacher on how to view the image to find where patterns can be observed, students can explore on their own. The program can be used to explore addition, multiplication, symmetry, and infinity; for more advanced students, more complex patterns can be introduced. Running the Fractal Microscope program with a supercomputer allows results to be viewed in a matter of seconds--or minutes, for extremely magnified images. By comparison, many of the images would take hours just to plot on a microcomputer, making the activity impractical for a classroom setting. Speed enabled by the ISDN link makes it possible for teachers to get students interested in math by rapidly creating attention-grabbing fractal images. Teachers then use the images to illustrate the concepts they want to convey. Fractal Microscope is only one of many software programs NCSA's Education Group uses with schools. Others include Chem Viz [see access, May-June 1992], Interactive Galaxy Simulation, and Biology Explorer. Plans are already underway to create documents using NCSA Mosaic [see access, Spring 1993] for next fall. s Part of the Mandelbrot set, generated with Fractal Microscope. Created by Michael South, REU student, and Robert Panoff, mentor. (Courtesy of NCSA Education Program) (Left) Fractal Microscope in use. (Right) Orbits of points in the Mandelbrot set plotted by StarStruck. (Courtesy of NCSA Education Program) Students at NCSA's SuperQuest Summer Institute with Larry Smarr. Those attending were from Bob Jones High School, Madison, AL; Illinois Mathematics and Science Academy, Aurora, IL; Riverside University High School, Milwaukee, WI; and Montgomery Blair High School, Silver Spring, MD. (Photo by Thompson-McClellan Photography) CLAREMONT HIGH WINS AWARD The Claremont High School team won second place in the SuperQuest '92 Best Student Paper Competition, which took place after the research projects were completed. Their project was to verify and expand topological knot tables. Team members were Christopher Candy, Kaan Erdener, and Danny Wu; Robs Muir was their teacher-coach. They participated in SuperQuest '92 at NCSA [see access, July-September 1992]. Sponsors of the SuperQuest program are the National Science Foundation, Digital Equipment Corporation, Cornell Theory Center, NCSA, Alabama Supercomputer Network, University of Alabama- Huntsville, Oregon Graduate Institute, Reed College, Sandia National Laboratory, IBM Corporation, Cray Research Inc., Intel Corporation, Cisco Inc., and the Advanced Digital Communications Consortium. 1993-94 REU MEMBERS Eight students, who have been chosen to participate in NCSA's Research Experiences for Undergraduates (REU) program [see access, Spring 1993], are listed below in order of students, academic affilliations, and NCSA mentors. All appointments were approved by NSF. William D. Baker, Jr., Univ. of Tulsa (Vernon Burton and Barbara Mihalas) Amy Biermann, Bryn Mawr College (Michael Heath and Robert Panoff) Stephen G. Gaddy, Penn. State Univ. (Robert Wilhelmson) Tejas R. Katwala, NJ Inst. of Technology (Barbara Mihalas) Stephen E. Lamm, Univ. of California at Irvine (Michael Heath and Robert Wilhelmson) Ed Mlodzik, Creighton Univ. (Robert Wilhelmson) J. Michael Pauza, Univ. of Tennessee at Chattanooga (Ed Seidel) Indira Rao, Univ. of Nevada at Reno (Shankar Subramaniam) new technology HP, CONVEX TEAM WITH NCSA . . . FOR SCALABLE APPLICATIONS RESEARCH On June 1, Hewlett-Packard Company, Convex Computer Corporation, and NCSA announced a collaborative effort built on a cluster of high-performance Hewlett-Packard (HP) workstations and a Convex supercomputer. The HP/Convex cluster will be used by researchers for scalable application development. HP workstations at NCSA's site also will be used in the research and development of distributed-computing environments, distributed and parallel applications, and multimedia applications. HP is providing more than $1 million of equipment to NCSA, including 18 desktop and deskside HP Apollo 9000 Series 700 workstations and multimedia software. The cluster will be tightly integrated with a CONVEX C3880 supercomputer creating a Meta Series using a high-performance interconnect and Convex software to generate the highest levels of application throughput performance. NCSA's users will access the cluster to carry out advanced scientific research in engineering, biology, materials and earth sciences, gas dynamics, computational chemistry, and seismic analysis. The HP/Convex cluster is a major component of NCSA's metacomputer, a collection of supercomputers, workstations, data storage, and advanced imaging resources that are integrated via sophisticated software technology to give users an environment that is as straightforward to use as a single computer [see access, September-December 1991]. The new cluster will spur NCSA efforts to build a scalable metacomputer based on leading-edge RISC technologies. NCSA and the other three NSF centers--Cornell Theory Center, Pittsburgh Supercomputing Center, and San Diego Supercomputer Center--are extending the metacomputer concept to the National MetaCenter, a synthesis of the intellectual and computing resources of the four centers [see access, October-December 1992]. HP, CONVEX CONTINUE ALLIANCE The announcement strengthens the business and technology alliance between HP and Convex. In May, HP announced it will license to Convex its UNIX-based HP-UX1 operating system and related software technologies; Convex will license to HP current and future software technologies. Previously, Convex has announced that it will base its future scalable parallel-processing system (SPP) on HP's PA-RISC2 technology. HP and Convex, in their partnership with NCSA, believe they can lead the market in the development of scalable computers, which provide several advantages over traditional high-end machines. Scalable computers are modular, upgradable, and binary-compatible from desktop to supercomputer, making them more economical and flexible than traditional supercomputers. Because Convex's SPP system, expected to be available in mid-1994, will be binary-compatible with HP's PA-RISC processors, users will have access to more than 4,000 PA-RISC-based applications addressing a broad spectrum of scientific and engineering disciplines. In addition, the SPP system will be source-compatible with existing CONVEX C Series supercomputers so that customers can use the 1,300 third-party applications already running on Convex's current product line. "This joint effort is further testimony to HP's status as the leading supplier of workstations to the scientific and engineering communities," says Gary B. Eichhorn, general manager of HP's Workstation Systems Group. "We're confident that the work the National MetaCenter is doing through NCSA and its other sites will one day fundamentally change the way we live our daily lives--at work and at home. HP is looking forward to this comprehensive, long-term partnership with NCSA, our continuing relationship with Convex, and to playing a key role in the development of the information infrastructure of tomorrow." "The future of high-performance computing depends on the development and availability of software applications that are able to take advantage of the latest hardware advances," said Steven J. Wallach, Convex's senior vice president of technology. "We are pleased to collaborate with NCSA, a recognized leader in development of supercomputer applications, and to advance computer research in existing areas such as the national data highway. This effort further demonstrates the strength of Convex's partnership with HP and our commitment to PA-RISC technology on our current high-performance systems. CORPORATE PARTICIPATION As part of a separate effort, the HP/Convex cluster will be used as a distributed-computing testbed for a new corporate "participation" program at NCSA. The program will allow corporations to join NCSA, HP, and Convex in developing distributed-computing environments and services and to port and optimize their applica-tion codes for the HP/Convex scalable platforms. "We view the participation program as a strong new component to our Industrial Program, allowing partners to move toward scalable computing in a structured way in collaboration with our staff," said John Stevenson, NCSA corporate officer. The center's industrial partners include AT&T, Caterpillar, Dow Chemical, Eli Lilly, FMC, Kodak, J. P. Morgan, Motorola, Phillips Petroleum, Schlumberger, and United Technologies. COLLABORATIVE TECHNOLOGIES NCSA will use several of the HP workstations in its Collaboratorium research laboratory, which is for developing new collaborative hypermedia and multimedia-based technologies. NCSA's globally distributed hypermedia software system, NCSA Mosaic, provides a user interface to the Internet and expanding corporate networks [see access, Spring 1993]. This type of next-generation electronic information system, in concert with HP- and NCSA- developed digital conferencing systems, provides the foundation for globally networked information and data solutions and the infrastructure for agile manufacturing efforts by HP and NCSA's clients. "The combination of HP workstations and Convex supercomputer technology is a key move toward realizing NCSA's goal of seamless desktop-to-teraflop capabilities," says NCSA Director Larry Smarr. "The wealth of third-party software programs available on these systems provides us with an excellent environment for real-world science and engineering problems." s NCSA's deskside HP Apollo 9000 cluster (opposite page) and close- up. (Photos by Wilmer Zehr) HIGH DEFINITION TECHNOLOGIES AT NCSA BY VINCENT JURGENS, TECHNICAL SERVICES, SCIENTIFIC COMMUNICATION AND MEDIA SERVICES The advancement of computational science, and the related hardware, moves forward with a general increase in information handling capabilities. At the end of the process is a presentation of the data, whether it is in text, numeric, visual, and/or audio forms. The visual analog to higher powered computers, networks, and storage devices is High Definition (HD) video technologies. NCSA is building an HD facility which is flexible enough to encompass the already high-resolution displays users are accustomed to in workstations, along with alternative display resolutions and new HDTV, recordable video resolutions. The equipment will be hard- wired into the HIPPI network being established for all of NCSA's high-performance machines. FIRST PHASE OF HD The first phase of the HD facility puts a PsiTech HFB24 multiresolution frame buffer on our HIPPI network, feeding a 100" rear-screen projected, Barco 1200 multisync projector. The CRAY Y/MP, CONVEX C3380, Thinking Machines' CM-5, and SGI Onyx computer systems will eventually all be on the HIPPI network and have PsiTech drivers installed for users to work with. Maximum resolution is 2048 x 2048 pixels, at 60 Hz, progressive scan refresh rates. This is a much higher resolution than any current HDTV standards or proposals. Being multiresolution, however, both the frame buffer and projector can be used at various resolutions, from 1280 x 1024 at 120 Hz on up. Any new standards for American HDTV production can be programmed into the PsiTech in the future. HD'S FUTURE AT NCSA The future directions of HD at NCSA involve the development of a true HD video and audio production environment, including recording, manipulation, and editing, building upon our experience with digital NTSC post-production and the creation of an HD animation for SIGGRAPH '92. Other forms of HD workstations, whereby HD displays are used as the main interface to one's workstation, are also being explored. To facilitate integration of our onsite resources, a comprehensive infrastructure is being planned, including video routing, scan conversion (between numerous video standards), recording, and display devices. Users are invited to visit the facility and explore the ways they might use this High Definition facility. Contact Vincent Jurgens, by phone at (217) 244-1543 or by electronic mail at vjurgens@ncsa.uiuc.edu (Internet). s REMODELING A CAREER BY PAULETTE SANCKEN, NCSA PUBLIC INFORMATION SPECIALIST Talking with David Bennett is a great way to jump-start a day. His enthusiasm for his job and the story of his recent life overhaul is enough to make one consider a midlife career change as advantageous. Bennett's career shift began when he discovered the realm of scientific visualization while attending an open house at UIUC's Beckman Institute in 1988. Viewing computer images created by Donna Cox and Ray Idaszak started a mental journey that culminated with Bennett quitting his job as a business systems director at an extended care facility and going back to school--at age 41. (Cox is NCSA co-director of SCMS and UIUC professor of Art; Idaszak, manager of High Performance Technology at the Information Technology Division of MCNC, was once a scientific visualization specialist at NCSA.) Bennett's timing could not have been better. Parkland Community College was initiating a curriculum in visualization computer graphics [see access, May-June 1989]. He became one of the first to enter the program. (Parkland now offers an associate in applied science career program in visualization computer graphics.) After two years of full-time study, part-time work, and an internship with NCSA industrial partner Motorola, Bennett was ready to re-enter the work force. OFF TO NCSC AND AVS In August 1990, Bennett started a new job as a scientific visualization specialist I at the North Carolina Supercomputing Center (NCSC). Within a year, he was promoted to a visualization specialist II working with AVS (Application Visualization System) software and was made project leader of the International AVS Center (IAC) located at MCNC. Another promotion propelled him to manager of Strategic Programs; today his official title is manager of Technical Program Development. All accomplished just short of three years--which is not surprising as you listen to the enthusiasm and obviously innate curiosity that comes from Bennett's voice and choice of words: "I have this sort of hunger. And because of that, I want to find out about new things. I go out, absorb as much as possible, and get a good overview. . . . Finding out what is going on in the world has led to lots of other interesting developments [here at the center]." One such development is the AVS '93 Conference held in May 1993 at Orlando, FL. As coordinator of the conference, Bennett invited NCSA Director Larry Smarr to give the keynote speech. "When I did the introduction," Bennett reflects, "I thanked him. Without his blessing of the [Parkland] program--as well as Donna Cox's--I would probably still be out there selling shoes, or pumping gas, or no telling what. So I was really pleased that NCSA was willing to open their doors to a bunch of students who did not know what the heck they were doing, on some totally innovative program that had never been tried anywhere else. [NCSA and Parkland] had no idea whether it was going to succeed. This is the sort of thing that your kind of center should be doing--promoting those leading-edge innovations. Taking them forward, and then at some point passing the gauntlet on for others to carry forward." Smarr's conference address focused on metacomputing and AVS. "AVS is typically thought of as a stand-alone workstation visualization tool," says Smarr. "This limits the size of datasets one can fit on a workstation, as well as the turnaround time being limited by the speed of the workstation." He suggests that by using AVS as a client-server, one could utilize the National Meta-center's HPCC machines as nationwide visualization servers. For more information about AVS, see "AVS simplifies visualization" on the opposite page. ROUNDING OUT THE CIRCLE Bennett's enthusiasm about visualization continues. Two years ago a program, similar to Parkland's, rose out of a conversation between Bennett and visitors from North Carolina's Wake Technical Community College. Interested in this new technology, college officials wanted to know what they could do to prepare their students for opportunities in the scientific visualization field. Bennett suggested the creation of a program like the one he graduated from at Parkland. This fall, Wake College will offer its first associate degree program in scientific visualization, which filled up just weeks after the announcement. If David Bennett is an example, graduates can expect a career that will only be limited by their own dreams. s Vision Dome--one of many projects initaiated by David Bennett, IAC. (Courtesy of AVS; visualization by Chris Landreth, MCNC Senior Scientific Animator) AVS SIMPLIFIES VISUALIZATION Distributed by Advanced Visual Systems Inc., Application Visualization System (AVS) is a scientific visualization and application development environment tool that enables users to construct visualizations from generic modules within the software. Built on a data flow paradigm, the easy-to-use graphical interface connects modules that result in the proper process for generating images. Bridging platforms AVS is useful to researchers because many people can use the same techniques and generic AVS modules even though they may be in different fields. AVS allows users to bridge platforms, or run any of its modules on any machine that has AVS installed. It is a distributed, heterogeneous, collaborative software tool. Several popular modules developed by users and distributed by the International AVS Center (IAC) utilize the Data Transfer Mechanism (DTM) protocol as a collaborative tool in the AVS environment. DTM was developed at NCSA by Jeff Terstriep, research programmer in the Computing and Communications Group [see access, September-December 1991]. Customizing modules "There are certain stages to the visualization process like reading in the data, scaling the data to fit in a certain range, associating colors with the numbers, or perhaps making a geometric form from the numbers. With AVS, every separate stage of this process is written as a module," explains William (Bill) Sherman, NCSA Virtual Reality project leader. With AVS, users can write new modules to be used with the modules built into AVS. For example, NCSA's PATHFINDER Program, being developed by NCSA Research Scientist Robert Wilhemson, is a collection of such modules that runs on several SGIs and on the CONVEX C3 using AVS [see access, July-September 1992]. NCSA Principal Investigator Joseph Lyding is running AVS on an HP system in his lab (see page 4). International AVS Center The IAC serves as a catalyst for expanding the AVS user base. It fosters discipline-specific module development and new uses for AVS. This worldwide clearinghouse collects, ports, and distributes user-contributed, public domain modules and acts as liaison between users and vendors. Advanced Visual Systems Inc., Convex Computer Corp., Digital Equipment Corp., Hewlett-Packard Co., IBM, Kubota Pacific Corp., and SUN Microsystems Inc., fund the IAC and promote software development. IAC publishes a quarterly journal called AVS Network News, hosts a yearly AVS User Group Conference, maintains an AVS Showroom, maintains a newsgroup (comp.graphics.avs), and presents lectures to user groups in Europe, Japan, and the U.S. For more information, contact the International AVS Center, P. O. Box 12889, 3021 Cornwallis Road, Research Triangle Park, NC 27709-2889 (U.S. mail); avs@ncsc.org (electronic mail); (919) 248- 1100 (phone); (919) 248-1101 (facsimile). NOTE: Randall Graham, science writer, contributed to this sidebar. NCSA MOSAIC EXPLOITS INTERNET BY KENNETH CHANG, NCSA RESEARCH PROGRAMMER, PUBLICATIONS GROUP Click. Click. See an exhibit of dinosaur fossils on your computer screen. Click. Click. Listen to a half-hour speech by F.W. DeKlerk to the National Press Club. Click. Click. Watch a short movie showing the path of last year's Hurricane Bob. The age of a global information network arrived some time ago, but few noticed, because there was no easy way to navigate through the nearly 10 terabytes of information--about 10,000 times more data than is found on a typical PC hard disk--that is publicly available on the Internet computer network. NCSA Mosaic, the latest software offering from NCSA's Software Development Group, opens a portal to this cornu-copia with a few clicks of the mouse. Click. Click. Read the text of a recent Supreme Court decision. Click. Click. Examine the latest satellite picture from the U.S. Weather Service. Click. Click. Browse through the electronic version of access. Built atop the World Wide Web technology developed by CERN (the high-energy physics laboratory in Switzerland), NCSA Mosaic is, in essence, HyperCard on a global scale. Each phrase highlighted in blue is a link pointing to related information, which can include text, images, sounds, and video sequences. For example, clicking on the words "dinosaur exhibit" could send you zipping through cyberspace to the exhibit on dinosaur fossils put together at Honolulu Community College. "Mosaic is turning hypermedia into a medium that can be practically used," says Marc Andree-ssen, NCSA programmer of the X Windows System version of Mosaic. "It's kind of like a '90s NCSA Telnet. Instead of logging into another computer, all of these computers on the network are able to give you information." NCSA Mosaic for the X Windows System was released in April. Since then, 5,000 copies a month have been downloaded from NCSA's anonymous ftp server. By the time you read this, Version 2.0 should be available, as should versions for the Macintosh and Microsoft Windows. NCSA Mosaic offers access to the more than 200 World Wide Web Servers, approximately 1,000 Gopher servers, and thousands of ftp servers, as well as WAIS databases and more obscure protocols. One of the greatest advantages of Mosaic, though, is that it combines all of these varied information servers into one unified information space. (That is, one rarely needs to know more than "click on the colored text" to effectively use the software.) In addition, Mosaic's Annotations feature enables one to attach personal notes to a document. Applications for this new technology are still in the discovery stage. The National Consortium for High Performance Computing [see access, Spring 1993] plans a Digital Library and Informations Systems Testbed, which will include such projects as an online library of medical images for the Human Genome and Human Brain Projects, archives of data from environmental monitoring satellites, and large datasets for social science research. Mosaic could be the software of choice for the testbed. "NCSA Mosaic is driving the next stage of evolution of the global Internet," says Joseph Hardin, head of the NCSA Software Development Group. "What emerges will be a new form of life online." NCSA Mosaic is available via anonymous ftp from NCSA's ftp server. See the inside back cover for directions on how to access the ftp server from the Internet. This software is free for individual use, but is copyrighted by the University of Illinois. NCSA and the University of Illinois retain rights to restrict the modification and redistribution of this software. s Mosaic screen shot. book review THE WHOLE INTERNET USER'S GUIDE AND CATALOG BY PAULETTE SANCKEN, NCSA PUBLIC INFORMATION SPECIALIST, PUBLICATIONS GROUP WHEN ASKED TO REVIEW ED KROL'S THE WHOLE INTERNET USER'S GUIDE AND CATALOG, I BALKED. ME? READ A TECHNICAL BOOK WITH THE NOTION OF UNDERSTANDING IT WELL ENOUGH TO TELL OTHERS? I HAD PLAYED AROUND IN CYBERSPACE ENOUGH TO KNOW THERE'S A LOT OF INFORMATION OUT THERE--WITH SEEMINGLY MANY WAYS TO VIEW IT--AND IT AMOUNTED TO A CONFUSING SYSTEM ONLY THE "TECHNOGEEK" COULD EMBRACE. BUT, OKAY, WHY NOT? I WAS CERTAINLY INTERESTED ENOUGH TO GIVE IT A GO. My misgivings turned out to be insignificant. The book is a surprisingly good read. It's well written and understandable to the novice; I would expect the experienced user to also benefit. "This book is intended for anyone who wants access to the Internet's tremendous resources. It's a book for professionals, certainly, but not computer professionals. It's designed for those who want to use the network, but who don't want to become a professional networker in order to use it," states Krol in his preface. How the book is organized KROL STARTS OUT WITH SOME HISTORY AND THEORY, PROMISING TO KEEP IT TO A MINIMUM--WHICH HE DOES. MOST OF THE BOOK DISCUSSES HOW TO USE THE TOOLS THAT ALLOW YOUR COMPUTER TO DO THINGS ON THE INTERNET: MOVING FILES WITH FTP, USING ELECTRONIC MAIL, ACCESSING NETWORK NEWS, WORKING WITH ARCHIE, AND HOW TO FIND SOMEONE ON THE SYSTEM. AND THERE ARE SEVERAL CHAPTERS DEVOTED TO THE MOST RECENT, "FRIENDLY" TOOLS LIKE GOPHER, WAIS, AND THE WORLD-WIDE WEB. THE FINAL SECTION IS A RESOURCE CATALOG--A LIST OF THINGS KROL AND SOME HELPERS FOUND ON THE INTERNET. ARRANGED BY SUBJECT, IT'S A GREAT PLACE TO START TO INVESTIGATE WHAT'S AVAILABLE. Can't afford a magazine subscription? Login to the electronic version of PC Magazine published by Ziff Davis (see page 293). Having special guests for dinner tonight? There's a recipe archive to browse through (see page 295). Developing Carpal Tunnel Syndrome from too much Internet browsing? Try the FDA Electronic Bulletin Board for information (see page 301). Want to play chess with another user? Find out what others think about a recent movie? It's all there, and more. Krol writes "You need three things to explore and use the Internet: a desire for information, the ability to use a computer, and access to the Internet. Desire for information is the most important." I might add that you need a fourth: a means to understand the workings of the Internet. Krol's book certainly provides this vital link. If you are interested in the Internet, but you are not willing to work your way through a mass of technical jargon and oblique explanations, this book is for you. It's available in most bookstores. It is published by O'Reilly & Associates Inc., Sebastopol, CA. About the author KROL WAS NCSA'S NETWORK MANAGER IN ITS EARLY DAYS. HIS ORIGINAL TASK WAS TO GET THE UIUC CAMPUS CONNECTED TO ARPANET, WHICH WAS THE BEGINNING OF HIS WORK IN WIDE-AREA NETWORKING. HE ALSO OVERSAW THE UIUC INSTALLATION OF NSFNET, AND HE NOW WORKS AT COMPUTING AND COMMUNICATIONS SERVICES OFFICE DIRECTING NETWORK OPERATIONS AND NETWORK INFORMATION SERVICES FOR THE CAMPUS. S Illustrations from "Preface" (left) and "Moving Files: FTP" (right) in The Whole Internet User's Guide and Catalog. (Courtesy of O'Reilly & Associates Inc.) center activities CENTER CACHE AWARDS CHARLES CATLETT, ASSOCIATE DIRECTOR FOR COMPUTING AND COMMUNICATIONS, RECEIVED THE IEEE COMMUNICATIONS SOCIETY'S FRED W. ELLERSICK PRIZE PAPER AWARD FOR 1992, WHICH WILL BE PRESENTED AT THE GLOBECOM CONFERENCE, NOVEMBER 29-DECEMBER 2, 1993. THE PAPER, "IN SEARCH OF GIGABIT APPLICATIONS," WAS PUBLISHED IN THE APRIL 1992 ISSUE OF IEEE'S COMMUNICATIONS MAGAZINE. OUTREACH MICHAEL HEATH, LEADER OF NCSA'S MATHEMATICS AND COMPUTER SCIENCE TEAM IN THE APPLICATIONS GROUP, WAS NAMED CO-CHAIR FOR ALGORITHMS ON THE PROGRAM COMMITTEE FOR THE 5TH SYMPOSIUM ON FRONTIERS OF MASSIVELY PARALLEL COMPUTATION THAT WILL BE HELD IN MCLEAN, VA, JUNE 1994. Alaina Kanfer, social science research assistant; Michael Welge, research programmer; and Scott Lathrop, team leader, have been working with the Applications and Technical Subcommittees of the Champaign County Chamber of Commerce's Infostructure Task Force. The subcommittees are defining and investigating local needs in education, agribusiness, health care, community services, local government, and small business applications in response to NCSA Director Larry Smarr's challenge for Champaign-Urbana to take advantage of emerging technologies [see access, Spring 1993]. NCSA has been working to support Champaign-Urbana's FreeNet effort called PrairieNet, which will coordinate with the chamber's activities. PERSONNEL BETH RICHARDSON, NCSA SENIOR RESEARCH PROGRAMMER, BEGAN WORKING WITH ALAN CRAIG ON NCSA'S TRAINING IN MID-JULY AND IS NOW A MEMBER OF NCSA'S USER SERVICES. SHE CONTINUES TO PROVIDE INDIVIDUAL ASSISTANCE TO USERS BY SPENDING PART OF HER TIME IN THE CONSULTING OFFICE. Balaji Veeraraghavan recently joined NCSA's Applications Group as a research programmer in computational chemistry. Jay Alameda, formerly Dow Chemical Co.'s consultant, joined the group as a research programmer in chemical engineering. Prasad Ravi took Alameda's former position as Dow consultant. PROMOTION DONNA COX, CO-DIRECTOR OF NCSA'S SCIENTIFIC COMMUNICATIONA AND MEDIA SERVICES, WAS RECENTLY PROMOTED TO PROFESSOR OF ART AND DESIGN BY THE UI BOARD OF TRUSTEES. COMPUTER GRAPHIC ARTIST COX TEACHES IN THE UIUC COLLEGE OF FINE AND APPLIED ARTS. PUBLICATION AS ACCESS WAS IN THE FINAL PHASE OF PRODUCTION, THE ANNOUNCEMENT WAS MADE THAT A GROUP OF UIUC CHEMISTS HAD CREATED A 3D MODEL THAT MAY PREDICT WHERE HORMONAL STEROIDS BIND TO PROTEINS. THE RESEARCHERS USED NCSA'S CRAY-2 SYSTEM AND AN SGI CRIMSON WORKSTATION AT THE UIUC DEPARTMENT OF CHEMISTRY TO MODEL THEIR CALCULATIONS. The group's leader Peter Wolynes, NCSA principal investigator, UIUC chemistry professor, and member of the Beckman Institute, made the announcement at the annual meeting of the American Chemical Society at Chicago in mid-August. The model will be published in the October Proceedings of the National Academy of Sciences. Other members of the research team include John Katzenellbogen, UIUC professor of chemistry, and Zan Luthey- Schulten, UIUC chemist and NCSA research scientist. VISITOR THOMAS DEFANTI, CO-DIRECTOR OF UIC'S ELECTRONIC VISUALIZATION LABORATORY, IS A VISITING ASSOCIATE DIRECTOR FOR VIRTUAL ENVIRONMENTS AT NCSA DURING THE FALL SEMESTER OF THE 1993-94 ACADEMIC YEAR. WHILE ON SABBATICAL TO NCSA, DEFANTI WILL BE COLLABORATING WITH NCSA'S VISUALIZATION TASK FORCE TO BUILD A LEADING-EDGE VIRTUAL ENVIRONMENT FACILITY. S NICO HABERMANN (1932-1993) A. NICO HABERMANN, NSF ASSISTANT DIRECTOR FOR COMPUTER AND INFORMATION SCIENCE AND ENGINEERING, SUFFERED A FATAL HEART ATTACK AT HIS HOME IN PITTSBURGH, PA ON AUGUST 8. HE WAS 62. Since 1991, Habermann had been on leave to NSF from Carnegie Mellon University. At Carnegie Mellon, Habermann was the Alan J. Perlis professor of computer science and founder of the Software Engineering Institute there. An internationally known computer scientist, Habermann was recognized for his work in programming languages, operating systems, software engineering, and packages. Born in the Netherlands, he was educated at the Free University of Amsterdam before obtaining his doctor-ate in applied mathematics from the Technological University at Eindhoven. He is survived by his wife and four children. The directors and staff of NCSA extend their sym-pathy to the Habermann family. Habermann's passing is a great loss to the NSF centers since he guided them in the development of the National MetaCenter. s HCI PAPERS AVAILABLE BY COLLEEN BUSHELL, VISUALIZATION AND HUMAN- COMPUTER INTERACTION DEVELOPER, SOFTWARE DEVELOPMENT GROUP NCSA sponsors a special interest group in Human-Computer Interaction (HCI) [see access, October-December 1992]. This group, chaired by Colleen Bushell, meets twice monthly during the academic year to present current research and development activities. Areas of discussion include technical, sociological, psychological, and aesthetic issues of human-computer interaction. They range over collaborative technologies, visualization, virtual reality, information systems, and multimedia. The group is open to the campus community and currently consists of approximately 60 people representing 16 UIUC depart-ments. For additional information about the HCI group, contact Colleen Bushell at (217) 244-6830 via phone or cbushell@ncsa.uiuc.edu via Internet. Announcements for HCI meetings/presentations are posted in the UIUC newsgroup: ncsa.hci.uiuc (Internet). NCSA has made available papers written by HCI group members in its technical report series. The 1992-93 academic year offering, called HCI Papers: Compilation for 1992-93, will be collected together as a single report. Additional papers will be added to the collection each semester. Reports that may be accessed soon from the NCSA servers are marked with an asterisk (*). (See the inside back cover for the addresses of the servers.) Following is a list of papers that will be available in the near future: M. Pauline Baker "Exploring the Relationship between User Task and Display Parameters in Support of Visual Data Analysis" * Noshir S. Contractor and David R. Seibold "Theoretical Framework for the Study of Structuring Processes in Group Decision Support Systems: Adaptive Structuration Theory and Self-Organizing Systems Theory" [Originally in Human Communication Research, vol. 19, no. 4 (June 1993)] Patricia M. Jones "Dimensions of Human-Computer Cooperative Problem Solving" * Barbara J. O'Keefe, Noshir S. Contractor, Patricia M. Jones, and Stephen C-Y. Lu "Studying and Supporting Human Interaction Processes" * Carla Scaletti and Alan B. Craig "Using Sound to Extract Meaning from Complex Data" [Originally in "Extracting Meaning from Complex Data: Processing, Display, Interaction II," Edward J. Farrell (Ed.), Proc. SPIE 1459, 207-219 (1991)] Susan Leigh Star "The Trojan Door: Organizations, Work, and the 'Open Black Box'" * [Originally in Systems/Practice, vol. 5 (1992)] Susan Leigh Star "Cooperation without Consensus in Scientific Problem Solving: Dynamics of Closure in Open Systems" * [Originally in CSCW: Cooperation or Conflict? Steve Easterbrook (Ed.), 93-105. London: Springer-Verlag (1993).] Susan Leigh Star and Geoffrey Bowker "Knowledge and Infrastructure in International Information Management: Problems of Classification and Coding" * [To appear in Information Acumen: The Understanding and Use of Knowledge in Modern Business, L. Bud (Ed.). London: Routledge (1993).] Christopher D. Wickens "Cognitive Issues in Virtual Reality" * [To appear in Virtual Reality, W. Barfield and T. Furness (Eds.)] Christopher D. Wickens "Virtual Reality in Education"* [To appear in Proc. IEEE Internatl. Conf. on Systems, Man, and Cybernetics (1992).] Christopher D. Wickens and David H. Merwin "Visualization of Higher Dimensional Databases" * [Originally in Proc. IEEE Visualization Conf. (1992)] Christopher D. Wickens, David H. Merwin, and Emilie L. Lin "Human Factors Implications of Graphics Enhancements for the Visualization of Scientific Data: Dimensional Integrality, Stereopsis, Motion, and Mesh" * These papers soon may be obtained through the NCSA Technical Resource Catalog or by contacting Orders for Publications, NCSA Software, and Multimedia [see ncsa contacts, page 2]. s NCSA and SDSC collaborated to hold simultaneous MetaCenter Computational Science Institutes on Parallel Computing in August. Two sessions were held jointly via the medium of videotele- conferencing. Those who participated are pictured at the Beckman Institute. (Photo by Tony Baylis, SCMS) UniTree, NCSA's new archival storage system that replaced the Common File System (CFS), and the Andrew File System (AFS), NCSA's new distributed file system, are overseen by the Distributed File Systems and Mass Storage team of (left to right) Michelle Butler, Jeff Rosendale, Ral Geis, Arlan Finstead, and Nancy Yeager (center). (Photo by Tony Baylis, SCMS) ACM/SAC '94 Phoenix, AZ is the location of the 1994 ACM Symposium on Applied Computing (SAC '94) to be held March 6-8 at the Phoenix Civic Plaza. For the past nine years, SAC has been a primary forum for applied computing practitioners and researchers in the areas related to genetic algorithms and other optimization techniques. The conference will be held in conjunction with the 1994 ACM Computer Science Conference (CSC '94). For further information, contact the conference coordinator, Edmund Deaton of San Diego State University, at deaton@cs.sdsu.edu (Internet). NCSA PREPRINTS AND TECHNICAL REPORTS BY GINNY HUDAK-DAVID, NCSA PUBLICATIONS EDITOR, PUBLICATIONS GROUP Last year NCSA established a preprint and a technical report series to showcase the work of NCSA staff. Below are listed the preprints and reports processed to date. Reports and preprints that are accessible on the NCSA servers (anonymous FTP and the World Wide Web) are noted with an asterisk (*). See the inside back cover for the addresses of the servers. PREPRINTS P001 Andrew Abrahams, David Bernstein, David Hobill, Edward Seidel, and Larry Smarr "Numerically Generated Black Hole Spacetimes: Interaction with Gravitational Waves" (February 1992)* P002 James M. Stone and Michael L. Norman "The Magnetic Collimation of Bipolar Outflows I: Adiabatic Simulations" (April 1992)* P003 James M. Stone and Michael L. Norman "ZEUS-2D: A Radiation Magnetohydrodynamics Code for Astrophysical Flows in Two Space Dimensions: I. The Hydrodynamic Algorithms and Tests" (April 1992)* P004 James M. Stone and Michael L. Norman "ZEUS-2D: A Radiation Magnetohydrodynamics Code for Astrophysical Flows in Two Space Dimensions: II. The Magnetohydrodynamic Algorithms and Tests" (April 1992)* P005 James M. Stone, Dimitri Mihalas, and Michael L. Norman "ZEUS-2D: A Radiation Magnetohydrodynamics Code for Astrophysical Flows in Two Space Dimensions: III. The Radiation Hydrodynamic Algorithms and Tests" (April 1992)* P006 James M. Stone and Michael L. Norman "The 3D Interaction of a Supernova Remnant with an Interstellar Cloud" (April 1992)* P007 James M. Stone and Dimitri Mihalas "Upwind Monotonic Interpolation Methods for the Solution of the Time Dependent Radiative Transfer Equation" (April 1992)* P008 James M. Stone, John F. Hawley, Charles R. Evans, and Michael L. Norman "A Test Suite for Magnetohydrodynamical Simulations" (April 1992)* P009 Harrell Sellers "Kinetic Isolation: An Efficient Method for Constrained Geometry Optimization" (April 1992) P010 Harrell Sellers "The C2-DIIS Convergence Acceleration Algorithm" (April 1992) P011 Michael L. Norman and Dinshaw Balsara "3D Hydrodynamical Simulations of Extragalactic Jets" (April 1992) P012 Louis J. Wicker and Robert B. Wilhelmson "Numerical Simulation of Tornadogenesis within a Supercell Thunderstorm" (May 1992) P013 Mordecai-Mark Mac Low and Michael L. Norman "Nonlinear Growth of Dynamical Overstabilities in Blast Waves" (July 1992) P014 Sung Yi, Gerry D. Pollack, M. Fouad Ahmad, and Harry H. Hilton "Thermo-viscoelastic Finite Element Analysis of Composite Shells" (September 1992) P015 M. H. Swellam, M. F. Ahmad, R. H. Dodds, and F. V. Lawrence "The Stress Intensity Factors of Tensile-shear Spot Welds" (September 1992) P016 Edward Seidel and Wai-Mo Suen "Towards a Singular-ity- proof Scheme in Numerical Relativity" (November 1992) P017 Amaury Fonseca, Jr. and M. Fouad Ahmad "PARFES-- Parallel Finite Element Solvers for Flow-Induced Fracture" (November 1992) P018 Robert B. Wilhelmson, Steve Koch, M. Arrott, J. Hagedorn, G. Mehrotra, C. Shaw, J. Thingvold, B. Jewett, and L. Wicker "PATHFINDER: Probing ATmospHeric Flows in an INteractive and Distributed EnviRonment" (December 1992) P019 Keith R. Searight, David P. Wojtowicz, Kenneth P. Bowman, Robert B. Wilhelmson, and John E. Walsh "Envision: A Collaborative Analysis and Display System for Large Geophysical Data Sets" (December 1992) P020 Michael L. Norman "Magnetic Shaping of SNRs and Their Bubbles" (December 1992) P021 Jerry M. Straka, Robert B. Wilhelmson, Louis J. Wicker, John R. Anderson, and Kelvin K. Droegemeier "Numerical Solutions of a Nonlinear Density Current: A Benchmark Solution and Comparisons" (February 1992) P022 Sung Yi, M. Fouad Ahmad, Harry H. Hilton, and Gerry D. Pollock "Vibration Responses of Viscoelastically Damped Plates" (April 1993) P023 Sung Yi, Gerry D. Pollock, M. Fouad Ahmad, and Harry H. Hilton "Thermo-viscoelastic Analysis of Fiber-matrix Interphase" (April 1993) P024 Deyang Song, Eric Golin, and Michael Norman "A Fine- grain Dataflow Model for Scientific Visualization Systems" (April 1993)* P025 Deyang Song and Michael L. Norman "Cosmic Explorer: A Virtual Reality Environment for Exploring Cosmic Data" (April 1993)* P026 Deyang Song and Michael L. Norman "Visualizing Multiscale Cosmological Data using Virtual Reality" (May 1993)* P027 Wenbo Y. Anninos and Michael L. Norman "Nonlinear Hydrodynamics of Cosmological Sheets I. Numerical Techniques and Tests" (July 1993)* P028 Rami M. HajAli, David A. Pecknold, and M. Fouad Ahmad "Combined Micromechanical and Structural Finite Element Analysis of Laminated Composites" (July 1993) P029 Marcus Wagner and David M. Ceperley "Path Integral Monte Carlo Simulations of H2 Surfaces" (July 1993)* P030 Marcus Wagner and David M. Ceperley "Path Integral Monte Carlo Simulations of Thin 4He Films on a H2 Surface" (July 1993)* P031 Danesh Tafti "A Study of High-Order Spatial Finite Difference Formulations for the Incompressible Navier-Stokes Equations" (September 1993)* P032 Danesh Tafti "Vorticity Dynamics and Scalar Transport in Separated and Reattached Flow on a Blunt Plate" (September 1993)* P033 Danesh Tafti "Implementation of a General Purpose Finite- Difference Algorithm on the CM-5 for Direct Numerical Simulations of Turbulence" (September 1993)* TECHNICAL REPORTS TR001 Joel Replogle, Charlie Catlett, and Randy Butler "FDDI Testbed Cray Research/Network Systems Beta Test Results, Revision I" (February 1991) TR002 Charles E. Catlett, Domenico Ferrari, Roy Campbell, Larry Landweber, Bill Hibbard, and Murray Thompson "BLANCA Testbed Annual Progress Report 1990-91" (August 1991) TR003 Deyang Song and Michael L. Norman "Nonlinear Interactive Motion Control Techniques for Virtual Space Navigation" (June 1992)* TR004 Greg Bryan "Adapting CMHOG to Cosmological Studies" (June 1992)* TR005 Arlan Finestead and Nancy Yeager "Performance of a Distributed Superscalar Storage Server" (October 1992) TR006 Patrick J. Moran "A Soft Module Mechanism for Data Flow Systems" (November 1992) TR008 Charles E. Catlett, Larry Landweber, et al "Annual Progress Report 1991-1992 BLANCA Testbed" (December 1992) TR009 Charles E. Catlett and Lex Lane "Experiments in Superscalar RISC Clusters" (December 1992)* TR010 Herbert Edelsbrunner and Ping Fu "Measuring Space Filling Diagrams" (April 1993) TR011 Gerry D. Pollock, M. Fouad Ahmad, and Paul Corcoran "A Finite Element Approach to Landfill Compaction" (April 1993) TR012 C. D. Gregory "A VIEWIT Cookbook for NMR Imaging and Spectroscopy" (April 1993)* TR013 Patrick J. Moran "Nicer-Slicer-Dicer: An Interactive Volume Visualization Tool" (August 1993)* TR014 Patrick J. Moran "Tele-Nicer-Slicer-Dicer: A New Tool for the Visualization of Large Volumetric Data" (August 1993) * All preprints and technical reports are available free of charge by contacting Orders for Publications, NCSA Software, and Multimedia [see ncsa contacts, page 2]. s CRAY-2 stops. NCSA's CRAY-2 system was unplugged after NCSA Director Larry Smarr typed the command for shutdown on August 26. The CRAY-2 was installed in 1988. (Left to right) Bob Wilhemson, Mike Norman, Jim Bottum, Charlie Catlett, Jeff Rosendale, and Smarr. (Photo by Tony Baylis, SCMS) CSS93. NCSA hosted the 4th annual conference of the Social Science Computing Association's Computing in the Social Sciences 1993 conference in May. Attendees--shown with banquet speaker Cora Marrett (far left), NSF associate director of Sociological, Behavioral, and Economical Sciences directorate--included (left to right) Bruce Tonn, incoming president; Douglas White, outgoing president; and William Bainbridge, head of sociology at NSF and chair of micro-computing, ASA (American Sociological Association). (Photo by Thompson-McClellan Photography) STUDENTS FROM ST. PETERSBURG Eleven students plus a couple of their teachers from St. Petersburg, Russia (formerly Leningrad) visited NCSA while in Champaign-Urbana during the spring semester. They participated in an exchange program with Univer-sity High School as part of a national program that involves 30 high schools around the U.S. Lex Lane, NCSA system security officer, guided them on a tour of NCSA's Machine Room. Jennifer Czoka,UIUC graduate student in English as a Second Language, acted as their interpreter during the visit. (Top) Curt Canada of the CM-5 team explained the workings of MPP to the group. (Bottom) Danilo Smalechko especially was intrigued with the CRAY- 2 system. Lex Lane is at his left. (Photos by Thompson-McClellan Photography) SIGGRAPH '93 SIGGRAPH '93, the 20th of the series, was held August 1-6 in Anaheim, CA. Its theme was "Be a Part of the Vision." Additions to this year's event included the first ACM Multimedia Conference, running concurrently; "Machine Culture: The Virtual Frontier," a curated, interactive art exhibit; and "Designing Technology," displaying "works that explore the role of design in the development of technology." NCSA's involvement in the conference was as follows: Panels Committee Chair, Donna Cox, co-director of Scientific Communications & Media Systems (SCMS); Mike McNeill, visualization specialist, committee member; Mark Bajuk, visualization consultant in Applications, committee member. Courses Robin Bargar, research programmer, Soft-ware Development Group, and producer, SCMS, was a lecturer in: "Applied Virtual Reality," an intermediate course on August 2; "An Introduction to Data Sonifica-tion," a beginning-level course on August 5. Electronic Theater "Data Driven: the Story of Franz K." Chris Landreth, North Carolina Supercomputing Center; Robin Bargar, NCSA. Technical Slide Set "Gravitational Waves from a Strongly Distorted Black Hole." Scientific Research: Peter Anninos, David Bernstein, Ed Seidel, Larry Smarr, NCSA; David Hobill, University of Calgary, Alberta; Scientific Visualization: Mark Bajuk, NCSA [see access, May-June 1992]. abbreviations ARPA Advanced Research Projects Agency CRI Cray Research Inc. HPCC High Performance Computing and Communications MPP Massively Parallel Processing NASA National Aeronautics and Space Administration NCAR National Center for Atmospheric Research NCSA National Center for Supercomputing Applications NOAA National Oceanic and Atmospheric Administration NRAO National Radio Astronomy Observatory NSF National Science Foundation TMC Thinking Machines Corp. UIC University of Illinois at Chicago UIUC University of Illinois at Urbana-Champaign documentation orders Articles in this access may refer to items that are available through the NCSA Technical Resources Catalog. To receive a copy of the catalog, send your request to Orders for Publication, NCSA Software, and Multimedia [see ncsa contacts, page 2]. accessing NCSA's servers Many of NCSA's publications (e.g., calendar of events, user guides, access, technical reports) as well as SDG software are available via the Internet on one of three NCSA servers: anonymous FTP, Gopher, or the World Wide Web. If you are connected to the Internet, we encourage you to take advantage of the easy-to-use servers to copy or view files. Anonymous FTP IP address: ftp.ncsa.uiuc.edu (141.142.20.50) Gopher server IP address: gopher.ncsa.uiuc.edu World Wide Web IP address: www.ncsa.uiuc.edu NCSA WWW home page: http://www.ncsa.uiuc.edu/General/NCSAHome.html If you have any questions about accessing the servers, contact your local system administrator or network expert. Instructions for accessing the anonymous FTP server are below: downloading from anonymous FTP server A number of NCSA publications are installed on the NCSA anonymous FTP server. If you are connected to Internet, you can download NCSA publications by following the procedures below. If you have any questions regarding the connection or procedure, consult your local system administrator or network expert. 1. Log on to a host at your site that is connected to Internet and running software supporting the FTP command. 2. Invoke FTP by entering the Internet address of the server: ftp ftp.ncsa.uiuc.edu or ftp 141.142.20.50 3. Log on using anonymous for the name. 4. Enter your local login name and address (e.g., smith@ncsa.uiuc.edu) for the password. 5. Enter get README.FIRST to transfer the instructions file (ASCII) to your local host. 6. Enter quit to exit FTP and return to your local host. 7. The NCSA publications are located in the /ncsapubs directory. All brand and product names are trademarks or registered trademarks of their respective holders. access National Center for Supercomputing Applications 152 Computing Applications Building 605 East Springfield Avenue Champaign, IL 61820-5518 .