CumInCAD is a Cumulative Index about publications in Computer Aided Architectural Design
supported by the sibling associations ACADIA, CAADRIA, eCAADe, SIGraDi, ASCAAD and CAAD futures

PDF papers
References

Hits 1 to 20 of 261

_id 450c
authors Akin, Ömer
year 1990
title Computational Design Instruction: Toward a Pedagogy
source The Electronic Design Studio: Architectural Knowledge and Media in the Computer Era [CAAD Futures ‘89 Conference Proceedings / ISBN 0-262-13254-0] Cambridge (Massachusetts / USA), 1989, pp. 302-316
summary The computer offers enormous potential both in and out of the classroom that is realized only in limited ways through the applications available to us today. In the early days of the computer it was generally argued that it would replace the architect. When this idea became obsolete, the prevailing opinion of proponents and opponents alike shifted to the notion of the computer as merely adding to present design capabilities. This idea is so ingrained in our thinking that we still speak of "aiding" design with computers. It is clear to those who grasp the real potential of this still new technology - as in the case of many other major technological innovations - that it continues to change the way we design, rather than to merely augment or replace human designers. In the classroom the computer has the potential to radically change three fundamental ingredients: student, instruction, and instructor. It is obvious that changes of this kind spell out a commensurate change in design pedagogy. If the computer is going to be more than a passive instrument in the design studio, then design pedagogy will have to be changed, fundamentally. While the practice of computing in the studio continues to be a significant I aspect of architectural education, articulation of viable pedagogy for use in the design studio is truly rare. In this paper the question of pedagogy in the CAD studio will be considered first. Then one particular design studio taught during Fall 1988 at Carnegie Mellon University will be presented. Finally, we shall return to issues of change in the student, instruction, and instructor, as highlighted by this particular experience.
series CAAD Futures
email
last changed 2003/11/21 15:15

_id 2f1a
authors Dabney, M.K., Wright, J.C. and Sanders, D.H.
year 1999
title Virtual Reality and the Future of Publishing Archaeological Excavations: the multimedia publication of the prehistoric settlement on Tsoungiza at Ancient Nemea
source New York: The Metropolitan Museum of Art
summary The Nemea Valley Archaeological Project is a study of settlement and land use in a regional valley system in Greece extending from the Upper Paleolithic until the present. Active field research was conducted by four teams between 1981 and 1990. The first component was a regional archaeological survey. Second, and closely related to the first, was a social anthropological study of modern settlement and land use. Next was a team assigned to excavate the succession of prehistoric settlements of Ancient Nemea on Tsoungiza. Last, historical ecologists, a palynologist, and a geologist formed the environmental component of the research. As a result of advances in electronic publishing, plans for the final publication of the Nemea Valley Archaeological Project have evolved. Complete publication of the excavation of the prehistoric settlements of Ancient Nemea on Tsoungiza will appear in an interactive multimedia format on CD/DVD in Fall 2000. This project is planned to be the first electronic publication of the American School of Classical Studies at Athens. We have chosen to publish in electronic format because it will meet the needs and interests of a wider audience, including avocational archaeologists, advanced high school and college students, graduate students, and professional archaeologists. The multimedia format on CD/DVD will permit the inclusion of text, databases, color and black-and-white images, two and three-dimensional graphics, and videos. This publication is being developed in cooperation with Learning Sites, Inc., which specializes in interactive three-dimensional reconstructions of ancient worlds http://www.learningsites.com. The Nemea Valley Archaeological Project is particularly well prepared for the shift towards electronic publishing because the project's field records were designed for and entered in computer databases from the inception of the project. Attention to recording precise locational information for all excavated objects enables us to place reconstructions of objects in their reconstructed architectural settings. Three-dimensional images of architectural remains and associated features will appear both as excavated and as reconstructed. Viewers will be able to navigate these images through the use of virtual reality. Viewers will also be able to reference all original drawings, photographs, and descriptions of the reconstructed architecture and objects. In this way a large audience will be able to view architectural remains, artifacts, and information that are otherwise inaccessible.
series other
last changed 2003/04/23 15:14

_id 298e
authors Dave, Bharat and Woodbury, Robert
year 1990
title Computer Modeling: A First Course in Design Computing
source The Electronic Design Studio: Architectural Knowledge and Media in the Computer Era [CAAD Futures ‘89 Conference Proceedings / ISBN 0-262-13254-0] Cambridge (Massachusetts / USA), 1989, pp. 61-76
summary Computation in design has long been a focus in our department. In recent years our faculty has paid particular attention to the use of computation in professional architectural education. The result is a shared vision of computers in the curriculum [Woodbury 1985] and a set of courses, some with considerable historyland others just now being initiated. We (Dave and Woodbury) have jointly developed and at various times over the last seven years have taught Computer Modeling, the most introductory of these courses. This is a required course for all the incoming freshmen students in the department. In this paper we describe Computer Modeling: its context, the issues and topics it addresses, the tasks it requires of students, and the questions and opportunities that it raises. Computer Modeling is a course about concepts, about ways of explicitly understanding design and its relation to computation. Procedural skills and algorithmic problem solving techniques are given only secondary emphasis. In essential terms, the course is about models, of design processes, of designed objects, of computation and of computational design. Its lessons are intended to communicate a structure of such models to students and through this structure to demonstrate a relationship between computation and design. It is hoped that this structure can be used as a framework, around which students can continue to develop an understanding of computers in design.
series CAAD Futures
email
last changed 2003/05/16 20:58

_id 4ae8
authors Kokosalakis, Jen, Hohmann, L.M. and Pamplin, I.
year 1999
title Benefits of Data Integration in Building Modelling: 3D Object Oriented Professional Collaboration
source AVOCAAD Second International Conference [AVOCAAD Conference Proceedings / ISBN 90-76101-02-07] Brussels (Belgium) 8-10 April 1999, pp. 103-130
summary This paper will review current progress across the building construction industry in meeting demands for use of data integration with the 3D building model as the coordinating device in building design and development. Decades of national initiatives from NEDO (1990) to Egan (1998) have striven to encourage collaboration in first the building design team and later targetting in programmas the means to accomplish this. In its 14th year 'The User Group' has intensified efforts to persuade the industry of the benefits of associating all data involved from the first briefing and conception of design needs and ideas, through the development of the design, testing for structures, costs, heating, lighting, urban and rural environmental impact, facilities management, adaptation and even the eventual controlled demolition of the building. Examples in this paper will be reported from 'The User Group' conference, "Profit from Data Integration: An industry update", (NEC, Birmingham, Nov. 1998), to indicate how various organisations are now profiting from data integration in 3D object orientated modelling.
series AVOCAAD
email
last changed 2005/09/09 10:48

_id aea2
authors Laurel, B. (ed.)
year 1990
title The Art of Human-Computer Interface Design
source New York: Addison-Wesley.
summary Human-computer interface design is a new discipline. So new in fact, that Alan Kay of Apple Computer quipped that people "are not sure whether they should order it by the yard or the ton"! Irrespective of the measure, interface design is gradually emerging as a much-needed and timely approach to reducing the awkwardness and inconveniences of human-computer interaction. "Increased cognitive load", "bewildered and tired users" - these are the byproducts of the "plethora of options and the interface conventions" faced by computer users. Originally, computers were "designed by engineers, for engineers". Little or no attention was, or needed to be, paid to the interface. However, the pervasive use of the personal computer and the increasing number and variety of applications and programs has given rise to a need to focus on the "cognitive locus of human-computer interaction" i.e. the interface. What is the interface? Laurel defines the interface as a "contact surface" that "reflects the physical properties of the interactors, the functions to be performed, and the balance of power and control." (p.xiii) Incorporated into her definition are the "cognitive and emotional aspects of the user's experience". In a very basic sense, the interface is "the place where contact between two entities occurs." (p.xii) Doorknobs, steering wheels, spacesuits-these are all interfaces. The greater the difference between the two entities, the greater the need for a well-designed interface. In this case, the two very different entities are computers and humans. Human-conputer interface design looks at how we can lessen the effects of these differences. This means, for Laurel, empowering users by providing them with ease of use. "How can we think about it so that the interfaces we design will empower users?" "What does the user want to do?" These are the questions Laurel believes must be asked by designers. These are the questions addressed directly and indirectly by the approximately 50 contributors to The Art of Human-Computer Interface Design. In spite of the large number of contributors to the book and the wide range of fields with which they are associated, there is a broad consensus on how interfaces can be designed for empowerment and ease of use. User testing, user contexts, user tasks, user needs, user control: these terms appear throughout the book and suggest ways in which design might focus less on the technology and more on the user. With this perspective in mind, contributor D. Norman argues that computer interfaces should be designed so that the user interacts more with the task and less with the machine. Such interfaces "blend with the task", and "make tools invisible" so that "the technology is subervient to that goal". Sellen and Nicol insist on the need for interfaces that are 'simple', 'self-explanatory', 'adaptive' and 'supportive'. Contributors Vertelney and Grudin are interested in interfaces that support the contexts in which many users work. They consider ways in which group-oriented tasks and collaborative efforts can be supported and aided by the particular design of the interface. Mountford equates ease of use with understating the interface: "The art and science of interface design depends largely on making the transaction with the computer as transparent as possible in order to minimize the burden on the user".(p.248) Mountford also believes in "making computers more powerful extensions of our natural capabilities and goals" by offering the user a "richer sensory environment". One way this can be achieved according to Saloman is through creative use of colour. Saloman notes that colour can not only impart information but that it can be a useful mnemonic device to create associations. A richer sensory environment can also be achieved through use of sound, natural speech recognition, graphics, gesture input devices, animation, video, optical media and through what Blake refers to as "hybrid systems". These systems include additional interface features to control components such as optical disks, videotape, speech digitizers and a range of devices that support "whole user tasks". Rich sensory environments are often characteristic of game interfaces which rely heavily on sound and graphics. Crawford believes we have a lot to learn from the design of games and that they incorporate "sound concepts of user interface design". He argues that "games operate in a more demanding user-interface universe than other applications" since they must be both "fun" and "functional".
series other
last changed 2003/04/23 15:14

_id 46f1
authors Patterson, J.F.
year 1990
title Rendezvous: An Architecture for Synchronous Multi-User Applications
source Proceedings, Conference on Computer-Supported Cooperative Work. New York: ACM, pp. 317-328
summary Rendezvous is an architecture for creating synchronous multi-user applications. It consists of two parts: a run-time architecture for managing the multi-user session and a start-up architecture for managing the network connectivity. The run-time architecture is based on a User Interface Management System called MEL, which is a language extension to Common Lisp providing support for graphics operations, object-oriented programming, and constraints. Constraints are used to manage three dimensions of sharing: sharing of underlying information, sharing of views, and sharing of access. The start-up architecture decouples invoking and joining an application so that not all users need be known when the application is started. At present, the run-time architecture is completed and running test applications. As a first test of the complete Rendezvous architecture, we will implement a multi-user card game by the end of the summer.
series other
last changed 2003/04/23 15:50

_id 22a4
authors Rogers, D.F.
year 1990
title Mathematical elements for computer graphics
source McGraw Hill, USA
summary The second edition of this classic computer graphics book represents a major rewrite. The clear concise discussion, the detailed algorithms, worked examples and numerous illustrations make the book of special interest to students, programmers and computer graphics professionals. The numerous detailed worked examples make it especially suitable for self-study. The first edition of the book, published in 1976, was one of the earliest computer graphics books. That first edition is still a staple on the bookshelves of many of the pioneers in computer graphics. The book thoroughly covers two- and three-dimensional transformations including rotation, scaling, translation, reflection, rotation about arbitrary points and axes, reflection about arbitrary lines and through arbitrary planes and points at infinity. Plane and space curves including efficient methods for representing conic sections, cubic splines, parabolically blended, Bezier and rational and non-rational B-spline (NURBS) curves are discussed. The discussion of surfaces includes surfaces of revolution, sweep surfaces, ruled and developable surfaces, Coons surfaces, Bezier and rational and non-rational B-splines (NURBS) surfaces. As with all the topics in the book, the discussion of both rational and non-rational B-spline curves and surfaces is accompanied by numerous detailed worked examples. The appendices contain over 50 pseudocoded algorithms including over 25 algorithms for Bezier and B-spline curves and surfaces.
series other
last changed 2003/04/23 15:14

_id c12b
authors Sakr, Yasser H. and Johnson, Robert E.
year 1991
title Computer-Aided Architectural Design Strategies: One Size Does Not Fit All
doi https://doi.org/10.52842/conf.acadia.1991.015
source Reality and Virtual Reality [ACADIA Conference Proceedings / ISBN 1-880250-00-4] Los Angeles (California - USA) October 1991, pp. 15-31
summary The practice of architecture is in the midst of significant change and an increasingly uncertain future. Socio-economic factors external to the profession are forcing firms to develop new strategies for delivering design services. Overlaying these external changes is the uncertainty resulting from the inevitable introduction of information technology, which is only beginning to have an impact on the profession. Some advocates see the emergence of a new form of design firm -the computerized design firm - as an intelligent organization structured around electronic work groups with powerful computation and communications tools (Catalano 1990). On the other hand, many practitioners still see CADD as an expensive technology whose primary result leads to an increase in overhead costs. But some practitioners and researchers (Coyne, 1991) recognize both the potential and, problems that computer-aided design presents to the profession. This research presents a framework for understanding how changing information technology might be appropriately integrated into the design firm. It argues that design is an increasingly diverse enterprise, and that this diversity must be understood in order to effectively integrate information technology. The study is divided into three sections. The first section develops an overview of major social, economic, and structural changes within the profession. The second section discusses two alternative approaches that have been utilized to integrate information technology into firms. The third part presents a framework for understanding how information technology may have an impact on strategies for structuring and organizing architectural firms.
series ACADIA
last changed 2022/06/07 07:56

_id 831d
authors Seebohm, Thomas
year 1992
title Discoursing on Urban History Through Structured Typologies
doi https://doi.org/10.52842/conf.acadia.1992.157
source Mission - Method - Madness [ACADIA Conference Proceedings / ISBN 1-880250-01-2] 1992, pp. 157-175
summary How can urban history be studied with the aid of three-dimensional computer modeling? One way is to model known cities at various times in history, using historical records as sources of data. While such studies greatly enhance the understanding of the form and structure of specific cities at specific points in time, it is questionable whether such studies actually provide a true understanding of history. It can be argued that they do not because such studies only show a record of one of many possible courses of action at various moments in time. To gain a true understanding of urban history one has to place oneself back in historical time to consider all of the possible courses of action which were open in the light of the then current situation of the city, to act upon a possible course of action and to view the consequences in the physical form of the city. Only such an understanding of urban history can transcend the memory of the actual and hence the behavior of the possible. Moreover, only such an understanding can overcome the limitations of historical relativism, which contends that historical fact is of value only in historical context, with the realization, due to Benedetto Croce and echoed by Rudolf Bultmann, that the horizon of "'deeper understanding" lies in "'the actuality of decision"' (Seebohm and van Pelt 1990).

One cannot conduct such studies on real cities except, perhaps, as a point of departure at some specific point in time to provide an initial layout for a city knowing that future forms derived by the studies will diverge from that recorded in history. An entirely imaginary city is therefore chosen. Although the components of this city at the level of individual buildings are taken from known cities in history, this choice does not preclude alternative forms of the city. To some degree, building types are invariants and, as argued in the Appendix, so are the urban typologies into which they may be grouped. In this imaginary city students of urban history play the role of citizens or groups of citizens. As they defend their interests and make concessions, while interacting with each other in their respective roles, they determine the nature of the city as it evolves through the major periods of Western urban history in the form of threedimensional computer models.

My colleague R.J. van Pelt and I presented this approach to the study of urban history previously at ACADIA (Seebohm and van Pelt 1990). Yet we did not pay sufficient attention to the manner in which such urban models should be structured and how the efforts of the participants should be coordinated. In the following sections I therefore review what the requirements are for three-dimensional modeling to support studies in urban history as outlined both from the viewpoint of file structure of the models and other viewpoints which have bearing on this structure. Three alternative software schemes of progressively increasing complexity are then discussed with regard to their ability to satisfy these requirements. This comparative study of software alternatives and their corresponding file structures justifies the present choice of structure in relation to the simpler and better known generic alternatives which do not have the necessary flexibility for structuring the urban model. Such flexibility means, of course, that in the first instance the modeling software is more timeconsuming to learn than a simple point and click package in accord with the now established axiom that ease of learning software tools is inversely related to the functional power of the tools. (Smith 1987).

series ACADIA
email
last changed 2022/06/07 07:56

_id b565
authors Yessios, Chris I. (Ed.)
year 1989
title New Ideas and Directions for the 1990’s [Conference Proceedings]
doi https://doi.org/10.52842/conf.acadia.1989
source ACADIA Conference Proceedings / Gainsville (Florida - USA) 27-29 October 1989, 262 p.
summary About a year ago, a comment of mine to Bob Johnson that recent Acadia Conferences appeared to be bypassing some of the real issues of CAAD and that the attendants seemed to be missing the opportunity to debate and to argue, landed me a request to be the Technical Chair for this Acadia 89. In spite of an expected heavy load this past year, I could not refuse. I certainly did not realize at the time what it would take to put the technical program of this Conference together: two "calls" for papers, many- many phone calls and the gracious acceptance of three invited speakers and twelve panelists. In response to a recommendation by Pamela Bancroft, last year's Technical Chair, the first call for papers had a deadline which was by about a month earlier than it has been in recent years. This must have found our membership unprepared and generated only thirteen submissions. A second call was issued with the end of July as a deadline. It generated another eleven submissions. Out of that total of twenty-four papers, ten were selected and are presented in this Conference. The selection process was based strictly on averaging the grades given by each of the three referees who blindly reviewed each paper. The names of the reviewers have been listed earlier in this volume and I wish to take this opportunity to wholeheartedly thank them. In most cases the reviewers offered extensive comments which were returned to the authors and helped them improve their papers. Many of the papers have actually been rewritten in response to the reviewers' comments and what are included in these Proceedings are substantially improved versions of the papers originally submitted. This is the way it is supposed to be, but could not be done without the excellent response by the authors. I"hey deserve our sincere thanks. It must be noted that the reviewers were not always in agreement, which should tell us something about the diverse orientations of our members. In the case of at least three papers, one reviewer gave a 0 or 1 (very low) when another gave a 9 or 10 (very high). In these cases the third reviewer gave the deciding grade. In no case was there a need for me to break a tie. Under normal circumstances, these "controversial" papers should have gone out for another cycle of reviews. Time did not permit to do so. However, I feel confident that the papers which have been selected deserve to be heard. It may be worth speculating why it took two calls to generate only 24 submissions when last year we had 42. There are a number of factors which must have had an effect. First of all, the early deadline. Secondly, the theme of this year's Conference was more focussed than it has been in the recent past. In addition, it was quite challenging. Even though the calls also encouraged submissions in areas other than the central theme, they discouraged contributions which might be redundant with past presentations. This must have filtered out presentations about "CAD in the studio" which did not have an orientation distinctively different from what everybody else is doing. Last, but possibly the most decisive factor must have been that, this year, Acadia was in competition with the Futures Conference. It does not take much to observe that more than half of the presentations at the CAAD Futures Conference were given by active Acadia members. Acadia should by all means be delighted that the bi-annual Futures took place in the States this year, but it certainly made our organizational task harder. As a matter of fact, as a record of CAAD happenings in 1989, 1 believe the Proceedings of the two Conferences complement each other and should be read as a pair.
series ACADIA
email
more http://www.acadia.org
last changed 2022/06/07 07:49

_id eb5f
authors Al-Sallal, Khaled A. and Degelman, Larry 0.
year 1994
title A Hypermedia Model for Supporting Energy Design in Buildings
doi https://doi.org/10.52842/conf.acadia.1994.039
source Reconnecting [ACADIA Conference Proceedings / ISBN 1-880250-03-9] Washington University (Saint Louis / USA) 1994, pp. 39-49
summary Several studies have discussed the limitations of the available CAAD tools and have proposed solutions [Brown and Novitski 1987, Brown 1990, Degelman and Kim 1988, Schuman et al 1988]. The lack of integration between the different tasks that these programs address and the design process is a major problem. Schuman et al [1988] argued that in architectural design many issues must be considered simultaneously before the synthesis of a final product can take place. Studies by Brown and Novitski [1987] and Brown [1990] discussed the difficulties involved with integrating technical considerations in the creative architectural process. One aspect of the problem is the neglect of technical factors during the initial phase of the design that, as the authors argued, results from changing the work environment and the laborious nature of the design process. Many of the current programs require the user to input a great deal of numerical values that are needed for the energy analysis. Although there are some programs that attempt to assist the user by setting default values, these programs distract the user with their extensive arrays of data. The appropriate design tool is the one that helps the user to easily view the principal components of the building design and specify their behaviors and interactions. Data abstraction and information parsimony are the key concepts in developing a successful design tool. Three different approaches for developing an appropriate CAAD tool were found in the literature. Although there are several similarities among them, each is unique in solving certain aspects of the problem. Brown and Novitski [1987] emphasize the learning factor of the tool as well as its highly graphical user interface. Degelman and Kim [1988] emphasize knowledge acquisition and the provision of simulation modules. The Windows and Daylighting Group of Lawrence Berkeley Laboratory (LBL) emphasizes the dynamic structuring of information, the intelligent linking of data, the integrity of the different issues of design and the design process, and the extensive use of images [Schuman et al 19881, these attributes incidentally define the word hypermedia. The LBL model, which uses hypermedia, seems to be the more promising direction for this type of research. However, there is still a need to establish a new model that integrates all aspects of the problem. The areas in which the present research departs from the LBL model can be listed as follows: it acknowledges the necessity of regarding the user as the center of the CAAD tool design, it develops a model that is based on one of the high level theories of human-computer interaction, and it develops a prototype tool that conforms to the model.

series ACADIA
email
last changed 2022/06/07 07:54

_id 8869
authors Ataman, Osman
year 2002
title Historical Analysis of Building - (Re)Construction in Olivette Park, USA
source SIGraDi 2002 - [Proceedings of the 6th Iberoamerican Congress of Digital Graphics] Caracas (Venezuela) 27-29 november 2002, pp. 63-66
summary From 1959 to 1990, East St. Louis, Illinois deteriorated from an “All-American City” to a national symbol of urban blight. Located on the Mississippi River, the East St. Louis of today faces severe economic, social, and environmental problems. Nearly onequarter of the city’s work force is unemployed and about 40 percent of families are living below the poverty level. But East St. Louis was not always a distressed community. With strong ties to St. Louis and the surrounding region, East St. Louis onceflourished as the country’s second busiest railroad hub. Powerful economic and socio-political forces, as well as unfortunate historical circumstance, propelled the city into a downward spiral that drastically decreased the quality of life in East St. Louis. This paper presents the digital re-construction of the buildings and the analyses of the historical aspects of the housing construction and types in this area. Furthermore, it reports the survey and assessment of the quality of building stocks based on therevitalization plan that will provide some guidelines and suggestions for improvement, stability, and future needs.
series SIGRADI
email
last changed 2016/03/10 09:47

_id afa0
authors Aziz, N.M., Bata, R. and Sudarshan, B.
year 1990
title Bezier Surface : Surface Intersection
source IEEE Computer Graphics and Applications. January, 1990. vol. 10: pp. 50-58
summary In this article the authors explain the computational requirement and accuracy of two methods for finding the intersection of Bezier surfaces. In both methods, the existence of an intersection curve is confirmed by using the convex hull property of such surfaces. The first method evaluates the intersection by recursive subdivision of two patches with overlapping hulls. The second method detects a point on the intersection curve, then incrementally traces the intersection in the parametric spaces of the two surfaces. With both methods the intersection of a pair of first-order planar patches must be solved analytically. The intersection is approximated by first-order Bezier patches in the first case, and by planar triangles in the second. Overall, the method of incremental tracing gives more accurate results than the method of recursive subdivision
keywords recursion, curves, convex hull, curved surfaces, intersection, Bezier, triangulation
series CADline
last changed 2003/06/02 14:42

_id 4cf2
authors Barsky, Brian A.
year 1990
title Geometric Continuity of Parametric Curves : Constructions of Geometrically Continuous Splines
source IEEE Computer Graphics and Applications. January, 1990. vol. 10: pp. 60-68 : ill
summary This article is part two of an article published in November 1989. In the first article theoretical foundations for geometric continuity were presented. In this article the basic theory was applied to the construction of geometrically continuous spline curves
keywords Bezier, curves, curved surfaces, splines, continuity, computational geometry, computer graphics
series CADline
last changed 2003/06/02 13:58

_id abc9
authors Campbell, A.T. and Fussell, D.S.
year 1990
title Adaptive Mesh Generation for Global Diffuse Illumination
source Computer Graphics Proc. SIGGRAPH 90 Vol. 24, No. 4, Aug. 1990, pp. 155-164
summary Rapid developments in the design of algorithms for rendering globally illuminated scenes have taken place in the past five years. Net energy methods such as the hemicube and other radiosity algorithms have become very effective a t computing the energy balance for scenes containing diffusely reflecting objects. Such methods first break up a scene description into a relatively large number of elements, or possibly sev- eral levels of elements. Energy transfers among these ele- ments are then determined using a variety of means. While much progress has been made in the design of energy transfer algorithms, little or no attention has been paid to the proper generation of the mesh of surface elements. This pa- per presents a technique for adaptively creating a mesh of surface elements as the energy transfers are computed. The method allows large numbers of small elements to be placed at parts of the scene where the most active energy trans- fers occur without requiring that other parts of the scene be needlessly subdivided to the same degree. As a result, the computational effort in the energy transfer computations can be concentrated where it has the most effect. CR Categories and Subject Descriptors: 1.3.3 [Computer Graphics]: Picture/Image Generation-Display algorithms. 1.3.7 [Computer Graphics]: Three-Dimensional Graphics and Realism. General Terms: Algorithms Additional Key Words and Phrases: global illumination, radiosity, mesh-generation, diffuse, data structure, incremental.
series journal paper
last changed 2003/04/23 15:50

_id 91c4
authors Checkland, P.
year 1981
title Systems Thinking, Systems Practice
source John Wiley & Sons, Chichester
summary Whether by design, accident or merely synchronicity, Checkland appears to have developed a habit of writing seminal publications near the start of each decade which establish the basis and framework for systems methodology research for that decade."" Hamish Rennie, Journal of the Operational Research Society, 1992 Thirty years ago Peter Checkland set out to test whether the Systems Engineering (SE) approach, highly successful in technical problems, could be used by managers coping with the unfolding complexities of organizational life. The straightforward transfer of SE to the broader situations of management was not possible, but by insisting on a combination of systems thinking strongly linked to real-world practice Checkland and his collaborators developed an alternative approach - Soft Systems Methodology (SSM) - which enables managers of all kinds and at any level to deal with the subtleties and confusions of the situations they face. This work established the now accepted distinction between hard systems thinking, in which parts of the world are taken to be systems which can be engineered, and soft systems thinking in which the focus is on making sure the process of inquiry into real-world complexity is itself a system for learning. Systems Thinking, Systems Practice (1981) and Soft Systems Methodology in Action (1990) together with an earlier paper Towards a Systems-based Methodology for Real-World Problem Solving (1972) have long been recognized as classics in the field. Now Peter Checkland has looked back over the three decades of SSM development, brought the account of it up to date, and reflected on the whole evolutionary process which has produced a mature SSM. SSM: A 30-Year Retrospective, here included with Systems Thinking, Systems Practice closes a chapter on what is undoubtedly the most significant single research programme on the use of systems ideas in problem solving. Now retired from full-time university work, Peter Checkland continues his research as a Leverhulme Emeritus Fellow. "
series other
last changed 2003/04/23 15:14

_id avocaad_2001_02
id avocaad_2001_02
authors Cheng-Yuan Lin, Yu-Tung Liu
year 2001
title A digital Procedure of Building Construction: A practical project
source AVOCAAD - ADDED VALUE OF COMPUTER AIDED ARCHITECTURAL DESIGN, Nys Koenraad, Provoost Tom, Verbeke Johan, Verleye Johan (Eds.), (2001) Hogeschool voor Wetenschap en Kunst - Departement Architectuur Sint-Lucas, Campus Brussel, ISBN 80-76101-05-1
summary In earlier times in which computers have not yet been developed well, there has been some researches regarding representation using conventional media (Gombrich, 1960; Arnheim, 1970). For ancient architects, the design process was described abstractly by text (Hewitt, 1985; Cable, 1983); the process evolved from unselfconscious to conscious ways (Alexander, 1964). Till the appearance of 2D drawings, these drawings could only express abstract visual thinking and visually conceptualized vocabulary (Goldschmidt, 1999). Then with the massive use of physical models in the Renaissance, the form and space of architecture was given better precision (Millon, 1994). Researches continued their attempts to identify the nature of different design tools (Eastman and Fereshe, 1994). Simon (1981) figured out that human increasingly relies on other specialists, computational agents, and materials referred to augment their cognitive abilities. This discourse was verified by recent research on conception of design and the expression using digital technologies (McCullough, 1996; Perez-Gomez and Pelletier, 1997). While other design tools did not change as much as representation (Panofsky, 1991; Koch, 1997), the involvement of computers in conventional architecture design arouses a new design thinking of digital architecture (Liu, 1996; Krawczyk, 1997; Murray, 1997; Wertheim, 1999). The notion of the link between ideas and media is emphasized throughout various fields, such as architectural education (Radford, 2000), Internet, and restoration of historical architecture (Potier et al., 2000). Information technology is also an important tool for civil engineering projects (Choi and Ibbs, 1989). Compared with conventional design media, computers avoid some errors in the process (Zaera, 1997). However, most of the application of computers to construction is restricted to simulations in building process (Halpin, 1990). It is worth studying how to employ computer technology meaningfully to bring significant changes to concept stage during the process of building construction (Madazo, 2000; Dave, 2000) and communication (Haymaker, 2000).In architectural design, concept design was achieved through drawings and models (Mitchell, 1997), while the working drawings and even shop drawings were brewed and communicated through drawings only. However, the most effective method of shaping building elements is to build models by computer (Madrazo, 1999). With the trend of 3D visualization (Johnson and Clayton, 1998) and the difference of designing between the physical environment and virtual environment (Maher et al. 2000), we intend to study the possibilities of using digital models, in addition to drawings, as a critical media in the conceptual stage of building construction process in the near future (just as the critical role that physical models played in early design process in the Renaissance). This research is combined with two practical building projects, following the progress of construction by using digital models and animations to simulate the structural layouts of the projects. We also tried to solve the complicated and even conflicting problems in the detail and piping design process through an easily accessible and precise interface. An attempt was made to delineate the hierarchy of the elements in a single structural and constructional system, and the corresponding relations among the systems. Since building construction is often complicated and even conflicting, precision needed to complete the projects can not be based merely on 2D drawings with some imagination. The purpose of this paper is to describe all the related elements according to precision and correctness, to discuss every possibility of different thinking in design of electric-mechanical engineering, to receive feedback from the construction projects in the real world, and to compare the digital models with conventional drawings.Through the application of this research, the subtle relations between the conventional drawings and digital models can be used in the area of building construction. Moreover, a theoretical model and standard process is proposed by using conventional drawings, digital models and physical buildings. By introducing the intervention of digital media in design process of working drawings and shop drawings, there is an opportune chance to use the digital media as a prominent design tool. This study extends the use of digital model and animation from design process to construction process. However, the entire construction process involves various details and exceptions, which are not discussed in this paper. These limitations should be explored in future studies.
series AVOCAAD
email
last changed 2005/09/09 10:48

_id 417a
authors Cipriani, R., Lagomarsino, A.D., Stagnaro, A., Valenti, E. and Sambolino, T.
year 1990
title Some Years' Experience Teaching CAAD
source The Electronic Design Studio: Architectural Knowledge and Media in the Computer Era [CAAD Futures ‘89 Conference Proceedings / ISBN 0-262-13254-0] Cambridge (Massachusetts / USA), 1989, pp. 347-361
summary In the conventional way of teaching architecture, it is common to think of design as the final synthesis of an intellectual process (composizione in Italian) integrating different elements from different curriculum subjects: history, structural analysis., technology, regional and urban planning, and so on. These elements, being comprehensive of their specific domains, together build the project. This process is supported by a long traditional that cannot easily be modified; however, we must not consider it to be the only one. Architectural practice should be much more. The Scuole di Architettura has walked a long and difficult road in the last thirty years., with a significant widening of interest in social, political, and economic issues. There have been recurring attempts at epistemological reformulation in some areas. There has been an acknowledgment of a crisis in contemporary town planning and a dimming of several certitudes that had developed with the birth and growth of the modernist school. And there has been a weakening of the promises that had given life to the vigorous discussion about town and regional planning. All of this leads to a reconsideration of the meaning and the deeper assumptions that the project implies, a question mark at the center of the human sciences that architectural practice involves. The old tradition., which assigned composition a central role in the project, is no longer sufficient because it is related to a reductive reading of epistemology that views human sciences as defining segments of physical knowledge of the actual world. Contemporary reflection on the difference between understanding and unfolding, together with the attention given to interpreting a moment as compared to purely describing one, gives to the project the task of inquiry instead of solution.
series CAAD Futures
last changed 1999/04/03 17:58

_id e5e2
authors Coyne, R.D., Rosenman, M.A. and Radford, A.D. (et.al.)
year 1990
title Knowledge Based Design Systems
source 576 p. : ill Reading, Mass.: Addison-Wesley, 1990. includes bibliographies and index.
summary This book describes the bases, approaches, techniques, and implementations of knowledge-based design systems, and advocates and develops new directions in design systems generally. A formal model of design coupled with the notion of prototypes provides a coherent framework for all that follows and is a platform on which a comprehension of knowledge-based design rests. The book is divided into three parts. Part I, Design, examines and describes design and design processes, providing the context for the remainder of the book. Part II, Representation and Reasoning, explores the kinds of knowledge involved in design and the tools and techniques available for representing and controlling this knowledge. It examines the attributes of design that must be described and the ways in which knowledge-based methods are capable of describing and controlling them. Part III, Knowledge-Based Design, presents in detail the fundamentals of the interpretation of design, including the role of expert systems in interpreting existing designs, before describing how to produce designs within a knowledge-based environment. This part includes a detailed examination of design processes from the perspective of how to control these processes. Within each of these processes, the place and role of knowledge is presented and examples of knowledge-based design systems given. Finally, the authors examine central areas of human design and demonstrate what current knowledge-based design systems are capable of doing now and in the future
keywords knowledge base, design process, representation, CAD, AI, prototypes, expert systems
series CADline
email
last changed 2003/05/17 10:13

_id e33a
authors De Cola, S., De Cola, B. and Pentasuglia, Francesco
year 1990
title Messina 1908: The Invisible City
source The Electronic Design Studio: Architectural Knowledge and Media in the Computer Era [CAAD Futures ‘89 Conference Proceedings / ISBN 0-262-13254-0] Cambridge (Massachusetts / USA), 1989, pp. 239-246
summary The initial purposes of this work were to build a 3D model of the old city of Messina and to reconstruct a walk through it; to understand the "Ghost city," the parts that form it, and the rules of its plan, which are explicit in some cases but hidden most of the time; to measure its space, appreciate the similarities to and differences from modern city plans, and use the information to improve the plans of tomorrow. It might seem a useless study of a nonexistent city, and yet during the months of detailed work, of patient reconstruction from the surveys and photographs of the city destroyed in 1908, we began to consider how it was still possible to obtain spatial values of and to project behaviors in the lost city, in other words, to practice tests on memory that are very interesting for people working in a context in which memory no longer exists. The work presented here is the first stage of a more complex research project still to be carried out on Messina as it was at the end of the nineteenth century. Here we constructed a 3D model of some parts of the city prior to the earthquake of 1908 and made a five-minute video, using cartoon techniques, of an "impossible" walk through the city. The fragments of the city were reconstructed from available documentary sources, primarily photographic images, which tended to be of the most important places in the city.
series CAAD Futures
last changed 1999/04/03 17:58

For more results click below:

this is page 0show page 1show page 2show page 3show page 4show page 5... show page 13HOMELOGIN (you are user _anon_560591 from group guest) CUMINCAD Papers Powered by SciX Open Publishing Services 1.002