CumInCAD is a Cumulative Index about publications in Computer Aided Architectural Design
supported by the sibling associations ACADIA, CAADRIA, eCAADe, SIGraDi, ASCAAD and CAAD futures

PDF papers
References

Hits 1 to 20 of 247

_id 60e7
authors Bailey, Rohan
year 2000
title The Intelligent Sketch: Developing a Conceptual Model for a Digital Design Assistant
source Eternity, Infinity and Virtuality in Architecture [Proceedings of the 22nd Annual Conference of the Association for Computer-Aided Design in Architecture / 1-880250-09-8] Washington D.C. 19-22 October 2000, pp. 137-145
doi https://doi.org/10.52842/conf.acadia.2000.137
summary The computer is a relatively new tool in the practice of Architecture. Since its introduction, there has been a desire amongst designers to use this new tool quite early in the design process. However, contrary to this desire, most Architects today use pen and paper in the very early stages of design to sketch. Architects solve problems by thinking visually. One of the most important tools that the Architect has at his disposal in the design process is the hand sketch. This iterative way of testing ideas and informing the design process with images fundamentally directs and aids the architect’s decision making. It has been said (Schön and Wiggins 1992) that sketching is about the reflective conversation designers have with images and ideas conveyed by the act of drawing. It is highly dependent on feedback. This “conversation” is an area worthy of investigation. Understanding this “conversation” is significant to understanding how we might apply the computer to enhance the designer’s ability to capture, manipulate and reflect on ideas during conceptual design. This paper discusses sketching and its relation to design thinking. It explores the conversations that designers engage in with the media they use. This is done through the explanation of a protocol analysis method. Protocol analysis used in the field of psychology, has been used extensively by Eastman et al (starting in the early 70s) as a method to elicit information about design thinking. In the pilot experiment described in this paper, two persons are used. One plays the role of the “hand” while the other is the “mind”- the two elements that are involved in the design “conversation”. This variation on classical protocol analysis sets out to discover how “intelligent” the hand should be to enhance design by reflection. The paper describes the procedures entailed in the pilot experiment and the resulting data. The paper then concludes by discussing future intentions for research and the far reaching possibilities for use of the computer in architectural studio teaching (as teaching aids) as well as a digital design assistant in conceptual design.
keywords CAAD, Sketching, Protocol Analysis, Design Thinking, Design Education
series ACADIA
last changed 2022/06/07 07:54

_id 3ff5
authors Abbo, I.A., La Scalea, L., Otero, E. and Castaneda, L.
year 1992
title Full-Scale Simulations as Tool for Developing Spatial Design Ability
source Proceedings of the 4rd European Full-Scale Modelling Conference / Lausanne (Switzerland) 9-12 September 1992, Part C, pp. 7-10
summary Spatial Design Ability has been defined as the capability to anticipate effects (psychological impressions on potential observers or users) produced by mental manipulation of elements of architectural or urban spaces. This ability, of great importance in choosing the appropriate option during the design process, is not specifically developed in schools of architecture and is partially obtained as a by-product of drawing, designing or architectural criticism. We use our Laboratory as a tool to present spaces to people so that they can evaluate them. By means of a series of exercises, students confront their anticipations with the psychological impressions produced in other people. For this occasion, we present an experience in which students had to propose a space for an exhibition hag in which architectural projects (student thesis) were to be shown. Following the Spatial Design Ability Development Model which we have been using for several years, students first get acquainted with the use of evaluation instruments for psychological impressions as well as with research methodology. In this case, due to the short period available, we reduced research to investigate the effects produced by the manipulation of only 2 independents variables: students manipulated first the form of the roof, walls and interiors elements, secondly, color and texture of those elements. They evaluated spatial quality, character and the other psychological impressions that manipulations produced in people. They used three dimensional scale models 1/10 and 1/1.
keywords Full-scale Modeling, Model Simulation, Real Environments
series other
email
more http://info.tuwien.ac.at/efa
last changed 2003/08/25 10:12

_id ecaadesigradi2019_449
id ecaadesigradi2019_449
authors Becerra Santacruz, Axel
year 2019
title The Architecture of ScarCity Game - The craft and the digital as an alternative design process
source Sousa, JP, Xavier, JP and Castro Henriques, G (eds.), Architecture in the Age of the 4th Industrial Revolution - Proceedings of the 37th eCAADe and 23rd SIGraDi Conference - Volume 3, University of Porto, Porto, Portugal, 11-13 September 2019, pp. 45-52
doi https://doi.org/10.52842/conf.ecaade.2019.3.045
summary The Architecture of ScarCity Game is a board game used as a pedagogical tool that challenges architecture students by involving them in a series of experimental design sessions to understand the design process of scarcity and the actual relation between the craft and the digital. This means "pragmatic delivery processes and material constraints, where the exchange between the artisan of handmade, representing local skills and technology of the digitally conceived is explored" (Huang 2013). The game focuses on understanding the different variables of the crafted design process of traditional communities under conditions of scarcity (Michel and Bevan 1992). This requires first analyzing the spatial environmental model of interaction, available human and natural resources, and the dynamic relationship of these variables in a digital era. In the first stage (Pre-Agency), the game set the concept of the craft by limiting students design exploration from a minimum possible perspective developing locally available resources and techniques. The key elements of the design process of traditional knowledge communities have to be identified (Preez 1984). In other words, this stage is driven by limited resources + chance + contingency. In the second stage (Post-Agency) students taking the architects´ role within this communities, have to speculate and explore the interface between the craft (local knowledge and low technological tools), and the digital represented by computation data, new technologies available and construction. This means the introduction of strategy + opportunity + chance as part of the design process. In this sense, the game has a life beyond its mechanics. This other life challenges the participants to exploit the possibilities of breaking the actual boundaries of design. The result is a tool to challenge conventional methods of teaching and leaning controlling a prescribed design process. It confronts the rules that professionals in this field take for granted. The game simulates a 'fake' reality by exploring in different ways with surveyed information. As a result, participants do not have anything 'real' to lose. Instead, they have all the freedom to innovate and be creative.
keywords Global south, scarcity, low tech, digital-craft, design process and innovation by challenge.
series eCAADeSIGraDi
email
last changed 2022/06/07 07:54

_id 6bff
authors Coyne, Richard
year 1992
title The Role of Metaphor in Understanding Computers in Design
source Mission - Method - Madness [ACADIA Conference Proceedings / ISBN 1-880250-01-2] 1992, pp. 3-11
doi https://doi.org/10.52842/conf.acadia.1992.003
summary The study of metaphor provides valuable insights into the workings of thought and understanding. This chapter addresses the important question of what the study of metaphor has to say about technology, the design process and hence the role of computers in design. The conclusion is that design involves the generation of action within a collaborative environment in which there is the free play of metaphor. A recognition of the close relationship between technology and metaphor provides insights into how to evaluate and develop the effective use of computers in design.

series ACADIA
email
last changed 2022/06/07 07:56

_id e412
authors Fargas, Josep and Papazian, Pegor
year 1992
title Modeling Regulations and Intentions for Urban Development: The Role of Computer Simulation in the Urban Design Studio
source CAAD Instruction: The New Teaching of an Architect? [eCAADe Conference Proceedings] Barcelona (Spain) 12-14 November 1992, pp. 201-212
doi https://doi.org/10.52842/conf.ecaade.1992.201
summary In this paper we present a strategy for modeling urban development in order to study the role of urban regulations and policies in the transformation of cities. We also suggest a methodology for using computer models as experimental tools in the urban design studio in order to make explicit the factors involved in shaping cities, and for the automatic visualization of projected development. The structure of the proposed model is based on different modules which represent, on the one hand, the rules regulating the physical growth of a city and, on the other hand, heuristics corresponding to different interests such as Real Estate Developers, City Hall Planners, Advocacy and Community Groups, and so on. Here we present a case study dealing with the Boston Redevelopment Authority zoning code for the Midtown Cultural District of Boston. We introduce a computer program which develops the district, adopting a particular point of view regarding urban regulation. We then generalize the notion of this type of computer modeling and simulation, and draw some conclusions about its possible uses in the teaching and practice of design.
series eCAADe
email
last changed 2022/06/07 07:55

_id 83f7
authors Fenves, Stephen J., Flemming, Ulrich and Hendrickson, Craig (et al)
year 1992
title Performance Evaluation in an Integrated Software Environment for Building Design and Construction Planning
source New York: John Wiley & Sons, 1992. pp. 159-169 : ill. includes bibliography
summary In this paper the authors describe the role of performance evaluation in the Integrated Software Environment for Building Design and Construction Planning (IBDE), which is a testbed for examining integration issues in the same domain. Various processes in IBDE deal with the spatial configuration, structural design, and construction planning of high-rise office buildings. Performance evaluations occur within these processes based on different representation schemes and control mechanisms for the handling of performance knowledge. Within this multiprocess environment, opportunities also exist for performance evaluation across disciplines through design critics
keywords evaluation, performance, integration, systems, building, design, construction, architecture, planning, structures, representation, control
series CADline
email
last changed 2003/06/02 10:24

_id 7ce5
authors Gal, Shahaf
year 1992
title Computers and Design Activities: Their Mediating Role in Engineering Education
source Sociomedia, ed. Edward Barret. MIT Press
summary Sociomedia: With all the new words used to describe electronic communication (multimedia, hypertext, cyberspace, etc.), do we need another one? Edward Barrett thinks we do; hence, he coins the term "sociomedia." It is meant to displace a computing economy in which technicity is hypostasized over sociality. Sociomedia, a compilation of twenty-five articles on the theory, design and practice of educational multimedia and hypermedia, attempts to re-value the communicational face of computing. Value, of course, is "ultimately a social construct." As such, it has everything to do with knowledge, power, education and technology. The projects discussed in this book represent the leading edge of electronic knowledge production in academia (not to mention major funding) and are determining the future of educational media. For these reasons, Sociomedia warrants close inspection. Barrett's introduction sets the tone. For him, designing computer media involves hardwiring a mechanism for the social construction of knowledge (1). He links computing to a process of social and communicative interactivity for constructing and desseminating knowledge. Through a mechanistic mapping of the university as hypercontext (a huge network that includes classrooms as well as services and offices), Barrett models intellectual work in such a way as to avoid "limiting definitions of human nature or human development." Education, then, can remain "where it should be--in the human domain (public and private) of sharing ideas and information through the medium of language." By leaving education in a virtual realm (where we can continue to disagree about its meaning and execution), it remains viral, mutating and contaminating in an intellectually healthy way. He concludes that his mechanistic model, by means of its reductionist approach, preserves value (7). This "value" is the social construction of knowledge. While I support the social orientation of Barrett's argument, discussions of value are related to power. I am not referring to the traditional teacher-student power structure that is supposedly dismantled through cooperative and constructivist learning strategies. The power to be reckoned with in the educational arena is foundational, that which (pre)determines value and the circulation of knowledge. "Since each of you reading this paragraph has a different perspective on the meaning of 'education' or 'learning,' and on the processes involved in 'getting an education,' think of the hybris in trying to capture education in a programmable function, in a displayable object, in a 'teaching machine'" (7). Actually, we must think about that hybris because it is, precisely, what informs teaching machines. Moreover, the basic epistemological premises that give rise to such productions are too often assumed. In the case of instructional design, the episteme of cognitive sciences are often taken for granted. It is ironic that many of the "postmodernists" who support electronic hypertextuality seem to have missed Jacques Derrida's and Michel Foucault's "deconstructions" of the epistemology underpinning cognitive sciences (if not of epistemology itself). Perhaps it is the glitz of the technology that blinds some users (qua developers) to the belief systems operating beneath the surface. Barrett is not guilty of reactionary thinking or politics; he is, in fact, quite in line with much American deconstructive and postmodern thinking. The problem arises in that he leaves open the definitions of "education," "learning" and "getting an education." One cannot engage in the production of new knowledge without orienting its design, production and dissemination, and without negotiating with others' orientations, especially where largescale funding is involved. Notions of human nature and development are structural, even infrastructural, whatever the medium of the teaching machine. Although he addresses some dynamics of power, money and politics when he talks about the recession and its effects on the conference, they are readily visible dynamics of power (3-4). Where does the critical factor of value determination, of power, of who gets what and why, get mapped onto a mechanistic model of learning institutions? Perhaps a mapping of contributors' institutions, of the funding sources for the projects showcased and for participation in the conference, and of the disciplines receiving funding for these sorts of projects would help visualize the configurations of power operative in the rising field of educational multimedia. Questions of power and money notwithstanding, Barrett's introduction sets the social and textual thematics for the collection of essays. His stress on interactivity, on communal knowledge production, on the society of texts, and on media producers and users is carried foward through the other essays, two of which I will discuss. Section I of the book, "Perspectives...," highlights the foundations, uses and possible consequences of multimedia and hypertextuality. The second essay in this section, "Is There a Class in This Text?," plays on the robust exchange surrounding Stanley Fish's book, Is There a Text in This Class?, which presents an attack on authority in reading. The author, John Slatin, has introduced electronic hypertextuality and interaction into his courses. His article maps the transformations in "the content and nature of work, and the workplace itself"-- which, in this case, is not industry but an English poetry class (25). Slatin discovered an increase of productive and cooperative learning in his electronically- mediated classroom. For him, creating knowledge in the electronic classroom involves interaction between students, instructors and course materials through the medium of interactive written discourse. These interactions lead to a new and persistent understanding of the course materials and of the participants' relation to the materials and to one another. The work of the course is to build relationships that, in my view, constitute not only the meaning of individual poems, but poetry itself. The class carries out its work in the continual and usually interactive production of text (31). While I applaud his strategies which dismantle traditional hierarchical structures in academia, the evidence does not convince me that the students know enough to ask important questions or to form a self-directing, learning community. Stanley Fish has not relinquished professing, though he, too, espouses the indeterminancy of the sign. By the fourth week of his course, Slatin's input is, by his own reckoning, reduced to 4% (39). In the transcript of the "controversial" Week 6 exchange on Gertrude Stein--the most disliked poet they were discussing at the time (40)--we see the blind leading the blind. One student parodies Stein for three lines and sums up his input with "I like it." Another, finds Stein's poetry "almost completey [sic] lacking in emotion or any artistic merit" (emphasis added). On what grounds has this student become an arbiter of "artistic merit"? Another student, after admitting being "lost" during the Wallace Steven discussion, talks of having more "respect for Stevens' work than Stein's" and adds that Stein's poetry lacks "conceptual significance[, s]omething which people of varied opinion can intelligently discuss without feeling like total dimwits...." This student has progressed from admitted incomprehension of Stevens' work to imposing her (groundless) respect for his work over Stein's. Then, she exposes her real dislike for Stein's poetry: that she (the student) missed the "conceptual significance" and hence cannot, being a person "of varied opinion," intelligently discuss it "without feeling like [a] total dimwit." Slatin's comment is frightening: "...by this point in the semester students have come to feel increasingly free to challenge the instructor" (41). The students that I have cited are neither thinking critically nor are their preconceptions challenged by student-governed interaction. Thanks to the class format, one student feels self-righteous in her ignorance, and empowered to censure. I believe strongly in student empowerment in the classroom, but only once students have accrued enough knowledge to make informed judgments. Admittedly, Slatin's essay presents only partial data (there are six hundred pages of course transcripts!); still, I wonder how much valuable knowledge and metaknowledge was gained by the students. I also question the extent to which authority and professorial dictature were addressed in this course format. The power structures that make it possible for a college to require such a course, and the choice of texts and pedagogy, were not "on the table." The traditional professorial position may have been displaced, but what took its place?--the authority of consensus with its unidentifiable strong arm, and the faceless reign of software design? Despite Slatin's claim that the students learned about the learning process, there is no evidence (in the article) that the students considered where their attitudes came from, how consensus operates in the construction of knowledge, how power is established and what relationship they have to bureaucratic insitutions. How do we, as teaching professionals, negotiate a balance between an enlightened despotism in education and student-created knowledge? Slatin, and other authors in this book, bring this fundamental question to the fore. There is no definitive answer because the factors involved are ultimately social, and hence, always shifting and reconfiguring. Slatin ends his article with the caveat that computerization can bring about greater estrangement between students, faculty and administration through greater regimentation and control. Of course, it can also "distribute authority and power more widely" (50). Power or authority without a specific face, however, is not necessarily good or just. Shahaf Gal's "Computers and Design Activities: Their Mediating Role in Engineering Education" is found in the second half of the volume, and does not allow for a theory/praxis dichotomy. Gal recounts a brief history of engineering education up to the introduction of Growltiger (GT), a computer-assisted learning aid for design. He demonstrates GT's potential to impact the learning of engineering design by tracking its use by four students in a bridge-building contest. What his text demonstrates clearly is that computers are "inscribing and imaging devices" that add another viewpoint to an on-going dialogue between student, teacher, earlier coursework, and other teaching/learning tools. The less proficient students made a serious error by relying too heavily on the technology, or treating it as a "blueprint provider." They "interacted with GT in a way that trusted the data to represent reality. They did not see their interaction with GT as a negotiation between two knowledge systems" (495). Students who were more thoroughly informed in engineering discourses knew to use the technology as one voice among others--they knew enough not simply to accept the input of the computer as authoritative. The less-advanced students learned a valuable lesson from the competition itself: the fact that their designs were not able to hold up under pressure (literally) brought the fact of their insufficient knowledge crashing down on them (and their bridges). They also had, post factum, several other designs to study, especially the winning one. Although competition and comparison are not good pedagogical strategies for everyone (in this case the competitors had volunteered), at some point what we think we know has to be challenged within the society of discourses to which it belongs. Students need critique in order to learn to push their learning into auto-critique. This is what is lacking in Slatin's discussion and in the writings of other avatars of constructivist, collaborative and computer-mediated pedagogies. Obviously there are differences between instrumental types of knowledge acquisition and discoursive knowledge accumulation. Indeed, I do not promote the teaching of reading, thinking and writing as "skills" per se (then again, Gal's teaching of design is quite discursive, if not dialogic). Nevertheless, the "soft" sciences might benefit from "bridge-building" competitions or the re-institution of some forms of agonia. Not everything agonistic is inhuman agony--the joy of confronting or creating a sound argument supported by defensible evidence, for example. Students need to know that soundbites are not sound arguments despite predictions that electronic writing will be aphoristic rather than periodic. Just because writing and learning can be conceived of hypertextually does not mean that rigor goes the way of the dinosaur. Rigor and hypertextuality are not mutually incompatible. Nor is rigorous thinking and hard intellectual work unpleasurable, although American anti-intellectualism, especially in the mass media, would make it so. At a time when the spurious dogmatics of a Rush Limbaugh and Holocaust revisionist historians circulate "aphoristically" in cyberspace, and at a time when knowledge is becoming increasingly textualized, the role of critical thinking in education will ultimately determine the value(s) of socially constructed knowledge. This volume affords the reader an opportunity to reconsider knowledge, power, and new communications technologies with respect to social dynamics and power relationships.
series other
last changed 2003/04/23 15:14

_id 067f
authors Gantt, Michelle and Nardi, Bonnie A.
year 1992
title Gardeners and Gurus: Patterns of Cooperation among CAD Users Perspectives on the Design of CollaborativeSystems
source Proceedings of ACM CHI'92 Conference on Human Factors in ComputingSystems 1992 pp. 107-117
summary We studied CAD system users to find out how they use the sophisticated customization and extension facilities offered by many CAD products. We found that users of varying levels of expertise collaborate to customize their CAD environments and to create programmatic extensions to their applications. Within a group of users, there is at least one local expert who provides support for other users. We call this person a local developer. The local developer is a fellow domain expert, not a professional programmer, outside technical consultant or MIS staff member. We found that in some CAD environments the support role has been formalized so that local developers are given official recognition, and time and resources to pursue local developer activities. In general, this formalization of the local developer role appears successful. We discuss the implications of our findings for work practices and for software design.
keywords Cooperative Work; End User Programming
series other
last changed 2002/07/07 16:01

_id a081
authors Greenberg S., Roseman M. and Webster, D.
year 1992
title Issues and Experiences Designing and Implementing Two Group Drawing Tools
source Readings in Groupware, 609-620
summary Groupware designers are now developing multi-user equivalents of popular paint and draw applications. Their job is not an easy one. First, human factors issues peculiar to group interaction appear that, if ignored, seriously limit the usability of the group tool. Second, implementation is fraught with considerable hurdles. This paper describes the issues and experiences we have met and handled in the design of two systems supporting remote real time group interaction: GroupSketch, a multi-user sketchpad; and GroupDraw, an object-based multi-user draw package. On the human factors side, we summarize empirically-derived design principles that we believe are critical to building useful and usable collaborative drawing tools. On the implementation side, we describe our experiences with replicated versus centralized architectures, schemes for participant registration, multiple cursors, network requirements, and the structure of the drawing primitives.
series other
last changed 2003/04/23 15:50

_id 6cfd
authors Harfmann, Anton C. and Majkowski, Bruce R.
year 1992
title Component-Based Spatial Reasoning
source Mission - Method - Madness [ACADIA Conference Proceedings / ISBN 1-880250-01-2] 1992, pp. 103-111
doi https://doi.org/10.52842/conf.acadia.1992.103
summary The design process and ordering of individual components through which architecture is realized relies on the use of abstract "models" to represent a proposed design. The emergence and use of these abstract "models" for building representation has a long history and tradition in the field of architecture. Models have been made and continue to be made for the patron, occasionally the public, and as a guide for the builders. Models have also been described as a means to reflect on the design and to allow the design to be in dialogue with the creator.

The term "model" in the above paragraph has been used in various ways and in this context is defined as any representation through which design intent is expressed. This includes accurate/ rational or abstract drawings (2- dimensional and 3-dimensional), physical models (realistic and abstract) and computer models (solid, void and virtual reality). The various models that fall within the categories above have been derived from the need to "view" the proposed design in various ways in order to support intuitive reasoning about the proposal and for evaluation purposes. For example, a 2-dimensional drawing of a floor plan is well suited to support reasoning about spatial relationships and circulation patterns while scaled 3-dimensional models facilitate reasoning about overall form, volume, light, massing etc. However, the common denominator of all architectural design projects (if the intent is to construct them in actual scale, physical form) are the discrete building elements from which the design will be constructed. It is proposed that a single computational model representing individual components supports all of the above "models" and facilitates "viewing"' the design according to the frame of reference of the viewer.

Furthermore, it is the position of the authors that all reasoning stems from this rudimentary level of modeling individual components.

The concept of component representation has been derived from the fact that a "real" building (made from individual components such as nuts, bolts and bar joists) can be "viewed" differently according to the frame of reference of the viewer. Each individual has the ability to infer and abstract from the assemblies of components a variety of different "models" ranging from a visceral, experiential understanding to a very technical, physical understanding. The component concept has already proven to be a valuable tool for reasoning about assemblies, interferences between components, tracing of load path and numerous other component related applications. In order to validate the component-based modeling concept this effort will focus on the development of spatial understanding from the component-based model. The discussions will, therefore, center about the representation of individual components and the development of spatial models and spatial reasoning from the component model. In order to frame the argument that spatial modeling and reasoning can be derived from the component representation, a review of the component-based modeling concept will precede the discussions of spatial issues.

series ACADIA
email
last changed 2022/06/07 07:49

_id abce
authors Ishii, Hiroshi and Kobayashi, Minoru
year 1992
title ClearBoard: A Seamless Medium for Shared Drawing and Conversation with Eye Contact Systems for Media-Supported Collaboration
source Proceedings of ACM CHI'92 Conference on HumanFactors in Computing Systems 1992 pp. 525-532
summary This paper introduces a novel shared drawing medium called ClearBoard. It realizes (1) a seamless shared drawing space and (2) eye contact to support realtime and remote collaboration by two users. We devised the key metaphor: "talking through and drawing on a transparent glass window" to design ClearBoard. A prototype of ClearBoard is implemented based on the "Drafter-Mirror" architecture. This paper first reviews previous work on shared drawing support to clarify the design goals. We then examine three metaphors that fulfill these goals. The design requirements and the two possible system architectures of ClearBoard are described. Finally, some findings gained through the experimental use of the prototype, including the feature of "gaze awareness", are discussed.
series other
last changed 2002/07/07 16:01

_id 49bf
authors Johnson, Robert E.
year 1992
title Design Inquiry and Resource Allocation
source New York: John Wiley & Sons, 1992. pp. 51-65 : ill. tables. includes bibliography
summary This paper proposes that the primary role of resource allocation in design is to assist design decision makers in ordering preferences and exploring trade-offs. Most existing cost evaluation paradigms focus on assessing costs after design decisions are made. This view unnecessarily restricts the active participation of economic knowledge in design decision-making. The approach described in this research suggests that the exploration and definition of values and references should be the major focus of economic analysis within the design process. A conceptual framework for this approach is presented along with several examples that illustrate the use of this framework. Computational approaches are suggested which play a central role in clarifying preference and exploring trade-offs during design
keywords economics, architecture, building, construction, resource allocation, design, cost, evaluation
series CADline
last changed 2003/06/02 13:58

_id 65aa
authors Madrazo, Leandro
year 1992
title From Sketches to Computer Images: A Strategy for the Application of Computers in Architectural Design
source CAAD Instruction: The New Teaching of an Architect? [eCAADe Conference Proceedings] Barcelona (Spain) 12-14 November 1992, pp. 331-350
doi https://doi.org/10.52842/conf.ecaade.1992.331
summary The use of computer tools in architectural practice has been steadily increasing in recent years. Many architectural offices are already using computer tools, mostly for production tasks. Hardly any design is being done with the computer. With the new computer tools, architects are confronted with the challenge to use computers to express their design ideas right from conception.

This paper describes a project made for a competition which recently took place in Spain. Sketches and computer models were the only tools used in designing this project. A variety of computer tools were used in different stages of this project: two dimensional drawing tools were used in the early stages, then a three-dimensional modeling program for the development of the design and for the production of final drawings, and a rendering program for final presentation images.

series eCAADe
email
last changed 2022/06/07 07:59

_id a582
authors Marshall, Tony B.
year 1992
title The Computer as a Graphic Medium in Conceptual Design
source Mission - Method - Madness [ACADIA Conference Proceedings / ISBN 1-880250-01-2] 1992, pp. 39-47
doi https://doi.org/10.52842/conf.acadia.1992.039
summary The success CAD has experienced in the architectural profession demonstrates that architects have been willing to replace traditional drafting media with computers and electronic plotters for the production of working drawings. Its expanded use in the design development phase for 3D modeling and rendering further justifies CAD's usefulness as a presentation medium. The schematic design phase however, has hardly been influenced by the evolution of CAD. Most architects simply have not come to view the computer as a viable design medium. One reason for this might be the strong correspondence between architectural CAD and plan view graphics, as used in working drawings, compared to the weak correspondence between architectural CAD and plan view graphics, as used in schematic design. The role of the actual graphic medium during schematic design should not be overlooked in the development of CAD applications.

In order to produce practical CAD applications for schematic design we must explore the computer’s potential as a form of expression and its role as a graphic medium. An examination of the use of traditional graphic media during schematic design will provide some clues regarding what capabilities CAD must provide and how a system should operate in order to be useful during conceptual design.

series ACADIA
last changed 2022/06/07 07:59

_id 2c22
authors O'Neill, Michael J.
year 1992
title Neural Network Simulation as a Computer- Aided design Tool For Predicting Wayfinding Performance
source New York: John Wiley & Sons, 1992. pp. 347-366 : ill. includes bibliography
summary Complex public facilities such as libraries, hospitals, and governmental buildings often present problems to users who must find their way through them. Research shows that difficulty in wayfinding has costs in terms of time, money, public safety, and stress that results from being lost. While a wide range of architectural research supports the notion that ease of wayfinding should be a criterion for good design, architects have no method for evaluating how well their building designs will support the wayfinding task. People store and retrieve information about the layout of the built environment in a knowledge representation known as the cognitive map. People depend on the information stored in the cognitive map to find their way through buildings. Although there are numerous simulations of the cognitive map, the mechanisms of these models are not constrained by what is known about the neurophysiology of the brain. Rather, these models incorporate search mechanisms that act on semantically encoded information about the environment. In this paper the author describes the evaluation and application of an artificial neural network simulation of the cognitive map as a means of predicting wayfinding behavior in buildings. This simulation is called NAPS-PC (Network Activity Processing Simulator--PC version). This physiologically plausible model represents knowledge about the layout of the environment through a network of inter-connected processing elements. The performance of NAPS-PC was evaluated against actual human wayfinding performance. The study found that the simulation generated behavior that matched the performance of human participants. After the validation, NAPS-PC was modified so that it could read environmental information directly from AutoCAD (a popular micro-computer-based CAD software package) drawing files, and perform 'wayfinding' tasks based on that environmental information. This prototype tool, called AutoNet, is conceptualized as a means of allowing designers to predict the wayfinding performance of users in a building before it is actually built
keywords simulation, cognition, neural networks, evaluation, floor plans, applications, wayfinding, layout, building
series CADline
last changed 2003/06/02 13:58

_id 975e
authors Pearce, M. and Goel, A. (et al.)
year 1992
title Case-Based Design support: A case study in architectural design
source IEEE Expert 7(5): 14-20
summary Archie, a small computer-based library of architectural design cases, is described. Archie helps architects in the high-level task of conceptual design as opposed to low-level tasks such as drawing and drafting, numerical calculations, and constraint propagation. Archie goes beyond supporting architects in design proposal and critiquing. It acts as a shared external memory that supports two kinds of design collaboration. First, by including enough knowledge about the goals, plans, outcomes, and lessons of past cases, it lets the designer access the work of previous architects. Second, by providing access to the perspectives of domain experts via the domain models, Archie helps architects anticipate and accommodate experts' views on evolving designs. The lessons learned about building large case-based systems to support real-world decision making in developing Archie are discussed.
series journal paper
last changed 2003/04/23 15:14

_id ddss9210
id ddss9210
authors Poortman, E.R.
year 1993
title Ratios for cost control
source Timmermans, Harry (Ed.), Design and Decision Support Systems in Architecture (Proceedings of a conference held in Mierlo, the Netherlands in July 1992), ISBN 0-7923-2444-7
summary The design of buildings takes place in phases representing a development from rough to precision planning. Estimates are made in order to test whether the result is still within the budget set by the client or developer. In this way, the decisions taken during the design phase can be quantified and expressed in monetary terms. To prevent blaming the wrong person when an overrun is discovered, the cost control process has to be improved. For that purpose, two new procedures have been developed: (i) a new translation activity; and (ii) ratios by which quantities can be characterized. 'Translation is the opposite of estimation. A monetary budget is converted -'translated' - into quantities, reflecting the desired quality of the building materials. The financial constraints of the client are thus converted into quantities - the building components used by the designers. Characteristic quantity figures play an important role in this activity. In working out an estimate, the form factor (i.e., the ratio between two characteristic values of a building component) has to be determined. The unit cost is then tested against that ratio. The introduction of the 'translation' activity and the use of characteristic quantity figures and form factors enhance existing estimation methods. By implementing these procedures, cost control becomes considerably more reliable.
series DDSS
last changed 2003/08/07 16:36

_id a302
authors Saggio, Antonino
year 1992
title A New Tool for Studio Teaching - Object Based Modeling
source CAAD Instruction: The New Teaching of an Architect? [eCAADe Conference Proceedings] Barcelona (Spain) 12-14 November 1992, pp. 251-264
doi https://doi.org/10.52842/conf.ecaade.1992.251
summary The scope of this paper is to present Computer Aided Architectural Design (and more particularly the dynamic and incremental modeling characteristics of Object Based Modeling) as a tool to reinforce the teaching of architectural design. Utilized within a method based on a cyclical application of "Concept and Testing", OBM has the possibility to work as an amplifier of design ideas and as a meaningful tool for the advancement of architectural design. Three related experiences support this hypothesis. The role played in concrete designs by an Object Based Modeling environment. Teaching with CAAD and OBM in the realm of documentation and analysis of architecture. Previous applications of the Concept-Testing methodology in design studios. The central sections of the paper focus on the analysis of these experiences, while the last section provides a 15 week, semester based, studio structure that incorporates OBM in the overall calendar and in key assignments. While the scope of this work coincides with the thesis presented at the Acadia '92 conference in Charleston (South Carolina), to focus the argument more clearly content, text and illustrations differ in several parts.

series eCAADe
email
last changed 2022/06/07 07:56

_id c93a
authors Saggio, Antonino
year 1992
title Object Based Modeling and Concept-Testing: A Framework for Studio Teaching
source Mission - Method - Madness [ACADIA Conference Proceedings / ISBN 1-880250-01-2] 1992, pp. 49-63
doi https://doi.org/10.52842/conf.acadia.1992.049
summary This chapter concludes with a proposal for a studio structure that incorporates computers as a creative stimulus in the design process. Three related experiences support this hypothesis: the role played in concrete designs by an Object Based Modeling environment, teaching with Computer Aided Architectural Design and OBM in the realm of documentation and analysis of architecture, previous applications of the Concept-Testing methodology in design studios. Examples from these three areas provide the framework for mutual support between OBM and a C-T approach for studio teaching. The central sections of the chapter focus on the analysis of these experiences, while the last section provides a 15 week, semester based, studio structure that incorporates OBM in the overall calendar and in key assignments.

series ACADIA
email
last changed 2022/06/07 07:56

_id 84e6
authors Seebohm, Thomas
year 1995
title A Response to William J. Mitchell's review of Possible Palladian Villas, by George Hersey and Richard Freedman, MIT Press, 1992
source AA Files ( Journal of the Architectural Association School of Architecture), No. 30, Autumn, 1995, pp. 109 - 111
summary A review by William J. Mitchell, entitled 'Franchising Architectural Styles", appeared in AA Files no. 26 (Autumn 1993). It reflects on a collision between two fundamentally opposing points of view, one held by the reviewer, the other by the reviewed. These determine our expectations of the role of computers in architectural design.

series journal paper
email
last changed 2003/05/15 21:45

For more results click below:

this is page 0show page 1show page 2show page 3show page 4show page 5... show page 12HOMELOGIN (you are user _anon_542966 from group guest) CUMINCAD Papers Powered by SciX Open Publishing Services 1.002