CumInCAD is a Cumulative Index about publications in Computer Aided Architectural Design
supported by the sibling associations ACADIA, CAADRIA, eCAADe, SIGraDi, ASCAAD and CAAD futures

PDF papers
References

Hits 1 to 20 of 246

_id caadria2014_071
id caadria2014_071
authors Li, Lezhi; Renyuan Hu, Meng Yao, Guangwei Huang and Ziyu Tong
year 2014
title Sculpting the Space: A Circulation Based Approach to Generative Design in a Multi-Agent System
source Rethinking Comprehensive Design: Speculative Counterculture, Proceedings of the 19th International Conference on Computer-Aided Architectural Design Research in Asia (CAADRIA 2014) / Kyoto 14-16 May 2014, pp. 565–574
doi https://doi.org/10.52842/conf.caadria.2014.565
summary This paper discusses an MAS (multiagent system) based approach to generating architectural spaces that afford better modes of human movement. To achieve this, a pedestrian simulation is carried out to record the data with regard to human spatial experience during the walking process. Unlike common practices of performance oriented generation where final results are achieved through cycles of simulation and comparison, what we propose here is to let human’s movement exert direct influence on space. We made this possible by asking "humans" to project simulation data on architectural surroundings, and thus cause the layout to change for the purpose of affording what we designate as good spatial experiences. A generation experiment of an exhibition space is implemented to explore this approach, in which tentative rules of such spatial manipulation are proposed and tested through space syntax analyse. As the results suggested, by looking at spatial layouts through a lens of human behaviour, this projection-and-generation method provides some insight into space qualities that other methods could not have offered.
keywords Performance oriented generative design; projection; multi-agent system; pedestrian simulation; space syntax
series CAADRIA
email
last changed 2022/06/07 07:59

_id 6ef4
authors Carrara, Gianfranco and Kalay, Yehuda E.
year 1992
title Multi-Model Representation of Design Knowledge
source Mission - Method - Madness [ACADIA Conference Proceedings / ISBN 1-880250-01-2] 1992, pp. 77-88
doi https://doi.org/10.52842/conf.acadia.1992.077
summary Explicit representation of design knowledge is needed if scientific methods are to be applied in design research, and if comPuters are to be used in the aid of design education and practice. The representation of knowledge in general, and design knowledge in particular, have been the subject matter of computer science, design methods, and computer- aided design research for quite some time. Several models of design knowledge representation have been developed over the last 30 years, addressing specific aspects of the problem. This paper describes a different approach to design knowledge representation that recognizes the Multi-modal nature of design knowledge. It uses a variety of computational tools to encode different kinds of design knowledge, including the descriptive (objects), the prescriptive (goals) and the operational (methods) kinds. The representation is intended to form a parsimonious, communicable and presentable knowledge-base that can be used as a tool for design research and education as well as for CAAD.
keywords Design Methods, Design Process, Goals, Knowledge Representation, Semantic Networks
series ACADIA
email
last changed 2022/06/07 07:55

_id 7ce5
authors Gal, Shahaf
year 1992
title Computers and Design Activities: Their Mediating Role in Engineering Education
source Sociomedia, ed. Edward Barret. MIT Press
summary Sociomedia: With all the new words used to describe electronic communication (multimedia, hypertext, cyberspace, etc.), do we need another one? Edward Barrett thinks we do; hence, he coins the term "sociomedia." It is meant to displace a computing economy in which technicity is hypostasized over sociality. Sociomedia, a compilation of twenty-five articles on the theory, design and practice of educational multimedia and hypermedia, attempts to re-value the communicational face of computing. Value, of course, is "ultimately a social construct." As such, it has everything to do with knowledge, power, education and technology. The projects discussed in this book represent the leading edge of electronic knowledge production in academia (not to mention major funding) and are determining the future of educational media. For these reasons, Sociomedia warrants close inspection. Barrett's introduction sets the tone. For him, designing computer media involves hardwiring a mechanism for the social construction of knowledge (1). He links computing to a process of social and communicative interactivity for constructing and desseminating knowledge. Through a mechanistic mapping of the university as hypercontext (a huge network that includes classrooms as well as services and offices), Barrett models intellectual work in such a way as to avoid "limiting definitions of human nature or human development." Education, then, can remain "where it should be--in the human domain (public and private) of sharing ideas and information through the medium of language." By leaving education in a virtual realm (where we can continue to disagree about its meaning and execution), it remains viral, mutating and contaminating in an intellectually healthy way. He concludes that his mechanistic model, by means of its reductionist approach, preserves value (7). This "value" is the social construction of knowledge. While I support the social orientation of Barrett's argument, discussions of value are related to power. I am not referring to the traditional teacher-student power structure that is supposedly dismantled through cooperative and constructivist learning strategies. The power to be reckoned with in the educational arena is foundational, that which (pre)determines value and the circulation of knowledge. "Since each of you reading this paragraph has a different perspective on the meaning of 'education' or 'learning,' and on the processes involved in 'getting an education,' think of the hybris in trying to capture education in a programmable function, in a displayable object, in a 'teaching machine'" (7). Actually, we must think about that hybris because it is, precisely, what informs teaching machines. Moreover, the basic epistemological premises that give rise to such productions are too often assumed. In the case of instructional design, the episteme of cognitive sciences are often taken for granted. It is ironic that many of the "postmodernists" who support electronic hypertextuality seem to have missed Jacques Derrida's and Michel Foucault's "deconstructions" of the epistemology underpinning cognitive sciences (if not of epistemology itself). Perhaps it is the glitz of the technology that blinds some users (qua developers) to the belief systems operating beneath the surface. Barrett is not guilty of reactionary thinking or politics; he is, in fact, quite in line with much American deconstructive and postmodern thinking. The problem arises in that he leaves open the definitions of "education," "learning" and "getting an education." One cannot engage in the production of new knowledge without orienting its design, production and dissemination, and without negotiating with others' orientations, especially where largescale funding is involved. Notions of human nature and development are structural, even infrastructural, whatever the medium of the teaching machine. Although he addresses some dynamics of power, money and politics when he talks about the recession and its effects on the conference, they are readily visible dynamics of power (3-4). Where does the critical factor of value determination, of power, of who gets what and why, get mapped onto a mechanistic model of learning institutions? Perhaps a mapping of contributors' institutions, of the funding sources for the projects showcased and for participation in the conference, and of the disciplines receiving funding for these sorts of projects would help visualize the configurations of power operative in the rising field of educational multimedia. Questions of power and money notwithstanding, Barrett's introduction sets the social and textual thematics for the collection of essays. His stress on interactivity, on communal knowledge production, on the society of texts, and on media producers and users is carried foward through the other essays, two of which I will discuss. Section I of the book, "Perspectives...," highlights the foundations, uses and possible consequences of multimedia and hypertextuality. The second essay in this section, "Is There a Class in This Text?," plays on the robust exchange surrounding Stanley Fish's book, Is There a Text in This Class?, which presents an attack on authority in reading. The author, John Slatin, has introduced electronic hypertextuality and interaction into his courses. His article maps the transformations in "the content and nature of work, and the workplace itself"-- which, in this case, is not industry but an English poetry class (25). Slatin discovered an increase of productive and cooperative learning in his electronically- mediated classroom. For him, creating knowledge in the electronic classroom involves interaction between students, instructors and course materials through the medium of interactive written discourse. These interactions lead to a new and persistent understanding of the course materials and of the participants' relation to the materials and to one another. The work of the course is to build relationships that, in my view, constitute not only the meaning of individual poems, but poetry itself. The class carries out its work in the continual and usually interactive production of text (31). While I applaud his strategies which dismantle traditional hierarchical structures in academia, the evidence does not convince me that the students know enough to ask important questions or to form a self-directing, learning community. Stanley Fish has not relinquished professing, though he, too, espouses the indeterminancy of the sign. By the fourth week of his course, Slatin's input is, by his own reckoning, reduced to 4% (39). In the transcript of the "controversial" Week 6 exchange on Gertrude Stein--the most disliked poet they were discussing at the time (40)--we see the blind leading the blind. One student parodies Stein for three lines and sums up his input with "I like it." Another, finds Stein's poetry "almost completey [sic] lacking in emotion or any artistic merit" (emphasis added). On what grounds has this student become an arbiter of "artistic merit"? Another student, after admitting being "lost" during the Wallace Steven discussion, talks of having more "respect for Stevens' work than Stein's" and adds that Stein's poetry lacks "conceptual significance[, s]omething which people of varied opinion can intelligently discuss without feeling like total dimwits...." This student has progressed from admitted incomprehension of Stevens' work to imposing her (groundless) respect for his work over Stein's. Then, she exposes her real dislike for Stein's poetry: that she (the student) missed the "conceptual significance" and hence cannot, being a person "of varied opinion," intelligently discuss it "without feeling like [a] total dimwit." Slatin's comment is frightening: "...by this point in the semester students have come to feel increasingly free to challenge the instructor" (41). The students that I have cited are neither thinking critically nor are their preconceptions challenged by student-governed interaction. Thanks to the class format, one student feels self-righteous in her ignorance, and empowered to censure. I believe strongly in student empowerment in the classroom, but only once students have accrued enough knowledge to make informed judgments. Admittedly, Slatin's essay presents only partial data (there are six hundred pages of course transcripts!); still, I wonder how much valuable knowledge and metaknowledge was gained by the students. I also question the extent to which authority and professorial dictature were addressed in this course format. The power structures that make it possible for a college to require such a course, and the choice of texts and pedagogy, were not "on the table." The traditional professorial position may have been displaced, but what took its place?--the authority of consensus with its unidentifiable strong arm, and the faceless reign of software design? Despite Slatin's claim that the students learned about the learning process, there is no evidence (in the article) that the students considered where their attitudes came from, how consensus operates in the construction of knowledge, how power is established and what relationship they have to bureaucratic insitutions. How do we, as teaching professionals, negotiate a balance between an enlightened despotism in education and student-created knowledge? Slatin, and other authors in this book, bring this fundamental question to the fore. There is no definitive answer because the factors involved are ultimately social, and hence, always shifting and reconfiguring. Slatin ends his article with the caveat that computerization can bring about greater estrangement between students, faculty and administration through greater regimentation and control. Of course, it can also "distribute authority and power more widely" (50). Power or authority without a specific face, however, is not necessarily good or just. Shahaf Gal's "Computers and Design Activities: Their Mediating Role in Engineering Education" is found in the second half of the volume, and does not allow for a theory/praxis dichotomy. Gal recounts a brief history of engineering education up to the introduction of Growltiger (GT), a computer-assisted learning aid for design. He demonstrates GT's potential to impact the learning of engineering design by tracking its use by four students in a bridge-building contest. What his text demonstrates clearly is that computers are "inscribing and imaging devices" that add another viewpoint to an on-going dialogue between student, teacher, earlier coursework, and other teaching/learning tools. The less proficient students made a serious error by relying too heavily on the technology, or treating it as a "blueprint provider." They "interacted with GT in a way that trusted the data to represent reality. They did not see their interaction with GT as a negotiation between two knowledge systems" (495). Students who were more thoroughly informed in engineering discourses knew to use the technology as one voice among others--they knew enough not simply to accept the input of the computer as authoritative. The less-advanced students learned a valuable lesson from the competition itself: the fact that their designs were not able to hold up under pressure (literally) brought the fact of their insufficient knowledge crashing down on them (and their bridges). They also had, post factum, several other designs to study, especially the winning one. Although competition and comparison are not good pedagogical strategies for everyone (in this case the competitors had volunteered), at some point what we think we know has to be challenged within the society of discourses to which it belongs. Students need critique in order to learn to push their learning into auto-critique. This is what is lacking in Slatin's discussion and in the writings of other avatars of constructivist, collaborative and computer-mediated pedagogies. Obviously there are differences between instrumental types of knowledge acquisition and discoursive knowledge accumulation. Indeed, I do not promote the teaching of reading, thinking and writing as "skills" per se (then again, Gal's teaching of design is quite discursive, if not dialogic). Nevertheless, the "soft" sciences might benefit from "bridge-building" competitions or the re-institution of some forms of agonia. Not everything agonistic is inhuman agony--the joy of confronting or creating a sound argument supported by defensible evidence, for example. Students need to know that soundbites are not sound arguments despite predictions that electronic writing will be aphoristic rather than periodic. Just because writing and learning can be conceived of hypertextually does not mean that rigor goes the way of the dinosaur. Rigor and hypertextuality are not mutually incompatible. Nor is rigorous thinking and hard intellectual work unpleasurable, although American anti-intellectualism, especially in the mass media, would make it so. At a time when the spurious dogmatics of a Rush Limbaugh and Holocaust revisionist historians circulate "aphoristically" in cyberspace, and at a time when knowledge is becoming increasingly textualized, the role of critical thinking in education will ultimately determine the value(s) of socially constructed knowledge. This volume affords the reader an opportunity to reconsider knowledge, power, and new communications technologies with respect to social dynamics and power relationships.
series other
last changed 2003/04/23 15:14

_id ecaade2009_138
id ecaade2009_138
authors Kozikoglu, Nilüfer; Erdogan, Meral; Nircan, Ahmet Kutsi; Özsel Akipek, Fulya
year 2009
title Collective Design Network: Systems Thinking (Event-Pattern-Structures) and System Dynamics Modelling as a Design Concept and Strategy
source Computation: The New Realm of Architectural Design [27th eCAADe Conference Proceedings / ISBN 978-0-9541183-8-9] Istanbul (Turkey) 16-19 September 2009, pp. 533-540
doi https://doi.org/10.52842/conf.ecaade.2009.533
wos WOS:000334282200064
summary This paper will relay the initial phase of a collaborative work within partners from the design discipline, systems engineering, and software engineering which deals with the interrelations of “network idea”, “systems thinking”, “collective design”, and “computation”. Vensim– a system dynamics modelling tool developed by Ventana Systems, Inc. in 1992 – has been used in an experimental first year design studio to engage students in systems thinking in the architectural design environment. It has been observed that this tool enabled most students to develop a multi-layered, complex and more controlled design logic and to amplify the cognitive processes at the beginning of the design education. We conclude that in order to fully realize systems thinking in the design process, new ways of integrating parametric design environments and system dynamic modelling environments needs to be investigated.
keywords Design network, system dynamics, dynamic pattern, collectivity, integration
series eCAADe
email
last changed 2022/06/07 07:51

_id ddss9215
id ddss9215
authors Mortola, E. and Giangrande, A.
year 1993
title A trichotomic segmentation procedure to evaluate projects in architecture
source Timmermans, Harry (Ed.), Design and Decision Support Systems in Architecture (Proceedings of a conference held in Mierlo, the Netherlands in July 1992), ISBN 0-7923-2444-7
summary This paper illustrates a model used to construct the evaluation module for An Interface for Designing (AID), a system to aid architectural design. The model can be used at the end of every cycle of analysis-synthesis-evaluation in the intermediate phases of design development. With the aid of the model it is possible to evaluate the quality of a project in overall terms to establish whether the project is acceptable, whether it should be elaborated ex-novo, or whether it is necessary to begin a new cycle to improve it. In this last case, it is also possible to evaluate the effectiveness of the possible actions and strategies for improvement. The model is based on a procedure of trichotomic segmentation, developed with MCDA (Multi-Criteria Decision Aid), which uses the outranking relation to compare the project with some evaluation profiles taken as projects of reference. An application of the model in the teaching field will also be described.
series DDSS
last changed 2003/08/07 16:36

_id 46c7
id 46c7
authors Ozel, Filiz
year 1992
title Data Modeling Needs of Life Safety Code (LSC) Compliance Applications
source Mission - Method - Madness [ACADIA Conference Proceedings / ISBN 1-880250-01-2] 1992, pp. 177-185
doi https://doi.org/10.52842/conf.acadia.1992.177
summary One of the most complex code compliance issues originates from the conformance of designs to Life Safety Code (NFPA 101). The development of computer based code compliance checking programs attracted the attention of building researchers and practitioners alike. These studies represent a number of approaches ranging from CAD based procedural approaches to rule based, non graphic ones, but they do not address the interaction of the rule base of such systems with graphic data bases that define the geometry of architectural objects. Automatic extraction of the attributes and the configuration of building systems requires 11 architectural object - graphic entity" data models that allow access and retrieval of the necessary data for code compliance checking. This study aims to specifically focus on the development of such a data model through the use of AutoLISP feature of AutoCAD (Autodesk Inc.) graphic system. This data model is intended to interact with a Life Safety Code rule base created through Level5-Object (Focus Inc.) expert system.

Assuming the availability of a more general building data model, one must define life and fire safety features of a building before any automatic checking can be performed. Object oriented data structures are beginning to be applied to design objects, since they allow the type versatility demanded by design applications. As one generates a functional view of the main data model, the software user must provide domain specific information. A functional view is defined as the process of generating domain specific data structures from a more general purpose data model, such as defining egress routes from wall or room object data structure. Typically in the early design phase of a project, these are related to the emergency egress design features of a building. Certain decisions such as where to provide sprinkler protection or the location of protected egress ways must be made early in the process.

series ACADIA
email
last changed 2022/06/07 08:00

_id ddss9212
id ddss9212
authors Prins, M., Bax, M.F.TH., Carp, J.C. and Tempelmans Plat, H.
year 1993
title A design decision support system for building flexibility and costs
source Timmermans, Harry (Ed.), Design and Decision Support Systems in Architecture (Proceedings of a conference held in Mierlo, the Netherlands in July 1992), ISBN 0-7923-2444-7
summary Because of possible changes in demand, buildings must have some flexibility. In this paper a building model, a financial-economic model and a process model will be presented, which together constitute a design decision support system. This system may be used to decide on flexibility and costs of building variants in all phases of the design process.
series DDSS
last changed 2003/08/07 16:36

_id eaff
authors Shaviv, Edna and Kalay, Yehuda E.
year 1992
title Combined Procedural and Heuristic Method to Energy Conscious Building Design and Evaluation
source New York: John Wiley & Sons, 1992. pp. 305-325 : ill. includes bibliography
summary This paper describes a methodology that combines both procedural and heuristic methods by means of integrating a simulation model with a knowledge based system (KBS) for supporting all phases of energy conscious design and evaluation. The methodology is based on partitioning the design process into discrete phases and identifying the informational characteristics of each phase, as far as energy conscious design is concerned. These informational characteristics are expressed in the form of design variables (parameters) and the relationships between them. The expected energy performance of a design alternative is evaluated by a combination of heuristic and procedural methods, and the context-sensitive application of default values, when necessary. By virtue of combining knowledge based evaluations with procedural ones, this methodology allows for testing the applicability of heuristic rules in non-standard cases,Ô h)0*0*0*°° ÔŒ thereby improving the predictable powers of the evaluation
keywords design process, evaluation, energy, analysis, synthesis, integration, architecture, knowledge base, heuristics, simulation
series CADline
email
last changed 2003/06/02 10:24

_id ddss9203
id ddss9203
authors Smeets, J.
year 1993
title Housing tenancy, data management and quality control
source Timmermans, Harry (Ed.), Design and Decision Support Systems in Architecture (Proceedings of a conference held in Mierlo, the Netherlands in July 1992), ISBN 0-7923-2444-7
summary This paper deals with housing tenancy, data management and quality control. The proposed method is focused on quality characteristics of housing estates in view of rentability risks. It entails a cycle of registration, analysis and implementation of measures. The starting point is the behaviour of the housing consumer in a market-oriented context. The model is framed within theories of strategic management and marketing. Systematic registration and evaluation of consumer behaviour, by means of a set of relevant process and product indicators, can yield relevant information in the four phases of the rental process: orientation, intake, dwelling and exit. This information concerns the way in which the dwelling (characterized by product indicators) fits the needs of the consumer. The systematic analysis of the process and product indicators during the phases of the rental process makes a 'strength-weakness analysis' of housing estates possible. The indicators can be presented in aggregated form by way of a 'rentability index. The 'strength-weakness analysis' steers the intervention in the quality characteristics of housing estates. The possibilities for readjustment, however, are different. The quality control system is not only an early warning system, but also has several other functions: evaluation, planning and communication. The method described here lays a solid foundation for a decision-support system in the area of housing tenancy.
series DDSS
last changed 2003/08/07 16:36

_id 25b7
authors Smeltzer, G., Roelen, W. and Maver, T.W.
year 1992
title Design Modelling and Design Presentation From a Computer-Generated Image Towards a Multi-user Design System
source CAAD Instruction: The New Teaching of an Architect? [eCAADe Conference Proceedings] Barcelona (Spain) 12-14 November 1992, pp. 137-144
doi https://doi.org/10.52842/conf.ecaade.1992.137
summary CAD systems regularly offer new techniques for the presentation of design proposals like computer-generated (stereo-) images, animations, holography and virtual reality. These techniques are mainly used for the presentation of a final design or for the presentation of buildings that have already been constructed. As in the course of time the quality of the CAD systems and their users have improved enormously, it is also possible to use these systems for the evaluation of several temporary design proposals during the design process. Since 'beautiful pictures' and 'wonderful animations' have already shown their great value when presenting a design, it is sometimes as if CAD systems are considered suitable for this propose only. Even new techniques like virtual reality systems seem to be valued only through the 'tinted glasses' of the presentation capabilities. Hardly any attention is paid to the possibilities that these new techniques offer as an instrument to support modelling and evaluation during the design process. This article will outline the results of research and development in the field of virtual reality. Virtual reality systems are based on the combination of a number of already existing presentation techniques like photo-realistic images, stereo images and real time animations. The added value of this type of CAD system is determined by the use of a new type of user interface. In effect this interface consists of sensors that register how its user moves and looks around. Through this, and by using a so- called 'eye phone' (comparable to stereo headphones for sound) the user, with some imaginative powers, thinks he is standing in the environment that he modelled, or in front of his building design. After this we will first sketch the outlines of some presentation techniques, that can also be found in a virtual reality system. Special attention will be paid to some specific characteristics of these techniques themselves. Next, a more detailed description will be given of virtual reality systems, focusing on the system that is being developed at Calibre itself.

series eCAADe
email
last changed 2022/06/07 07:56

_id 58c5
authors Van Wezel, Ruud
year 1992
title MOCK-UP SYSTEM WAGENINGEN: DEVELOPMENT, LIMITATION AND FUTURE
source Proceedings of the 4rd European Full-Scale Modelling Conference / Lausanne (Switzerland) 9-12 September 1992, Part A, pp. 15-18
summary A brief description of the development of the Mock-up System (MUS) in the context of the Wageningen training program. The students are first taught some keywords in understanding of the building process. They are then trained to express how they want to live (theory) and later on they confront themselves with what they have built in the MUS (practice) . Besides being an educational tool, the MUS is used for pre-building evaluation and research questions. The drawbacks or limitations of the system (outdoor reality versus indoor simulation) and future use by different target groups are also discussed in this paper. The power of the MUS is, and will continue to be, the concrete building of communicational results and the generation of communication by doing so.
keywords Full-scale Modeling, Model Simulation, Real Environments
series other
type normal paper
more http://info.tuwien.ac.at/efa
last changed 2004/05/04 15:30

_id avocaad_2001_09
id avocaad_2001_09
authors Yu-Tung Liu, Yung-Ching Yeh, Sheng-Cheng Shih
year 2001
title Digital Architecture in CAD studio and Internet-based competition
source AVOCAAD - ADDED VALUE OF COMPUTER AIDED ARCHITECTURAL DESIGN, Nys Koenraad, Provoost Tom, Verbeke Johan, Verleye Johan (Eds.), (2001) Hogeschool voor Wetenschap en Kunst - Departement Architectuur Sint-Lucas, Campus Brussel, ISBN 80-76101-05-1
summary Architectural design has been changing because of the vast and creative use of computer in different ways. From the viewpoint of designing itself, computer has been used as drawing tools in the latter phase of design (Mitchell 1977; Coyne et al. 1990), presentation and simulation tools in the middle phase (Liu and Bai 2000), and even critical media which triggers creative thinking in the very early phase (Maher et al. 2000; Liu 1999; Won 1999). All the various roles that computer can play have been adopted in a number of professional design corporations and so-called computer-aided design (CAD) studio in schools worldwide (Kvan 1997, 2000; Cheng 1998). The processes and outcomes of design have been continuously developing to capture the movement of the computer age. However, from the viewpoint of social-cultural theories of architecture, the evolvement of design cannot be achieved solely by designers or design processes. Any new idea of design can be accepted socially, culturally and historically only under one condition: The design outcomes could be reviewed and appreciated by critics in the field at the time of its production (Csikszentmihalyi 1986, 1988; Schon and Wiggins 1992; Liu 2000). In other words, aspects of design production (by designers in different design processes) are as critical as those of design appreciation (by critics in different review processes) in the observation of the future trends of architecture.Nevertheless, in the field of architectural design with computer and Internet, that is, so-called computer-aided design computer-mediated design, or internet-based design, most existing studies pay more attentions to producing design in design processes as mentioned above. Relatively few studies focus on how critics act and how they interact with designers in the review processes. Therefore, this study intends to investigate some evolving phenomena of the interaction between design production and appreciation in the environment of computer and Internet.This paper takes a CAD studio and an Internet-based competition as examples. The CAD studio includes 7 master's students and 2 critics, all from the same countries. The Internet-based competition, held in year 2000, includes 206 designers from 43 counties and 26 critics from 11 countries. 3 students and the 2 critics in the CAD studio are the competition participating designers and critics respectively. The methodological steps are as follows: 1. A qualitative analysis: observation and interview of the 3 participants and 2 reviewers who join both the CAD studio and the competition. The 4 analytical criteria are the kinds of presenting media, the kinds of supportive media (such as verbal and gesture/facial data), stages of the review processes, and interaction between the designer and critics. The behavioral data are acquired by recording the design presentation and dialogue within 3 months. 2. A quantitative analysis: statistical analysis of the detailed reviewing data in the CAD studio and the competition. The four 4 analytical factors are the reviewing time, the number of reviewing of the same project, the comparison between different projects, and grades/comments. 3. Both the qualitative and quantitative data are cross analyzed and discussed, based on the theories of design thinking, design production/appreciation, and the appreciative system (Goodman 1978, 1984).The result of this study indicates that the interaction between design production and appreciation during the review processes could differ significantly. The review processes could be either linear or cyclic due to the influences from the kinds of media, the environmental discrepancies between studio and Internet, as well as cognitive thinking/memory capacity. The design production and appreciation seem to be more linear in CAD studio whereas more cyclic in the Internet environment. This distinction coincides with the complementary observations of designing as a linear process (Jones 1970; Simon 1981) or a cyclic movement (Schon and Wiggins 1992). Some phenomena during the two processes are also illustrated in detail in this paper.This study is merely a starting point of the research in design production and appreciation in the computer and network age. The future direction of investigation is to establish a theoretical model for the interaction between design production and appreciation based on current findings. The model is expected to conduct using revised protocol analysis and interviews. The other future research is to explore how design computing creativity emerge from the process of producing and appreciating.
series AVOCAAD
email
last changed 2005/09/09 10:48

_id a3f5
authors Zandi-Nia, Abolfazl
year 1992
title Topgene: An artificial Intelligence Approach to a Design Process
source Delft University of Technology
summary This work deals with two architectural design (AD) problems at the topological level and in presence of the social norms community, privacy, circulation-cost, and intervening opportunity. The first problem concerns generating a design with respect to a set of above mentioned norms, and the second problem requires evaluation of existing designs with respect to the same set of norms. Both problems are based on the structural-behavioral relationship in buildings. This work has challenged above problems in the following senses: (1) A working system, called TOPGENE (The TOpological Pattern GENErator) has been developed. (2) Both problems may be vague and may lack enough information in their statement. For example, an AD in the presence of the social norms requires the degrees of interactions between the location pairs in the building. This information is not always implicitly available, and must be explicated from the design data. (3) An AD problem at topological level is intractable with no fast and efficient algorithm for its solution. To reduce the search efforts in the process of design generation, TOPGENE uses a heuristic hill climbing strategy that takes advantage of domain specific rules of thumbs to choose a path in the search space of a design. (4) TOPGENE uses the Q-analysis method for explication of hidden information, also hierarchical clustering of location-pairs with respect to their flow generation potential as a prerequisite information for the heuristic reasoning process. (5) To deal with a design of a building at topological level TOPGENE takes advantage of existing graph algorithms such as path-finding and planarity testing during its reasoning process. This work also presents a new efficient algorithm for keeping track of distances in a growing graph. (6) This work also presents a neural net implementation of a special case of the design generation problem. This approach is based on the Hopfield model of neural networks. The result of this approach has been used test TOPGENE approach in generating designs. A comparison of these two approaches shows that the neural network provides mathematically more optimal designs, while TOPGENE produces more realistic designs. These two systems may be integrated to create a hybrid system.
series thesis:PhD
last changed 2003/02/12 22:37

_id 3ff5
authors Abbo, I.A., La Scalea, L., Otero, E. and Castaneda, L.
year 1992
title Full-Scale Simulations as Tool for Developing Spatial Design Ability
source Proceedings of the 4rd European Full-Scale Modelling Conference / Lausanne (Switzerland) 9-12 September 1992, Part C, pp. 7-10
summary Spatial Design Ability has been defined as the capability to anticipate effects (psychological impressions on potential observers or users) produced by mental manipulation of elements of architectural or urban spaces. This ability, of great importance in choosing the appropriate option during the design process, is not specifically developed in schools of architecture and is partially obtained as a by-product of drawing, designing or architectural criticism. We use our Laboratory as a tool to present spaces to people so that they can evaluate them. By means of a series of exercises, students confront their anticipations with the psychological impressions produced in other people. For this occasion, we present an experience in which students had to propose a space for an exhibition hag in which architectural projects (student thesis) were to be shown. Following the Spatial Design Ability Development Model which we have been using for several years, students first get acquainted with the use of evaluation instruments for psychological impressions as well as with research methodology. In this case, due to the short period available, we reduced research to investigate the effects produced by the manipulation of only 2 independents variables: students manipulated first the form of the roof, walls and interiors elements, secondly, color and texture of those elements. They evaluated spatial quality, character and the other psychological impressions that manipulations produced in people. They used three dimensional scale models 1/10 and 1/1.
keywords Full-scale Modeling, Model Simulation, Real Environments
series other
email
more http://info.tuwien.ac.at/efa
last changed 2003/08/25 10:12

_id 4704
authors Amirante, I., Rinaldi, S. and Muzzillo, F.
year 1992
title A Tutorial Experiment Concerning Dampness Diagnosis Supported by an Expert System
source CAAD Instruction: The New Teaching of an Architect? [eCAADe Conference Proceedings] Barcelona (Spain) 12-14 November 1992, pp. 159-172
doi https://doi.org/10.52842/conf.ecaade.1992.159
summary (A) The teaching of Technology of Building Rehabilitation in Italian Universities - (B) Experimental course of technological rehabilitation with computer tools - (C) Synthesis of technological approach - (D) Dampness diagnostic process using the Expert System - (E) Primary consideration on tutorial experience - (F) Bibliography
series eCAADe
last changed 2022/06/07 07:54

_id 735a
authors Anh, Tran Hoai
year 1992
title FULL-SCALE EXPERIMENT ON KITCHEN FUNCTION IN HANOI
source Proceedings of the 4rd European Full-Scale Modelling Conference / Lausanne (Switzerland) 9-12 September 1992, Part A, pp. 19-30
summary This study is a part of a licentiate thesis on "Functional kitchen for the Vietnamese cooking way"at the Department of Architecture and Development studies, Lund University. The issues it is dealing with are: (1) Inadequacy of kitchen design in the apartment buildings in Hanoi, where the kitchen is often designed as a mere cooking place - other parts of the food making process are not given any attention. (2) Lack of standard dimensional and planning criteria for functional kitchen which can serve as bases for kitchen design. // The thesis aims at finding out indicators on functional spatial requirements for kitchen, which can serve as guide-line for designing functional kitchen for Hanoi. One of the main propositions in the thesis is that functional kitchens for Hanoi should be organised to permit the culinary activities done according to the Vietnamese urban culinary practice. This is based on the concept that the culinary activity is an expression Of culture, thus the practice of preparing meal in the present context of the urban households in Hanoi has an established pattern, method which demand a suitable area and arrangement in the kitchen. This pattern and cooking method should make up the functional requirement for kitchen in Hanoi, and be taken in to account if functional kitchen designing is to be achieved. In the context of the space-limited apartment building of Hanoi, special focus is given to find out indicators on the minimum functional spatial requirements of the kitchen works.
keywords Full-scale Modeling, Model Simulation, Real Environments
series other
type normal paper
more http://info.tuwien.ac.at/efa
last changed 2004/05/04 15:29

_id 60e7
authors Bailey, Rohan
year 2000
title The Intelligent Sketch: Developing a Conceptual Model for a Digital Design Assistant
source Eternity, Infinity and Virtuality in Architecture [Proceedings of the 22nd Annual Conference of the Association for Computer-Aided Design in Architecture / 1-880250-09-8] Washington D.C. 19-22 October 2000, pp. 137-145
doi https://doi.org/10.52842/conf.acadia.2000.137
summary The computer is a relatively new tool in the practice of Architecture. Since its introduction, there has been a desire amongst designers to use this new tool quite early in the design process. However, contrary to this desire, most Architects today use pen and paper in the very early stages of design to sketch. Architects solve problems by thinking visually. One of the most important tools that the Architect has at his disposal in the design process is the hand sketch. This iterative way of testing ideas and informing the design process with images fundamentally directs and aids the architect’s decision making. It has been said (Schön and Wiggins 1992) that sketching is about the reflective conversation designers have with images and ideas conveyed by the act of drawing. It is highly dependent on feedback. This “conversation” is an area worthy of investigation. Understanding this “conversation” is significant to understanding how we might apply the computer to enhance the designer’s ability to capture, manipulate and reflect on ideas during conceptual design. This paper discusses sketching and its relation to design thinking. It explores the conversations that designers engage in with the media they use. This is done through the explanation of a protocol analysis method. Protocol analysis used in the field of psychology, has been used extensively by Eastman et al (starting in the early 70s) as a method to elicit information about design thinking. In the pilot experiment described in this paper, two persons are used. One plays the role of the “hand” while the other is the “mind”- the two elements that are involved in the design “conversation”. This variation on classical protocol analysis sets out to discover how “intelligent” the hand should be to enhance design by reflection. The paper describes the procedures entailed in the pilot experiment and the resulting data. The paper then concludes by discussing future intentions for research and the far reaching possibilities for use of the computer in architectural studio teaching (as teaching aids) as well as a digital design assistant in conceptual design.
keywords CAAD, Sketching, Protocol Analysis, Design Thinking, Design Education
series ACADIA
last changed 2022/06/07 07:54

_id aa78
authors Bayazit, Nigan
year 1992
title Requirements of an Expert System for Design Studios
source CAAD Instruction: The New Teaching of an Architect? [eCAADe Conference Proceedings] Barcelona (Spain) 12-14 November 1992, pp. 187-194
doi https://doi.org/10.52842/conf.ecaade.1992.187
summary The goal of this paper is to study problems of the transition from traditional architectural studio teaching to CAAD studio teaching which requires a CAAD expert system as studio tutor, and to study the behavior of the student in this new environment. The differences between the traditional and computerized studio teaching and the experiences in this field are explained referring to the requirements for designing time in relation to the expertise of the student in the application of a CAD program. Learning styles and the process of design in computerized and non-computerized studio teaching are discussed. Design studio requirements of the students in traditional studio environment while doing design works are clarified depending on the results of an empirical study which explained the relations between the tutor and the student while they were doing studio critiques. Main complaints of the students raised in the empirical study were the lack of data in the specific design problem area, difficulties of realization of ideas and thoughts, not knowing the starting point of design, having no information about the references to be used for the specific design task, having difficulties in the application of presentation techniques. In the concluding parts of the paper are discussed the different styles of teaching and their relation to the CAAD environment, the transformation of the instructional programs for the new design environment, the future expectations from the CAAD programs, properties of the new teaching environment and the roles of the expert systems in design studio education.

keywords CAAD Education, Expert System, Architectural Design Studio, Knowledge Acquisition
series eCAADe
email
last changed 2022/06/07 07:54

_id ecaadesigradi2019_449
id ecaadesigradi2019_449
authors Becerra Santacruz, Axel
year 2019
title The Architecture of ScarCity Game - The craft and the digital as an alternative design process
source Sousa, JP, Xavier, JP and Castro Henriques, G (eds.), Architecture in the Age of the 4th Industrial Revolution - Proceedings of the 37th eCAADe and 23rd SIGraDi Conference - Volume 3, University of Porto, Porto, Portugal, 11-13 September 2019, pp. 45-52
doi https://doi.org/10.52842/conf.ecaade.2019.3.045
summary The Architecture of ScarCity Game is a board game used as a pedagogical tool that challenges architecture students by involving them in a series of experimental design sessions to understand the design process of scarcity and the actual relation between the craft and the digital. This means "pragmatic delivery processes and material constraints, where the exchange between the artisan of handmade, representing local skills and technology of the digitally conceived is explored" (Huang 2013). The game focuses on understanding the different variables of the crafted design process of traditional communities under conditions of scarcity (Michel and Bevan 1992). This requires first analyzing the spatial environmental model of interaction, available human and natural resources, and the dynamic relationship of these variables in a digital era. In the first stage (Pre-Agency), the game set the concept of the craft by limiting students design exploration from a minimum possible perspective developing locally available resources and techniques. The key elements of the design process of traditional knowledge communities have to be identified (Preez 1984). In other words, this stage is driven by limited resources + chance + contingency. In the second stage (Post-Agency) students taking the architects´ role within this communities, have to speculate and explore the interface between the craft (local knowledge and low technological tools), and the digital represented by computation data, new technologies available and construction. This means the introduction of strategy + opportunity + chance as part of the design process. In this sense, the game has a life beyond its mechanics. This other life challenges the participants to exploit the possibilities of breaking the actual boundaries of design. The result is a tool to challenge conventional methods of teaching and leaning controlling a prescribed design process. It confronts the rules that professionals in this field take for granted. The game simulates a 'fake' reality by exploring in different ways with surveyed information. As a result, participants do not have anything 'real' to lose. Instead, they have all the freedom to innovate and be creative.
keywords Global south, scarcity, low tech, digital-craft, design process and innovation by challenge.
series eCAADeSIGraDi
email
last changed 2022/06/07 07:54

_id 065b
authors Beitia, S.S., Zulueta, A. and Barrallo, J.
year 1995
title The Virtual Cathedral - An Essay about CAAD, History and Structure
source Multimedia and Architectural Disciplines [Proceedings of the 13th European Conference on Education in Computer Aided Architectural Design in Europe / ISBN 0-9523687-1-4] Palermo (Italy) 16-18 November 1995, pp. 355-360
doi https://doi.org/10.52842/conf.ecaade.1995.355
summary The Old Cathedral of Santa Maria in Vitoria is the most representative building of the Gothic style in the Basque Country. Built during the XIV century, it has been closed to the cult in 1994 because of the high risk of collapse that presents its structure. This closure was originated by the structural analysis that was entrusted to the University of the Basque Country in 1992. The topographic works developed in the Cathedral to elaborate the planimetry of the temple revealed that many structural elements of great importance like arches, buttresses and flying buttresses were removed, modified or added along the history of Santa Maria. The first structural analysis made in the church suggested that the huge deformations showed in the resistant elements, specially the piers, were originated by interventions made in the past. A deep historical investigation allowed us to know how the Cathedral was built and the changes executed until our days. With this information, we started the elaboration of a virtual model of the Cathedral of Santa Maria. This model was introduced into a Finite Elements Method system to study the deformations suffered in the church during its construction in the XIV century, and the intervention made later in the XV, XVI and XX centuries. The efficiency of the virtual model simulating the geometry of the Cathedral along history allowed us to detect the cause of the structural damage, that was finally found in many unfortunate interventions along time.

series eCAADe
more http://dpce.ing.unipa.it/Webshare/Wwwroot/ecaade95/Pag_43.htm
last changed 2022/06/07 07:54

For more results click below:

this is page 0show page 1show page 2show page 3show page 4show page 5... show page 12HOMELOGIN (you are user _anon_736958 from group guest) CUMINCAD Papers Powered by SciX Open Publishing Services 1.002