CumInCAD is a Cumulative Index about publications in Computer Aided Architectural Design
supported by the sibling associations ACADIA, CAADRIA, eCAADe, SIGraDi, ASCAAD and CAAD futures

PDF papers
References

Hits 1 to 20 of 223

_id 6d1d
authors Daru, R. and Daru, M.
year 1992
title Personal Working Styles in the CMD Studio
source CAAD Instruction: The New Teaching of an Architect? [eCAADe Conference Proceedings] Barcelona (Spain) 12-14 November 1992, pp. 451-472
doi https://doi.org/10.52842/conf.ecaade.1992.451
summary Normative and problem-solving approaches of architectural design ignore the personality aspects of the designing activity. Every architect approaches projects according to her/his own strategies and tactics. Usually they do not conform to the prescriptive models of design theoreticians. Computer aided design tools should be adapted to their utility within the strategies and tactics of each and every architectural student. We are testing the usefulness of CAAD tools developed by others or ourselves and identifying the needs for missing tools. It is already clear that many CAAD tools reflect the point of view of the programmer about strategies and tactics of designing and that they do not take into account the idiosyncrasies of the end user. Forcing the tools on students breeds the risk of fostering repulsion against ill-adapted tools, and consequently against CMD. Our research group pursues empirical research on working styles of designing by practising architects within the frame of a personality theory of actions. The results indicate that there are three main directions for designing strategies. If we want to take into account the real-world behaviour in design practice within architectural education, this implies the diversification of the exercises we offer to the students in threefold, corresponding with the three directions. To this, we add the didactic options of complementation, compensation and support, depending on what we know about the strong or weak points of the students involved. We have started proposing choices for the exercises of our design morphology studio. Students are offered approaches and tools we consider best adapted to their own working

series eCAADe
email
last changed 2022/06/07 07:55

_id 4129
authors Fargas, Josep and Papazian, Pegor
year 1992
title Metaphors in Design: An Experiment with a Frame, Two Lines and Two Rectangles
source Mission - Method - Madness [ACADIA Conference Proceedings / ISBN 1-880250-01-2] 1992, pp. 13-22
doi https://doi.org/10.52842/conf.acadia.1992.013
summary The research we will discuss below originated from an attempt to examine the capacity of designers to evaluate an artifact, and to study the feasibility of replicating a designer's moves intended to make an artifact more expressive of a given quality. We will present the results of an interactive computer experiment, first developed at the MIT Design Research Seminar, which is meant to capture the subject’s actions in a simple design task as a series of successive "moves"'. We will propose that designers use metaphors in their interaction with design artifacts and we will argue that the concept of metaphors can lead to a powerful theory of design activity. Finally, we will show how such a theory can drive the project of building a design system.

When trying to understand how designers work, it is tempting to examine design products in order to come up with the principles or norms behind them. The problem with such an approach is that it may lead to a purely syntactical analysis of design artifacts, failing to capture the knowledge of the designer in an explicit way, and ignoring the interaction between the designer and the evolving design. We will present a theory about design activity based on the observation that knowledge is brought into play during a design task by a process of interpretation of the design document. By treating an evolving design in terms of the meanings and rules proper to a given way of seeing, a designer can reduce the complexity of a task by focusing on certain of its aspects, and can manipulate abstract elements in a meaningful way.

series ACADIA
email
last changed 2022/06/07 07:55

_id ddss9214
id ddss9214
authors Friedman, A.
year 1993
title A decision-making process for choice of a flexible internal partition option in multi-unit housing using decision theory techniques
source Timmermans, Harry (Ed.), Design and Decision Support Systems in Architecture (Proceedings of a conference held in Mierlo, the Netherlands in July 1992), ISBN 0-7923-2444-7
summary Recent demographic changes have increased the heterogeneity of user groups in the North American housing market. Smaller households (e.g. elderly, single parent) have non-traditional spatial requirements that cannot be accommodated within the conventional house layout. This has created renewed interest in Demountable/Flexible internal partition systems. However, the process by which designers decide which project or user groups are most suited for the use of these systems is quite often complex, non-linear, uncertain and dynamic, since the decisions involve natural processes and human values that are apparently random. The anonymity of users when mass housing projects are conceptualized, and the uncertainty as to the alternative to be selected by the user, given his/her constantly changing needs, are some contributing factors to this effect. Decision Theory techniques, not commonly used by architects, can facilitate the decision-making process through a systematic evaluation of alternatives by means of quantitative methods in order to reduce uncertainty in probabilistic events or in cases when data is insufficient. The author used Decision Theory in the selection of flexible partition systems. The study involved a multi-unit, privately initiated housing project in Montreal, Canada, where real site conditions and costs were used. In this paper, the author outlines the fundamentals of Decision Theory and demonstrates the use of Expected Monetary Value and Weighted Objective Analysis methods and their outcomes in the design of a Montreal housing project. The study showed that Decision Theory can be used as an effective tool in housing design once the designer knows how to collect basic data.
series DDSS
last changed 2003/08/07 16:36

_id 7ce5
authors Gal, Shahaf
year 1992
title Computers and Design Activities: Their Mediating Role in Engineering Education
source Sociomedia, ed. Edward Barret. MIT Press
summary Sociomedia: With all the new words used to describe electronic communication (multimedia, hypertext, cyberspace, etc.), do we need another one? Edward Barrett thinks we do; hence, he coins the term "sociomedia." It is meant to displace a computing economy in which technicity is hypostasized over sociality. Sociomedia, a compilation of twenty-five articles on the theory, design and practice of educational multimedia and hypermedia, attempts to re-value the communicational face of computing. Value, of course, is "ultimately a social construct." As such, it has everything to do with knowledge, power, education and technology. The projects discussed in this book represent the leading edge of electronic knowledge production in academia (not to mention major funding) and are determining the future of educational media. For these reasons, Sociomedia warrants close inspection. Barrett's introduction sets the tone. For him, designing computer media involves hardwiring a mechanism for the social construction of knowledge (1). He links computing to a process of social and communicative interactivity for constructing and desseminating knowledge. Through a mechanistic mapping of the university as hypercontext (a huge network that includes classrooms as well as services and offices), Barrett models intellectual work in such a way as to avoid "limiting definitions of human nature or human development." Education, then, can remain "where it should be--in the human domain (public and private) of sharing ideas and information through the medium of language." By leaving education in a virtual realm (where we can continue to disagree about its meaning and execution), it remains viral, mutating and contaminating in an intellectually healthy way. He concludes that his mechanistic model, by means of its reductionist approach, preserves value (7). This "value" is the social construction of knowledge. While I support the social orientation of Barrett's argument, discussions of value are related to power. I am not referring to the traditional teacher-student power structure that is supposedly dismantled through cooperative and constructivist learning strategies. The power to be reckoned with in the educational arena is foundational, that which (pre)determines value and the circulation of knowledge. "Since each of you reading this paragraph has a different perspective on the meaning of 'education' or 'learning,' and on the processes involved in 'getting an education,' think of the hybris in trying to capture education in a programmable function, in a displayable object, in a 'teaching machine'" (7). Actually, we must think about that hybris because it is, precisely, what informs teaching machines. Moreover, the basic epistemological premises that give rise to such productions are too often assumed. In the case of instructional design, the episteme of cognitive sciences are often taken for granted. It is ironic that many of the "postmodernists" who support electronic hypertextuality seem to have missed Jacques Derrida's and Michel Foucault's "deconstructions" of the epistemology underpinning cognitive sciences (if not of epistemology itself). Perhaps it is the glitz of the technology that blinds some users (qua developers) to the belief systems operating beneath the surface. Barrett is not guilty of reactionary thinking or politics; he is, in fact, quite in line with much American deconstructive and postmodern thinking. The problem arises in that he leaves open the definitions of "education," "learning" and "getting an education." One cannot engage in the production of new knowledge without orienting its design, production and dissemination, and without negotiating with others' orientations, especially where largescale funding is involved. Notions of human nature and development are structural, even infrastructural, whatever the medium of the teaching machine. Although he addresses some dynamics of power, money and politics when he talks about the recession and its effects on the conference, they are readily visible dynamics of power (3-4). Where does the critical factor of value determination, of power, of who gets what and why, get mapped onto a mechanistic model of learning institutions? Perhaps a mapping of contributors' institutions, of the funding sources for the projects showcased and for participation in the conference, and of the disciplines receiving funding for these sorts of projects would help visualize the configurations of power operative in the rising field of educational multimedia. Questions of power and money notwithstanding, Barrett's introduction sets the social and textual thematics for the collection of essays. His stress on interactivity, on communal knowledge production, on the society of texts, and on media producers and users is carried foward through the other essays, two of which I will discuss. Section I of the book, "Perspectives...," highlights the foundations, uses and possible consequences of multimedia and hypertextuality. The second essay in this section, "Is There a Class in This Text?," plays on the robust exchange surrounding Stanley Fish's book, Is There a Text in This Class?, which presents an attack on authority in reading. The author, John Slatin, has introduced electronic hypertextuality and interaction into his courses. His article maps the transformations in "the content and nature of work, and the workplace itself"-- which, in this case, is not industry but an English poetry class (25). Slatin discovered an increase of productive and cooperative learning in his electronically- mediated classroom. For him, creating knowledge in the electronic classroom involves interaction between students, instructors and course materials through the medium of interactive written discourse. These interactions lead to a new and persistent understanding of the course materials and of the participants' relation to the materials and to one another. The work of the course is to build relationships that, in my view, constitute not only the meaning of individual poems, but poetry itself. The class carries out its work in the continual and usually interactive production of text (31). While I applaud his strategies which dismantle traditional hierarchical structures in academia, the evidence does not convince me that the students know enough to ask important questions or to form a self-directing, learning community. Stanley Fish has not relinquished professing, though he, too, espouses the indeterminancy of the sign. By the fourth week of his course, Slatin's input is, by his own reckoning, reduced to 4% (39). In the transcript of the "controversial" Week 6 exchange on Gertrude Stein--the most disliked poet they were discussing at the time (40)--we see the blind leading the blind. One student parodies Stein for three lines and sums up his input with "I like it." Another, finds Stein's poetry "almost completey [sic] lacking in emotion or any artistic merit" (emphasis added). On what grounds has this student become an arbiter of "artistic merit"? Another student, after admitting being "lost" during the Wallace Steven discussion, talks of having more "respect for Stevens' work than Stein's" and adds that Stein's poetry lacks "conceptual significance[, s]omething which people of varied opinion can intelligently discuss without feeling like total dimwits...." This student has progressed from admitted incomprehension of Stevens' work to imposing her (groundless) respect for his work over Stein's. Then, she exposes her real dislike for Stein's poetry: that she (the student) missed the "conceptual significance" and hence cannot, being a person "of varied opinion," intelligently discuss it "without feeling like [a] total dimwit." Slatin's comment is frightening: "...by this point in the semester students have come to feel increasingly free to challenge the instructor" (41). The students that I have cited are neither thinking critically nor are their preconceptions challenged by student-governed interaction. Thanks to the class format, one student feels self-righteous in her ignorance, and empowered to censure. I believe strongly in student empowerment in the classroom, but only once students have accrued enough knowledge to make informed judgments. Admittedly, Slatin's essay presents only partial data (there are six hundred pages of course transcripts!); still, I wonder how much valuable knowledge and metaknowledge was gained by the students. I also question the extent to which authority and professorial dictature were addressed in this course format. The power structures that make it possible for a college to require such a course, and the choice of texts and pedagogy, were not "on the table." The traditional professorial position may have been displaced, but what took its place?--the authority of consensus with its unidentifiable strong arm, and the faceless reign of software design? Despite Slatin's claim that the students learned about the learning process, there is no evidence (in the article) that the students considered where their attitudes came from, how consensus operates in the construction of knowledge, how power is established and what relationship they have to bureaucratic insitutions. How do we, as teaching professionals, negotiate a balance between an enlightened despotism in education and student-created knowledge? Slatin, and other authors in this book, bring this fundamental question to the fore. There is no definitive answer because the factors involved are ultimately social, and hence, always shifting and reconfiguring. Slatin ends his article with the caveat that computerization can bring about greater estrangement between students, faculty and administration through greater regimentation and control. Of course, it can also "distribute authority and power more widely" (50). Power or authority without a specific face, however, is not necessarily good or just. Shahaf Gal's "Computers and Design Activities: Their Mediating Role in Engineering Education" is found in the second half of the volume, and does not allow for a theory/praxis dichotomy. Gal recounts a brief history of engineering education up to the introduction of Growltiger (GT), a computer-assisted learning aid for design. He demonstrates GT's potential to impact the learning of engineering design by tracking its use by four students in a bridge-building contest. What his text demonstrates clearly is that computers are "inscribing and imaging devices" that add another viewpoint to an on-going dialogue between student, teacher, earlier coursework, and other teaching/learning tools. The less proficient students made a serious error by relying too heavily on the technology, or treating it as a "blueprint provider." They "interacted with GT in a way that trusted the data to represent reality. They did not see their interaction with GT as a negotiation between two knowledge systems" (495). Students who were more thoroughly informed in engineering discourses knew to use the technology as one voice among others--they knew enough not simply to accept the input of the computer as authoritative. The less-advanced students learned a valuable lesson from the competition itself: the fact that their designs were not able to hold up under pressure (literally) brought the fact of their insufficient knowledge crashing down on them (and their bridges). They also had, post factum, several other designs to study, especially the winning one. Although competition and comparison are not good pedagogical strategies for everyone (in this case the competitors had volunteered), at some point what we think we know has to be challenged within the society of discourses to which it belongs. Students need critique in order to learn to push their learning into auto-critique. This is what is lacking in Slatin's discussion and in the writings of other avatars of constructivist, collaborative and computer-mediated pedagogies. Obviously there are differences between instrumental types of knowledge acquisition and discoursive knowledge accumulation. Indeed, I do not promote the teaching of reading, thinking and writing as "skills" per se (then again, Gal's teaching of design is quite discursive, if not dialogic). Nevertheless, the "soft" sciences might benefit from "bridge-building" competitions or the re-institution of some forms of agonia. Not everything agonistic is inhuman agony--the joy of confronting or creating a sound argument supported by defensible evidence, for example. Students need to know that soundbites are not sound arguments despite predictions that electronic writing will be aphoristic rather than periodic. Just because writing and learning can be conceived of hypertextually does not mean that rigor goes the way of the dinosaur. Rigor and hypertextuality are not mutually incompatible. Nor is rigorous thinking and hard intellectual work unpleasurable, although American anti-intellectualism, especially in the mass media, would make it so. At a time when the spurious dogmatics of a Rush Limbaugh and Holocaust revisionist historians circulate "aphoristically" in cyberspace, and at a time when knowledge is becoming increasingly textualized, the role of critical thinking in education will ultimately determine the value(s) of socially constructed knowledge. This volume affords the reader an opportunity to reconsider knowledge, power, and new communications technologies with respect to social dynamics and power relationships.
series other
last changed 2003/04/23 15:14

_id 4b2a
id 4b2a
authors Jabi, Wassim
year 2004
title A FRAMEWORK FOR COMPUTER-SUPPORTED COLLABORATION IN ARCHITECTURAL DESIGN
source University of Michigan
summary The development of appropriate research frameworks and guidelines for the construction of software aids in the area of architectural design can lead to a better understanding of designing and computer support for designing (Gero and Maher 1997). The field of research and development in computer-supported collaborative architectural design reflects that of the early period in the development of the field of computersupported cooperative work (CSCW). In the early 1990s, the field of CSCW relied on unsystematic attempts to generate software that increases the productivity of people working together (Robinson 1992). Furthermore, a shift is taking place by which researchers in the field of architecture are increasingly becoming consumers of rather than innovators of technology (Gero and Maher . In particular, the field of architecture is rapidly becoming dependent on commercial software implementations that are slow to respond to new research or to user demands. Additionally, these commercial systems force a particular view of the domain they serve and as such might hinder rather than help its development. The aim of this dissertation is to provide information to architects and others to help them build their own tools or, at a minimum, be critical of commercial solutions.
series thesis:PhD
type normal paper
email
last changed 2004/10/24 22:35

_id ddss9208
id ddss9208
authors Lucardie, G.L.
year 1993
title A functional approach to realizing decision support systems in technical regulation management for design and construction
source Timmermans, Harry (Ed.), Design and Decision Support Systems in Architecture (Proceedings of a conference held in Mierlo, the Netherlands in July 1992), ISBN 0-7923-2444-7
summary Technical building standards defining the quality of buildings, building products, building materials and building processes aim to provide acceptable levels of safety, health, usefulness and energy consumption. However, the logical consistency between these goals and the set of regulations produced to achieve them is often hard to identify. Not only the large quantities of highly complex and frequently changing building regulations to be met, but also the variety of user demands and the steadily increasing technical information on (new) materials, products and buildings have produced a very complex set of knowledge and data that should be taken into account when handling technical building regulations. Integrating knowledge technology and database technology is an important step towards managing the complexity of technical regulations. Generally, two strategies can be followed to integrate knowledge and database technology. The main emphasis of the first strategy is on transferring data structures and processing techniques from one field of research to another. The second approach is concerned exclusively with the semantic structure of what is contained in the data-based or knowledge-based system. The aim of this paper is to show that the second or knowledge-level approach, in particular the theory of functional classifications, is more fundamental and more fruitful. It permits a goal-directed rationalized strategy towards analysis, use and application of regulations. Therefore, it enables the reconstruction of (deep) models of regulations, objects and of users accounting for the flexibility and dynamics that are responsible for the complexity of technical regulations. Finally, at the systems level, the theory supports an effective development of a new class of rational Decision Support Systems (DSS), which should reduce the complexity of technical regulations and restore the logical consistency between the goals of technical regulations and the technical regulations themselves.
series DDSS
last changed 2003/08/07 16:36

_id caadria2020_242
id caadria2020_242
authors Martin Iglesias, Rodrigo, Voto, Cristina and Agra, Rocío
year 2020
title Design in the Age of Dissident Cyborgs - Xenofuturism as caring-curing practices
source D. Holzer, W. Nakapan, A. Globa, I. Koh (eds.), RE: Anthropocene, Design in the Age of Humans - Proceedings of the 25th CAADRIA Conference - Volume 2, Chulalongkorn University, Bangkok, Thailand, 5-6 August 2020, pp. 233-240
doi https://doi.org/10.52842/conf.caadria.2020.2.233
summary This paper synthesizes several years of research in the field of the theory of architecture and design, and its subsequent undergraduate and graduate teaching. Specifically, it is a work that reflects on how architecture and design should face the three most important paradigmatic phenomena of our present and near future. Paradigms as things we think with, rather than as things we think about (Agamben, 2008), or in other words, it matters what ideas we use to think of other ideas (Strathern, 1992). These phenomena refer to environmental, technological and anthropological aspects, and the strategies to cope with them, involving alternate design thinking and practice in which futurabilities and futurizations depart from the displacement generated by post-utopian visions based on dissidence and subalternity.
keywords Chthulucene; Cyborg Design; Dissident Futures; Futurization; Xenofuturism
series CAADRIA
email
last changed 2022/06/07 07:59

_id 58c5
authors Van Wezel, Ruud
year 1992
title MOCK-UP SYSTEM WAGENINGEN: DEVELOPMENT, LIMITATION AND FUTURE
source Proceedings of the 4rd European Full-Scale Modelling Conference / Lausanne (Switzerland) 9-12 September 1992, Part A, pp. 15-18
summary A brief description of the development of the Mock-up System (MUS) in the context of the Wageningen training program. The students are first taught some keywords in understanding of the building process. They are then trained to express how they want to live (theory) and later on they confront themselves with what they have built in the MUS (practice) . Besides being an educational tool, the MUS is used for pre-building evaluation and research questions. The drawbacks or limitations of the system (outdoor reality versus indoor simulation) and future use by different target groups are also discussed in this paper. The power of the MUS is, and will continue to be, the concrete building of communicational results and the generation of communication by doing so.
keywords Full-scale Modeling, Model Simulation, Real Environments
series other
type normal paper
more http://info.tuwien.ac.at/efa
last changed 2004/05/04 15:30

_id 3ff5
authors Abbo, I.A., La Scalea, L., Otero, E. and Castaneda, L.
year 1992
title Full-Scale Simulations as Tool for Developing Spatial Design Ability
source Proceedings of the 4rd European Full-Scale Modelling Conference / Lausanne (Switzerland) 9-12 September 1992, Part C, pp. 7-10
summary Spatial Design Ability has been defined as the capability to anticipate effects (psychological impressions on potential observers or users) produced by mental manipulation of elements of architectural or urban spaces. This ability, of great importance in choosing the appropriate option during the design process, is not specifically developed in schools of architecture and is partially obtained as a by-product of drawing, designing or architectural criticism. We use our Laboratory as a tool to present spaces to people so that they can evaluate them. By means of a series of exercises, students confront their anticipations with the psychological impressions produced in other people. For this occasion, we present an experience in which students had to propose a space for an exhibition hag in which architectural projects (student thesis) were to be shown. Following the Spatial Design Ability Development Model which we have been using for several years, students first get acquainted with the use of evaluation instruments for psychological impressions as well as with research methodology. In this case, due to the short period available, we reduced research to investigate the effects produced by the manipulation of only 2 independents variables: students manipulated first the form of the roof, walls and interiors elements, secondly, color and texture of those elements. They evaluated spatial quality, character and the other psychological impressions that manipulations produced in people. They used three dimensional scale models 1/10 and 1/1.
keywords Full-scale Modeling, Model Simulation, Real Environments
series other
email
more http://info.tuwien.ac.at/efa
last changed 2003/08/25 10:12

_id 6208
authors Abou-Jaoude, Georges
year 1992
title To Master a Tool
source Proceedings of the 4rd European Full-Scale Modelling Conference / Lausanne (Switzerland) 9-12 September 1992, Part B, p. 15
summary The tool here is the computer or to be precise, a unit that includes the computer, the peripherals and the software needed to fulfill a task. These tools are getting very sophisticated and user interfaces extremly friendly, therefore it is very easy to become the slave of such electronic tools and reach self satisfaction with strait forward results and attractive images. In order to master and not to become slaves of sophisticated tools, a very solid knowledge of related fields or domains of application becomes necessary. In the case of this seminar, full scale modelling, is a way to understand the relation between a mental model and it's full-scale modelling, it is a way of communicating what is in a designers mind. Computers and design programs can have the same goal, rather than chosing one method or the other let us try to say how important it is today to complement designing with computer with other means and media such as full scale modelling, and what computer modelling and simulation can bring to full scale modelling or other means.
keywords Full-scale Modeling, Model Simulation, Real Environments
series other
more http://info.tuwien.ac.at/efa
last changed 2003/08/25 10:12

_id acadia06_455
id acadia06_455
authors Ambach, Barbara
year 2006
title Eve’s Four Faces interactive surface configurations
source Synthetic Landscapes [Proceedings of the 25th Annual Conference of the Association for Computer-Aided Design in Architecture] pp. 455-460
doi https://doi.org/10.52842/conf.acadia.2006.455
summary Eve’s Four Faces consists of a series of digitally animated and interactive surfaces. Their content and structure are derived from a collection of sources outside the conventional boundaries of architectural research, namely psychology and the broader spectrum of arts and culture.The investigation stems from a psychological study documenting the attributes and social relationships of four distinct personality prototypes: the Individuated, the Traditional, the Conflicted, and the Assured (York and John 1992). For the purposes of this investigation, all four prototypes are assumed to be inherent, to certain degrees, in each individual. However, the propensity towards one of the prototypes forms the basis for each individual’s “personality structure.” The attributes, social implications and prospects for habitation have been translated into animations and surfaces operating within A House for Eve’s Four Faces. The presentation illustrates the potential for constructed surfaces to be configured and transformed interactively, responding to the needs and qualities associated with each prototype. The intention is to study the effects of each configuration and how each configuration may be therapeutic in supporting, challenging or altering one’s personality as it oscillates and shifts through the four prototypical conditions.
series ACADIA
email
last changed 2022/06/07 07:54

_id 2006_040
id 2006_040
authors Ambach, Barbara
year 2006
title Eve’s Four Faces-Interactive surface configurations
source Communicating Space(s) [24th eCAADe Conference Proceedings / ISBN 0-9541183-5-9] Volos (Greece) 6-9 September 2006, pp. 40-44
doi https://doi.org/10.52842/conf.ecaade.2006.040
summary Eve’s Four Faces consists of a series of digitally animated and interactive surfaces. Their content and structure are derived from a collection of sources outside the conventional boundaries of architectural research, namely psychology and the broader spectrum of arts and culture. The investigation stems from a psychological study documenting the attributes and social relationships of four distinct personality prototypes; the “Individuated”, the “Traditional”, the “Conflicted” and the “Assured”. (York and John, 1992) For the purposes of this investigation, all four prototypes are assumed to be inherent, to certain degrees, in each individual; however, the propensity towards one of the prototypes forms the basis for each individual’s “personality structure”. The attributes, social implications and prospects for habitation have been translated into animations and surfaces operating within A House for Eve’s Four Faces. The presentation illustrates the potential for constructed surfaces to be configured and transformed interactively, responding to the needs and qualities associated with each prototype. The intention is to study the effects of each configuration and how it may be therapeutic in supporting, challenging or altering one’s personality as it oscillates and shifts through the four prototypical conditions.
keywords interaction; digital; environments; psychology; prototypes
series eCAADe
type normal paper
last changed 2022/06/07 07:54

_id ascaad2022_043
id ascaad2022_043
authors Awan, Abeeha; Prokop, Simon; Vele, Jiri; Dounas, Theodor; Lombardi, Davide; Agkathidis, Asterios; Kurilla, Lukas
year 2022
title Qualitative Knowledge Graph for the Evaluation of Metaverse(s) - Is the Metaverse Hype or a Promising New Field for Architects?
source Hybrid Spaces of the Metaverse - Architecture in the Age of the Metaverse: Opportunities and Potentials [10th ASCAAD Conference Proceedings] Debbieh (Lebanon) [Virtual Conference] 12-13 October 2022, pp. 99-116
summary With the advancement of augmented and virtual reality technologies both in scale as well as accessibility, the Metaverse (Stephenson, 1992, Hughes, 2022) has emerged as a new digital space with potential for the application of architectural creativity and design. With blockchain integration, the concept of the Metaverse shows promise in creating a “decentralised” space for design and creativity with rewards for its participants. As a platform that incorporates these technological components, does the Metaverse have utility for architectural design? Is there something truly novel in what the Metaverse brings to architectural computing, and architectural design? The paper constructs a qualitative knowledge graph that can be used for the evaluation of various kinds of Metaverses in and for architectural design. We use Design Science Research methods to develop the knowledge graph and its evaluative capacity, stemming from our experience with two Metaverses, Decentraland and Cryptovoxels. The paper concludes with a discussion of knowledge and practice gaps that are evident, framing the opportunities that architects might have in the future in terms of developing Metaverse(s).
series ASCAAD
email
last changed 2024/02/16 13:24

_id 065b
authors Beitia, S.S., Zulueta, A. and Barrallo, J.
year 1995
title The Virtual Cathedral - An Essay about CAAD, History and Structure
source Multimedia and Architectural Disciplines [Proceedings of the 13th European Conference on Education in Computer Aided Architectural Design in Europe / ISBN 0-9523687-1-4] Palermo (Italy) 16-18 November 1995, pp. 355-360
doi https://doi.org/10.52842/conf.ecaade.1995.355
summary The Old Cathedral of Santa Maria in Vitoria is the most representative building of the Gothic style in the Basque Country. Built during the XIV century, it has been closed to the cult in 1994 because of the high risk of collapse that presents its structure. This closure was originated by the structural analysis that was entrusted to the University of the Basque Country in 1992. The topographic works developed in the Cathedral to elaborate the planimetry of the temple revealed that many structural elements of great importance like arches, buttresses and flying buttresses were removed, modified or added along the history of Santa Maria. The first structural analysis made in the church suggested that the huge deformations showed in the resistant elements, specially the piers, were originated by interventions made in the past. A deep historical investigation allowed us to know how the Cathedral was built and the changes executed until our days. With this information, we started the elaboration of a virtual model of the Cathedral of Santa Maria. This model was introduced into a Finite Elements Method system to study the deformations suffered in the church during its construction in the XIV century, and the intervention made later in the XV, XVI and XX centuries. The efficiency of the virtual model simulating the geometry of the Cathedral along history allowed us to detect the cause of the structural damage, that was finally found in many unfortunate interventions along time.

series eCAADe
more http://dpce.ing.unipa.it/Webshare/Wwwroot/ecaade95/Pag_43.htm
last changed 2022/06/07 07:54

_id e039
authors Bertin, Vito
year 1992
title Structural Transformations (Basic Architectural Unit 6)
source CAAD Instruction: The New Teaching of an Architect? [eCAADe Conference Proceedings] Barcelona (Spain) 12-14 November 1992, pp. 413-426
doi https://doi.org/10.52842/conf.ecaade.1992.413
summary While the teaching of the phenomenon of form as well as space is normally seen within an environment of free experimentation and personal expression, other directions prove to be worth of pursuit. The proposed paper represents such an exploration. The generation of controlled complexity and structural transformations have been the title of the project which forms the base of this paper. In it, the potential for creative development of the student was explored in such a way, that as in the sciences a process can be reproduced or an exploration utilized in further experimentation. The cube as a well proven B.A.U. or basic architectural unit has again been used in our work. Even a simple object like a cube has many properties. As properties are never pure, but always related to other properties, and looking at a single property as a specific value of a variable, it is possible to link a whole field of objects. These links provide a network of paths through which exploration and development is possible. The paper represents a first step in a direction which we think will compliment the already established basic design program.

series eCAADe
email
last changed 2022/06/07 07:52

_id eabb
authors Boeykens, St. Geebelen, B. and Neuckermans, H.
year 2002
title Design phase transitions in object-oriented modeling of architecture
source Connecting the Real and the Virtual - design e-ducation [20th eCAADe Conference Proceedings / ISBN 0-9541183-0-8] Warsaw (Poland) 18-20 September 2002, pp. 310-313
doi https://doi.org/10.52842/conf.ecaade.2002.310
summary The project IDEA+ aims to develop an “Integrated Design Environment for Architecture”. Its goal is providing a tool for the designer-architect that can be of assistance in the early-design phases. It should provide the possibility to perform tests (like heat or cost calculations) and simple simulations in the different (early) design phases, without the need for a fully detailed design or remodeling in a different application. The test for daylighting is already in development (Geebelen, to be published). The conceptual foundation for this design environment has been laid out in a scheme in which different design phases and scales are defined, together with appropriate tests at the different levels (Neuckermans, 1992). It is a translation of the “designerly” way of thinking of the architect (Cross, 1982). This conceptual model has been translated into a “Core Object Model” (Hendricx, 2000), which defines a structured object model to describe the necessary building model. These developments form the theoretical basis for the implementation of IDEA+ (both the data structure & prototype software), which is currently in progress. The research project addresses some issues, which are at the forefront of the architect’s interest while designing with CAAD. These are treated from the point of view of a practicing architect.
series eCAADe
email
last changed 2022/06/07 07:52

_id 91c4
authors Checkland, P.
year 1981
title Systems Thinking, Systems Practice
source John Wiley & Sons, Chichester
summary Whether by design, accident or merely synchronicity, Checkland appears to have developed a habit of writing seminal publications near the start of each decade which establish the basis and framework for systems methodology research for that decade."" Hamish Rennie, Journal of the Operational Research Society, 1992 Thirty years ago Peter Checkland set out to test whether the Systems Engineering (SE) approach, highly successful in technical problems, could be used by managers coping with the unfolding complexities of organizational life. The straightforward transfer of SE to the broader situations of management was not possible, but by insisting on a combination of systems thinking strongly linked to real-world practice Checkland and his collaborators developed an alternative approach - Soft Systems Methodology (SSM) - which enables managers of all kinds and at any level to deal with the subtleties and confusions of the situations they face. This work established the now accepted distinction between hard systems thinking, in which parts of the world are taken to be systems which can be engineered, and soft systems thinking in which the focus is on making sure the process of inquiry into real-world complexity is itself a system for learning. Systems Thinking, Systems Practice (1981) and Soft Systems Methodology in Action (1990) together with an earlier paper Towards a Systems-based Methodology for Real-World Problem Solving (1972) have long been recognized as classics in the field. Now Peter Checkland has looked back over the three decades of SSM development, brought the account of it up to date, and reflected on the whole evolutionary process which has produced a mature SSM. SSM: A 30-Year Retrospective, here included with Systems Thinking, Systems Practice closes a chapter on what is undoubtedly the most significant single research programme on the use of systems ideas in problem solving. Now retired from full-time university work, Peter Checkland continues his research as a Leverhulme Emeritus Fellow. "
series other
last changed 2003/04/23 15:14

_id c434
authors Colajanni, B., Pellitteri, G. and Scianna, A.
year 1992
title Two Approaches to Teaching Computers in Architecture: The Experience in the Faculty of Engineering in Palermo, Italy
source CAAD Instruction: The New Teaching of an Architect? [eCAADe Conference Proceedings] Barcelona (Spain) 12-14 November 1992, pp. 295-306
doi https://doi.org/10.52842/conf.ecaade.1992.295
summary Teaching the use of computers in architecture poses the same kind of problems as teaching mathematics. To both there are two possible approaches. The first presents the discipline as a tool of which the merely instrumental aspect is emphasized. Teaching is limited to show the results obtainable by existing programs and how to get them. The second approach, on the contrary emphasizes the autonomous nature of the discipline, mathematics as much as computing, on the basis of the convincement that the maximum of instrumental usefulness can be obtained through the knowledge at the highest degree of generality and, then, of abstraction. The first approach changes little in the mind of the student. He simply learns that is possible, and then worthy doing, a certain amount of operations, mainly checks of performances (and not only the control of the aspect, now easy with one of the many existing CAD) or searches of technical informations in some database. The second approach gives the student the consciousness of the manageability of abstract structures of relationships. He acquires then the idea of creating by himself particular structures of relationships and managing them. This can modify the very idea of the design procedure giving the student the consciousness that he can intervene directly in every segment of the design procedure, reshaping it to some extent in a way better suited to the particular problem he is dealing with. Of course this second approach implies learning not only a language but also the capability of coming to terms with languages. And again it is a cultural acquisition that can be very useful when referred to the languages of architecture. Furthermore the capability of simulating on the computer also a small segment of the design process gives the student a better understanding both of the particular problem he is dealing with and of the very nature of design. As for the first effect, it happens whenever a translation is done from a language to another one. One is obliged to get to the core of the matter in order to overcome the difficulties rising from the different bias of the two languages. The second effect comes from the necessity of placing the studied segment in the general flow of the design process. The organisation in a linear sequence of action to be accomplished recursively in an order always varying in any design occasion is an extremely useful exercise to understand the signification and the techniques of formalisation of design problems.
series eCAADe
email
last changed 2022/06/07 07:56

_id a93f
authors Eisenman, P.
year 1992
title Visions unfolding: architecture in the age of electronic media
source Domus, 1/92
summary During the fifty years sinee the Second World War, a paradigm shift has taken place that should have profoundly affected architecture: this was the shift from the mechanicai paradigm to the electrorlic one. This change can be simply understood by comparing the impact of the role of the human subject on such primary modes of reproduction as the photograph and the fax; the photograph within the mechanical paradigm, the fax within the electronic one. In photographic reproduction the subiect still maintains a controlled interaction with the object. A photograph can be developed with more or less contrast, texture or clarity.
series journal paper
last changed 2003/04/23 15:50

_id 68c8
authors Flemming, U., Coyne, R. and Fenves, S. (et al.)
year 1994
title SEED: A Software Environment to Support the Early Phases in Building Design
source Proceeding of IKM '94, Weimar, Germany, pp. 5-10
summary The SEED project intends to develop a software environment that supports the early phases in building design (Flemming et al., 1993). The goal is to provide support, in principle, for the preliminary design of buildings in all aspects that can gain from computer support. This includes using the computer not only for analysis and evaluation, but also more actively for the generation of designs, or more accurately, for the rapid generation of design representations. A major motivation for the development of SEED is to bring the results of two multi-generational research efforts focusing on `generative' design systems closer to practice: 1. LOOS/ABLOOS, a generative system for the synthesis of layouts of rectangles (Flemming et al., 1988; Flemming, 1989; Coyne and Flemming, 1990; Coyne, 1991); 2. GENESIS, a rule-based system that supports the generation of assemblies of 3-dimensional solids (Heisserman, 1991; Heisserman and Woodbury, 1993). The rapid generation of design representations can take advantage of special opportunities when it deals with a recurring building type, that is, a building type dealt with frequently by the users of the system. Design firms - from housing manufacturers to government agencies - accumulate considerable experience with recurring building types. But current CAD systems capture this experience and support its reuse only marginally. SEED intends to provide systematic support for the storing and retrieval of past solutions and their adaptation to similar problem situations. This motivation aligns aspects of SEED closely with current work in Artificial Intelligence that focuses on case-based design (see, for example, Kolodner, 1991; Domeshek and Kolodner, 1992; Hua et al., 1992).
series other
email
last changed 2003/04/23 15:14

For more results click below:

this is page 0show page 1show page 2show page 3show page 4show page 5... show page 11HOMELOGIN (you are user _anon_696599 from group guest) CUMINCAD Papers Powered by SciX Open Publishing Services 1.002