CumInCAD is a Cumulative Index about publications in Computer Aided Architectural Design
supported by the sibling associations ACADIA, CAADRIA, eCAADe, SIGraDi, ASCAAD and CAAD futures

PDF papers
References

Hits 1 to 20 of 4813

_id ecaadesigradi2019_449
id ecaadesigradi2019_449
authors Becerra Santacruz, Axel
year 2019
title The Architecture of ScarCity Game - The craft and the digital as an alternative design process
doi https://doi.org/10.52842/conf.ecaade.2019.3.045
source Sousa, JP, Xavier, JP and Castro Henriques, G (eds.), Architecture in the Age of the 4th Industrial Revolution - Proceedings of the 37th eCAADe and 23rd SIGraDi Conference - Volume 3, University of Porto, Porto, Portugal, 11-13 September 2019, pp. 45-52
summary The Architecture of ScarCity Game is a board game used as a pedagogical tool that challenges architecture students by involving them in a series of experimental design sessions to understand the design process of scarcity and the actual relation between the craft and the digital. This means "pragmatic delivery processes and material constraints, where the exchange between the artisan of handmade, representing local skills and technology of the digitally conceived is explored" (Huang 2013). The game focuses on understanding the different variables of the crafted design process of traditional communities under conditions of scarcity (Michel and Bevan 1992). This requires first analyzing the spatial environmental model of interaction, available human and natural resources, and the dynamic relationship of these variables in a digital era. In the first stage (Pre-Agency), the game set the concept of the craft by limiting students design exploration from a minimum possible perspective developing locally available resources and techniques. The key elements of the design process of traditional knowledge communities have to be identified (Preez 1984). In other words, this stage is driven by limited resources + chance + contingency. In the second stage (Post-Agency) students taking the architects´ role within this communities, have to speculate and explore the interface between the craft (local knowledge and low technological tools), and the digital represented by computation data, new technologies available and construction. This means the introduction of strategy + opportunity + chance as part of the design process. In this sense, the game has a life beyond its mechanics. This other life challenges the participants to exploit the possibilities of breaking the actual boundaries of design. The result is a tool to challenge conventional methods of teaching and leaning controlling a prescribed design process. It confronts the rules that professionals in this field take for granted. The game simulates a 'fake' reality by exploring in different ways with surveyed information. As a result, participants do not have anything 'real' to lose. Instead, they have all the freedom to innovate and be creative.
keywords Global south, scarcity, low tech, digital-craft, design process and innovation by challenge.
series eCAADeSIGraDi
email
last changed 2022/06/07 07:54

_id aa78
authors Bayazit, Nigan
year 1992
title Requirements of an Expert System for Design Studios
doi https://doi.org/10.52842/conf.ecaade.1992.187
source CAAD Instruction: The New Teaching of an Architect? [eCAADe Conference Proceedings] Barcelona (Spain) 12-14 November 1992, pp. 187-194
summary The goal of this paper is to study problems of the transition from traditional architectural studio teaching to CAAD studio teaching which requires a CAAD expert system as studio tutor, and to study the behavior of the student in this new environment. The differences between the traditional and computerized studio teaching and the experiences in this field are explained referring to the requirements for designing time in relation to the expertise of the student in the application of a CAD program. Learning styles and the process of design in computerized and non-computerized studio teaching are discussed. Design studio requirements of the students in traditional studio environment while doing design works are clarified depending on the results of an empirical study which explained the relations between the tutor and the student while they were doing studio critiques. Main complaints of the students raised in the empirical study were the lack of data in the specific design problem area, difficulties of realization of ideas and thoughts, not knowing the starting point of design, having no information about the references to be used for the specific design task, having difficulties in the application of presentation techniques. In the concluding parts of the paper are discussed the different styles of teaching and their relation to the CAAD environment, the transformation of the instructional programs for the new design environment, the future expectations from the CAAD programs, properties of the new teaching environment and the roles of the expert systems in design studio education.

keywords CAAD Education, Expert System, Architectural Design Studio, Knowledge Acquisition
series eCAADe
email
last changed 2022/06/07 07:54

_id cef3
authors Bridges, Alan H.
year 1992
title Computing and Problem Based Learning at Delft University of Technology Faculty of Architecture
doi https://doi.org/10.52842/conf.ecaade.1992.289
source CAAD Instruction: The New Teaching of an Architect? [eCAADe Conference Proceedings] Barcelona (Spain) 12-14 November 1992, pp. 289-294
summary Delft University of Technology, founded in 1842, is the oldest and largest technical university in the Netherlands. It provides education for more than 13,000 students in fifteen main subject areas. The Faculty of Architecture, Housing, Urban Design and Planning is one of the largest faculties of the DUT with some 2000 students and over 500 staff members. The course of study takes four academic years: a first year (Propaedeuse) and a further three years (Doctoraal) leading to the "ingenieur" qualification. The basic course material is delivered in the first two years and is taken by all students. The third and fourth years consist of a smaller number of compulsory subjects in each of the department's specialist areas together with a wide range of option choices. The five main subject areas the students may choose from for their specialisation are Architecture, Building and Project Management, Building Technology, Urban Design and Planning, and Housing.

The curriculum of the Faculty has been radically revised over the last two years and is now based on the concept of "Problem-Based Learning". The subject matter taught is divided thematically into specific issues that are taught in six week blocks. The vehicles for these blocks are specially selected and adapted case studies prepared by teams of staff members. These provide a focus for integrating specialist subjects around a studio based design theme. In the case of second year this studio is largely computer-based: many drawings are produced by computer and several specially written computer applications are used in association with the specialist inputs.

This paper describes the "block structure" used in second year, giving examples of the special computer programs used, but also raises a number of broader educational issues. Introduction of the block system arose as a method of curriculum integration in response to difficulties emerging from the independent functioning of strong discipline areas in the traditional work groups. The need for a greater level of selfdirected learning was recognised as opposed to the "passive information model" of student learning in which the students are seen as empty vessels to be filled with knowledge - which they are then usually unable to apply in design related contexts in the studio. Furthermore, the value of electives had been questioned: whilst enabling some diversity of choice, they may also be seen as diverting attention and resources from the real problems of teaching architecture.

series eCAADe
email
last changed 2022/06/07 07:54

_id 4857
authors Escola Tecnica Superior D'arquitectura de Barcelona (Ed.)
year 1992
title CAAD Instruction: The New Teaching of an Architect?
doi https://doi.org/10.52842/conf.ecaade.1992
source eCAADe Conference Proceedings / Barcelona (Spain) 12-14 November 1992, 551 p.
summary The involvement of computer graphic systems in the transmission of knowledge in the areas of urban planning and architectural design will bring a significant change to the didactic programs and methods of those schools which have decided to adopt these new instruments. Workshops of urban planning and architectural design will have to modify their structures, and teaching teams will have to revise their current programs. Some european schools and faculties of architecture have taken steps in this direction. Others are willing to join them.

This process is only delayed by the scarcity of material resources, and by the slowness with which a sufficient number of teachers are adopting these methods.

ECAADE has set out to analyze the state of this issue during its next conference, and it will be discussed from various points of view. From this confrontation of ideas will come, surely, the guidelines for progress in the years to come.

The different sessions will be grouped together following these four themes:

(A.) Multimedia and Course Work / State of the art of the synthesis of graphical and textual information favored by new available multimedia computer programs. Their repercussions on academic programs. (B.) The New Design Studio / Physical characteristics, data concentration and accessibility of a computerized studio can be better approached in a computerized workshop. (C.) How to manage the new education system / Problems and possibilities raised, from the practical and organizational points of view, of architectural education by the introduction of computers in the classrooms. (D.) CAAI. Formal versus informal structure / How will the traditional teaching structure be affected by the incidence of these new systems in which the access to knowledge and information can be obtained in a random way and guided by personal and subjective criteria.

series eCAADe
email
last changed 2022/06/07 07:49

_id 7ce5
authors Gal, Shahaf
year 1992
title Computers and Design Activities: Their Mediating Role in Engineering Education
source Sociomedia, ed. Edward Barret. MIT Press
summary Sociomedia: With all the new words used to describe electronic communication (multimedia, hypertext, cyberspace, etc.), do we need another one? Edward Barrett thinks we do; hence, he coins the term "sociomedia." It is meant to displace a computing economy in which technicity is hypostasized over sociality. Sociomedia, a compilation of twenty-five articles on the theory, design and practice of educational multimedia and hypermedia, attempts to re-value the communicational face of computing. Value, of course, is "ultimately a social construct." As such, it has everything to do with knowledge, power, education and technology. The projects discussed in this book represent the leading edge of electronic knowledge production in academia (not to mention major funding) and are determining the future of educational media. For these reasons, Sociomedia warrants close inspection. Barrett's introduction sets the tone. For him, designing computer media involves hardwiring a mechanism for the social construction of knowledge (1). He links computing to a process of social and communicative interactivity for constructing and desseminating knowledge. Through a mechanistic mapping of the university as hypercontext (a huge network that includes classrooms as well as services and offices), Barrett models intellectual work in such a way as to avoid "limiting definitions of human nature or human development." Education, then, can remain "where it should be--in the human domain (public and private) of sharing ideas and information through the medium of language." By leaving education in a virtual realm (where we can continue to disagree about its meaning and execution), it remains viral, mutating and contaminating in an intellectually healthy way. He concludes that his mechanistic model, by means of its reductionist approach, preserves value (7). This "value" is the social construction of knowledge. While I support the social orientation of Barrett's argument, discussions of value are related to power. I am not referring to the traditional teacher-student power structure that is supposedly dismantled through cooperative and constructivist learning strategies. The power to be reckoned with in the educational arena is foundational, that which (pre)determines value and the circulation of knowledge. "Since each of you reading this paragraph has a different perspective on the meaning of 'education' or 'learning,' and on the processes involved in 'getting an education,' think of the hybris in trying to capture education in a programmable function, in a displayable object, in a 'teaching machine'" (7). Actually, we must think about that hybris because it is, precisely, what informs teaching machines. Moreover, the basic epistemological premises that give rise to such productions are too often assumed. In the case of instructional design, the episteme of cognitive sciences are often taken for granted. It is ironic that many of the "postmodernists" who support electronic hypertextuality seem to have missed Jacques Derrida's and Michel Foucault's "deconstructions" of the epistemology underpinning cognitive sciences (if not of epistemology itself). Perhaps it is the glitz of the technology that blinds some users (qua developers) to the belief systems operating beneath the surface. Barrett is not guilty of reactionary thinking or politics; he is, in fact, quite in line with much American deconstructive and postmodern thinking. The problem arises in that he leaves open the definitions of "education," "learning" and "getting an education." One cannot engage in the production of new knowledge without orienting its design, production and dissemination, and without negotiating with others' orientations, especially where largescale funding is involved. Notions of human nature and development are structural, even infrastructural, whatever the medium of the teaching machine. Although he addresses some dynamics of power, money and politics when he talks about the recession and its effects on the conference, they are readily visible dynamics of power (3-4). Where does the critical factor of value determination, of power, of who gets what and why, get mapped onto a mechanistic model of learning institutions? Perhaps a mapping of contributors' institutions, of the funding sources for the projects showcased and for participation in the conference, and of the disciplines receiving funding for these sorts of projects would help visualize the configurations of power operative in the rising field of educational multimedia. Questions of power and money notwithstanding, Barrett's introduction sets the social and textual thematics for the collection of essays. His stress on interactivity, on communal knowledge production, on the society of texts, and on media producers and users is carried foward through the other essays, two of which I will discuss. Section I of the book, "Perspectives...," highlights the foundations, uses and possible consequences of multimedia and hypertextuality. The second essay in this section, "Is There a Class in This Text?," plays on the robust exchange surrounding Stanley Fish's book, Is There a Text in This Class?, which presents an attack on authority in reading. The author, John Slatin, has introduced electronic hypertextuality and interaction into his courses. His article maps the transformations in "the content and nature of work, and the workplace itself"-- which, in this case, is not industry but an English poetry class (25). Slatin discovered an increase of productive and cooperative learning in his electronically- mediated classroom. For him, creating knowledge in the electronic classroom involves interaction between students, instructors and course materials through the medium of interactive written discourse. These interactions lead to a new and persistent understanding of the course materials and of the participants' relation to the materials and to one another. The work of the course is to build relationships that, in my view, constitute not only the meaning of individual poems, but poetry itself. The class carries out its work in the continual and usually interactive production of text (31). While I applaud his strategies which dismantle traditional hierarchical structures in academia, the evidence does not convince me that the students know enough to ask important questions or to form a self-directing, learning community. Stanley Fish has not relinquished professing, though he, too, espouses the indeterminancy of the sign. By the fourth week of his course, Slatin's input is, by his own reckoning, reduced to 4% (39). In the transcript of the "controversial" Week 6 exchange on Gertrude Stein--the most disliked poet they were discussing at the time (40)--we see the blind leading the blind. One student parodies Stein for three lines and sums up his input with "I like it." Another, finds Stein's poetry "almost completey [sic] lacking in emotion or any artistic merit" (emphasis added). On what grounds has this student become an arbiter of "artistic merit"? Another student, after admitting being "lost" during the Wallace Steven discussion, talks of having more "respect for Stevens' work than Stein's" and adds that Stein's poetry lacks "conceptual significance[, s]omething which people of varied opinion can intelligently discuss without feeling like total dimwits...." This student has progressed from admitted incomprehension of Stevens' work to imposing her (groundless) respect for his work over Stein's. Then, she exposes her real dislike for Stein's poetry: that she (the student) missed the "conceptual significance" and hence cannot, being a person "of varied opinion," intelligently discuss it "without feeling like [a] total dimwit." Slatin's comment is frightening: "...by this point in the semester students have come to feel increasingly free to challenge the instructor" (41). The students that I have cited are neither thinking critically nor are their preconceptions challenged by student-governed interaction. Thanks to the class format, one student feels self-righteous in her ignorance, and empowered to censure. I believe strongly in student empowerment in the classroom, but only once students have accrued enough knowledge to make informed judgments. Admittedly, Slatin's essay presents only partial data (there are six hundred pages of course transcripts!); still, I wonder how much valuable knowledge and metaknowledge was gained by the students. I also question the extent to which authority and professorial dictature were addressed in this course format. The power structures that make it possible for a college to require such a course, and the choice of texts and pedagogy, were not "on the table." The traditional professorial position may have been displaced, but what took its place?--the authority of consensus with its unidentifiable strong arm, and the faceless reign of software design? Despite Slatin's claim that the students learned about the learning process, there is no evidence (in the article) that the students considered where their attitudes came from, how consensus operates in the construction of knowledge, how power is established and what relationship they have to bureaucratic insitutions. How do we, as teaching professionals, negotiate a balance between an enlightened despotism in education and student-created knowledge? Slatin, and other authors in this book, bring this fundamental question to the fore. There is no definitive answer because the factors involved are ultimately social, and hence, always shifting and reconfiguring. Slatin ends his article with the caveat that computerization can bring about greater estrangement between students, faculty and administration through greater regimentation and control. Of course, it can also "distribute authority and power more widely" (50). Power or authority without a specific face, however, is not necessarily good or just. Shahaf Gal's "Computers and Design Activities: Their Mediating Role in Engineering Education" is found in the second half of the volume, and does not allow for a theory/praxis dichotomy. Gal recounts a brief history of engineering education up to the introduction of Growltiger (GT), a computer-assisted learning aid for design. He demonstrates GT's potential to impact the learning of engineering design by tracking its use by four students in a bridge-building contest. What his text demonstrates clearly is that computers are "inscribing and imaging devices" that add another viewpoint to an on-going dialogue between student, teacher, earlier coursework, and other teaching/learning tools. The less proficient students made a serious error by relying too heavily on the technology, or treating it as a "blueprint provider." They "interacted with GT in a way that trusted the data to represent reality. They did not see their interaction with GT as a negotiation between two knowledge systems" (495). Students who were more thoroughly informed in engineering discourses knew to use the technology as one voice among others--they knew enough not simply to accept the input of the computer as authoritative. The less-advanced students learned a valuable lesson from the competition itself: the fact that their designs were not able to hold up under pressure (literally) brought the fact of their insufficient knowledge crashing down on them (and their bridges). They also had, post factum, several other designs to study, especially the winning one. Although competition and comparison are not good pedagogical strategies for everyone (in this case the competitors had volunteered), at some point what we think we know has to be challenged within the society of discourses to which it belongs. Students need critique in order to learn to push their learning into auto-critique. This is what is lacking in Slatin's discussion and in the writings of other avatars of constructivist, collaborative and computer-mediated pedagogies. Obviously there are differences between instrumental types of knowledge acquisition and discoursive knowledge accumulation. Indeed, I do not promote the teaching of reading, thinking and writing as "skills" per se (then again, Gal's teaching of design is quite discursive, if not dialogic). Nevertheless, the "soft" sciences might benefit from "bridge-building" competitions or the re-institution of some forms of agonia. Not everything agonistic is inhuman agony--the joy of confronting or creating a sound argument supported by defensible evidence, for example. Students need to know that soundbites are not sound arguments despite predictions that electronic writing will be aphoristic rather than periodic. Just because writing and learning can be conceived of hypertextually does not mean that rigor goes the way of the dinosaur. Rigor and hypertextuality are not mutually incompatible. Nor is rigorous thinking and hard intellectual work unpleasurable, although American anti-intellectualism, especially in the mass media, would make it so. At a time when the spurious dogmatics of a Rush Limbaugh and Holocaust revisionist historians circulate "aphoristically" in cyberspace, and at a time when knowledge is becoming increasingly textualized, the role of critical thinking in education will ultimately determine the value(s) of socially constructed knowledge. This volume affords the reader an opportunity to reconsider knowledge, power, and new communications technologies with respect to social dynamics and power relationships.
series other
last changed 2003/04/23 15:14

_id e7c8
authors Kalisperis, Loukas N., Steinman, Mitch and Summers, Luis H.
year 1992
title Design Knowledge, Environmental Complexity in Nonorthogonal Space
source New York: John Wiley & Sons, 1992. pp. 273-291 : ill. includes bibliography
summary Mechanization and industrialization of society has resulted in most people spending the greater part of their lives in enclosed environments. Optimal design of indoor artificial climates is therefore of increasing importance. Wherever artificial climates are created for human occupation, the aim is that the environment be designed so that individuals are in thermal comfort. Current design methodologies for radiant panel heating systems do not adequately account for the complexities of human thermal comfort, because they monitor air temperature alone and do not account for thermal neutrality in complex enclosures. Thermal comfort for a person is defined as that condition of mind which expresses satisfaction with the thermal environment. Thermal comfort is dependent on Mean Radiant Temperature and Operative Temperature among other factors. In designing artificial climates for human occupancy the interaction of the human with the heated surfaces as well the surface-to-surface heat exchange must be accounted for. Early work in the area provided an elaborate and difficult method for calculating radiant heat exchange for simplistic and orthogonal enclosures. A new improved method developed by the authors for designing radiant panel heating systems based on human thermal comfort and mean radiant temperature is presented. Through automation and elaboration this method overcomes the limitations of the early work. The design procedure accounts for human thermal comfort in nonorthogonal as well as orthogonal spaces based on mean radiant temperature prediction. The limitation of simplistic orthogonal geometries has been overcome with the introduction of the MRT-Correction method and inclined surface-to-person shape factor methodology. The new design method increases the accuracy of calculation and prediction of human thermal comfort and will allow designers to simulate complex enclosures utilizing the latest design knowledge of radiant heat exchange to increase human thermal comfort
keywords applications, architecture, building, energy, systems, design, knowledge
series CADline
last changed 2003/06/02 10:24

_id e8f0
authors Mackey, David L.
year 1992
title Mission Possible: Computer Aided Design for Everyone
doi https://doi.org/10.52842/conf.acadia.1992.065
source Mission - Method - Madness [ACADIA Conference Proceedings / ISBN 1-880250-01-2] 1992, pp. 65-73
summary A pragmatic model for the building of an electronic architectural design curriculum which will offer students and faculty the opportunity to fully integrate information age technologies into the educational experience is becoming increasingly desirable.

The majority of architectural programs teach technology topics through content specific courses which appear as an educational sequence within the curriculum. These technology topics have traditionally included structural design, environmental systems, and construction materials and methods. Likewise, that course model has been broadly applied to the teaching of computer aided design, which is identified as a technology topic. Computer technology has resulted in a proliferation of courses which similarly introduce the student to computer graphic and design systems through a traditional course structure.

Inevitably, competition for priority arises within the curriculum, introducing the potential risk that otherwise valuable courses and/or course content will be replaced by the "'newer" technology, and providing fertile ground for faculty and administrative resistance to computerization as traditional courses are pushed aside or seem threatened.

An alternative view is that computer technology is not a "topic", but rather the medium for creating a design (and studio) environment for informed decision making.... deciding what it is we should build. Such a viewpoint urges the development of a curricular structure, through which the impact of computer technology may be understood as that medium for design decision making, as the initial step in addressing the current and future needs of architectural education.

One example of such a program currently in place at the College of Architecture and Planning, Ball State University takes an approach which overlays, like a transparent tissue, the computer aided design content (or a computer emphasis) onto the primary curriculum.

With the exception of a general introductory course at the freshman level, computer instruction and content issues may be addressed effectively within existing studio courses. The level of operational and conceptual proficiency achieved by the student, within an electronic design studio, makes the electronic design environment selfsustaining and maintainable across the entire curriculum. The ability to broadly apply computer aided design to the educational experience can be independent of the availability of many specialized computer aided design faculty.

series ACADIA
last changed 2022/06/07 07:59

_id 9d0c
authors McVey, G., McCrobie, D., Evans, D., McIlvaine Parsons, D., Templar, J. Konz, S. and Caldwell, B.
year 1992
title Interactions between Environmental Design and Human Factors Specialists ENVIRONMENTAL DESIGN: Panel
source Proceedings of the Human Factors Society 36th Annual Meeting 1992 v.1 pp. 575-577
summary Most of the interactions between human factors specialists, such as ergonomists, and environmental specialists such as facility planners and architects tend to be task specific and do not follow any accepted process. Consequently, the success of such interactions are usually a function of serendipity rather than informed expectation. It is anticipated that by gathering such specialists in an open discussion, relevant issues may be addressed and successful interaction procedures introduced and discussed. Such a forum is desirable for developing an understanding of the differences, educational and operational, between environmental design specialists, and human factors specialists, as well as for exploring the ways their communications can be enhanced. It is anticipated that by sharing their experiences with the attendees, the presenters will identify relevant on-going knowledge transfer activities, and also introduce and discuss practical problem-solving and communication methods that can be used with assurance by the attendees themselves when faced with similar problems in the future. This panel will focus on issues that arrive out of situations where human factors specialists and environmental design specialists are joined together in project development. The specialties represented include architecture, facility planning, environmental psychology, ergonomic research, industrial design and engineering, and equipment and furniture design and manufacturing.
series other
last changed 2002/07/07 16:01

_id 2c22
authors O'Neill, Michael J.
year 1992
title Neural Network Simulation as a Computer- Aided design Tool For Predicting Wayfinding Performance
source New York: John Wiley & Sons, 1992. pp. 347-366 : ill. includes bibliography
summary Complex public facilities such as libraries, hospitals, and governmental buildings often present problems to users who must find their way through them. Research shows that difficulty in wayfinding has costs in terms of time, money, public safety, and stress that results from being lost. While a wide range of architectural research supports the notion that ease of wayfinding should be a criterion for good design, architects have no method for evaluating how well their building designs will support the wayfinding task. People store and retrieve information about the layout of the built environment in a knowledge representation known as the cognitive map. People depend on the information stored in the cognitive map to find their way through buildings. Although there are numerous simulations of the cognitive map, the mechanisms of these models are not constrained by what is known about the neurophysiology of the brain. Rather, these models incorporate search mechanisms that act on semantically encoded information about the environment. In this paper the author describes the evaluation and application of an artificial neural network simulation of the cognitive map as a means of predicting wayfinding behavior in buildings. This simulation is called NAPS-PC (Network Activity Processing Simulator--PC version). This physiologically plausible model represents knowledge about the layout of the environment through a network of inter-connected processing elements. The performance of NAPS-PC was evaluated against actual human wayfinding performance. The study found that the simulation generated behavior that matched the performance of human participants. After the validation, NAPS-PC was modified so that it could read environmental information directly from AutoCAD (a popular micro-computer-based CAD software package) drawing files, and perform 'wayfinding' tasks based on that environmental information. This prototype tool, called AutoNet, is conceptualized as a means of allowing designers to predict the wayfinding performance of users in a building before it is actually built
keywords simulation, cognition, neural networks, evaluation, floor plans, applications, wayfinding, layout, building
series CADline
last changed 2003/06/02 13:58

_id 61e0
authors Streich, Bernd
year 1992
title Should We Integrate Programming Knowledge into the Architect's CAAD-Education? Basic Considerations and Experiences from Kaiserslautern
doi https://doi.org/10.52842/conf.ecaade.1992.399
source CAAD Instruction: The New Teaching of an Architect? [eCAADe Conference Proceedings] Barcelona (Spain) 12-14 November 1992, pp. 399-409
summary At the ECAADE-congress 1991 in Munich, the teaching concept of computer-aided architectural design of the faculty of architecture and environmental/urban planning at the University of Kaiserslautern has been presented. On that occasion, this brought about the question whether the curriculum should include programming knowledge. In this paper, the discussion shall be taken up again with several arguments in favour of the computer programming instruction. At first, a survey of the current discussion of the subject shall be given, then there will follow some reflections on the theoretical relationship between designing and programming, and finally, examples from the teaching experience in Kaiserslautern will be presented.

series eCAADe
email
last changed 2022/06/07 07:56

_id aab6
authors Bermudez, Julio
year 1995
title Designing Architectural Experiences: Using Computers to Construct Temporal 3D Narratives
doi https://doi.org/10.52842/conf.acadia.1995.139
source Computing in Design - Enabling, Capturing and Sharing Ideas [ACADIA Conference Proceedings / ISBN 1-880250-04-7] University of Washington (Seattle, Washington / USA) October 19-22, 1995, pp. 139-149
summary Computers are launching us into a representational revolution that fundamentally challenges the way we have hitherto conceived and practiced architecture. This paper will explore one of its fronts: the simulation of architectural experiences. Today's off-the-shelf softwares (e.g. 3D modeling, animations, multimedia) allow us for first time in history to depict and thus approach architectural design and criticism truly experientially. What is so appealing about this is the possibility of shifting our attention from the object to the experience of the object and in so doing reconceptualizing architectural design as the design of architectural experiences. Carrying forward such a phenomenological proposition requires us to know (1) how to work with non-traditional and 'quasi-immersive' (or subject-centered) representational systems, and (2) how to construct temporal assemblages of experiential events that unfold not unlike 'architectural stories'. As our discipline lacks enough knowledge on this area, importing models from other fields appears as an appropriate starting point. In this sense, the narrative arts (especially those involved with the temporal representation of audio-visual narratives) offer us the best insights. For example, principles of cinema and storytelling give us an excellent guidance for designing architectural experiences that have a structuring theme (parti), a plot (order), unfolding episodes (rhythm), and special events (details). Approaching architecture as a temporal 3D narrative does transform the design process and, consequently, its results. For instance, (1) phenomenological issues enter the decision making process in an equal footing to functional, technological, or compositional considerations; (2) orthographic representations become secondary sources of information, mostly used for later accurate dimensioning or geometrization; (3) multi-sensory qualities beyond sight are seriously considered (particularly sound, texture, and kinesthetic); etc.
series ACADIA
email
last changed 2022/06/07 07:52

_id acadia17_138
id acadia17_138
authors Berry, Jaclyn; Park, Kat
year 2017
title A Passive System for Quantifying Indoor Space Utilization
doi https://doi.org/10.52842/conf.acadia.2017.138
source ACADIA 2017: DISCIPLINES & DISRUPTION [Proceedings of the 37th Annual Conference of the Association for Computer Aided Design in Architecture (ACADIA) ISBN 978-0-692-96506-1] Cambridge, MA 2-4 November, 2017), pp. 138-145
summary This paper presents the development of a prototype for a new sensing device for anonymously evaluating space utilization, which includes usage factors such as occupancy levels, congregation and circulation patterns. This work builds on existing methods and technology for measuring building performance, human comfort and occupant experience in post-occupancy evaluations as well as pre-design strategic planning. The ability to collect data related to utilization and occupant experience has increased significantly due to the greater accessibility of sensor systems in recent years. As a result, designers are exploring new methods to empirically verify spatial properties that have traditionally been considered more qualitative in nature. With this premise, this study challenges current strategies that rely heavily on manual data collection and survey reports. The proposed sensing device is designed to supplement the traditional manual method with a new layer of automated, unbiased data that is capable of capturing environmental and social qualities of a given space. In a controlled experiment, the authors found that the data collected from the sensing device can be extrapolated to show how layout, spatial interventions or other design factors affect circulation, congregation, productivity, and occupancy in an office setting. In the future, this sensing device could provide designers with real-time feedback about how their designs influence occupants’ experiences, and thus allow the designers to base what are currently intuition-based decisions on reliable data and evidence.
keywords design methods; information processing; smart buildings; IoT
series ACADIA
email
last changed 2022/06/07 07:52

_id acadia23_v3_247
id acadia23_v3_247
authors Bulman, Luke
year 2023
title Notes on a Visual Identity
source ACADIA 2023: Habits of the Anthropocene: Scarcity and Abundance in a Post-Material Economy [Volume 3: Proceedings of the 43rd Annual Conference for the Association for Computer Aided Design in Architecture (ACADIA) ISBN 979-8-9891764-1-0]. Denver. 26-28 October 2023. edited by A. Crawford, N. Diniz, R. Beckett, J. Vanucchi, M. Swackhamer 24-32.
summary In developing the visual identity for ""ACADIA 2023: Habits of the Anthropocene,"" our focus was on capturing the essence of extreme environmental conditions and their parallels with the challenges of the Anthropocene era. The project drew inspiration from the disorienting whiteout conditions in snowstorms, where the lack of visible shadows and horizon lines creates navigational challenges (figure 1.) This concept serves as a metaphor for the Anthropocene, a period where traditional methods of orientation and understanding are increasingly inadequate, necessitating the development of new approaches and tools.
series ACADIA
email
last changed 2024/04/17 14:00

_id 0014
authors Hsu, W. and Liu, B.
year 2000
title Conceptual design: issues and challenges
source Computer-Aided Design, Vol. 32 (14) (2000) pp. 849-850
summary Decisions made during conceptual design have significant influence on the cost, performance, reliability, safety and environmental impact of a product. It has been estimated that design decisions account for more than 75% of final product costs. It is, therefore, vital that designers have access to the right tools to support such design activities. In the early 1980s, researchers began to realize the impact of design decisions on downstream activities. As a result, different methodologies such as design for assembly, design for manufacturing and concurrent engineering, have been proposed. Software tools that implement these methodologies have also been developed. However, most of these tools are only applicable in the detailed design phase. Yet, even the highest standard of detailed design cannot compensate for a poor design concept formulated at the conceptual design phase. In spite of this, few CAD tools have been developed to support conceptual design activities. This is because knowledge of the design requirements and constraints during this early phase of a product's life cycle is usually imprecise and incomplete, making it difficult to utilize computer-based systems or prototypes. However, recent advances in fields such as fuzzy logic, computational geometry, constraints programming and so on have now made it possible for researchers to tackle some of the challenging issues in dealing with conceptual design activities. In this special issue, we have gathered together discussions on various aspects of conceptual design phase: from the capturing of the designer's intent, to modelling design constraints and solving them in an efficient manner, to verifying the correctness of the design.
series journal paper
email
last changed 2003/05/15 10:54

_id 802c
authors Kalisperis, Loukas N. and Kolarevic, Branko (Eds.)
year 1995
title Computing in Design - Enabling, Capturing and Sharing Ideas [Conference Proceedings]
doi https://doi.org/10.52842/conf.acadia.1995
source ACADIA Conference Proceedings / ISBN 1-880250-04-7 / University of Washington (Seattle, Washington / USA) October 19-22, 1995, 423 p.
summary The papers collected in this volume reflect not only the conference theme of enabling, capturing, and sharing design ideas, but also the ACADIA’s fifteen-year-old spirit of sharing new ideas about the application and integration of computing technology in architectural education and practice. In the fifteen years of its existence, ACADIA has not only encouraged new research, but has also motivated classroom use of new approaches that incorporate digital media directly into the design process. This educational mission is particularly important as architectural computing spreads from school's design studios into architectural offices, as students whom we train move into the work place and share their knowledge of the new design technologies. The papers in this volume clearly show that the capturing, enabling, and sharing of ideas are enhanced by the use of computers in design, not just in documentation and production, but more importantly from the very origination of the idea. The long sought synergy between the "digital" and the "traditional" is slowly, but increasingly happening in design studios and offices. Thousands of students and architects are exploring design ideas using digital technology, i.e., CAD is more or less in everyone's hands.

series ACADIA
email
last changed 2022/06/07 07:49

_id 1083
authors Wu, Rui
year 2002
title Computer Aided Dimensional Control in Building Construction
source Eindhoven University of Technology
summary Dimensional control in the building industry can be defined as the operational techniques and activities that are necessary, during the construction process of a building, for the assurance of the defined dimension quality of a building (Hoof, 1986). Efficient and precise dimensional control of buildings under construction is becoming ever more important because of changes in the construction industry. More prefabricated components are used; more regulations appear; newly designed buildings have more complex shapes, and building construction is speeding up. To ensure the predefined dimensional quality, a plan of dimensional control must be designed, on the basis of building drawings and specifications delivered by architects, before the building is constructed. The dimensional control plan must provide site personnel with adequate information on, among others, setting out and assembling building components, which can often be done by means of Total Stations. The essence of designing a dimensional control plan is to find out which points should be used as positioning points, which points should be set out in advance or controlled afterwards, and not to forget why. In an effort to contribute to the improvement of the dimensional control of on-site construction projects, this research tries to capture the knowledge required to design an adequate dimensional control plan and make that knowledge more generally available, and build a digital connection between CAD systems and Total Stations, focusing on prefabricated concrete building structural elements. The instrument developed in this research for capturing of essential dimensional control information and knowledge makes use of Product Data Technology (PDT) and Knowledge Technology (KT). The chosen solution supports the stochastic analysis of optimal positioning points taking account of various sorts of deviations and their mutual relationships. The resulting information model has been written in a standardized information modelling language called UML (Unified Modelling Language). The model has been implemented in a Dimensional Control System (DCS) and applied in the “La Tour” construction project in Apeldoorn, the Netherlands. The DCS provides a digital way to bridge the floor plan design with dimensional control, predict dimensional deviation limits and output the data needed for a Total Station. The case study of “La Tour” tests the UML model and prototype of the DCS. The results prove that direct positioning of objects (by putting reflectors on the objects and using a Total Station and by inputting coordinates extracted and calculated from the AutoCAD drawings) provides higher speed, accuracy and reliability. It also shows a way to (pre)position free form objects in 3D where traditional methods cannot. In conclusion: (1) it seems to be justified to expect that the application of the DCS will contribute to increased confidence in dimensional control and the reduction of costs of failure, which potentially could support the increased use of cheaper construction methods, and will also contribute to the improvement of building design and construction process. (2) the scientific contribution of this research is a first step towards providing dimensional quality in a construction process covered by stochastic dimensional uncertainty, even for positioning of free form objects.
keywords Construction Management; Constructional Engineering; Computer Applications
series thesis:PhD
last changed 2003/02/12 22:37

_id ecaade2013_224
id ecaade2013_224
authors Xiong, Lu; Xiong, Wei and Zhang, Hongxia
year 2013
title Gulou Structure Grammar and its Computer Implementation
doi https://doi.org/10.52842/conf.ecaade.2013.2.725
source Stouffs, Rudi and Sariyildiz, Sevil (eds.), Computation and Performance – Proceedings of the 31st eCAADe Conference – Volume 2, Faculty of Architecture, Delft University of Technology, Delft, The Netherlands, 18-20 September 2013, pp. 725-733
summary Gulou is a type of building found in ethnic Dong people’s settlements in south west China. It plays a significant role in the traditional Dong architecture and shows both social and technical values. In the near future the technique as an intangible culture heritage would face the risk of extinction because of globalization. The paper argues that the use of formal grammar and computer tools could help the preservation and learning of the design knowledge of Gulou Structure and develop Gulou designs which would be adapted to modern needs. A shape grammar called Gulou Structure Grammar (GSG) and its computer implementation are made to achieve the goals of capturing the design knowledge of Gulou structure, generating new Gulou designs and promoting the education of Gulou building techniques.
wos WOS:000340643600075
keywords Gulou structure; shape grammar; parametric model; ethnic building technique.
series eCAADe
email
last changed 2022/06/07 07:57

_id ijac202321413
id ijac202321413
authors Ayoub, Mohammed
year 2023
title Estimating the received solar irradiances by traditional vaulted roofs using optimized neural networks and transfer learning
source International Journal of Architectural Computing 2023, Vol. 21 - no. 4, 795-820
summary Traditional vaulted roof-forms have long been utilized in hot-desert climate for better indoor environmental quality. Unprecedently, this research investigates the possible contribution of machine learning to estimate the received solar irradiances by those roofs, based on simulation-derived training and testing datasets, where two algorithms were used to reduce their higher-dimensionality. Then, four models of ordinary least-squares and artificial neural networks were developed. Their ability to accurately estimate solar irradiances was confirmed, with R2 of 95.599–98.794% and RMSE of 12.437–23.909 Wh/m2. Transfer Learning was also applied to pass the stored knowledge of the best-performing model into another one for estimating the performance of new roof-forms. The results demonstrated that transferred models could provide better estimations with R2 of 87.416–97.889% and RMSE of 79.300–13.971 Wh/m2, compared to un-transferred models. Machine learning shall redefine the practice of building performance, providing architects with flexibility to rapidly make informed decisions during the early design stages.
keywords Solar irradiance, prediction, simulation, machine learning, transfer learning
series journal
last changed 2024/04/17 14:30

_id ddss9811
id ddss9811
authors Barbanente, A., Conte, E. and Monno, V.
year 1998
title Changing trends and approaches in human and computer modelling for social housing policies
source Timmermans, Harry (Ed.), Fourth Design and Decision Support Systems in Architecture and Urban Planning Maastricht, the Netherlands), ISBN 90-6814-081-7, July 26-29, 1998
summary The paper discusses conceptual issues, goals and preliminary results of an on-going research which aims at building a Decision Support System for public housing environmental oriented maintenance and management in a city in Southern Italy, Bari. Traditional post-war Italian housing policies are compared with more recent approaches in the field, pointing out the change from quantitative, aggregated, more simple building problems and relatedapproaches to qualitative, differentiated, complex ones integrating social, economic and environmental dimensions with the aim of regenerating deteriorated residential areas. The paper claims for the need shift, both in the human and computer areas, from traditional quantitative models to new approaches able to manage also qualitative variables, temporal dynamics, emergencies, and intentionality, since they appear key aspects of the real world to be modelled. The housing estate of Bari and its needs of maintenance and management are examined, eliciting essential related knowledge using the interview technique. The clear orientation towards sustainable policies for urban regeneration, at a local, national, and Community level, is also considered. The innovative and collaborative nature of such policies and the attention to be paid to the social aspects ofthe problem require a complex DSS, integrating various kind of hypertexts, information systems and case-based fuzzy expert systems, whose main aims, functions, software and general organisation are outlined in the paper.
series DDSS
last changed 2003/11/21 15:16

_id ga9924
id ga9924
authors Cardalda, Juan Jesus Romero J.J.
year 1999
title Artificial Music Composer
source International Conference on Generative Art
summary Traditional Musical Computation Systems had to face the differences between the computational techniques and the characteristics of musical creation. Characteristics such as a high degree of subjectivity, a great irrational component, and a learning process based on the use of examples and environmental absorption, have made music difficult to be formalized through algorithmic methods or classical Artificial Intelligence methods such as Expert Systems. We propose the creation of a cybernetic model of a human composer in a primeval stage of human musical evolution, following a paradigm of cognitive complex models creation, based on the use of the human reference, not only in a static point of view but also considering its evolution through time. Therefore, the proposed system simulates musical creation in one of the first stages of musical evolution, whose main characteristics are the percussive and choral aspects. The system is based on Genetic Algorithms, whose genetic population is integrated by several tribes. This model carries out the task of musical composition, led by the user who expresses his/her musical taste assigning a punctuation to each tribe. The GA selects the worse tribes as individuals to be eliminated. In order to select those tribes which are going to be used as parents, a random function is used, having each tribe a probality proportional to its punctuation. The new tribe is produced by crossing the parent tribes in each individual. Afterwards, mutation takes place in the created individuals. The experiments carried out with this system have proved its functionality in the composition of rhythmic patterns. It is intended to enlarge the experiment's scope by communicating the system via Internet. This would enable its use by users of different musical cultures, taking into account that the system is user-friendly, since it requires no musical knowledge.
series other
email
more http://www.generativeart.com/
last changed 2003/08/07 17:25

For more results click below:

this is page 0show page 1show page 2show page 3show page 4show page 5... show page 240HOMELOGIN (you are user _anon_625228 from group guest) CUMINCAD Papers Powered by SciX Open Publishing Services 1.002