CumInCAD is a Cumulative Index about publications in Computer Aided Architectural Design
supported by the sibling associations ACADIA, CAADRIA, eCAADe, SIGraDi, ASCAAD and CAAD futures

PDF papers
References

Hits 1 to 20 of 240

_id cef3
authors Bridges, Alan H.
year 1992
title Computing and Problem Based Learning at Delft University of Technology Faculty of Architecture
doi https://doi.org/10.52842/conf.ecaade.1992.289
source CAAD Instruction: The New Teaching of an Architect? [eCAADe Conference Proceedings] Barcelona (Spain) 12-14 November 1992, pp. 289-294
summary Delft University of Technology, founded in 1842, is the oldest and largest technical university in the Netherlands. It provides education for more than 13,000 students in fifteen main subject areas. The Faculty of Architecture, Housing, Urban Design and Planning is one of the largest faculties of the DUT with some 2000 students and over 500 staff members. The course of study takes four academic years: a first year (Propaedeuse) and a further three years (Doctoraal) leading to the "ingenieur" qualification. The basic course material is delivered in the first two years and is taken by all students. The third and fourth years consist of a smaller number of compulsory subjects in each of the department's specialist areas together with a wide range of option choices. The five main subject areas the students may choose from for their specialisation are Architecture, Building and Project Management, Building Technology, Urban Design and Planning, and Housing.

The curriculum of the Faculty has been radically revised over the last two years and is now based on the concept of "Problem-Based Learning". The subject matter taught is divided thematically into specific issues that are taught in six week blocks. The vehicles for these blocks are specially selected and adapted case studies prepared by teams of staff members. These provide a focus for integrating specialist subjects around a studio based design theme. In the case of second year this studio is largely computer-based: many drawings are produced by computer and several specially written computer applications are used in association with the specialist inputs.

This paper describes the "block structure" used in second year, giving examples of the special computer programs used, but also raises a number of broader educational issues. Introduction of the block system arose as a method of curriculum integration in response to difficulties emerging from the independent functioning of strong discipline areas in the traditional work groups. The need for a greater level of selfdirected learning was recognised as opposed to the "passive information model" of student learning in which the students are seen as empty vessels to be filled with knowledge - which they are then usually unable to apply in design related contexts in the studio. Furthermore, the value of electives had been questioned: whilst enabling some diversity of choice, they may also be seen as diverting attention and resources from the real problems of teaching architecture.

series eCAADe
email
last changed 2022/06/07 07:54

_id 7ce5
authors Gal, Shahaf
year 1992
title Computers and Design Activities: Their Mediating Role in Engineering Education
source Sociomedia, ed. Edward Barret. MIT Press
summary Sociomedia: With all the new words used to describe electronic communication (multimedia, hypertext, cyberspace, etc.), do we need another one? Edward Barrett thinks we do; hence, he coins the term "sociomedia." It is meant to displace a computing economy in which technicity is hypostasized over sociality. Sociomedia, a compilation of twenty-five articles on the theory, design and practice of educational multimedia and hypermedia, attempts to re-value the communicational face of computing. Value, of course, is "ultimately a social construct." As such, it has everything to do with knowledge, power, education and technology. The projects discussed in this book represent the leading edge of electronic knowledge production in academia (not to mention major funding) and are determining the future of educational media. For these reasons, Sociomedia warrants close inspection. Barrett's introduction sets the tone. For him, designing computer media involves hardwiring a mechanism for the social construction of knowledge (1). He links computing to a process of social and communicative interactivity for constructing and desseminating knowledge. Through a mechanistic mapping of the university as hypercontext (a huge network that includes classrooms as well as services and offices), Barrett models intellectual work in such a way as to avoid "limiting definitions of human nature or human development." Education, then, can remain "where it should be--in the human domain (public and private) of sharing ideas and information through the medium of language." By leaving education in a virtual realm (where we can continue to disagree about its meaning and execution), it remains viral, mutating and contaminating in an intellectually healthy way. He concludes that his mechanistic model, by means of its reductionist approach, preserves value (7). This "value" is the social construction of knowledge. While I support the social orientation of Barrett's argument, discussions of value are related to power. I am not referring to the traditional teacher-student power structure that is supposedly dismantled through cooperative and constructivist learning strategies. The power to be reckoned with in the educational arena is foundational, that which (pre)determines value and the circulation of knowledge. "Since each of you reading this paragraph has a different perspective on the meaning of 'education' or 'learning,' and on the processes involved in 'getting an education,' think of the hybris in trying to capture education in a programmable function, in a displayable object, in a 'teaching machine'" (7). Actually, we must think about that hybris because it is, precisely, what informs teaching machines. Moreover, the basic epistemological premises that give rise to such productions are too often assumed. In the case of instructional design, the episteme of cognitive sciences are often taken for granted. It is ironic that many of the "postmodernists" who support electronic hypertextuality seem to have missed Jacques Derrida's and Michel Foucault's "deconstructions" of the epistemology underpinning cognitive sciences (if not of epistemology itself). Perhaps it is the glitz of the technology that blinds some users (qua developers) to the belief systems operating beneath the surface. Barrett is not guilty of reactionary thinking or politics; he is, in fact, quite in line with much American deconstructive and postmodern thinking. The problem arises in that he leaves open the definitions of "education," "learning" and "getting an education." One cannot engage in the production of new knowledge without orienting its design, production and dissemination, and without negotiating with others' orientations, especially where largescale funding is involved. Notions of human nature and development are structural, even infrastructural, whatever the medium of the teaching machine. Although he addresses some dynamics of power, money and politics when he talks about the recession and its effects on the conference, they are readily visible dynamics of power (3-4). Where does the critical factor of value determination, of power, of who gets what and why, get mapped onto a mechanistic model of learning institutions? Perhaps a mapping of contributors' institutions, of the funding sources for the projects showcased and for participation in the conference, and of the disciplines receiving funding for these sorts of projects would help visualize the configurations of power operative in the rising field of educational multimedia. Questions of power and money notwithstanding, Barrett's introduction sets the social and textual thematics for the collection of essays. His stress on interactivity, on communal knowledge production, on the society of texts, and on media producers and users is carried foward through the other essays, two of which I will discuss. Section I of the book, "Perspectives...," highlights the foundations, uses and possible consequences of multimedia and hypertextuality. The second essay in this section, "Is There a Class in This Text?," plays on the robust exchange surrounding Stanley Fish's book, Is There a Text in This Class?, which presents an attack on authority in reading. The author, John Slatin, has introduced electronic hypertextuality and interaction into his courses. His article maps the transformations in "the content and nature of work, and the workplace itself"-- which, in this case, is not industry but an English poetry class (25). Slatin discovered an increase of productive and cooperative learning in his electronically- mediated classroom. For him, creating knowledge in the electronic classroom involves interaction between students, instructors and course materials through the medium of interactive written discourse. These interactions lead to a new and persistent understanding of the course materials and of the participants' relation to the materials and to one another. The work of the course is to build relationships that, in my view, constitute not only the meaning of individual poems, but poetry itself. The class carries out its work in the continual and usually interactive production of text (31). While I applaud his strategies which dismantle traditional hierarchical structures in academia, the evidence does not convince me that the students know enough to ask important questions or to form a self-directing, learning community. Stanley Fish has not relinquished professing, though he, too, espouses the indeterminancy of the sign. By the fourth week of his course, Slatin's input is, by his own reckoning, reduced to 4% (39). In the transcript of the "controversial" Week 6 exchange on Gertrude Stein--the most disliked poet they were discussing at the time (40)--we see the blind leading the blind. One student parodies Stein for three lines and sums up his input with "I like it." Another, finds Stein's poetry "almost completey [sic] lacking in emotion or any artistic merit" (emphasis added). On what grounds has this student become an arbiter of "artistic merit"? Another student, after admitting being "lost" during the Wallace Steven discussion, talks of having more "respect for Stevens' work than Stein's" and adds that Stein's poetry lacks "conceptual significance[, s]omething which people of varied opinion can intelligently discuss without feeling like total dimwits...." This student has progressed from admitted incomprehension of Stevens' work to imposing her (groundless) respect for his work over Stein's. Then, she exposes her real dislike for Stein's poetry: that she (the student) missed the "conceptual significance" and hence cannot, being a person "of varied opinion," intelligently discuss it "without feeling like [a] total dimwit." Slatin's comment is frightening: "...by this point in the semester students have come to feel increasingly free to challenge the instructor" (41). The students that I have cited are neither thinking critically nor are their preconceptions challenged by student-governed interaction. Thanks to the class format, one student feels self-righteous in her ignorance, and empowered to censure. I believe strongly in student empowerment in the classroom, but only once students have accrued enough knowledge to make informed judgments. Admittedly, Slatin's essay presents only partial data (there are six hundred pages of course transcripts!); still, I wonder how much valuable knowledge and metaknowledge was gained by the students. I also question the extent to which authority and professorial dictature were addressed in this course format. The power structures that make it possible for a college to require such a course, and the choice of texts and pedagogy, were not "on the table." The traditional professorial position may have been displaced, but what took its place?--the authority of consensus with its unidentifiable strong arm, and the faceless reign of software design? Despite Slatin's claim that the students learned about the learning process, there is no evidence (in the article) that the students considered where their attitudes came from, how consensus operates in the construction of knowledge, how power is established and what relationship they have to bureaucratic insitutions. How do we, as teaching professionals, negotiate a balance between an enlightened despotism in education and student-created knowledge? Slatin, and other authors in this book, bring this fundamental question to the fore. There is no definitive answer because the factors involved are ultimately social, and hence, always shifting and reconfiguring. Slatin ends his article with the caveat that computerization can bring about greater estrangement between students, faculty and administration through greater regimentation and control. Of course, it can also "distribute authority and power more widely" (50). Power or authority without a specific face, however, is not necessarily good or just. Shahaf Gal's "Computers and Design Activities: Their Mediating Role in Engineering Education" is found in the second half of the volume, and does not allow for a theory/praxis dichotomy. Gal recounts a brief history of engineering education up to the introduction of Growltiger (GT), a computer-assisted learning aid for design. He demonstrates GT's potential to impact the learning of engineering design by tracking its use by four students in a bridge-building contest. What his text demonstrates clearly is that computers are "inscribing and imaging devices" that add another viewpoint to an on-going dialogue between student, teacher, earlier coursework, and other teaching/learning tools. The less proficient students made a serious error by relying too heavily on the technology, or treating it as a "blueprint provider." They "interacted with GT in a way that trusted the data to represent reality. They did not see their interaction with GT as a negotiation between two knowledge systems" (495). Students who were more thoroughly informed in engineering discourses knew to use the technology as one voice among others--they knew enough not simply to accept the input of the computer as authoritative. The less-advanced students learned a valuable lesson from the competition itself: the fact that their designs were not able to hold up under pressure (literally) brought the fact of their insufficient knowledge crashing down on them (and their bridges). They also had, post factum, several other designs to study, especially the winning one. Although competition and comparison are not good pedagogical strategies for everyone (in this case the competitors had volunteered), at some point what we think we know has to be challenged within the society of discourses to which it belongs. Students need critique in order to learn to push their learning into auto-critique. This is what is lacking in Slatin's discussion and in the writings of other avatars of constructivist, collaborative and computer-mediated pedagogies. Obviously there are differences between instrumental types of knowledge acquisition and discoursive knowledge accumulation. Indeed, I do not promote the teaching of reading, thinking and writing as "skills" per se (then again, Gal's teaching of design is quite discursive, if not dialogic). Nevertheless, the "soft" sciences might benefit from "bridge-building" competitions or the re-institution of some forms of agonia. Not everything agonistic is inhuman agony--the joy of confronting or creating a sound argument supported by defensible evidence, for example. Students need to know that soundbites are not sound arguments despite predictions that electronic writing will be aphoristic rather than periodic. Just because writing and learning can be conceived of hypertextually does not mean that rigor goes the way of the dinosaur. Rigor and hypertextuality are not mutually incompatible. Nor is rigorous thinking and hard intellectual work unpleasurable, although American anti-intellectualism, especially in the mass media, would make it so. At a time when the spurious dogmatics of a Rush Limbaugh and Holocaust revisionist historians circulate "aphoristically" in cyberspace, and at a time when knowledge is becoming increasingly textualized, the role of critical thinking in education will ultimately determine the value(s) of socially constructed knowledge. This volume affords the reader an opportunity to reconsider knowledge, power, and new communications technologies with respect to social dynamics and power relationships.
series other
last changed 2003/04/23 15:14

_id ea96
authors Hacfoort, Eek J. and Veldhuisen, Jan K.
year 1992
title A Building Design and Evaluation System
source New York: John Wiley & Sons, 1992. pp. 195-211 : ill. table. includes bibliography
summary Within the field of architectural design there is a growing awareness of imbalance among the professionalism, the experience, and the creativity of the designers' response to the up-to-date requirements of all parties interested in the design process. The building design and evaluating system COSMOS makes it possible for various participants to work within their own domain, so that separated but coordinated work can be done. This system is meant to organize the initial stage of the design process, where user-defined functions, geometry, type of construction, and building materials are decided. It offers a tool to design a building to calculate a number of effects and for managing the information necessary to evaluate the design decisions. The system is provided with data and sets of parameters for describing the conditions, along with their properties, of the main building functions of a selection of well-known building types. The architectural design is conceptualized as being a hierarchy of spatial units, ranking from building blocks down to specific rooms or spaces. The concept of zoning is used as a means of calculating and directly evaluating the structure of the design without working out the details. A distinction is made between internal and external calculations and evaluations during the initial design process. During design on screen, an estimation can be recorded of building costs, energy costs, acoustics, lighting, construction, and utility. Furthermore, the design can be exported to a design application program, in this case AutoCAD, to make and show drawings in more detail. Through the medium of a database, external calculation and evaluation of building costs, life-cycle costs, energy costs, interior climate, acoustics, lighting, construction, and utility are possible in much more advanced application programs
keywords evaluation, applications, integration, architecture, design, construction, building, energy, cost, lighting, acoustics, performance
series CADline
last changed 2003/06/02 13:58

_id cf73
authors Dosti, P., Martens, B. and Voigt, A.
year 1992
title Spatial Simulation In Architecture, City Development and Regional Planning
doi https://doi.org/10.52842/conf.ecaade.1992.195
source CAAD Instruction: The New Teaching of an Architect? [eCAADe Conference Proceedings] Barcelona (Spain) 12-14 November 1992, pp. 195-200
summary The appropriate use of spatial simulation techniques considerably tends to increase the depth of evidence and the realistic content of the design and plannings to be described and moreover may encourage experimentations, trial attempts and planning variants. This means also the more frequent use of combinations between different techniques, having in mind that they are not equivalent, but making use of the respective advantages each offers. Until now the main attention of the EDP-Lab was directed on achieving quantity. For the time to come time it will be the formation of quality. The challenge in the educational system at the Vienna University of Technology is to obtain appropriate results in the frame- work of low-cost simulation. This aspect seems also to be meaningful in order to enforce the final implementation in architectural practice.

series eCAADe
email
more http://info.tuwien.ac.at/ecaade/
last changed 2022/06/07 07:55

_id e8f0
authors Mackey, David L.
year 1992
title Mission Possible: Computer Aided Design for Everyone
doi https://doi.org/10.52842/conf.acadia.1992.065
source Mission - Method - Madness [ACADIA Conference Proceedings / ISBN 1-880250-01-2] 1992, pp. 65-73
summary A pragmatic model for the building of an electronic architectural design curriculum which will offer students and faculty the opportunity to fully integrate information age technologies into the educational experience is becoming increasingly desirable.

The majority of architectural programs teach technology topics through content specific courses which appear as an educational sequence within the curriculum. These technology topics have traditionally included structural design, environmental systems, and construction materials and methods. Likewise, that course model has been broadly applied to the teaching of computer aided design, which is identified as a technology topic. Computer technology has resulted in a proliferation of courses which similarly introduce the student to computer graphic and design systems through a traditional course structure.

Inevitably, competition for priority arises within the curriculum, introducing the potential risk that otherwise valuable courses and/or course content will be replaced by the "'newer" technology, and providing fertile ground for faculty and administrative resistance to computerization as traditional courses are pushed aside or seem threatened.

An alternative view is that computer technology is not a "topic", but rather the medium for creating a design (and studio) environment for informed decision making.... deciding what it is we should build. Such a viewpoint urges the development of a curricular structure, through which the impact of computer technology may be understood as that medium for design decision making, as the initial step in addressing the current and future needs of architectural education.

One example of such a program currently in place at the College of Architecture and Planning, Ball State University takes an approach which overlays, like a transparent tissue, the computer aided design content (or a computer emphasis) onto the primary curriculum.

With the exception of a general introductory course at the freshman level, computer instruction and content issues may be addressed effectively within existing studio courses. The level of operational and conceptual proficiency achieved by the student, within an electronic design studio, makes the electronic design environment selfsustaining and maintainable across the entire curriculum. The ability to broadly apply computer aided design to the educational experience can be independent of the availability of many specialized computer aided design faculty.

series ACADIA
last changed 2022/06/07 07:59

_id 9d0c
authors McVey, G., McCrobie, D., Evans, D., McIlvaine Parsons, D., Templar, J. Konz, S. and Caldwell, B.
year 1992
title Interactions between Environmental Design and Human Factors Specialists ENVIRONMENTAL DESIGN: Panel
source Proceedings of the Human Factors Society 36th Annual Meeting 1992 v.1 pp. 575-577
summary Most of the interactions between human factors specialists, such as ergonomists, and environmental specialists such as facility planners and architects tend to be task specific and do not follow any accepted process. Consequently, the success of such interactions are usually a function of serendipity rather than informed expectation. It is anticipated that by gathering such specialists in an open discussion, relevant issues may be addressed and successful interaction procedures introduced and discussed. Such a forum is desirable for developing an understanding of the differences, educational and operational, between environmental design specialists, and human factors specialists, as well as for exploring the ways their communications can be enhanced. It is anticipated that by sharing their experiences with the attendees, the presenters will identify relevant on-going knowledge transfer activities, and also introduce and discuss practical problem-solving and communication methods that can be used with assurance by the attendees themselves when faced with similar problems in the future. This panel will focus on issues that arrive out of situations where human factors specialists and environmental design specialists are joined together in project development. The specialties represented include architecture, facility planning, environmental psychology, ergonomic research, industrial design and engineering, and equipment and furniture design and manufacturing.
series other
last changed 2002/07/07 16:01

_id 58c5
authors Van Wezel, Ruud
year 1992
title MOCK-UP SYSTEM WAGENINGEN: DEVELOPMENT, LIMITATION AND FUTURE
source Proceedings of the 4rd European Full-Scale Modelling Conference / Lausanne (Switzerland) 9-12 September 1992, Part A, pp. 15-18
summary A brief description of the development of the Mock-up System (MUS) in the context of the Wageningen training program. The students are first taught some keywords in understanding of the building process. They are then trained to express how they want to live (theory) and later on they confront themselves with what they have built in the MUS (practice) . Besides being an educational tool, the MUS is used for pre-building evaluation and research questions. The drawbacks or limitations of the system (outdoor reality versus indoor simulation) and future use by different target groups are also discussed in this paper. The power of the MUS is, and will continue to be, the concrete building of communicational results and the generation of communication by doing so.
keywords Full-scale Modeling, Model Simulation, Real Environments
series other
type normal paper
more http://info.tuwien.ac.at/efa
last changed 2004/05/04 15:30

_id sigradi2015_11.166
id sigradi2015_11.166
authors Calixto, Victor; Celani, Gabriela
year 2015
title A literature review for space planning optimization using an evolutionary algorithm approach: 1992-2014
source SIGRADI 2015 [Proceedings of the 19th Conference of the Iberoamerican Society of Digital Graphics - vol. 2 - ISBN: 978-85-8039-133-6] Florianópolis, SC, Brasil 23-27 November 2015, pp. 662-671.
summary Space planning in architecture is a field of research in which the process of arranging a set of space elements is the main concern. This paper presents a survey of 31 papers among applications and reviews of space planning method using evolutionary algorithms. The objective of this work was to organize, classify and discuss about twenty-two years of SP based on an evolutionary approach to orient future research in the field.
keywords Space Planning, Evolutionary algorithms, Generative System
series SIGRADI
email
last changed 2016/03/10 09:47

_id cf5c
authors Carpenter, B.
year 1992
title The logic of typed feature structures with applications to unification grammars, logic programs and constraint resolution
source Cambridge Tracts in Theoretical Computer Science, Cambridge University Press
summary This book develops the theory of typed feature structures, a new form of data structure that generalizes both the first-order terms of logic programs and feature-structures of unification-based grammars to include inheritance, typing, inequality, cycles and intensionality. It presents a synthesis of many existing ideas into a uniform framework, which serves as a logical foundation for grammars, logic programming and constraint-based reasoning systems. Throughout the text, a logical perspective is adopted that employs an attribute-value description language along with complete equational axiomatizations of the various systems of feature structures. Efficiency concerns are discussed and complexity and representability results are provided. The application of feature structures to phrase structure grammars is described and completeness results are shown for standard evaluation strategies. Definite clause logic programs are treated as a special case of phrase structure grammars. Constraint systems are introduced and an enumeration technique is given for solving arbitrary attribute-value logic constraints. This book with its innovative approach to data structures will be essential reading for researchers in computational linguistics, logic programming and knowledge representation. Its self-contained presentation makes it flexible enough to serve as both a research tool and a textbook.
series other
last changed 2003/04/23 15:14

_id ddss9209
id ddss9209
authors De Gelder, J.T. and Lucardie, G.L.
year 1993
title Knowledge and data modelling in cad/cam applications
source Timmermans, Harry (Ed.), Design and Decision Support Systems in Architecture (Proceedings of a conference held in Mierlo, the Netherlands in July 1992), ISBN 0-7923-2444-7
summary Modelling knowledge and data in CAD/CAM applications is complex because different goals and contexts have to be taken into account. This complexity makes particular demands upon representation formalisms. Today many modelling tools are based on record structures. By analyzing the requirements for a product model of a portal structure in steel, this paper shows that in many situations record structures are not well suited as a representation formalism for storing knowledge and data in CAD/CAM applications. This is illustrated by performing a knowledge-level analysis of the knowledge and data generated in the design and manufacturing process of a portal structure in steel.
series DDSS
last changed 2003/08/07 16:36

_id ddss9205
id ddss9205
authors De Scheemaker. A.
year 1993
title Towards an integrated facility management system for management and use of government buildings
source Timmermans, Harry (Ed.), Design and Decision Support Systems in Architecture (Proceedings of a conference held in Mierlo, the Netherlands in July 1992), ISBN 0-7923-2444-7
summary The Government Building Agency in the Netherlands is developing an integrated facility management system for two of its departments. Applications are already developed to support a number of day-to-day facility management activities on an operational level. Research is now being carried out to develop a management control system to better plan and control housing and material resources.
series DDSS
last changed 2003/08/07 16:36

_id 067f
authors Gantt, Michelle and Nardi, Bonnie A.
year 1992
title Gardeners and Gurus: Patterns of Cooperation among CAD Users Perspectives on the Design of CollaborativeSystems
source Proceedings of ACM CHI'92 Conference on Human Factors in ComputingSystems 1992 pp. 107-117
summary We studied CAD system users to find out how they use the sophisticated customization and extension facilities offered by many CAD products. We found that users of varying levels of expertise collaborate to customize their CAD environments and to create programmatic extensions to their applications. Within a group of users, there is at least one local expert who provides support for other users. We call this person a local developer. The local developer is a fellow domain expert, not a professional programmer, outside technical consultant or MIS staff member. We found that in some CAD environments the support role has been formalized so that local developers are given official recognition, and time and resources to pursue local developer activities. In general, this formalization of the local developer role appears successful. We discuss the implications of our findings for work practices and for software design.
keywords Cooperative Work; End User Programming
series other
last changed 2002/07/07 16:01

_id a081
authors Greenberg S., Roseman M. and Webster, D.
year 1992
title Issues and Experiences Designing and Implementing Two Group Drawing Tools
source Readings in Groupware, 609-620
summary Groupware designers are now developing multi-user equivalents of popular paint and draw applications. Their job is not an easy one. First, human factors issues peculiar to group interaction appear that, if ignored, seriously limit the usability of the group tool. Second, implementation is fraught with considerable hurdles. This paper describes the issues and experiences we have met and handled in the design of two systems supporting remote real time group interaction: GroupSketch, a multi-user sketchpad; and GroupDraw, an object-based multi-user draw package. On the human factors side, we summarize empirically-derived design principles that we believe are critical to building useful and usable collaborative drawing tools. On the implementation side, we describe our experiences with replicated versus centralized architectures, schemes for participant registration, multiple cursors, network requirements, and the structure of the drawing primitives.
series other
last changed 2003/04/23 15:50

_id 6df3
authors Gross, Mark D. and Zimring, Craig
year 1992
title Predicting Wayfinding Behavior in Buildings : A Schema-Based Approach
source New York: John Wiley & Sons, 1992. pp. 367-377 : ill. includes bibliography
summary Postoccupancy evaluations of large buildings often reveal significant wayfinding problems caused by poor floor-plan layout. Predicting wayfinding problems early in the design process could avoid costly remodeling and make better buildings. However, we lack formal, predictive models of human wayfinding behavior. Computational models of wayfinding in buildings have addressed constructing a topological and geometric representations of the plan layout incrementally during exploration. The authors propose to combine this with a schema model of building memory. It is argued that people orient themselves and wayfind in new buildings using schemas, or generic expectations about building layout. In this paper the authors give their preliminary thoughts toward developing a computational model of wayfinding based on this approach
keywords wayfinding, evaluation, applications, architecture, floor plans, layout, building, prediction
series CADline
email
last changed 2002/09/05 15:02

_id 6cfd
authors Harfmann, Anton C. and Majkowski, Bruce R.
year 1992
title Component-Based Spatial Reasoning
doi https://doi.org/10.52842/conf.acadia.1992.103
source Mission - Method - Madness [ACADIA Conference Proceedings / ISBN 1-880250-01-2] 1992, pp. 103-111
summary The design process and ordering of individual components through which architecture is realized relies on the use of abstract "models" to represent a proposed design. The emergence and use of these abstract "models" for building representation has a long history and tradition in the field of architecture. Models have been made and continue to be made for the patron, occasionally the public, and as a guide for the builders. Models have also been described as a means to reflect on the design and to allow the design to be in dialogue with the creator.

The term "model" in the above paragraph has been used in various ways and in this context is defined as any representation through which design intent is expressed. This includes accurate/ rational or abstract drawings (2- dimensional and 3-dimensional), physical models (realistic and abstract) and computer models (solid, void and virtual reality). The various models that fall within the categories above have been derived from the need to "view" the proposed design in various ways in order to support intuitive reasoning about the proposal and for evaluation purposes. For example, a 2-dimensional drawing of a floor plan is well suited to support reasoning about spatial relationships and circulation patterns while scaled 3-dimensional models facilitate reasoning about overall form, volume, light, massing etc. However, the common denominator of all architectural design projects (if the intent is to construct them in actual scale, physical form) are the discrete building elements from which the design will be constructed. It is proposed that a single computational model representing individual components supports all of the above "models" and facilitates "viewing"' the design according to the frame of reference of the viewer.

Furthermore, it is the position of the authors that all reasoning stems from this rudimentary level of modeling individual components.

The concept of component representation has been derived from the fact that a "real" building (made from individual components such as nuts, bolts and bar joists) can be "viewed" differently according to the frame of reference of the viewer. Each individual has the ability to infer and abstract from the assemblies of components a variety of different "models" ranging from a visceral, experiential understanding to a very technical, physical understanding. The component concept has already proven to be a valuable tool for reasoning about assemblies, interferences between components, tracing of load path and numerous other component related applications. In order to validate the component-based modeling concept this effort will focus on the development of spatial understanding from the component-based model. The discussions will, therefore, center about the representation of individual components and the development of spatial models and spatial reasoning from the component model. In order to frame the argument that spatial modeling and reasoning can be derived from the component representation, a review of the component-based modeling concept will precede the discussions of spatial issues.

series ACADIA
email
last changed 2022/06/07 07:49

_id 578d
authors Helpenstein, H. (Ed.)
year 1993
title CAD geometry data exchange using STEP
source Berlin: Springer-Verlag
summary With increasing demand for data exchange in computer integrated manufacturing, a neutral connection between dissimilar systems is needed. After a few national and European attempts, a worldwide standardization of product data has been developed. Standard ISO 10303 (STEP - STandard for Exchange of Product data) produced in its first version those parts that are relevant for CAD geometrical data. A European consortium of 14 CAD vendors and users was supported by the ESPRIT programme to influence the emerging standard and implement early applications for it. Over the years 1989-1992, project CADEX (CAD geometry data EXchange) worked out application protocols as a contribution to STEP; developed a software toolkit that reads, writes, and manipulates STEP data; and, based on this toolkit, implemented data exchange processors for ten different CAD and FEA systems. This book reports the work done in project CADEX and describes all its results in detail.
series other
last changed 2003/04/23 15:14

_id 32eb
authors Henry, Daniel
year 1992
title Spatial Perception in Virtual Environments : Evaluating an Architectural Application
source University of Washington
summary Over the last several years, professionals from many different fields have come to the Human Interface Technology Laboratory (H.I.T.L) to discover and learn about virtual environments. In general, they are impressed by their experiences and express the tremendous potential the tool has in their respective fields. But the potentials are always projected far in the future, and the tool remains just a concept. This is justifiable because the quality of the visual experience is so much less than what people are used to seeing; high definition television, breathtaking special cinematographic effects and photorealistic computer renderings. Instead, the models in virtual environments are very simple looking; they are made of small spaces, filled with simple or abstract looking objects of little color distinctions as seen through displays of noticeably low resolution and at an update rate which leaves much to be desired. Clearly, for most applications, the requirements of precision have not been met yet with virtual interfaces as they exist today. However, there are a few domains where the relatively low level of the technology could be perfectly appropriate. In general, these are applications which require that the information be presented in symbolic or representational form. Having studied architecture, I knew that there are moments during the early part of the design process when conceptual decisions are made which require precisely the simple and representative nature available in existing virtual environments. This was a marvelous discovery for me because I had found a viable use for virtual environments which could be immediately beneficial to architecture, my shared area of interest. It would be further beneficial to architecture in that the virtual interface equipment I would be evaluating at the H.I.T.L. happens to be relatively less expensive and more practical than other configurations such as the "Walkthrough" at the University of North Carolina. The set-up at the H.I.T.L. could be easily introduced into architectural firms because it takes up very little physical room (150 square feet) and it does not require expensive and space taking hardware devices (such as the treadmill device for simulating walking). Now that the potential for using virtual environments in this architectural application is clear, it becomes important to verify that this tool succeeds in accurately representing space as intended. The purpose of this study is to verify that the perception of spaces is the same, in both simulated and real environment. It is hoped that the findings of this study will guide and accelerate the process by which the technology makes its way into the field of architecture.
keywords Space Perception; Space (Architecture); Computer Simulation
series thesis:MSc
last changed 2003/02/12 22:37

_id ab4d
authors Huang, Tao-Kuang, Degelman, Larry O., and Larsen, Terry R.
year 1992
title A Visualization Model for Computerized Energy Evaluation During the Conceptual Design Stage (ENERGRAPH)
doi https://doi.org/10.52842/conf.acadia.1992.195
source Mission - Method - Madness [ACADIA Conference Proceedings / ISBN 1-880250-01-2] 1992, pp. 195-206
summary Energy performance is a crucial step toward responsible design. Currently there are many tools that can be applied to reach this goal with reasonable accuracy. Often times, however, major flaws are not discovered until the final stage of design when it is too late to change. Not only are existing simulation models complicated to apply at the conceptual design stage, but energy principles and their applications are also abstract and hard to visualize. Because of the lack of suitable tools to visualize energy analysis output, energy conservation concepts fail to be integrated into the building design. For these reasons, designers tend not to apply energy conservation concepts at the early design stage. However, since computer graphics is a new phase of visual communication in design process, the above problems might be solved properly through a computerized graphical interface in the conceptual design stage.

The research described in this paper is the result of exploring the concept of using computer graphics to support energy efficient building designs. It focuses on the visualization of building energy through a highly interactive graphical interface in the early design stage.

series ACADIA
email
last changed 2022/06/07 07:50

_id 56e9
authors Huang, Tao-Kuang
year 1992
title A Graphical Feedback Model for Computerized Energy Analysis during the Conceptual Design Stage
source Texas A&M University
summary During the last two decades, considerable effort has been placed on the development of building design analysis tools. Architects and designers have begun to take advantage of computers to generate and examine design alternatives. However, because it has been difficult to adapt computer technologies to the visual orientation of the building designer, the majority of computer applications have been limited to numerical analysis and office automation tasks. Only recently, because of advances in hardware and software techniques, computers have entered into a new phase in the development of architectural design. haveters are now able to interactively display graphics solutions to architectural related problems, which is fundamental to the design process. The majority of research programs in energy efficient design have sharpened people's understanding of energy principles and their application of those principles. Energy conservation concepts, however, have not been widely used. A major problem in the implementation of these principles is that energy principles their applications are abstract, hard to visualize and separated from the architectural design process. Furthermore, one aspect of energy analysis may contain thousands of pieces of numerical information which often leads to confusion on the part of designers. If these difficulties can be overcome, it would bring a great benefit to the advancement of energy conservation concepts. This research explores the concept of an integrated computer graphics program to support energy efficient design. It focuses on (1) the integration of energy efficiently and architectural design, and (2) the visualization of building energy use through graphical interfaces during the conceptual design stage. It involves (1) the discussion of frameworks of computer-aided architectural design and computer-aided energy efficient building design, and (2) the development of an integrated computer prototype program with a graphical interface that helps the designer create building layouts, analyze building energy interactively and receive visual feedbacks dynamically. The goal is to apply computer graphics as an aid to visualize the effects of energy related decisions and therefore permit the designer to visualize and understand energy conservation concepts in the conceptual phase of architectural design.
series thesis:PhD
last changed 2003/02/12 22:37

_id ed78
authors Jog, Bharati
year 1993
title Integration of Computer Applications in the Practice of Architecture
doi https://doi.org/10.52842/conf.acadia.1993.089
source Education and Practice: The Critical Interface [ACADIA Conference Proceedings / ISBN 1-880250-02-0] Texas (Texas / USA) 1993, pp. 89-97
summary Computer Applications in Architecture is emerging as an important aspect of our profession. The field, which is often referred to as Computer-Aided Architectural Design (CAAD) has had a notable impact on the profession and academia in recent years. A few professionals have predicted that as slide rules were replaced by calculators, in the coming years drafting boards and parallel bars will be replaced by computers. On the other hand, many architects do not anticipate such a drastic change in the coming decade as present CAD systems are supporting only a few integral aspects of architectural design. However, all agree that architecture curricula should be modified to integrate CAAD education.

In 1992-93, in the Department of Architecture of the 'School of Architecture and interior Design' at the University of Cincinnati, a curriculum committee was formed to review and modify the entire architecture curriculum. Since our profession and academia relate directly to each other, the author felt that while revising the curriculum, the committee should have factual information about CAD usage in the industry. Three ways to obtain such information were thought of, namely (1) conducting person to person or telephone interviews with the practitioners (2) requesting firms to give open- ended feed back and (3) surveying firms by sending a questionnaire. Of these three, the most effective, efficient and suitable method to obtain such information was an organized survey through a questionnaire. In mid December 1992, a survey was organized which was sponsored by the School of Architecture and Interior Design, the Center for the Study of the Practice of Architecture (CSPA) and the University Division of Professional Practice, all from the University of Cincinnati.

This chapter focuses on the results of this survey. A brief description of the survey design is also given. In the next section a few surveys organized in recent years are listed. In the third section the design of this survey is presented. The survey questions and their responses are given in the fourth section. The last section presents the conclusions and brief recommendations regarding computer curriculum in architecture.

series ACADIA
last changed 2022/06/07 07:52

For more results click below:

this is page 0show page 1show page 2show page 3show page 4show page 5... show page 11HOMELOGIN (you are user _anon_300670 from group guest) CUMINCAD Papers Powered by SciX Open Publishing Services 1.002