CumInCAD is a Cumulative Index about publications in Computer Aided Architectural Design
supported by the sibling associations ACADIA, CAADRIA, eCAADe, SIGraDi, ASCAAD and CAAD futures

PDF papers
References

Hits 1 to 20 of 742

_id acadia06_455
id acadia06_455
authors Ambach, Barbara
year 2006
title Eve’s Four Faces interactive surface configurations
doi https://doi.org/10.52842/conf.acadia.2006.455
source Synthetic Landscapes [Proceedings of the 25th Annual Conference of the Association for Computer-Aided Design in Architecture] pp. 455-460
summary Eve’s Four Faces consists of a series of digitally animated and interactive surfaces. Their content and structure are derived from a collection of sources outside the conventional boundaries of architectural research, namely psychology and the broader spectrum of arts and culture.The investigation stems from a psychological study documenting the attributes and social relationships of four distinct personality prototypes: the Individuated, the Traditional, the Conflicted, and the Assured (York and John 1992). For the purposes of this investigation, all four prototypes are assumed to be inherent, to certain degrees, in each individual. However, the propensity towards one of the prototypes forms the basis for each individual’s “personality structure.” The attributes, social implications and prospects for habitation have been translated into animations and surfaces operating within A House for Eve’s Four Faces. The presentation illustrates the potential for constructed surfaces to be configured and transformed interactively, responding to the needs and qualities associated with each prototype. The intention is to study the effects of each configuration and how each configuration may be therapeutic in supporting, challenging or altering one’s personality as it oscillates and shifts through the four prototypical conditions.
series ACADIA
email
last changed 2022/06/07 07:54

_id 2006_040
id 2006_040
authors Ambach, Barbara
year 2006
title Eve’s Four Faces-Interactive surface configurations
doi https://doi.org/10.52842/conf.ecaade.2006.040
source Communicating Space(s) [24th eCAADe Conference Proceedings / ISBN 0-9541183-5-9] Volos (Greece) 6-9 September 2006, pp. 40-44
summary Eve’s Four Faces consists of a series of digitally animated and interactive surfaces. Their content and structure are derived from a collection of sources outside the conventional boundaries of architectural research, namely psychology and the broader spectrum of arts and culture. The investigation stems from a psychological study documenting the attributes and social relationships of four distinct personality prototypes; the “Individuated”, the “Traditional”, the “Conflicted” and the “Assured”. (York and John, 1992) For the purposes of this investigation, all four prototypes are assumed to be inherent, to certain degrees, in each individual; however, the propensity towards one of the prototypes forms the basis for each individual’s “personality structure”. The attributes, social implications and prospects for habitation have been translated into animations and surfaces operating within A House for Eve’s Four Faces. The presentation illustrates the potential for constructed surfaces to be configured and transformed interactively, responding to the needs and qualities associated with each prototype. The intention is to study the effects of each configuration and how it may be therapeutic in supporting, challenging or altering one’s personality as it oscillates and shifts through the four prototypical conditions.
keywords interaction; digital; environments; psychology; prototypes
series eCAADe
type normal paper
last changed 2022/06/07 07:54

_id 4bd2
authors Carrara, G., Kalay, Y.E. and Novembri, G.
year 1992
title A Computational Framework for Supporting Creative Architectural Design
source New York: John Wiley & Sons, 1992. pp. 17-34 : ill. includes Bibliography
summary Design can be considered a process leading to the definition of a physical form that achieves a certain predefined set of performance criteria. The process comprises three distinct operations: (1) Definition of the desired set of performance criteria (design goals); (2) generation of alternative design solutions; (3) evaluation of the expected performances of alternative design solutions, and comparing them to the predefined criteria. Difficulties arise in performing each one of the three operations, and in combining them into a purposeful unified process. Computational techniques were developed to assist each of the three operations. A comprehensive and successful computational design assistant will have to recognize the limitations of current computational techniques, and incorporate a symbiosis between the machine and the human designer. This symbiosis comprises allocating design tasks between the designer and the computer in a manner that is most appropriate for the task at hand. The task allocation must, therefore, be done dynamically, responding to the changing circumstances of the design process. This report proposes a framework for such a symbiotic partnership, which comprises four major components: (1) User interface and design process control; (2) design goals; (3) evaluators; (4) database
keywords architecture, knowledge base, systems, design process, control
series CADline
email
last changed 2003/06/02 14:41

_id 91c4
authors Checkland, P.
year 1981
title Systems Thinking, Systems Practice
source John Wiley & Sons, Chichester
summary Whether by design, accident or merely synchronicity, Checkland appears to have developed a habit of writing seminal publications near the start of each decade which establish the basis and framework for systems methodology research for that decade."" Hamish Rennie, Journal of the Operational Research Society, 1992 Thirty years ago Peter Checkland set out to test whether the Systems Engineering (SE) approach, highly successful in technical problems, could be used by managers coping with the unfolding complexities of organizational life. The straightforward transfer of SE to the broader situations of management was not possible, but by insisting on a combination of systems thinking strongly linked to real-world practice Checkland and his collaborators developed an alternative approach - Soft Systems Methodology (SSM) - which enables managers of all kinds and at any level to deal with the subtleties and confusions of the situations they face. This work established the now accepted distinction between hard systems thinking, in which parts of the world are taken to be systems which can be engineered, and soft systems thinking in which the focus is on making sure the process of inquiry into real-world complexity is itself a system for learning. Systems Thinking, Systems Practice (1981) and Soft Systems Methodology in Action (1990) together with an earlier paper Towards a Systems-based Methodology for Real-World Problem Solving (1972) have long been recognized as classics in the field. Now Peter Checkland has looked back over the three decades of SSM development, brought the account of it up to date, and reflected on the whole evolutionary process which has produced a mature SSM. SSM: A 30-Year Retrospective, here included with Systems Thinking, Systems Practice closes a chapter on what is undoubtedly the most significant single research programme on the use of systems ideas in problem solving. Now retired from full-time university work, Peter Checkland continues his research as a Leverhulme Emeritus Fellow. "
series other
last changed 2003/04/23 15:14

_id 2325
authors Chilton, John C.
year 1992
title Computer Aided Structural Design in Architectural Instruction
doi https://doi.org/10.52842/conf.ecaade.1992.443
source CAAD Instruction: The New Teaching of an Architect? [eCAADe Conference Proceedings] Barcelona (Spain) 12-14 November 1992, pp. 443-450
summary In schools of architecture there is a tendency to associate the use of computers solely with the production of graphic images as part of the architectural design process. However, if the architecture is to work as a building it is also essential that technical aspects of the design are adequately investigated. One of the problem areas for most architectural students is structural design and they are often reluctant to use hand calculations to determine sizes of structural elements within their projects. In recent years, much of the drudgery of hand calculation has been removed from the engineer by the use of computers, and this has, hopefully, allowed a more thorough investigation of conceptual ideas and alternatives. The same benefit is now becoming available to architectural students. This is in the form of structural analysis and design programs that can be used, even by those having a limited knowledge of structural engineering, to assess the stability of designs and obtain approximate sizes for individual structural elements. The paper discusses how the use of such programs is taught, within the School of Architecture at Nottingham. Examples will be given of how they can assist students in the architectural design process. In particular, the application of GLULAM, a program for estimating sizes of laminated timber elements and SAND, a structural analysis and design package, will be described.
series eCAADe
last changed 2022/06/07 07:55

_id e51d
authors Fazio, P., Bedard, C. and Gowri, K.
year 1992
title Constraints for Generating Building Envelope Design Alternatives
source New York: John Wiley & Sons, 1992. pp. 145-155 : charts. includes bibliography
summary The building envelope design process involves selecting materials and constructional types for envelope components. Many different materials need to be combined together for wall and roof assemblies to meet the various performance requirements such as thermal efficiency, cost, acoustic and fire resistances. The number of performance attributes to be considered in the design process is large. Lack of information, time limitations and the large number of feasible design alternatives generally force the designer to rely on past experience and practical judgement to make rapid design decisions. Current work at the Centre for Buildings Studies focuses on the development of knowledge-based synthesis and evaluation techniques for reducing the problems of information handling and decision making in building envelope design. The generation of design alternatives is viewed as a search process that identifies feasible combinations of building envelope components satisfying a set of performance requirements, material compatibility, practicality of design, etc. This paper discusses knowledge acquisition and representation issues involved in the definition of constraints to guide the generation of feasible combinations of envelope components
keywords envelope, knowledge base, knowledge acquisition, representation, performance, design, structures, architecture, evaluation
series CADline
last changed 2003/06/02 14:41

_id 83f7
authors Fenves, Stephen J., Flemming, Ulrich and Hendrickson, Craig (et al)
year 1992
title Performance Evaluation in an Integrated Software Environment for Building Design and Construction Planning
source New York: John Wiley & Sons, 1992. pp. 159-169 : ill. includes bibliography
summary In this paper the authors describe the role of performance evaluation in the Integrated Software Environment for Building Design and Construction Planning (IBDE), which is a testbed for examining integration issues in the same domain. Various processes in IBDE deal with the spatial configuration, structural design, and construction planning of high-rise office buildings. Performance evaluations occur within these processes based on different representation schemes and control mechanisms for the handling of performance knowledge. Within this multiprocess environment, opportunities also exist for performance evaluation across disciplines through design critics
keywords evaluation, performance, integration, systems, building, design, construction, architecture, planning, structures, representation, control
series CADline
email
last changed 2003/06/02 10:24

_id 7ce5
authors Gal, Shahaf
year 1992
title Computers and Design Activities: Their Mediating Role in Engineering Education
source Sociomedia, ed. Edward Barret. MIT Press
summary Sociomedia: With all the new words used to describe electronic communication (multimedia, hypertext, cyberspace, etc.), do we need another one? Edward Barrett thinks we do; hence, he coins the term "sociomedia." It is meant to displace a computing economy in which technicity is hypostasized over sociality. Sociomedia, a compilation of twenty-five articles on the theory, design and practice of educational multimedia and hypermedia, attempts to re-value the communicational face of computing. Value, of course, is "ultimately a social construct." As such, it has everything to do with knowledge, power, education and technology. The projects discussed in this book represent the leading edge of electronic knowledge production in academia (not to mention major funding) and are determining the future of educational media. For these reasons, Sociomedia warrants close inspection. Barrett's introduction sets the tone. For him, designing computer media involves hardwiring a mechanism for the social construction of knowledge (1). He links computing to a process of social and communicative interactivity for constructing and desseminating knowledge. Through a mechanistic mapping of the university as hypercontext (a huge network that includes classrooms as well as services and offices), Barrett models intellectual work in such a way as to avoid "limiting definitions of human nature or human development." Education, then, can remain "where it should be--in the human domain (public and private) of sharing ideas and information through the medium of language." By leaving education in a virtual realm (where we can continue to disagree about its meaning and execution), it remains viral, mutating and contaminating in an intellectually healthy way. He concludes that his mechanistic model, by means of its reductionist approach, preserves value (7). This "value" is the social construction of knowledge. While I support the social orientation of Barrett's argument, discussions of value are related to power. I am not referring to the traditional teacher-student power structure that is supposedly dismantled through cooperative and constructivist learning strategies. The power to be reckoned with in the educational arena is foundational, that which (pre)determines value and the circulation of knowledge. "Since each of you reading this paragraph has a different perspective on the meaning of 'education' or 'learning,' and on the processes involved in 'getting an education,' think of the hybris in trying to capture education in a programmable function, in a displayable object, in a 'teaching machine'" (7). Actually, we must think about that hybris because it is, precisely, what informs teaching machines. Moreover, the basic epistemological premises that give rise to such productions are too often assumed. In the case of instructional design, the episteme of cognitive sciences are often taken for granted. It is ironic that many of the "postmodernists" who support electronic hypertextuality seem to have missed Jacques Derrida's and Michel Foucault's "deconstructions" of the epistemology underpinning cognitive sciences (if not of epistemology itself). Perhaps it is the glitz of the technology that blinds some users (qua developers) to the belief systems operating beneath the surface. Barrett is not guilty of reactionary thinking or politics; he is, in fact, quite in line with much American deconstructive and postmodern thinking. The problem arises in that he leaves open the definitions of "education," "learning" and "getting an education." One cannot engage in the production of new knowledge without orienting its design, production and dissemination, and without negotiating with others' orientations, especially where largescale funding is involved. Notions of human nature and development are structural, even infrastructural, whatever the medium of the teaching machine. Although he addresses some dynamics of power, money and politics when he talks about the recession and its effects on the conference, they are readily visible dynamics of power (3-4). Where does the critical factor of value determination, of power, of who gets what and why, get mapped onto a mechanistic model of learning institutions? Perhaps a mapping of contributors' institutions, of the funding sources for the projects showcased and for participation in the conference, and of the disciplines receiving funding for these sorts of projects would help visualize the configurations of power operative in the rising field of educational multimedia. Questions of power and money notwithstanding, Barrett's introduction sets the social and textual thematics for the collection of essays. His stress on interactivity, on communal knowledge production, on the society of texts, and on media producers and users is carried foward through the other essays, two of which I will discuss. Section I of the book, "Perspectives...," highlights the foundations, uses and possible consequences of multimedia and hypertextuality. The second essay in this section, "Is There a Class in This Text?," plays on the robust exchange surrounding Stanley Fish's book, Is There a Text in This Class?, which presents an attack on authority in reading. The author, John Slatin, has introduced electronic hypertextuality and interaction into his courses. His article maps the transformations in "the content and nature of work, and the workplace itself"-- which, in this case, is not industry but an English poetry class (25). Slatin discovered an increase of productive and cooperative learning in his electronically- mediated classroom. For him, creating knowledge in the electronic classroom involves interaction between students, instructors and course materials through the medium of interactive written discourse. These interactions lead to a new and persistent understanding of the course materials and of the participants' relation to the materials and to one another. The work of the course is to build relationships that, in my view, constitute not only the meaning of individual poems, but poetry itself. The class carries out its work in the continual and usually interactive production of text (31). While I applaud his strategies which dismantle traditional hierarchical structures in academia, the evidence does not convince me that the students know enough to ask important questions or to form a self-directing, learning community. Stanley Fish has not relinquished professing, though he, too, espouses the indeterminancy of the sign. By the fourth week of his course, Slatin's input is, by his own reckoning, reduced to 4% (39). In the transcript of the "controversial" Week 6 exchange on Gertrude Stein--the most disliked poet they were discussing at the time (40)--we see the blind leading the blind. One student parodies Stein for three lines and sums up his input with "I like it." Another, finds Stein's poetry "almost completey [sic] lacking in emotion or any artistic merit" (emphasis added). On what grounds has this student become an arbiter of "artistic merit"? Another student, after admitting being "lost" during the Wallace Steven discussion, talks of having more "respect for Stevens' work than Stein's" and adds that Stein's poetry lacks "conceptual significance[, s]omething which people of varied opinion can intelligently discuss without feeling like total dimwits...." This student has progressed from admitted incomprehension of Stevens' work to imposing her (groundless) respect for his work over Stein's. Then, she exposes her real dislike for Stein's poetry: that she (the student) missed the "conceptual significance" and hence cannot, being a person "of varied opinion," intelligently discuss it "without feeling like [a] total dimwit." Slatin's comment is frightening: "...by this point in the semester students have come to feel increasingly free to challenge the instructor" (41). The students that I have cited are neither thinking critically nor are their preconceptions challenged by student-governed interaction. Thanks to the class format, one student feels self-righteous in her ignorance, and empowered to censure. I believe strongly in student empowerment in the classroom, but only once students have accrued enough knowledge to make informed judgments. Admittedly, Slatin's essay presents only partial data (there are six hundred pages of course transcripts!); still, I wonder how much valuable knowledge and metaknowledge was gained by the students. I also question the extent to which authority and professorial dictature were addressed in this course format. The power structures that make it possible for a college to require such a course, and the choice of texts and pedagogy, were not "on the table." The traditional professorial position may have been displaced, but what took its place?--the authority of consensus with its unidentifiable strong arm, and the faceless reign of software design? Despite Slatin's claim that the students learned about the learning process, there is no evidence (in the article) that the students considered where their attitudes came from, how consensus operates in the construction of knowledge, how power is established and what relationship they have to bureaucratic insitutions. How do we, as teaching professionals, negotiate a balance between an enlightened despotism in education and student-created knowledge? Slatin, and other authors in this book, bring this fundamental question to the fore. There is no definitive answer because the factors involved are ultimately social, and hence, always shifting and reconfiguring. Slatin ends his article with the caveat that computerization can bring about greater estrangement between students, faculty and administration through greater regimentation and control. Of course, it can also "distribute authority and power more widely" (50). Power or authority without a specific face, however, is not necessarily good or just. Shahaf Gal's "Computers and Design Activities: Their Mediating Role in Engineering Education" is found in the second half of the volume, and does not allow for a theory/praxis dichotomy. Gal recounts a brief history of engineering education up to the introduction of Growltiger (GT), a computer-assisted learning aid for design. He demonstrates GT's potential to impact the learning of engineering design by tracking its use by four students in a bridge-building contest. What his text demonstrates clearly is that computers are "inscribing and imaging devices" that add another viewpoint to an on-going dialogue between student, teacher, earlier coursework, and other teaching/learning tools. The less proficient students made a serious error by relying too heavily on the technology, or treating it as a "blueprint provider." They "interacted with GT in a way that trusted the data to represent reality. They did not see their interaction with GT as a negotiation between two knowledge systems" (495). Students who were more thoroughly informed in engineering discourses knew to use the technology as one voice among others--they knew enough not simply to accept the input of the computer as authoritative. The less-advanced students learned a valuable lesson from the competition itself: the fact that their designs were not able to hold up under pressure (literally) brought the fact of their insufficient knowledge crashing down on them (and their bridges). They also had, post factum, several other designs to study, especially the winning one. Although competition and comparison are not good pedagogical strategies for everyone (in this case the competitors had volunteered), at some point what we think we know has to be challenged within the society of discourses to which it belongs. Students need critique in order to learn to push their learning into auto-critique. This is what is lacking in Slatin's discussion and in the writings of other avatars of constructivist, collaborative and computer-mediated pedagogies. Obviously there are differences between instrumental types of knowledge acquisition and discoursive knowledge accumulation. Indeed, I do not promote the teaching of reading, thinking and writing as "skills" per se (then again, Gal's teaching of design is quite discursive, if not dialogic). Nevertheless, the "soft" sciences might benefit from "bridge-building" competitions or the re-institution of some forms of agonia. Not everything agonistic is inhuman agony--the joy of confronting or creating a sound argument supported by defensible evidence, for example. Students need to know that soundbites are not sound arguments despite predictions that electronic writing will be aphoristic rather than periodic. Just because writing and learning can be conceived of hypertextually does not mean that rigor goes the way of the dinosaur. Rigor and hypertextuality are not mutually incompatible. Nor is rigorous thinking and hard intellectual work unpleasurable, although American anti-intellectualism, especially in the mass media, would make it so. At a time when the spurious dogmatics of a Rush Limbaugh and Holocaust revisionist historians circulate "aphoristically" in cyberspace, and at a time when knowledge is becoming increasingly textualized, the role of critical thinking in education will ultimately determine the value(s) of socially constructed knowledge. This volume affords the reader an opportunity to reconsider knowledge, power, and new communications technologies with respect to social dynamics and power relationships.
series other
last changed 2003/04/23 15:14

_id 1076
authors Gero, John S. and Saunders, Robert
year 2000
title Constructed Representations and Their Functions in Computational Models of Designing
doi https://doi.org/10.52842/conf.caadria.2000.215
source CAADRIA 2000 [Proceedings of the Fifth Conference on Computer Aided Architectural Design Research in Asia / ISBN 981-04-2491-4] Singapore 18-19 May 2000, pp. 215-224
summary This paper re-examines the conclusions made by Schön and Wiggins in 1992 that computers were unable to reproduce processes crucial to designing. We propose that recent developments in artificial intelligence and design computing put us in a position where we can begin to computationally model designing as conceived by Schön and Wiggins. We present a computational model of designing using situated processes that construct representations. We show how constructed representations support computational processes that model the different kinds of seeing reported in designing. We also present recently developed computational processes that can identify unexpected consequences of design actions using adaptive novelty detection.
series CAADRIA
email
last changed 2022/06/07 07:51

_id eda3
authors Goldschmidt, Gabriela
year 1992
title Criteria for Design Evaluation : A Process-Oriented Paradigm
source New York: John Wiley & Sons, 1992. pp. 67-79. includes bibliography
summary Architectural research of the last two or three decades has been largely devoted to design methodology. Systematic evaluations of design products and prescription of their desired qualities led to specifications for better designs and possible routines to achieve them. Computers have facilitated this task. The human designer, however, has largely resisted the use of innovative methods. In this paper the author claims that the reason for that lies in insufficient regard for innate cognitive aptitudes which are activated in the process of designing. A view of these aptitudes, based on patterns of links among design moves, is presented. It is proposed that process research is mandatory for further advancements in design research utility
keywords cognition, design process, research, protocol analysis, architecture
series CADline
last changed 1999/02/12 15:08

_id 6df3
authors Gross, Mark D. and Zimring, Craig
year 1992
title Predicting Wayfinding Behavior in Buildings : A Schema-Based Approach
source New York: John Wiley & Sons, 1992. pp. 367-377 : ill. includes bibliography
summary Postoccupancy evaluations of large buildings often reveal significant wayfinding problems caused by poor floor-plan layout. Predicting wayfinding problems early in the design process could avoid costly remodeling and make better buildings. However, we lack formal, predictive models of human wayfinding behavior. Computational models of wayfinding in buildings have addressed constructing a topological and geometric representations of the plan layout incrementally during exploration. The authors propose to combine this with a schema model of building memory. It is argued that people orient themselves and wayfind in new buildings using schemas, or generic expectations about building layout. In this paper the authors give their preliminary thoughts toward developing a computational model of wayfinding based on this approach
keywords wayfinding, evaluation, applications, architecture, floor plans, layout, building, prediction
series CADline
email
last changed 2002/09/05 15:02

_id ea96
authors Hacfoort, Eek J. and Veldhuisen, Jan K.
year 1992
title A Building Design and Evaluation System
source New York: John Wiley & Sons, 1992. pp. 195-211 : ill. table. includes bibliography
summary Within the field of architectural design there is a growing awareness of imbalance among the professionalism, the experience, and the creativity of the designers' response to the up-to-date requirements of all parties interested in the design process. The building design and evaluating system COSMOS makes it possible for various participants to work within their own domain, so that separated but coordinated work can be done. This system is meant to organize the initial stage of the design process, where user-defined functions, geometry, type of construction, and building materials are decided. It offers a tool to design a building to calculate a number of effects and for managing the information necessary to evaluate the design decisions. The system is provided with data and sets of parameters for describing the conditions, along with their properties, of the main building functions of a selection of well-known building types. The architectural design is conceptualized as being a hierarchy of spatial units, ranking from building blocks down to specific rooms or spaces. The concept of zoning is used as a means of calculating and directly evaluating the structure of the design without working out the details. A distinction is made between internal and external calculations and evaluations during the initial design process. During design on screen, an estimation can be recorded of building costs, energy costs, acoustics, lighting, construction, and utility. Furthermore, the design can be exported to a design application program, in this case AutoCAD, to make and show drawings in more detail. Through the medium of a database, external calculation and evaluation of building costs, life-cycle costs, energy costs, interior climate, acoustics, lighting, construction, and utility are possible in much more advanced application programs
keywords evaluation, applications, integration, architecture, design, construction, building, energy, cost, lighting, acoustics, performance
series CADline
last changed 2003/06/02 13:58

_id 7e68
authors Holland, J.
year 1992
title Genetic Algorithms
source Scientific America, July 1992
summary Living organisms are consummate problem solvers. They exhibit a versatility that puts the best computer programs to shame. This observation is especially galling for computer scientists, who may spend months or years of intellectual effort on an algorithm, whereas organisms come by their abilities through the apparently undirected mechanism of evolution and natural selection. Pragmatic researchers see evolution's remarkable power as something to be emulated rather than envied. Natural selection eliminates one of the greatest hurdles in software design: specifying in advance all the features of a problem and the actions a program should take to deal with them. By harnessing the mechanisms of evolution, researchers may be able to "breed" programs that solve problems even when no person can fully understand their structure. Indeed, these so-called genetic algorithms have already demonstrated the ability to made breakthroughs in the design of such complex systems as jet engines. Genetic algorithms make it possible to explore a far greater range of potential solutions to a problem than do conventional programs. Furthermore, as researchers probe the natural selection of programs under controlled an well-understood conditions, the practical results they achieve may yield some insight into the details of how life and intelligence evolve in the natural world.
series journal paper
last changed 2003/04/23 15:50

_id 2467
authors Jockusch, Peter R.A.
year 1992
title How Can We Achieve a Good Building?
source New York: John Wiley & Sons, 1992. pp. 51-65 : ill. includes bibliography
summary This paper is concerned with the reasons and purposes for which we evaluate and predict building performance. The discussion is based on the author's experience, gained through the preparation and evaluation of more than 50 major architectural competitions
keywords An attempt is made to discover for whom and in what respect a building can be considered a 'good building,' by asking the following questions: What can prediction and evaluation of building performance achieve? How well can we assess the performance and value of an existing building within its socio-technical context? For what purposes and with what degree of confidence can the eventual performance of a designed and specified building be predicted? How do these evaluations compare to actual post occupancy performance? To what extent do the roles and motivations of assessors, evaluators, and decision makers affect the value-stating process? prediction, evaluation, performance, building, life cycle, design, architecture
series CADline
last changed 2003/06/02 13:58

_id cc2f
authors Jog, Bharati
year 1992
title Evaluation of Designs for Energy Performance Using A Knowledge-Based System
source New York: John Wiley & Sons, 1992. pp. 293-304 : ill. includes a bibliography
summary Principles of knowledge-based (or expert) systems have been applied in different knowledge-rich domains such as geology, medicine, and very large scale integrated circuits (VLSI). There have been some efforts to develop expert systems for evaluation and prediction of architectural designs in this decade. This paper presents a prototype system, Energy Expert, which quickly computes the approximate yearly energy performance of a building design, analyzes the energy performance, and gives advice on possible ways of improving the design. These modifications are intended to make the building more energy efficient and help cut down on heating and cooling costs. The system is designed for the schematic design phase of an architectural project. Also discussed briefly is the reasoning behind developing such a system for the schematic design rather than the final design phase
keywords expert systems, energy, evaluation, performance, knowledge base, architecture, reasoning, programming, prediction
series CADline
last changed 1999/02/12 15:08

_id 49bf
authors Johnson, Robert E.
year 1992
title Design Inquiry and Resource Allocation
source New York: John Wiley & Sons, 1992. pp. 51-65 : ill. tables. includes bibliography
summary This paper proposes that the primary role of resource allocation in design is to assist design decision makers in ordering preferences and exploring trade-offs. Most existing cost evaluation paradigms focus on assessing costs after design decisions are made. This view unnecessarily restricts the active participation of economic knowledge in design decision-making. The approach described in this research suggests that the exploration and definition of values and references should be the major focus of economic analysis within the design process. A conceptual framework for this approach is presented along with several examples that illustrate the use of this framework. Computational approaches are suggested which play a central role in clarifying preference and exploring trade-offs during design
keywords economics, architecture, building, construction, resource allocation, design, cost, evaluation
series CADline
last changed 2003/06/02 13:58

_id acaa
authors Kalay, Yehuda E.
year 1992
title Evaluating and Predicting Design Performance
source New York: John Wiley & Sons, 1992. pp. 399-404
summary This article is the conclusion chapter of the book by the same title. Evaluation can be defined as measuring the fit between achieved or expected performances to stated criteria. Prediction is the process whereby expected performance characteristics are simulated, or otherwise made tangible, when evaluation is applied to hypothetical design solutions. The multifaceted nature of design solutions precludes optimization of any one performance characteristic. Rather, a good design solution will strike a balance in the degree to which any performance criterion is achieved, such that overall performance will be maximized. This paper discusses the nature of evaluation and prediction, their multilevel and multifaceted dimensions, and some of the approaches that have been proposed to perform quantitative and qualitative evaluations
keywords evaluation, performance, prediction, multicriteria, architecture, design process
series CADline
email
last changed 2003/06/02 13:58

_id e7c8
authors Kalisperis, Loukas N., Steinman, Mitch and Summers, Luis H.
year 1992
title Design Knowledge, Environmental Complexity in Nonorthogonal Space
source New York: John Wiley & Sons, 1992. pp. 273-291 : ill. includes bibliography
summary Mechanization and industrialization of society has resulted in most people spending the greater part of their lives in enclosed environments. Optimal design of indoor artificial climates is therefore of increasing importance. Wherever artificial climates are created for human occupation, the aim is that the environment be designed so that individuals are in thermal comfort. Current design methodologies for radiant panel heating systems do not adequately account for the complexities of human thermal comfort, because they monitor air temperature alone and do not account for thermal neutrality in complex enclosures. Thermal comfort for a person is defined as that condition of mind which expresses satisfaction with the thermal environment. Thermal comfort is dependent on Mean Radiant Temperature and Operative Temperature among other factors. In designing artificial climates for human occupancy the interaction of the human with the heated surfaces as well the surface-to-surface heat exchange must be accounted for. Early work in the area provided an elaborate and difficult method for calculating radiant heat exchange for simplistic and orthogonal enclosures. A new improved method developed by the authors for designing radiant panel heating systems based on human thermal comfort and mean radiant temperature is presented. Through automation and elaboration this method overcomes the limitations of the early work. The design procedure accounts for human thermal comfort in nonorthogonal as well as orthogonal spaces based on mean radiant temperature prediction. The limitation of simplistic orthogonal geometries has been overcome with the introduction of the MRT-Correction method and inclined surface-to-person shape factor methodology. The new design method increases the accuracy of calculation and prediction of human thermal comfort and will allow designers to simulate complex enclosures utilizing the latest design knowledge of radiant heat exchange to increase human thermal comfort
keywords applications, architecture, building, energy, systems, design, knowledge
series CADline
last changed 2003/06/02 10:24

_id a2e6
authors Liggett, R.S., Mitchell, W.J. and Tan, M.
year 1992
title Multi-Level Analysis and Optimization of Design
source New York: John Wiley & Sons, 1992. pp. 2512-269 : ill. includes bibliography
summary This paper discusses a knowledge-based computer-aided design system, that provides multi-level analysis capabilities, and that automatically propagates constraints on design variables from level to level. It also Supports formulation and solution of optimization problems at different levels, so that a solution can be approached by solving a sequence of appropriately constrained sub-optimization problems. Theory and implementation are discussed, and a detailed case study of application to the design of small house plans is provided
keywords constraints, design, methods, knowledge base, CAD, systems, analysis, optimization, automation, user interface, shape grammars
series CADline
email
last changed 2003/06/02 14:41

_id 8488
authors Liggett, Robin S.
year 1992
title A Designer-Automated Algorithm Partnership : An Interactive Graphic Approach to Facility Layout
source New York: John Wiley & Sons, 1992. pp. 101-123 : ill. includes bibliography
summary Automated solution technique for spatial allocation problems have long been an interest of researchers in computer-aided design. This paper describes research focusing on the use of an interactive graphic interface for the solution of facility layout problems which have quantifyable but sometimes competing criteria. The ideas presented in the paper have been implemented in a personal computer system
keywords algorithms, user interface, layout, synthesis, floor plans, architecture, facilities planning, automation, space allocation, optimization
series CADline
email
last changed 2003/06/02 13:58

For more results click below:

this is page 0show page 1show page 2show page 3show page 4show page 5... show page 37HOMELOGIN (you are user _anon_104563 from group guest) CUMINCAD Papers Powered by SciX Open Publishing Services 1.002