CumInCAD is a Cumulative Index about publications in Computer Aided Architectural Design
supported by the sibling associations ACADIA, CAADRIA, eCAADe, SIGraDi, ASCAAD and CAAD futures

PDF papers
References

Hits 1 to 20 of 219

_id 84e6
authors Seebohm, Thomas
year 1995
title A Response to William J. Mitchell's review of Possible Palladian Villas, by George Hersey and Richard Freedman, MIT Press, 1992
source AA Files ( Journal of the Architectural Association School of Architecture), No. 30, Autumn, 1995, pp. 109 - 111
summary A review by William J. Mitchell, entitled 'Franchising Architectural Styles", appeared in AA Files no. 26 (Autumn 1993). It reflects on a collision between two fundamentally opposing points of view, one held by the reviewer, the other by the reviewed. These determine our expectations of the role of computers in architectural design.

series journal paper
email tseebohm@fes.uwaterloo.ca
last changed 2003/05/15 19:45

_id 2cb4
authors Bille, Pia
year 1992
title CAD at the AAA
source CAAD Instruction: The New Teaching of an Architect? [eCAADe Conference Proceedings] Barcelona (Spain) 12-14 November 1992, pp. 279-288
summary Teaching computer science at the Aarhus School of Architecture goes back as far as to the beginning of the 80ís, when a few teachers and students were curious towards the new media seeing its great developing perspectives and its possible use in the design of architecture. The curiosity and excitement about technology continued, although the results were modest and the usefulness not a dominant aspect in this early period. In the middle of the 80ís the School of Architecture was given the opportunity by means of state funding to buy the first 10 IBM PC's to run AutoCad among other programmes. Beside this a bigger CAD-system Gable 4D Series was introduced running on MicroVax Workstations. The software was dedicated to drafting buildings in 2 and 3 dimensions - an important task within the profession of architects.

series eCAADe
email pia.bille@a-aarhus.dk
last changed 2003/11/21 14:16

_id e039
authors Bertin, Vito
year 1992
title Structural Transformations (Basic Architectural Unit 6)
source CAAD Instruction: The New Teaching of an Architect? [eCAADe Conference Proceedings] Barcelona (Spain) 12-14 November 1992, pp. 413-426
summary While the teaching of the phenomenon of form as well as space is normally seen within an environment of free experimentation and personal expression, other directions prove to be worth of pursuit. The proposed paper represents such an exploration. The generation of controlled complexity and structural transformations have been the title of the project which forms the base of this paper. In it, the potential for creative development of the student was explored in such a way, that as in the sciences a process can be reproduced or an exploration utilized in further experimentation. The cube as a well proven B.A.U. or basic architectural unit has again been used in our work. Even a simple object like a cube has many properties. As properties are never pure, but always related to other properties, and looking at a single property as a specific value of a variable, it is possible to link a whole field of objects. These links provide a network of paths through which exploration and development is possible. The paper represents a first step in a direction which we think will compliment the already established basic design program.

series eCAADe
email vito@osk.threewebnet.or.jp
last changed 2003/11/21 14:16

_id 91c4
authors Checkland, P.
year 1981
title Systems Thinking, Systems Practice
source John Wiley & Sons, Chichester
summary Whether by design, accident or merely synchronicity, Checkland appears to have developed a habit of writing seminal publications near the start of each decade which establish the basis and framework for systems methodology research for that decade."" Hamish Rennie, Journal of the Operational Research Society, 1992 Thirty years ago Peter Checkland set out to test whether the Systems Engineering (SE) approach, highly successful in technical problems, could be used by managers coping with the unfolding complexities of organizational life. The straightforward transfer of SE to the broader situations of management was not possible, but by insisting on a combination of systems thinking strongly linked to real-world practice Checkland and his collaborators developed an alternative approach - Soft Systems Methodology (SSM) - which enables managers of all kinds and at any level to deal with the subtleties and confusions of the situations they face. This work established the now accepted distinction between hard systems thinking, in which parts of the world are taken to be systems which can be engineered, and soft systems thinking in which the focus is on making sure the process of inquiry into real-world complexity is itself a system for learning. Systems Thinking, Systems Practice (1981) and Soft Systems Methodology in Action (1990) together with an earlier paper Towards a Systems-based Methodology for Real-World Problem Solving (1972) have long been recognized as classics in the field. Now Peter Checkland has looked back over the three decades of SSM development, brought the account of it up to date, and reflected on the whole evolutionary process which has produced a mature SSM. SSM: A 30-Year Retrospective, here included with Systems Thinking, Systems Practice closes a chapter on what is undoubtedly the most significant single research programme on the use of systems ideas in problem solving. Now retired from full-time university work, Peter Checkland continues his research as a Leverhulme Emeritus Fellow. "
series other
last changed 2003/04/23 13:14

_id c434
authors Colajanni, B., Pellitteri, G. and Scianna, A.
year 1992
title Two Approaches to Teaching Computers in Architecture: The Experience in the Faculty of Engineering in Palermo, Italy
source CAAD Instruction: The New Teaching of an Architect? [eCAADe Conference Proceedings] Barcelona (Spain) 12-14 November 1992, pp. 295-306
summary Teaching the use of computers in architecture poses the same kind of problems as teaching mathematics. To both there are two possible approaches. The first presents the discipline as a tool of which the merely instrumental aspect is emphasized. Teaching is limited to show the results obtainable by existing programs and how to get them. The second approach, on the contrary emphasizes the autonomous nature of the discipline, mathematics as much as computing, on the basis of the convincement that the maximum of instrumental usefulness can be obtained through the knowledge at the highest degree of generality and, then, of abstraction. The first approach changes little in the mind of the student. He simply learns that is possible, and then worthy doing, a certain amount of operations, mainly checks of performances (and not only the control of the aspect, now easy with one of the many existing CAD) or searches of technical informations in some database. The second approach gives the student the consciousness of the manageability of abstract structures of relationships. He acquires then the idea of creating by himself particular structures of relationships and managing them. This can modify the very idea of the design procedure giving the student the consciousness that he can intervene directly in every segment of the design procedure, reshaping it to some extent in a way better suited to the particular problem he is dealing with. Of course this second approach implies learning not only a language but also the capability of coming to terms with languages. And again it is a cultural acquisition that can be very useful when referred to the languages of architecture. Furthermore the capability of simulating on the computer also a small segment of the design process gives the student a better understanding both of the particular problem he is dealing with and of the very nature of design. As for the first effect, it happens whenever a translation is done from a language to another one. One is obliged to get to the core of the matter in order to overcome the difficulties rising from the different bias of the two languages. The second effect comes from the necessity of placing the studied segment in the general flow of the design process. The organisation in a linear sequence of action to be accomplished recursively in an order always varying in any design occasion is an extremely useful exercise to understand the signification and the techniques of formalisation of design problems.
series eCAADe
email bcolajan@mbox.unipa.it
last changed 1998/08/18 14:26

_id e412
authors Fargas, Josep and Papazian, Pegor
year 1992
title Modeling Regulations and Intentions for Urban Development: The Role of Computer Simulation in the Urban Design Studio
source CAAD Instruction: The New Teaching of an Architect? [eCAADe Conference Proceedings] Barcelona (Spain) 12-14 November 1992, pp. 201-212
summary In this paper we present a strategy for modeling urban development in order to study the role of urban regulations and policies in the transformation of cities. We also suggest a methodology for using computer models as experimental tools in the urban design studio in order to make explicit the factors involved in shaping cities, and for the automatic visualization of projected development. The structure of the proposed model is based on different modules which represent, on the one hand, the rules regulating the physical growth of a city and, on the other hand, heuristics corresponding to different interests such as Real Estate Developers, City Hall Planners, Advocacy and Community Groups, and so on. Here we present a case study dealing with the Boston Redevelopment Authority zoning code for the Midtown Cultural District of Boston. We introduce a computer program which develops the district, adopting a particular point of view regarding urban regulation. We then generalize the notion of this type of computer modeling and simulation, and draw some conclusions about its possible uses in the teaching and practice of design.
series eCAADe
email fargas@dtec.es
last changed 2003/05/16 19:27

_id 7ce5
authors Gal, Shahaf
year 1992
title Computers and Design Activities: Their Mediating Role in Engineering Education
source Sociomedia, ed. Edward Barret. MIT Press
summary Sociomedia: With all the new words used to describe electronic communication (multimedia, hypertext, cyberspace, etc.), do we need another one? Edward Barrett thinks we do; hence, he coins the term "sociomedia." It is meant to displace a computing economy in which technicity is hypostasized over sociality. Sociomedia, a compilation of twenty-five articles on the theory, design and practice of educational multimedia and hypermedia, attempts to re-value the communicational face of computing. Value, of course, is "ultimately a social construct." As such, it has everything to do with knowledge, power, education and technology. The projects discussed in this book represent the leading edge of electronic knowledge production in academia (not to mention major funding) and are determining the future of educational media. For these reasons, Sociomedia warrants close inspection. Barrett's introduction sets the tone. For him, designing computer media involves hardwiring a mechanism for the social construction of knowledge (1). He links computing to a process of social and communicative interactivity for constructing and desseminating knowledge. Through a mechanistic mapping of the university as hypercontext (a huge network that includes classrooms as well as services and offices), Barrett models intellectual work in such a way as to avoid "limiting definitions of human nature or human development." Education, then, can remain "where it should be--in the human domain (public and private) of sharing ideas and information through the medium of language." By leaving education in a virtual realm (where we can continue to disagree about its meaning and execution), it remains viral, mutating and contaminating in an intellectually healthy way. He concludes that his mechanistic model, by means of its reductionist approach, preserves value (7). This "value" is the social construction of knowledge. While I support the social orientation of Barrett's argument, discussions of value are related to power. I am not referring to the traditional teacher-student power structure that is supposedly dismantled through cooperative and constructivist learning strategies. The power to be reckoned with in the educational arena is foundational, that which (pre)determines value and the circulation of knowledge. "Since each of you reading this paragraph has a different perspective on the meaning of 'education' or 'learning,' and on the processes involved in 'getting an education,' think of the hybris in trying to capture education in a programmable function, in a displayable object, in a 'teaching machine'" (7). Actually, we must think about that hybris because it is, precisely, what informs teaching machines. Moreover, the basic epistemological premises that give rise to such productions are too often assumed. In the case of instructional design, the episteme of cognitive sciences are often taken for granted. It is ironic that many of the "postmodernists" who support electronic hypertextuality seem to have missed Jacques Derrida's and Michel Foucault's "deconstructions" of the epistemology underpinning cognitive sciences (if not of epistemology itself). Perhaps it is the glitz of the technology that blinds some users (qua developers) to the belief systems operating beneath the surface. Barrett is not guilty of reactionary thinking or politics; he is, in fact, quite in line with much American deconstructive and postmodern thinking. The problem arises in that he leaves open the definitions of "education," "learning" and "getting an education." One cannot engage in the production of new knowledge without orienting its design, production and dissemination, and without negotiating with others' orientations, especially where largescale funding is involved. Notions of human nature and development are structural, even infrastructural, whatever the medium of the teaching machine. Although he addresses some dynamics of power, money and politics when he talks about the recession and its effects on the conference, they are readily visible dynamics of power (3-4). Where does the critical factor of value determination, of power, of who gets what and why, get mapped onto a mechanistic model of learning institutions? Perhaps a mapping of contributors' institutions, of the funding sources for the projects showcased and for participation in the conference, and of the disciplines receiving funding for these sorts of projects would help visualize the configurations of power operative in the rising field of educational multimedia. Questions of power and money notwithstanding, Barrett's introduction sets the social and textual thematics for the collection of essays. His stress on interactivity, on communal knowledge production, on the society of texts, and on media producers and users is carried foward through the other essays, two of which I will discuss. Section I of the book, "Perspectives...," highlights the foundations, uses and possible consequences of multimedia and hypertextuality. The second essay in this section, "Is There a Class in This Text?," plays on the robust exchange surrounding Stanley Fish's book, Is There a Text in This Class?, which presents an attack on authority in reading. The author, John Slatin, has introduced electronic hypertextuality and interaction into his courses. His article maps the transformations in "the content and nature of work, and the workplace itself"-- which, in this case, is not industry but an English poetry class (25). Slatin discovered an increase of productive and cooperative learning in his electronically- mediated classroom. For him, creating knowledge in the electronic classroom involves interaction between students, instructors and course materials through the medium of interactive written discourse. These interactions lead to a new and persistent understanding of the course materials and of the participants' relation to the materials and to one another. The work of the course is to build relationships that, in my view, constitute not only the meaning of individual poems, but poetry itself. The class carries out its work in the continual and usually interactive production of text (31). While I applaud his strategies which dismantle traditional hierarchical structures in academia, the evidence does not convince me that the students know enough to ask important questions or to form a self-directing, learning community. Stanley Fish has not relinquished professing, though he, too, espouses the indeterminancy of the sign. By the fourth week of his course, Slatin's input is, by his own reckoning, reduced to 4% (39). In the transcript of the "controversial" Week 6 exchange on Gertrude Stein--the most disliked poet they were discussing at the time (40)--we see the blind leading the blind. One student parodies Stein for three lines and sums up his input with "I like it." Another, finds Stein's poetry "almost completey [sic] lacking in emotion or any artistic merit" (emphasis added). On what grounds has this student become an arbiter of "artistic merit"? Another student, after admitting being "lost" during the Wallace Steven discussion, talks of having more "respect for Stevens' work than Stein's" and adds that Stein's poetry lacks "conceptual significance[, s]omething which people of varied opinion can intelligently discuss without feeling like total dimwits...." This student has progressed from admitted incomprehension of Stevens' work to imposing her (groundless) respect for his work over Stein's. Then, she exposes her real dislike for Stein's poetry: that she (the student) missed the "conceptual significance" and hence cannot, being a person "of varied opinion," intelligently discuss it "without feeling like [a] total dimwit." Slatin's comment is frightening: "...by this point in the semester students have come to feel increasingly free to challenge the instructor" (41). The students that I have cited are neither thinking critically nor are their preconceptions challenged by student-governed interaction. Thanks to the class format, one student feels self-righteous in her ignorance, and empowered to censure. I believe strongly in student empowerment in the classroom, but only once students have accrued enough knowledge to make informed judgments. Admittedly, Slatin's essay presents only partial data (there are six hundred pages of course transcripts!); still, I wonder how much valuable knowledge and metaknowledge was gained by the students. I also question the extent to which authority and professorial dictature were addressed in this course format. The power structures that make it possible for a college to require such a course, and the choice of texts and pedagogy, were not "on the table." The traditional professorial position may have been displaced, but what took its place?--the authority of consensus with its unidentifiable strong arm, and the faceless reign of software design? Despite Slatin's claim that the students learned about the learning process, there is no evidence (in the article) that the students considered where their attitudes came from, how consensus operates in the construction of knowledge, how power is established and what relationship they have to bureaucratic insitutions. How do we, as teaching professionals, negotiate a balance between an enlightened despotism in education and student-created knowledge? Slatin, and other authors in this book, bring this fundamental question to the fore. There is no definitive answer because the factors involved are ultimately social, and hence, always shifting and reconfiguring. Slatin ends his article with the caveat that computerization can bring about greater estrangement between students, faculty and administration through greater regimentation and control. Of course, it can also "distribute authority and power more widely" (50). Power or authority without a specific face, however, is not necessarily good or just. Shahaf Gal's "Computers and Design Activities: Their Mediating Role in Engineering Education" is found in the second half of the volume, and does not allow for a theory/praxis dichotomy. Gal recounts a brief history of engineering education up to the introduction of Growltiger (GT), a computer-assisted learning aid for design. He demonstrates GT's potential to impact the learning of engineering design by tracking its use by four students in a bridge-building contest. What his text demonstrates clearly is that computers are "inscribing and imaging devices" that add another viewpoint to an on-going dialogue between student, teacher, earlier coursework, and other teaching/learning tools. The less proficient students made a serious error by relying too heavily on the technology, or treating it as a "blueprint provider." They "interacted with GT in a way that trusted the data to represent reality. They did not see their interaction with GT as a negotiation between two knowledge systems" (495). Students who were more thoroughly informed in engineering discourses knew to use the technology as one voice among others--they knew enough not simply to accept the input of the computer as authoritative. The less-advanced students learned a valuable lesson from the competition itself: the fact that their designs were not able to hold up under pressure (literally) brought the fact of their insufficient knowledge crashing down on them (and their bridges). They also had, post factum, several other designs to study, especially the winning one. Although competition and comparison are not good pedagogical strategies for everyone (in this case the competitors had volunteered), at some point what we think we know has to be challenged within the society of discourses to which it belongs. Students need critique in order to learn to push their learning into auto-critique. This is what is lacking in Slatin's discussion and in the writings of other avatars of constructivist, collaborative and computer-mediated pedagogies. Obviously there are differences between instrumental types of knowledge acquisition and discoursive knowledge accumulation. Indeed, I do not promote the teaching of reading, thinking and writing as "skills" per se (then again, Gal's teaching of design is quite discursive, if not dialogic). Nevertheless, the "soft" sciences might benefit from "bridge-building" competitions or the re-institution of some forms of agonia. Not everything agonistic is inhuman agony--the joy of confronting or creating a sound argument supported by defensible evidence, for example. Students need to know that soundbites are not sound arguments despite predictions that electronic writing will be aphoristic rather than periodic. Just because writing and learning can be conceived of hypertextually does not mean that rigor goes the way of the dinosaur. Rigor and hypertextuality are not mutually incompatible. Nor is rigorous thinking and hard intellectual work unpleasurable, although American anti-intellectualism, especially in the mass media, would make it so. At a time when the spurious dogmatics of a Rush Limbaugh and Holocaust revisionist historians circulate "aphoristically" in cyberspace, and at a time when knowledge is becoming increasingly textualized, the role of critical thinking in education will ultimately determine the value(s) of socially constructed knowledge. This volume affords the reader an opportunity to reconsider knowledge, power, and new communications technologies with respect to social dynamics and power relationships.
series other
last changed 2003/04/23 13:14

_id ddss9207
id ddss9207
authors Gauchel, J., Hovestadt, L., van Wyk, S. and Bhat, R.R.
year 1993
title Modular building models
source Timmermans, Harry (Ed.), Design and Decision Support Systems in Architecture (Proceedings of a conference held in Mierlo, the Netherlands in July 1992), ISBN 0-7923-2444-7
summary The development and implementation of a modular building model appropriate for computer aided design is described. The limitations of a unified building model with regard to concurrence and complexity in design is discussed. Current research suggests that to model real-world complexity, one must trade centralized control for autonomy. In this paper we develop a modular approach to building modelling that is based on object-oriented autonomy and makes it possible to define these models in a distributed concurrent manner. Such a modular and autonomous implementation brings inherent uncertainty and conflict which cannot be determined a priori.
series DDSS
last changed 2003/08/07 14:36

_id eda3
authors Goldschmidt, Gabriela
year 1992
title Criteria for Design Evaluation : A Process-Oriented Paradigm
source New York: John Wiley & Sons, 1992. pp. 67-79. includes bibliography
summary Architectural research of the last two or three decades has been largely devoted to design methodology. Systematic evaluations of design products and prescription of their desired qualities led to specifications for better designs and possible routines to achieve them. Computers have facilitated this task. The human designer, however, has largely resisted the use of innovative methods. In this paper the author claims that the reason for that lies in insufficient regard for innate cognitive aptitudes which are activated in the process of designing. A view of these aptitudes, based on patterns of links among design moves, is presented. It is proposed that process research is mandatory for further advancements in design research utility
keywords cognition, design process, research, protocol analysis, architecture
series CADline
last changed 1999/02/12 14:08

_id ea96
authors Hacfoort, Eek J. and Veldhuisen, Jan K.
year 1992
title A Building Design and Evaluation System
source New York: John Wiley & Sons, 1992. pp. 195-211 : ill. table. includes bibliography
summary Within the field of architectural design there is a growing awareness of imbalance among the professionalism, the experience, and the creativity of the designers' response to the up-to-date requirements of all parties interested in the design process. The building design and evaluating system COSMOS makes it possible for various participants to work within their own domain, so that separated but coordinated work can be done. This system is meant to organize the initial stage of the design process, where user-defined functions, geometry, type of construction, and building materials are decided. It offers a tool to design a building to calculate a number of effects and for managing the information necessary to evaluate the design decisions. The system is provided with data and sets of parameters for describing the conditions, along with their properties, of the main building functions of a selection of well-known building types. The architectural design is conceptualized as being a hierarchy of spatial units, ranking from building blocks down to specific rooms or spaces. The concept of zoning is used as a means of calculating and directly evaluating the structure of the design without working out the details. A distinction is made between internal and external calculations and evaluations during the initial design process. During design on screen, an estimation can be recorded of building costs, energy costs, acoustics, lighting, construction, and utility. Furthermore, the design can be exported to a design application program, in this case AutoCAD, to make and show drawings in more detail. Through the medium of a database, external calculation and evaluation of building costs, life-cycle costs, energy costs, interior climate, acoustics, lighting, construction, and utility are possible in much more advanced application programs
keywords evaluation, applications, integration, architecture, design, construction, building, energy, cost, lighting, acoustics, performance
series CADline
last changed 2003/06/02 11:58

_id d919
authors Heckbert, P.S.
year 1992
title Discontinuity Meshing for Radiosity
source Eurographics Workshop on Rendering. May 1992, pp. 203-216
summary The radiosity method is the most popular algorithm for simulating interreflection of light between diffuse surfaces. Most existing radiosity algorithms employ simple meshes and piecewise constant approximations, thereby constraining the radiosity function to be constant across each polygonal element. Much more accurate simulations are possible if linear, quadratic, or higher degree approximations are used. In order to realize the potential accuracy of higher-degree approximations, however, it is necessary for the radiosity mesh to resolve discontinuities such as shadow edges in the radiosity function. A discontinuity meshing algorithm is presented that places mesh boundaries directly along discontinuities. Such algorithms offer the potential of faster, more accurate simulations. Results are shown for three-dimensional scenes.
series other
last changed 2003/04/23 13:14

_id 7e68
authors Holland, J.
year 1992
title Genetic Algorithms
source Scientific America, July 1992
summary Living organisms are consummate problem solvers. They exhibit a versatility that puts the best computer programs to shame. This observation is especially galling for computer scientists, who may spend months or years of intellectual effort on an algorithm, whereas organisms come by their abilities through the apparently undirected mechanism of evolution and natural selection. Pragmatic researchers see evolution's remarkable power as something to be emulated rather than envied. Natural selection eliminates one of the greatest hurdles in software design: specifying in advance all the features of a problem and the actions a program should take to deal with them. By harnessing the mechanisms of evolution, researchers may be able to "breed" programs that solve problems even when no person can fully understand their structure. Indeed, these so-called genetic algorithms have already demonstrated the ability to made breakthroughs in the design of such complex systems as jet engines. Genetic algorithms make it possible to explore a far greater range of potential solutions to a problem than do conventional programs. Furthermore, as researchers probe the natural selection of programs under controlled an well-understood conditions, the practical results they achieve may yield some insight into the details of how life and intelligence evolve in the natural world.
series journal paper
last changed 2003/04/23 13:50

_id abce
authors Ishii, Hiroshi and Kobayashi, Minoru
year 1992
title ClearBoard: A Seamless Medium for Shared Drawing and Conversation with Eye Contact Systems for Media-Supported Collaboration
source Proceedings of ACM CHI'92 Conference on HumanFactors in Computing Systems 1992 pp. 525-532
summary This paper introduces a novel shared drawing medium called ClearBoard. It realizes (1) a seamless shared drawing space and (2) eye contact to support realtime and remote collaboration by two users. We devised the key metaphor: "talking through and drawing on a transparent glass window" to design ClearBoard. A prototype of ClearBoard is implemented based on the "Drafter-Mirror" architecture. This paper first reviews previous work on shared drawing support to clarify the design goals. We then examine three metaphors that fulfill these goals. The design requirements and the two possible system architectures of ClearBoard are described. Finally, some findings gained through the experimental use of the prototype, including the feature of "gaze awareness", are discussed.
series other
last changed 2002/07/07 14:01

_id cc2f
authors Jog, Bharati
year 1992
title Evaluation of Designs for Energy Performance Using A Knowledge-Based System
source New York: John Wiley & Sons, 1992. pp. 293-304 : ill. includes a bibliography
summary Principles of knowledge-based (or expert) systems have been applied in different knowledge-rich domains such as geology, medicine, and very large scale integrated circuits (VLSI). There have been some efforts to develop expert systems for evaluation and prediction of architectural designs in this decade. This paper presents a prototype system, Energy Expert, which quickly computes the approximate yearly energy performance of a building design, analyzes the energy performance, and gives advice on possible ways of improving the design. These modifications are intended to make the building more energy efficient and help cut down on heating and cooling costs. The system is designed for the schematic design phase of an architectural project. Also discussed briefly is the reasoning behind developing such a system for the schematic design rather than the final design phase
keywords expert systems, energy, evaluation, performance, knowledge base, architecture, reasoning, programming, prediction
series CADline
last changed 1999/02/12 14:08

_id 11b6
authors Kalmychkov, Vitaly A. and Smolyaninov, Alexander V.
year 1992
title Design of Object-Oriented Data Visualization System
source East-West International Conference on Human-Computer Interaction: Proceedings of the EWHCI'92 1992 pp. 463-470
summary The report is devoted to the data visualization system design and implementation, which provides the means for design of the image of the user's numeric information on the personal computer. The problems of design, architecture and operation of data visualization system which provides to user convenient means for constructing the numeric information image of required type is considered. Image constructing is executed by means of required sizes fields placing and filling of them by necessary content (coordinates system, graphs, inscriptions). User's interface with instrument system is object-oriented: after object (field or its content) choice user can manipulate of it, executing only those operations, that are determined for it as object of appointed function. Ergonomical and comfortable constructing is ensured by careful coordinated system of possible actions on each of image constructing stage and supported by icons menu and textual menu.
series other
last changed 2002/07/07 14:01

_id 0b53
authors Lawrence, Roderick J.
year 1992
title CHARACTERISTICS OF ARCHITECTURAL DESIGN-TOOLS
source Proceedings of the 4rd European Full-Scale Modelling Conference / Lausanne (Switzerland) 9-12 September 1992, Part B, pp. 7-14
summary The professional roles and fonctions of architects are linked to the societal context in which they practice. Furthermore, this context, which is not static, has a relationship to the ways in which institutions, groups and individuals are involved in processes for the design and construction of the built environment. This presentation illustrates how the roles and functions of architects, other professionals, their clients and the general public have a bearing on the tools and methods used by the architectural profession to simulate design projects. Traditionally, sketches, renderings and pattern books were used. Then, they were supplemented by axonometric and perspective drawings, written and diagrammatic specifications, photographs and small-scale models. In recent decades mathematical models of diverse kinds, simulation techniques -including small- and full-scale modelling kits -as well as computer aided design and drafting systems have been used. This paper briefly presents these kinds of tools and then presents a typology of them. In conclusion, possible applications for the future are discussed.
keywords Full-scale Modeling, Model Simulation, Real Environments
series other
type normal paper
email lawrence@uni2a.unige.ch
more http://info.tuwien.ac.at/efa
last changed 2004/05/04 13:39

_id caadria2014_071
id caadria2014_071
authors Li, Lezhi; Renyuan Hu, Meng Yao, Guangwei Huang and Ziyu Tong
year 2014
title Sculpting the Space: A Circulation Based Approach to Generative Design in a Multi-Agent System
source Rethinking Comprehensive Design: Speculative Counterculture, Proceedings of the 19th International Conference on Computer-Aided Architectural Design Research in Asia (CAADRIA 2014) / Kyoto 14-16 May 2014, pp. 565Ė574
summary This paper discusses an MAS (multiagent system) based approach to generating architectural spaces that afford better modes of human movement. To achieve this, a pedestrian simulation is carried out to record the data with regard to human spatial experience during the walking process. Unlike common practices of performance oriented generation where final results are achieved through cycles of simulation and comparison, what we propose here is to let humanís movement exert direct influence on space. We made this possible by asking "humans" to project simulation data on architectural surroundings, and thus cause the layout to change for the purpose of affording what we designate as good spatial experiences. A generation experiment of an exhibition space is implemented to explore this approach, in which tentative rules of such spatial manipulation are proposed and tested through space syntax analyse. As the results suggested, by looking at spatial layouts through a lens of human behaviour, this projection-and-generation method provides some insight into space qualities that other methods could not have offered.
keywords Performance oriented generative design; projection; multi-agent system; pedestrian simulation; space syntax
series CAADRIA
email caroline.li.1992@gmail.com
last changed 2014/04/22 08:23

_id e8f0
authors Mackey, David L.
year 1992
title Mission Possible: Computer Aided Design for Everyone
source Mission - Method - Madness [ACADIA Conference Proceedings / ISBN 1-880250-01-2] 1992, pp. 65-73
summary A pragmatic model for the building of an electronic architectural design curriculum which will offer students and faculty the opportunity to fully integrate information age technologies into the educational experience is becoming increasingly desirable.

The majority of architectural programs teach technology topics through content specific courses which appear as an educational sequence within the curriculum. These technology topics have traditionally included structural design, environmental systems, and construction materials and methods. Likewise, that course model has been broadly applied to the teaching of computer aided design, which is identified as a technology topic. Computer technology has resulted in a proliferation of courses which similarly introduce the student to computer graphic and design systems through a traditional course structure.

Inevitably, competition for priority arises within the curriculum, introducing the potential risk that otherwise valuable courses and/or course content will be replaced by the "'newer" technology, and providing fertile ground for faculty and administrative resistance to computerization as traditional courses are pushed aside or seem threatened.

An alternative view is that computer technology is not a "topic", but rather the medium for creating a design (and studio) environment for informed decision making.... deciding what it is we should build. Such a viewpoint urges the development of a curricular structure, through which the impact of computer technology may be understood as that medium for design decision making, as the initial step in addressing the current and future needs of architectural education.

One example of such a program currently in place at the College of Architecture and Planning, Ball State University takes an approach which overlays, like a transparent tissue, the computer aided design content (or a computer emphasis) onto the primary curriculum.

With the exception of a general introductory course at the freshman level, computer instruction and content issues may be addressed effectively within existing studio courses. The level of operational and conceptual proficiency achieved by the student, within an electronic design studio, makes the electronic design environment selfsustaining and maintainable across the entire curriculum. The ability to broadly apply computer aided design to the educational experience can be independent of the availability of many specialized computer aided design faculty.

series ACADIA
last changed 1999/03/29 13:58

_id ddss9215
id ddss9215
authors Mortola, E. and Giangrande, A.
year 1993
title A trichotomic segmentation procedure to evaluate projects in architecture
source Timmermans, Harry (Ed.), Design and Decision Support Systems in Architecture (Proceedings of a conference held in Mierlo, the Netherlands in July 1992), ISBN 0-7923-2444-7
summary This paper illustrates a model used to construct the evaluation module for An Interface for Designing (AID), a system to aid architectural design. The model can be used at the end of every cycle of analysis-synthesis-evaluation in the intermediate phases of design development. With the aid of the model it is possible to evaluate the quality of a project in overall terms to establish whether the project is acceptable, whether it should be elaborated ex-novo, or whether it is necessary to begin a new cycle to improve it. In this last case, it is also possible to evaluate the effectiveness of the possible actions and strategies for improvement. The model is based on a procedure of trichotomic segmentation, developed with MCDA (Multi-Criteria Decision Aid), which uses the outranking relation to compare the project with some evaluation profiles taken as projects of reference. An application of the model in the teaching field will also be described.
series DDSS
last changed 2003/08/07 14:36

_id ddss9212
id ddss9212
authors Prins, M., Bax, M.F.TH., Carp, J.C. and Tempelmans Plat, H.
year 1993
title A design decision support system for building flexibility and costs
source Timmermans, Harry (Ed.), Design and Decision Support Systems in Architecture (Proceedings of a conference held in Mierlo, the Netherlands in July 1992), ISBN 0-7923-2444-7
summary Because of possible changes in demand, buildings must have some flexibility. In this paper a building model, a financial-economic model and a process model will be presented, which together constitute a design decision support system. This system may be used to decide on flexibility and costs of building variants in all phases of the design process.
series DDSS
last changed 2003/08/07 14:36

For more results click below:

this is page 0show page 1show page 2show page 3show page 4show page 5... show page 10HOMELOGIN (you are user _anon_899393 from group guest) CUMINCAD Papers Powered by SciX Open Publishing Services 1.002