CumInCAD is a Cumulative Index about publications in Computer Aided Architectural Design
supported by the sibling associations ACADIA, CAADRIA, eCAADe, SIGraDi, ASCAAD and CAAD futures

PDF papers
References

Hits 1 to 20 of 30

_id 9f8a
authors Davidow, William H.
year 1992
title The Virtual Corporation: Structuring and Revitalizing the Corporation for the 21St Century
source New York: Harper Collins Publishers
summary The great value of this timely, important book is that it provides an integrated picture of the customer-driven company of the future. We have begun to learn about lean production technology, stripped-down management, worker empowerment, flexible customized manufacturing, and other modern strategies, but Davidow and Malone show for the first time how these ideas are fitting together to create a new kind of corporation and a worldwide business revolution. Their research is fascinating. The authors provide illuminating case studies of American, Japanese, and European companies that have discovered the keys to improved competitiveness, redesigned their businesses and their business relationships, and made extraordinary gains. They also write bluntly and critically about a number of American corporations that are losing market share by clinging to outmoded thinking. Business success in the global marketplace of the future is going to depend upon corporations producing "virtual" products high in added value, rich in variety, and available instantly in response to customer needs. At the heart of this revolution will be fast new information technologies; increased emphasis on quality; accelerated product development; changing management practices, including new alignments between management and labor; and new linkages between company, supplier, and consumer, and between industry and government. The Virtual Corporation is an important cutting-edge book that offers a creative synthesis of the most influential ideas in modern business theory. It has already fired excitement and debate in industry, academia, and government, and it is essential reading for anyone involved in the leadership of America's business and the shaping of America's economic future.
series other
last changed 2003/04/23 15:14

_id ddss9208
id ddss9208
authors Lucardie, G.L.
year 1993
title A functional approach to realizing decision support systems in technical regulation management for design and construction
source Timmermans, Harry (Ed.), Design and Decision Support Systems in Architecture (Proceedings of a conference held in Mierlo, the Netherlands in July 1992), ISBN 0-7923-2444-7
summary Technical building standards defining the quality of buildings, building products, building materials and building processes aim to provide acceptable levels of safety, health, usefulness and energy consumption. However, the logical consistency between these goals and the set of regulations produced to achieve them is often hard to identify. Not only the large quantities of highly complex and frequently changing building regulations to be met, but also the variety of user demands and the steadily increasing technical information on (new) materials, products and buildings have produced a very complex set of knowledge and data that should be taken into account when handling technical building regulations. Integrating knowledge technology and database technology is an important step towards managing the complexity of technical regulations. Generally, two strategies can be followed to integrate knowledge and database technology. The main emphasis of the first strategy is on transferring data structures and processing techniques from one field of research to another. The second approach is concerned exclusively with the semantic structure of what is contained in the data-based or knowledge-based system. The aim of this paper is to show that the second or knowledge-level approach, in particular the theory of functional classifications, is more fundamental and more fruitful. It permits a goal-directed rationalized strategy towards analysis, use and application of regulations. Therefore, it enables the reconstruction of (deep) models of regulations, objects and of users accounting for the flexibility and dynamics that are responsible for the complexity of technical regulations. Finally, at the systems level, the theory supports an effective development of a new class of rational Decision Support Systems (DSS), which should reduce the complexity of technical regulations and restore the logical consistency between the goals of technical regulations and the technical regulations themselves.
series DDSS
last changed 2003/08/07 16:36

_id ddss9203
id ddss9203
authors Smeets, J.
year 1993
title Housing tenancy, data management and quality control
source Timmermans, Harry (Ed.), Design and Decision Support Systems in Architecture (Proceedings of a conference held in Mierlo, the Netherlands in July 1992), ISBN 0-7923-2444-7
summary This paper deals with housing tenancy, data management and quality control. The proposed method is focused on quality characteristics of housing estates in view of rentability risks. It entails a cycle of registration, analysis and implementation of measures. The starting point is the behaviour of the housing consumer in a market-oriented context. The model is framed within theories of strategic management and marketing. Systematic registration and evaluation of consumer behaviour, by means of a set of relevant process and product indicators, can yield relevant information in the four phases of the rental process: orientation, intake, dwelling and exit. This information concerns the way in which the dwelling (characterized by product indicators) fits the needs of the consumer. The systematic analysis of the process and product indicators during the phases of the rental process makes a 'strength-weakness analysis' of housing estates possible. The indicators can be presented in aggregated form by way of a 'rentability index. The 'strength-weakness analysis' steers the intervention in the quality characteristics of housing estates. The possibilities for readjustment, however, are different. The quality control system is not only an early warning system, but also has several other functions: evaluation, planning and communication. The method described here lays a solid foundation for a decision-support system in the area of housing tenancy.
series DDSS
last changed 2003/08/07 16:36

_id 7ce5
authors Gal, Shahaf
year 1992
title Computers and Design Activities: Their Mediating Role in Engineering Education
source Sociomedia, ed. Edward Barret. MIT Press
summary Sociomedia: With all the new words used to describe electronic communication (multimedia, hypertext, cyberspace, etc.), do we need another one? Edward Barrett thinks we do; hence, he coins the term "sociomedia." It is meant to displace a computing economy in which technicity is hypostasized over sociality. Sociomedia, a compilation of twenty-five articles on the theory, design and practice of educational multimedia and hypermedia, attempts to re-value the communicational face of computing. Value, of course, is "ultimately a social construct." As such, it has everything to do with knowledge, power, education and technology. The projects discussed in this book represent the leading edge of electronic knowledge production in academia (not to mention major funding) and are determining the future of educational media. For these reasons, Sociomedia warrants close inspection. Barrett's introduction sets the tone. For him, designing computer media involves hardwiring a mechanism for the social construction of knowledge (1). He links computing to a process of social and communicative interactivity for constructing and desseminating knowledge. Through a mechanistic mapping of the university as hypercontext (a huge network that includes classrooms as well as services and offices), Barrett models intellectual work in such a way as to avoid "limiting definitions of human nature or human development." Education, then, can remain "where it should be--in the human domain (public and private) of sharing ideas and information through the medium of language." By leaving education in a virtual realm (where we can continue to disagree about its meaning and execution), it remains viral, mutating and contaminating in an intellectually healthy way. He concludes that his mechanistic model, by means of its reductionist approach, preserves value (7). This "value" is the social construction of knowledge. While I support the social orientation of Barrett's argument, discussions of value are related to power. I am not referring to the traditional teacher-student power structure that is supposedly dismantled through cooperative and constructivist learning strategies. The power to be reckoned with in the educational arena is foundational, that which (pre)determines value and the circulation of knowledge. "Since each of you reading this paragraph has a different perspective on the meaning of 'education' or 'learning,' and on the processes involved in 'getting an education,' think of the hybris in trying to capture education in a programmable function, in a displayable object, in a 'teaching machine'" (7). Actually, we must think about that hybris because it is, precisely, what informs teaching machines. Moreover, the basic epistemological premises that give rise to such productions are too often assumed. In the case of instructional design, the episteme of cognitive sciences are often taken for granted. It is ironic that many of the "postmodernists" who support electronic hypertextuality seem to have missed Jacques Derrida's and Michel Foucault's "deconstructions" of the epistemology underpinning cognitive sciences (if not of epistemology itself). Perhaps it is the glitz of the technology that blinds some users (qua developers) to the belief systems operating beneath the surface. Barrett is not guilty of reactionary thinking or politics; he is, in fact, quite in line with much American deconstructive and postmodern thinking. The problem arises in that he leaves open the definitions of "education," "learning" and "getting an education." One cannot engage in the production of new knowledge without orienting its design, production and dissemination, and without negotiating with others' orientations, especially where largescale funding is involved. Notions of human nature and development are structural, even infrastructural, whatever the medium of the teaching machine. Although he addresses some dynamics of power, money and politics when he talks about the recession and its effects on the conference, they are readily visible dynamics of power (3-4). Where does the critical factor of value determination, of power, of who gets what and why, get mapped onto a mechanistic model of learning institutions? Perhaps a mapping of contributors' institutions, of the funding sources for the projects showcased and for participation in the conference, and of the disciplines receiving funding for these sorts of projects would help visualize the configurations of power operative in the rising field of educational multimedia. Questions of power and money notwithstanding, Barrett's introduction sets the social and textual thematics for the collection of essays. His stress on interactivity, on communal knowledge production, on the society of texts, and on media producers and users is carried foward through the other essays, two of which I will discuss. Section I of the book, "Perspectives...," highlights the foundations, uses and possible consequences of multimedia and hypertextuality. The second essay in this section, "Is There a Class in This Text?," plays on the robust exchange surrounding Stanley Fish's book, Is There a Text in This Class?, which presents an attack on authority in reading. The author, John Slatin, has introduced electronic hypertextuality and interaction into his courses. His article maps the transformations in "the content and nature of work, and the workplace itself"-- which, in this case, is not industry but an English poetry class (25). Slatin discovered an increase of productive and cooperative learning in his electronically- mediated classroom. For him, creating knowledge in the electronic classroom involves interaction between students, instructors and course materials through the medium of interactive written discourse. These interactions lead to a new and persistent understanding of the course materials and of the participants' relation to the materials and to one another. The work of the course is to build relationships that, in my view, constitute not only the meaning of individual poems, but poetry itself. The class carries out its work in the continual and usually interactive production of text (31). While I applaud his strategies which dismantle traditional hierarchical structures in academia, the evidence does not convince me that the students know enough to ask important questions or to form a self-directing, learning community. Stanley Fish has not relinquished professing, though he, too, espouses the indeterminancy of the sign. By the fourth week of his course, Slatin's input is, by his own reckoning, reduced to 4% (39). In the transcript of the "controversial" Week 6 exchange on Gertrude Stein--the most disliked poet they were discussing at the time (40)--we see the blind leading the blind. One student parodies Stein for three lines and sums up his input with "I like it." Another, finds Stein's poetry "almost completey [sic] lacking in emotion or any artistic merit" (emphasis added). On what grounds has this student become an arbiter of "artistic merit"? Another student, after admitting being "lost" during the Wallace Steven discussion, talks of having more "respect for Stevens' work than Stein's" and adds that Stein's poetry lacks "conceptual significance[, s]omething which people of varied opinion can intelligently discuss without feeling like total dimwits...." This student has progressed from admitted incomprehension of Stevens' work to imposing her (groundless) respect for his work over Stein's. Then, she exposes her real dislike for Stein's poetry: that she (the student) missed the "conceptual significance" and hence cannot, being a person "of varied opinion," intelligently discuss it "without feeling like [a] total dimwit." Slatin's comment is frightening: "...by this point in the semester students have come to feel increasingly free to challenge the instructor" (41). The students that I have cited are neither thinking critically nor are their preconceptions challenged by student-governed interaction. Thanks to the class format, one student feels self-righteous in her ignorance, and empowered to censure. I believe strongly in student empowerment in the classroom, but only once students have accrued enough knowledge to make informed judgments. Admittedly, Slatin's essay presents only partial data (there are six hundred pages of course transcripts!); still, I wonder how much valuable knowledge and metaknowledge was gained by the students. I also question the extent to which authority and professorial dictature were addressed in this course format. The power structures that make it possible for a college to require such a course, and the choice of texts and pedagogy, were not "on the table." The traditional professorial position may have been displaced, but what took its place?--the authority of consensus with its unidentifiable strong arm, and the faceless reign of software design? Despite Slatin's claim that the students learned about the learning process, there is no evidence (in the article) that the students considered where their attitudes came from, how consensus operates in the construction of knowledge, how power is established and what relationship they have to bureaucratic insitutions. How do we, as teaching professionals, negotiate a balance between an enlightened despotism in education and student-created knowledge? Slatin, and other authors in this book, bring this fundamental question to the fore. There is no definitive answer because the factors involved are ultimately social, and hence, always shifting and reconfiguring. Slatin ends his article with the caveat that computerization can bring about greater estrangement between students, faculty and administration through greater regimentation and control. Of course, it can also "distribute authority and power more widely" (50). Power or authority without a specific face, however, is not necessarily good or just. Shahaf Gal's "Computers and Design Activities: Their Mediating Role in Engineering Education" is found in the second half of the volume, and does not allow for a theory/praxis dichotomy. Gal recounts a brief history of engineering education up to the introduction of Growltiger (GT), a computer-assisted learning aid for design. He demonstrates GT's potential to impact the learning of engineering design by tracking its use by four students in a bridge-building contest. What his text demonstrates clearly is that computers are "inscribing and imaging devices" that add another viewpoint to an on-going dialogue between student, teacher, earlier coursework, and other teaching/learning tools. The less proficient students made a serious error by relying too heavily on the technology, or treating it as a "blueprint provider." They "interacted with GT in a way that trusted the data to represent reality. They did not see their interaction with GT as a negotiation between two knowledge systems" (495). Students who were more thoroughly informed in engineering discourses knew to use the technology as one voice among others--they knew enough not simply to accept the input of the computer as authoritative. The less-advanced students learned a valuable lesson from the competition itself: the fact that their designs were not able to hold up under pressure (literally) brought the fact of their insufficient knowledge crashing down on them (and their bridges). They also had, post factum, several other designs to study, especially the winning one. Although competition and comparison are not good pedagogical strategies for everyone (in this case the competitors had volunteered), at some point what we think we know has to be challenged within the society of discourses to which it belongs. Students need critique in order to learn to push their learning into auto-critique. This is what is lacking in Slatin's discussion and in the writings of other avatars of constructivist, collaborative and computer-mediated pedagogies. Obviously there are differences between instrumental types of knowledge acquisition and discoursive knowledge accumulation. Indeed, I do not promote the teaching of reading, thinking and writing as "skills" per se (then again, Gal's teaching of design is quite discursive, if not dialogic). Nevertheless, the "soft" sciences might benefit from "bridge-building" competitions or the re-institution of some forms of agonia. Not everything agonistic is inhuman agony--the joy of confronting or creating a sound argument supported by defensible evidence, for example. Students need to know that soundbites are not sound arguments despite predictions that electronic writing will be aphoristic rather than periodic. Just because writing and learning can be conceived of hypertextually does not mean that rigor goes the way of the dinosaur. Rigor and hypertextuality are not mutually incompatible. Nor is rigorous thinking and hard intellectual work unpleasurable, although American anti-intellectualism, especially in the mass media, would make it so. At a time when the spurious dogmatics of a Rush Limbaugh and Holocaust revisionist historians circulate "aphoristically" in cyberspace, and at a time when knowledge is becoming increasingly textualized, the role of critical thinking in education will ultimately determine the value(s) of socially constructed knowledge. This volume affords the reader an opportunity to reconsider knowledge, power, and new communications technologies with respect to social dynamics and power relationships.
series other
last changed 2003/04/23 15:14

_id 8b12
authors Manning, Peter and Mattar, Samir
year 1992
title A Preliminary to Development of Expert Systems for Total Design of Entire Buildings
source New York: John Wiley & Sons, 1992. pp. 215-237 : tables. includes bibliography
summary This paper has two primary objectives. The first is to represent the practicability of making the design of entire buildings a conscious, craftsman-like, activity conducted in the clear, without the mystery that tends, because of designers' usual 'black box' methods, to surround it. To this end, a design strategy and some tactics for resolving decisions at critical stages in the design process, which the authors have described elsewhere, are recapitulated to show how total design of buildings can be pursued in a generic manner. This done, the way is opened for the second objective: to make the large and important field of work that is building design amenable to computerization. The form that pursuit of this second objective is taking is being influenced greatly by growing interest in expert systems, which for everyday professional building design appears a more useful development than previous CAD emphases on drafting and graphics. Application of the authors' design methods to a series of expert systems for the total design of entire buildings is therefore indicated. For such a vast project--the formulation of bases for design assistance and expert systems that can be integrated and used as a generic method for the total design of entire buildings, so that the results are more certain and successful than the outcome of the generality of present-day building design--the most that can be attempted within the limits of a single paper is a set of examples of some of the stages in the process. Nevertheless, since the design method described begins at the 'large end' of the process, where the most consequential decisions are made, it is hoped that the major thrusts and the essential CAD activities will be evident. All design is substantially iterative, and provided that the major iterations are intelligible, there should be no need for this demonstration to labor over the lesser ones
keywords evaluation, integration, architecture, building, expert systems, design methods, design process
series CADline
last changed 2003/06/02 13:58

_id 3ff5
authors Abbo, I.A., La Scalea, L., Otero, E. and Castaneda, L.
year 1992
title Full-Scale Simulations as Tool for Developing Spatial Design Ability
source Proceedings of the 4rd European Full-Scale Modelling Conference / Lausanne (Switzerland) 9-12 September 1992, Part C, pp. 7-10
summary Spatial Design Ability has been defined as the capability to anticipate effects (psychological impressions on potential observers or users) produced by mental manipulation of elements of architectural or urban spaces. This ability, of great importance in choosing the appropriate option during the design process, is not specifically developed in schools of architecture and is partially obtained as a by-product of drawing, designing or architectural criticism. We use our Laboratory as a tool to present spaces to people so that they can evaluate them. By means of a series of exercises, students confront their anticipations with the psychological impressions produced in other people. For this occasion, we present an experience in which students had to propose a space for an exhibition hag in which architectural projects (student thesis) were to be shown. Following the Spatial Design Ability Development Model which we have been using for several years, students first get acquainted with the use of evaluation instruments for psychological impressions as well as with research methodology. In this case, due to the short period available, we reduced research to investigate the effects produced by the manipulation of only 2 independents variables: students manipulated first the form of the roof, walls and interiors elements, secondly, color and texture of those elements. They evaluated spatial quality, character and the other psychological impressions that manipulations produced in people. They used three dimensional scale models 1/10 and 1/1.
keywords Full-scale Modeling, Model Simulation, Real Environments
series other
email
more http://info.tuwien.ac.at/efa
last changed 2003/08/25 10:12

_id cf73
authors Dosti, P., Martens, B. and Voigt, A.
year 1992
title Spatial Simulation In Architecture, City Development and Regional Planning
doi https://doi.org/10.52842/conf.ecaade.1992.195
source CAAD Instruction: The New Teaching of an Architect? [eCAADe Conference Proceedings] Barcelona (Spain) 12-14 November 1992, pp. 195-200
summary The appropriate use of spatial simulation techniques considerably tends to increase the depth of evidence and the realistic content of the design and plannings to be described and moreover may encourage experimentations, trial attempts and planning variants. This means also the more frequent use of combinations between different techniques, having in mind that they are not equivalent, but making use of the respective advantages each offers. Until now the main attention of the EDP-Lab was directed on achieving quantity. For the time to come time it will be the formation of quality. The challenge in the educational system at the Vienna University of Technology is to obtain appropriate results in the frame- work of low-cost simulation. This aspect seems also to be meaningful in order to enforce the final implementation in architectural practice.

series eCAADe
email
more http://info.tuwien.ac.at/ecaade/
last changed 2022/06/07 07:55

_id 4129
authors Fargas, Josep and Papazian, Pegor
year 1992
title Metaphors in Design: An Experiment with a Frame, Two Lines and Two Rectangles
doi https://doi.org/10.52842/conf.acadia.1992.013
source Mission - Method - Madness [ACADIA Conference Proceedings / ISBN 1-880250-01-2] 1992, pp. 13-22
summary The research we will discuss below originated from an attempt to examine the capacity of designers to evaluate an artifact, and to study the feasibility of replicating a designer's moves intended to make an artifact more expressive of a given quality. We will present the results of an interactive computer experiment, first developed at the MIT Design Research Seminar, which is meant to capture the subject’s actions in a simple design task as a series of successive "moves"'. We will propose that designers use metaphors in their interaction with design artifacts and we will argue that the concept of metaphors can lead to a powerful theory of design activity. Finally, we will show how such a theory can drive the project of building a design system.

When trying to understand how designers work, it is tempting to examine design products in order to come up with the principles or norms behind them. The problem with such an approach is that it may lead to a purely syntactical analysis of design artifacts, failing to capture the knowledge of the designer in an explicit way, and ignoring the interaction between the designer and the evolving design. We will present a theory about design activity based on the observation that knowledge is brought into play during a design task by a process of interpretation of the design document. By treating an evolving design in terms of the meanings and rules proper to a given way of seeing, a designer can reduce the complexity of a task by focusing on certain of its aspects, and can manipulate abstract elements in a meaningful way.

series ACADIA
email
last changed 2022/06/07 07:55

_id cc68
authors García, Agustín Pérez
year 1992
title Learning Structural Design - Computers and Virtual Laboratories
doi https://doi.org/10.52842/conf.ecaade.1992.525
source CAAD Instruction: The New Teaching of an Architect? [eCAADe Conference Proceedings] Barcelona (Spain) 12-14 November 1992, pp. 525-534
summary This paper shows how the spreading use of computers can improve the quality of education, specially in the field of architecture. An Innovative Teaching Project oriented to the discipline Structural Design of Buildings has been implemented at the School of Architecture of Valencia. The main objective of this project is the transformation of the computer room into a virtual laboratory for simulating the behaviour of structural typologies using mathematical models of them. An environment, specially oriented to Structural Design, has been integrated in a Computer Aided Design platform to teach how design the Structure of Buildings.
series eCAADe
last changed 2022/06/07 07:51

_id ddss9211
id ddss9211
authors Gilleard, J. and Olatidoye, O.
year 1993
title Graphical interfacing to a conceptual model for estimating the cost of residential construction
source Timmermans, Harry (Ed.), Design and Decision Support Systems in Architecture (Proceedings of a conference held in Mierlo, the Netherlands in July 1992), ISBN 0-7923-2444-7
summary This paper presents a method for determining elemental square foot costs and cost significance for residential construction. Using AutoCAD's icon menu and dialogue box' facilities, a non-expert may graphically select (i) residential configuration; (ii) construction quality level; (iii) geographical location; (iv) square foot area; and finally, (v) add-ons, e.g. porches and decks, basement, heating and cooling equipment, garages and carports etc. in order to determine on-site builder's costs. Subsequent AutoLisp routines facilitate data transfer to a Lotus 1-2-3 spreadsheet where an elemental cost breakdown for the project may be determined. Finally, using Lotus 1-2-3 macros, computed data is transferred back to AutoCAD, where all cost significant items are graphically highlighted.
series DDSS
last changed 2003/08/07 16:36

_id 32eb
authors Henry, Daniel
year 1992
title Spatial Perception in Virtual Environments : Evaluating an Architectural Application
source University of Washington
summary Over the last several years, professionals from many different fields have come to the Human Interface Technology Laboratory (H.I.T.L) to discover and learn about virtual environments. In general, they are impressed by their experiences and express the tremendous potential the tool has in their respective fields. But the potentials are always projected far in the future, and the tool remains just a concept. This is justifiable because the quality of the visual experience is so much less than what people are used to seeing; high definition television, breathtaking special cinematographic effects and photorealistic computer renderings. Instead, the models in virtual environments are very simple looking; they are made of small spaces, filled with simple or abstract looking objects of little color distinctions as seen through displays of noticeably low resolution and at an update rate which leaves much to be desired. Clearly, for most applications, the requirements of precision have not been met yet with virtual interfaces as they exist today. However, there are a few domains where the relatively low level of the technology could be perfectly appropriate. In general, these are applications which require that the information be presented in symbolic or representational form. Having studied architecture, I knew that there are moments during the early part of the design process when conceptual decisions are made which require precisely the simple and representative nature available in existing virtual environments. This was a marvelous discovery for me because I had found a viable use for virtual environments which could be immediately beneficial to architecture, my shared area of interest. It would be further beneficial to architecture in that the virtual interface equipment I would be evaluating at the H.I.T.L. happens to be relatively less expensive and more practical than other configurations such as the "Walkthrough" at the University of North Carolina. The set-up at the H.I.T.L. could be easily introduced into architectural firms because it takes up very little physical room (150 square feet) and it does not require expensive and space taking hardware devices (such as the treadmill device for simulating walking). Now that the potential for using virtual environments in this architectural application is clear, it becomes important to verify that this tool succeeds in accurately representing space as intended. The purpose of this study is to verify that the perception of spaces is the same, in both simulated and real environment. It is hoped that the findings of this study will guide and accelerate the process by which the technology makes its way into the field of architecture.
keywords Space Perception; Space (Architecture); Computer Simulation
series thesis:MSc
last changed 2003/02/12 22:37

_id ddss9215
id ddss9215
authors Mortola, E. and Giangrande, A.
year 1993
title A trichotomic segmentation procedure to evaluate projects in architecture
source Timmermans, Harry (Ed.), Design and Decision Support Systems in Architecture (Proceedings of a conference held in Mierlo, the Netherlands in July 1992), ISBN 0-7923-2444-7
summary This paper illustrates a model used to construct the evaluation module for An Interface for Designing (AID), a system to aid architectural design. The model can be used at the end of every cycle of analysis-synthesis-evaluation in the intermediate phases of design development. With the aid of the model it is possible to evaluate the quality of a project in overall terms to establish whether the project is acceptable, whether it should be elaborated ex-novo, or whether it is necessary to begin a new cycle to improve it. In this last case, it is also possible to evaluate the effectiveness of the possible actions and strategies for improvement. The model is based on a procedure of trichotomic segmentation, developed with MCDA (Multi-Criteria Decision Aid), which uses the outranking relation to compare the project with some evaluation profiles taken as projects of reference. An application of the model in the teaching field will also be described.
series DDSS
last changed 2003/08/07 16:36

_id ddss9210
id ddss9210
authors Poortman, E.R.
year 1993
title Ratios for cost control
source Timmermans, Harry (Ed.), Design and Decision Support Systems in Architecture (Proceedings of a conference held in Mierlo, the Netherlands in July 1992), ISBN 0-7923-2444-7
summary The design of buildings takes place in phases representing a development from rough to precision planning. Estimates are made in order to test whether the result is still within the budget set by the client or developer. In this way, the decisions taken during the design phase can be quantified and expressed in monetary terms. To prevent blaming the wrong person when an overrun is discovered, the cost control process has to be improved. For that purpose, two new procedures have been developed: (i) a new translation activity; and (ii) ratios by which quantities can be characterized. 'Translation is the opposite of estimation. A monetary budget is converted -'translated' - into quantities, reflecting the desired quality of the building materials. The financial constraints of the client are thus converted into quantities - the building components used by the designers. Characteristic quantity figures play an important role in this activity. In working out an estimate, the form factor (i.e., the ratio between two characteristic values of a building component) has to be determined. The unit cost is then tested against that ratio. The introduction of the 'translation' activity and the use of characteristic quantity figures and form factors enhance existing estimation methods. By implementing these procedures, cost control becomes considerably more reliable.
series DDSS
last changed 2003/08/07 16:36

_id 4075
authors Rahman, O. M. A.
year 1992
title Visual quality and response assessment: an experimental technique
source Environment and Planning B: Planning and Design 19, pp. 689-708
summary Contributed by Susan Pietsch (spietsch@arch.adelaide.edu.au)
keywords 3D City Modeling, Development Control, Design Control
series other
last changed 2001/06/04 20:41

_id 2db4
authors Schmitt, Gerhard
year 1992
title Design for Performance
source New York: John Wiley & Sons, 1992. pp. 83-100 : ill. includes bibliography Design for performance describes a generative approach toward fulfilling qualitative and quantitative design requirements based on specification and existing cases. The term design applies to the architectural domain: the term performance includes the aesthetic, quantitative, and qualitative behavior of an artifact. In achieving architectural quality while adhering to measurable criteria, design for performance has representational, computational, and practical advantages over traditional methods, in particular over post-facto single- and multicriteria analysis and evaluation. In this paper a proposal for a working model and a partial implementation of this model are described. architecture / evaluation / performance / synthesis / design / representation / prediction / integration. Ô h)0*0*0*°° ÔŒ21. Schneekloth, Lynda H., Rajendra K. Jain and Gary E. Day. 'Wind Study of Pedestrian Environments.' February, 1989. 30, [2] p. : ill. includes bibliography and index.
summary This report summarizes Part 1 of the research on wind conditions affecting pedestrian environments for the State University of New York at Buffalo. Part 1 reports on existing conditions in the main part of the North Campus in Amherst. Procedures and methods are outlined, the profile of the current situation reported, and a special study on the proposed Natural Science and Math Building are included
keywords architecture, research, evaluation, analysis, simulation, hardware
series CADline
last changed 1999/02/12 15:09

_id 0719
authors Shiffer, M.J.
year 1992
title Towards a collaborative planning system
source Environment and Planning B, Volume 19, 1992, pp. 709-722
summary This article begins by exploring the problem of combining the elements of group cognition, access to media, and access to tools into a holistic planning process. It then discusses a way in which technology can be used to help combine these activities by incorporating graphical interfaces, associative information structuring, and computer-supported collaborative work into a microcomputer-based Collaborative Planning System (CPS). Methods for the development of a CPS are proposed and two systems are explored as examples. It is concluded that increased access to relevant information, aided by the implementation of a CPS, can ultimately lead to greater communication amongst participants in a group planning situation. This will ultimately have a positive effect on the quality of plans and decisions.
series journal paper
last changed 2003/04/23 15:50

_id 25b7
authors Smeltzer, G., Roelen, W. and Maver, T.W.
year 1992
title Design Modelling and Design Presentation From a Computer-Generated Image Towards a Multi-user Design System
doi https://doi.org/10.52842/conf.ecaade.1992.137
source CAAD Instruction: The New Teaching of an Architect? [eCAADe Conference Proceedings] Barcelona (Spain) 12-14 November 1992, pp. 137-144
summary CAD systems regularly offer new techniques for the presentation of design proposals like computer-generated (stereo-) images, animations, holography and virtual reality. These techniques are mainly used for the presentation of a final design or for the presentation of buildings that have already been constructed. As in the course of time the quality of the CAD systems and their users have improved enormously, it is also possible to use these systems for the evaluation of several temporary design proposals during the design process. Since 'beautiful pictures' and 'wonderful animations' have already shown their great value when presenting a design, it is sometimes as if CAD systems are considered suitable for this propose only. Even new techniques like virtual reality systems seem to be valued only through the 'tinted glasses' of the presentation capabilities. Hardly any attention is paid to the possibilities that these new techniques offer as an instrument to support modelling and evaluation during the design process. This article will outline the results of research and development in the field of virtual reality. Virtual reality systems are based on the combination of a number of already existing presentation techniques like photo-realistic images, stereo images and real time animations. The added value of this type of CAD system is determined by the use of a new type of user interface. In effect this interface consists of sensors that register how its user moves and looks around. Through this, and by using a so- called 'eye phone' (comparable to stereo headphones for sound) the user, with some imaginative powers, thinks he is standing in the environment that he modelled, or in front of his building design. After this we will first sketch the outlines of some presentation techniques, that can also be found in a virtual reality system. Special attention will be paid to some specific characteristics of these techniques themselves. Next, a more detailed description will be given of virtual reality systems, focusing on the system that is being developed at Calibre itself.

series eCAADe
email
last changed 2022/06/07 07:56

_id 1b31
authors Stöckli, Tobi
year 1992
title THE MEASURABLE AND THE UNMEASURABLE OR - FROM FORM TO DESIGN TO EXISTANCE
source Proceedings of the 4rd European Full-Scale Modelling Conference / Lausanne (Switzerland) 9-12 September 1992, Part B, pp. 55-62
summary This article discusses the architectural design process from two sides of the spectrum: the formal exercises of experts and the participatory process involving users. The "place" of the full-scale-modelling laboratory at the Federal Institute of Technology in Lausanne is then assessed with respect to this spectrum. It may seem that activities in a full-scale laboratory are closer to the participation process than to formal exercises. However, activities of the full-scale laboratory in Lausanne may best be situated around the middle of the design process. It is clearly within the realm of the measurable (since each construction can easily be measured.) Yet, it does not quite correspond to the real building; it remains an abstraction, a model. And in this quality of abstraction lies the potential to give form to the unmeasurable. It is a tool which allows a transformation of the unmeasurable aspects of an idea into the unmeasurable of existence.
keywords Full-scale Modeling, Model Simulation, Real Environments
series other
type normal paper
email
more http://info.tuwien.ac.at/efa
last changed 2004/05/04 15:40

_id a89d
authors Wiederhold, G.
year 1992
title Mediators in the Architecture of Future Information Systems
source IEEE Computer 25, no. 3: 38-48
summary The installation of high-speed networks using optical fiber and high bandwidth messsage forwarding gateways is changing the physical capabilities of information systems. These capabilities must be complemented with corresponding software systems advances to obtain a real benefit. Without smart software we will gain access to more data, but not improve access to the type and quality of information needed for decision making. To develop the concepts needed for future information systems we model information processing as an interaction of data and knowledge. This model provides criteria for a high-level functional partitioning. These partitions are mapped into information processing modules. The modules are assigned to nodes of the distributed information systems. A central role is assigned to modules that mediate between the users' workstations and data resources. Mediators contain the administrative and technical knowledge to create information needed for decision-making.
series journal paper
last changed 2003/04/23 15:14

_id 8aab
authors Wiezel, Avi and Becker, Rachel
year 1992
title Integration of Performance Evaluation in Computer Aided Design
source New York: John Wiley & Sons, 1992. pp. 171-181 : ill. includes bibliography
summary An integrated computerized system for evaluation of the overall performance of a building was developed. The system exemplifies the capability of appropriate CAD techniques to upgrade the decision making process and the quality of the design. This paper describes the specific problems arising from the integration of the performance evaluation within the existing CAAD process
keywords CAD, systems, evaluation, civil engineering, integration, performance, building
series CADline
last changed 2003/06/02 13:58

For more results click below:

this is page 0show page 1HOMELOGIN (you are user _anon_762350 from group guest) CUMINCAD Papers Powered by SciX Open Publishing Services 1.002