CumInCAD is a Cumulative Index about publications in Computer Aided Architectural Design
supported by the sibling associations ACADIA, CAADRIA, eCAADe, SIGraDi, ASCAAD and CAAD futures

PDF papers
References

Hits 1 to 20 of 247

_id 9f8a
authors Davidow, William H.
year 1992
title The Virtual Corporation: Structuring and Revitalizing the Corporation for the 21St Century
source New York: Harper Collins Publishers
summary The great value of this timely, important book is that it provides an integrated picture of the customer-driven company of the future. We have begun to learn about lean production technology, stripped-down management, worker empowerment, flexible customized manufacturing, and other modern strategies, but Davidow and Malone show for the first time how these ideas are fitting together to create a new kind of corporation and a worldwide business revolution. Their research is fascinating. The authors provide illuminating case studies of American, Japanese, and European companies that have discovered the keys to improved competitiveness, redesigned their businesses and their business relationships, and made extraordinary gains. They also write bluntly and critically about a number of American corporations that are losing market share by clinging to outmoded thinking. Business success in the global marketplace of the future is going to depend upon corporations producing "virtual" products high in added value, rich in variety, and available instantly in response to customer needs. At the heart of this revolution will be fast new information technologies; increased emphasis on quality; accelerated product development; changing management practices, including new alignments between management and labor; and new linkages between company, supplier, and consumer, and between industry and government. The Virtual Corporation is an important cutting-edge book that offers a creative synthesis of the most influential ideas in modern business theory. It has already fired excitement and debate in industry, academia, and government, and it is essential reading for anyone involved in the leadership of America's business and the shaping of America's economic future.
series other
last changed 2003/04/23 15:14

_id 3ff5
authors Abbo, I.A., La Scalea, L., Otero, E. and Castaneda, L.
year 1992
title Full-Scale Simulations as Tool for Developing Spatial Design Ability
source Proceedings of the 4rd European Full-Scale Modelling Conference / Lausanne (Switzerland) 9-12 September 1992, Part C, pp. 7-10
summary Spatial Design Ability has been defined as the capability to anticipate effects (psychological impressions on potential observers or users) produced by mental manipulation of elements of architectural or urban spaces. This ability, of great importance in choosing the appropriate option during the design process, is not specifically developed in schools of architecture and is partially obtained as a by-product of drawing, designing or architectural criticism. We use our Laboratory as a tool to present spaces to people so that they can evaluate them. By means of a series of exercises, students confront their anticipations with the psychological impressions produced in other people. For this occasion, we present an experience in which students had to propose a space for an exhibition hag in which architectural projects (student thesis) were to be shown. Following the Spatial Design Ability Development Model which we have been using for several years, students first get acquainted with the use of evaluation instruments for psychological impressions as well as with research methodology. In this case, due to the short period available, we reduced research to investigate the effects produced by the manipulation of only 2 independents variables: students manipulated first the form of the roof, walls and interiors elements, secondly, color and texture of those elements. They evaluated spatial quality, character and the other psychological impressions that manipulations produced in people. They used three dimensional scale models 1/10 and 1/1.
keywords Full-scale Modeling, Model Simulation, Real Environments
series other
email
more http://info.tuwien.ac.at/efa
last changed 2003/08/25 10:12

_id 7291
authors Arvesen, Liv
year 1992
title Measures and the Unmeasurable
source Proceedings of the 4rd European Full-Scale Modelling Conference / Lausanne (Switzerland) 9-12 September 1992, Part C, pp. 11-16
summary Nowhere do we ever find a similar environment as the one related to the tea ceremony. We may learn from the teamasters as we may learn from our masters of architecture. Directly and indirectly we are influenced by our surroundings which have been proved by research and which we ourselves experience in our daily life. The full scale experiments have been made on this subject. Related to the nervous mind the experiments were concentrated of form expressing safety and peace.
keywords Full-scale Modeling,Model Simulation, Real Environments
series other
more http://info.tuwien.ac.at/efa
last changed 2003/08/25 10:12

_id 6270
authors Atac, Ibrahim
year 1992
title CAAD Education and Post-Graduate Opportunities (At Mimar Sinan University)
doi https://doi.org/10.52842/conf.ecaade.1992.273
source CAAD Instruction: The New Teaching of an Architect? [eCAADe Conference Proceedings] Barcelona (Spain) 12-14 November 1992, pp. 273-278
summary This paper addresses new design teaching strategies at an important and traditional university in Istanbul, founded as the Academy of Fine Arts 110 years ago. It will include a short review of design education before the Academy changed into a university, and a description of the present situation with regard to computers. Nearly two years ago, CAAD education was introduced as an elective subject. The students show great interest in CAD; most Turkish architects now work with computers and CAAD graphics, although automated architecture has not yet become firmly established. The aim of the CAD studio is also to establish an institute which will allow university staff to develop their own programs and to pursue scientific research in this field. On the basis of rising requests from researchers and students, rapid and healthy developments should be made to keep up with new technologies. As the improvement of the specialized involvement with CAD is the future target, MSU is attempting to broaden its horizon by including design methodologies of the last decades.

series eCAADe
last changed 2022/06/07 07:54

_id ecaadesigradi2019_449
id ecaadesigradi2019_449
authors Becerra Santacruz, Axel
year 2019
title The Architecture of ScarCity Game - The craft and the digital as an alternative design process
doi https://doi.org/10.52842/conf.ecaade.2019.3.045
source Sousa, JP, Xavier, JP and Castro Henriques, G (eds.), Architecture in the Age of the 4th Industrial Revolution - Proceedings of the 37th eCAADe and 23rd SIGraDi Conference - Volume 3, University of Porto, Porto, Portugal, 11-13 September 2019, pp. 45-52
summary The Architecture of ScarCity Game is a board game used as a pedagogical tool that challenges architecture students by involving them in a series of experimental design sessions to understand the design process of scarcity and the actual relation between the craft and the digital. This means "pragmatic delivery processes and material constraints, where the exchange between the artisan of handmade, representing local skills and technology of the digitally conceived is explored" (Huang 2013). The game focuses on understanding the different variables of the crafted design process of traditional communities under conditions of scarcity (Michel and Bevan 1992). This requires first analyzing the spatial environmental model of interaction, available human and natural resources, and the dynamic relationship of these variables in a digital era. In the first stage (Pre-Agency), the game set the concept of the craft by limiting students design exploration from a minimum possible perspective developing locally available resources and techniques. The key elements of the design process of traditional knowledge communities have to be identified (Preez 1984). In other words, this stage is driven by limited resources + chance + contingency. In the second stage (Post-Agency) students taking the architects´ role within this communities, have to speculate and explore the interface between the craft (local knowledge and low technological tools), and the digital represented by computation data, new technologies available and construction. This means the introduction of strategy + opportunity + chance as part of the design process. In this sense, the game has a life beyond its mechanics. This other life challenges the participants to exploit the possibilities of breaking the actual boundaries of design. The result is a tool to challenge conventional methods of teaching and leaning controlling a prescribed design process. It confronts the rules that professionals in this field take for granted. The game simulates a 'fake' reality by exploring in different ways with surveyed information. As a result, participants do not have anything 'real' to lose. Instead, they have all the freedom to innovate and be creative.
keywords Global south, scarcity, low tech, digital-craft, design process and innovation by challenge.
series eCAADeSIGraDi
email
last changed 2022/06/07 07:54

_id 065b
authors Beitia, S.S., Zulueta, A. and Barrallo, J.
year 1995
title The Virtual Cathedral - An Essay about CAAD, History and Structure
doi https://doi.org/10.52842/conf.ecaade.1995.355
source Multimedia and Architectural Disciplines [Proceedings of the 13th European Conference on Education in Computer Aided Architectural Design in Europe / ISBN 0-9523687-1-4] Palermo (Italy) 16-18 November 1995, pp. 355-360
summary The Old Cathedral of Santa Maria in Vitoria is the most representative building of the Gothic style in the Basque Country. Built during the XIV century, it has been closed to the cult in 1994 because of the high risk of collapse that presents its structure. This closure was originated by the structural analysis that was entrusted to the University of the Basque Country in 1992. The topographic works developed in the Cathedral to elaborate the planimetry of the temple revealed that many structural elements of great importance like arches, buttresses and flying buttresses were removed, modified or added along the history of Santa Maria. The first structural analysis made in the church suggested that the huge deformations showed in the resistant elements, specially the piers, were originated by interventions made in the past. A deep historical investigation allowed us to know how the Cathedral was built and the changes executed until our days. With this information, we started the elaboration of a virtual model of the Cathedral of Santa Maria. This model was introduced into a Finite Elements Method system to study the deformations suffered in the church during its construction in the XIV century, and the intervention made later in the XV, XVI and XX centuries. The efficiency of the virtual model simulating the geometry of the Cathedral along history allowed us to detect the cause of the structural damage, that was finally found in many unfortunate interventions along time.

series eCAADe
more http://dpce.ing.unipa.it/Webshare/Wwwroot/ecaade95/Pag_43.htm
last changed 2022/06/07 07:54

_id 2cb4
authors Bille, Pia
year 1992
title CAD at the AAA
doi https://doi.org/10.52842/conf.ecaade.1992.279
source CAAD Instruction: The New Teaching of an Architect? [eCAADe Conference Proceedings] Barcelona (Spain) 12-14 November 1992, pp. 279-288
summary Teaching computer science at the Aarhus School of Architecture goes back as far as to the beginning of the 80’s, when a few teachers and students were curious towards the new media seeing its great developing perspectives and its possible use in the design of architecture. The curiosity and excitement about technology continued, although the results were modest and the usefulness not a dominant aspect in this early period. In the middle of the 80’s the School of Architecture was given the opportunity by means of state funding to buy the first 10 IBM PC's to run AutoCad among other programmes. Beside this a bigger CAD-system Gable 4D Series was introduced running on MicroVax Workstations. The software was dedicated to drafting buildings in 2 and 3 dimensions - an important task within the profession of architects.

series eCAADe
email
last changed 2022/06/07 07:52

_id 0ad8
authors Candy, E., Maver, T.W. and Petric, J.
year 1992
title A Multi-Media Celebration of Robert Adam's Glasgow Architecture
doi https://doi.org/10.52842/conf.ecaade.1992.043
source CAAD Instruction: The New Teaching of an Architect? [eCAADe Conference Proceedings] Barcelona (Spain) 12-14 November 1992, pp. 43-54
summary This paper is a summary of work done in preparation for an exhibition titled "A European Vision: Robert Adam's Glasgow" which marks the bi-centenary of Robert Adam's death. The main contributors to this project, orchestrated over academic sessions 91/92, were the undergraduate and post-graduate students from the Department of Architecture, University of Strathclyde, Glasgow.
series eCAADe
email
last changed 2022/06/07 07:54

_id 91c4
authors Checkland, P.
year 1981
title Systems Thinking, Systems Practice
source John Wiley & Sons, Chichester
summary Whether by design, accident or merely synchronicity, Checkland appears to have developed a habit of writing seminal publications near the start of each decade which establish the basis and framework for systems methodology research for that decade."" Hamish Rennie, Journal of the Operational Research Society, 1992 Thirty years ago Peter Checkland set out to test whether the Systems Engineering (SE) approach, highly successful in technical problems, could be used by managers coping with the unfolding complexities of organizational life. The straightforward transfer of SE to the broader situations of management was not possible, but by insisting on a combination of systems thinking strongly linked to real-world practice Checkland and his collaborators developed an alternative approach - Soft Systems Methodology (SSM) - which enables managers of all kinds and at any level to deal with the subtleties and confusions of the situations they face. This work established the now accepted distinction between hard systems thinking, in which parts of the world are taken to be systems which can be engineered, and soft systems thinking in which the focus is on making sure the process of inquiry into real-world complexity is itself a system for learning. Systems Thinking, Systems Practice (1981) and Soft Systems Methodology in Action (1990) together with an earlier paper Towards a Systems-based Methodology for Real-World Problem Solving (1972) have long been recognized as classics in the field. Now Peter Checkland has looked back over the three decades of SSM development, brought the account of it up to date, and reflected on the whole evolutionary process which has produced a mature SSM. SSM: A 30-Year Retrospective, here included with Systems Thinking, Systems Practice closes a chapter on what is undoubtedly the most significant single research programme on the use of systems ideas in problem solving. Now retired from full-time university work, Peter Checkland continues his research as a Leverhulme Emeritus Fellow. "
series other
last changed 2003/04/23 15:14

_id e412
authors Fargas, Josep and Papazian, Pegor
year 1992
title Modeling Regulations and Intentions for Urban Development: The Role of Computer Simulation in the Urban Design Studio
doi https://doi.org/10.52842/conf.ecaade.1992.201
source CAAD Instruction: The New Teaching of an Architect? [eCAADe Conference Proceedings] Barcelona (Spain) 12-14 November 1992, pp. 201-212
summary In this paper we present a strategy for modeling urban development in order to study the role of urban regulations and policies in the transformation of cities. We also suggest a methodology for using computer models as experimental tools in the urban design studio in order to make explicit the factors involved in shaping cities, and for the automatic visualization of projected development. The structure of the proposed model is based on different modules which represent, on the one hand, the rules regulating the physical growth of a city and, on the other hand, heuristics corresponding to different interests such as Real Estate Developers, City Hall Planners, Advocacy and Community Groups, and so on. Here we present a case study dealing with the Boston Redevelopment Authority zoning code for the Midtown Cultural District of Boston. We introduce a computer program which develops the district, adopting a particular point of view regarding urban regulation. We then generalize the notion of this type of computer modeling and simulation, and draw some conclusions about its possible uses in the teaching and practice of design.
series eCAADe
email
last changed 2022/06/07 07:55

_id ddss9214
id ddss9214
authors Friedman, A.
year 1993
title A decision-making process for choice of a flexible internal partition option in multi-unit housing using decision theory techniques
source Timmermans, Harry (Ed.), Design and Decision Support Systems in Architecture (Proceedings of a conference held in Mierlo, the Netherlands in July 1992), ISBN 0-7923-2444-7
summary Recent demographic changes have increased the heterogeneity of user groups in the North American housing market. Smaller households (e.g. elderly, single parent) have non-traditional spatial requirements that cannot be accommodated within the conventional house layout. This has created renewed interest in Demountable/Flexible internal partition systems. However, the process by which designers decide which project or user groups are most suited for the use of these systems is quite often complex, non-linear, uncertain and dynamic, since the decisions involve natural processes and human values that are apparently random. The anonymity of users when mass housing projects are conceptualized, and the uncertainty as to the alternative to be selected by the user, given his/her constantly changing needs, are some contributing factors to this effect. Decision Theory techniques, not commonly used by architects, can facilitate the decision-making process through a systematic evaluation of alternatives by means of quantitative methods in order to reduce uncertainty in probabilistic events or in cases when data is insufficient. The author used Decision Theory in the selection of flexible partition systems. The study involved a multi-unit, privately initiated housing project in Montreal, Canada, where real site conditions and costs were used. In this paper, the author outlines the fundamentals of Decision Theory and demonstrates the use of Expected Monetary Value and Weighted Objective Analysis methods and their outcomes in the design of a Montreal housing project. The study showed that Decision Theory can be used as an effective tool in housing design once the designer knows how to collect basic data.
series DDSS
last changed 2003/08/07 16:36

_id 7ce5
authors Gal, Shahaf
year 1992
title Computers and Design Activities: Their Mediating Role in Engineering Education
source Sociomedia, ed. Edward Barret. MIT Press
summary Sociomedia: With all the new words used to describe electronic communication (multimedia, hypertext, cyberspace, etc.), do we need another one? Edward Barrett thinks we do; hence, he coins the term "sociomedia." It is meant to displace a computing economy in which technicity is hypostasized over sociality. Sociomedia, a compilation of twenty-five articles on the theory, design and practice of educational multimedia and hypermedia, attempts to re-value the communicational face of computing. Value, of course, is "ultimately a social construct." As such, it has everything to do with knowledge, power, education and technology. The projects discussed in this book represent the leading edge of electronic knowledge production in academia (not to mention major funding) and are determining the future of educational media. For these reasons, Sociomedia warrants close inspection. Barrett's introduction sets the tone. For him, designing computer media involves hardwiring a mechanism for the social construction of knowledge (1). He links computing to a process of social and communicative interactivity for constructing and desseminating knowledge. Through a mechanistic mapping of the university as hypercontext (a huge network that includes classrooms as well as services and offices), Barrett models intellectual work in such a way as to avoid "limiting definitions of human nature or human development." Education, then, can remain "where it should be--in the human domain (public and private) of sharing ideas and information through the medium of language." By leaving education in a virtual realm (where we can continue to disagree about its meaning and execution), it remains viral, mutating and contaminating in an intellectually healthy way. He concludes that his mechanistic model, by means of its reductionist approach, preserves value (7). This "value" is the social construction of knowledge. While I support the social orientation of Barrett's argument, discussions of value are related to power. I am not referring to the traditional teacher-student power structure that is supposedly dismantled through cooperative and constructivist learning strategies. The power to be reckoned with in the educational arena is foundational, that which (pre)determines value and the circulation of knowledge. "Since each of you reading this paragraph has a different perspective on the meaning of 'education' or 'learning,' and on the processes involved in 'getting an education,' think of the hybris in trying to capture education in a programmable function, in a displayable object, in a 'teaching machine'" (7). Actually, we must think about that hybris because it is, precisely, what informs teaching machines. Moreover, the basic epistemological premises that give rise to such productions are too often assumed. In the case of instructional design, the episteme of cognitive sciences are often taken for granted. It is ironic that many of the "postmodernists" who support electronic hypertextuality seem to have missed Jacques Derrida's and Michel Foucault's "deconstructions" of the epistemology underpinning cognitive sciences (if not of epistemology itself). Perhaps it is the glitz of the technology that blinds some users (qua developers) to the belief systems operating beneath the surface. Barrett is not guilty of reactionary thinking or politics; he is, in fact, quite in line with much American deconstructive and postmodern thinking. The problem arises in that he leaves open the definitions of "education," "learning" and "getting an education." One cannot engage in the production of new knowledge without orienting its design, production and dissemination, and without negotiating with others' orientations, especially where largescale funding is involved. Notions of human nature and development are structural, even infrastructural, whatever the medium of the teaching machine. Although he addresses some dynamics of power, money and politics when he talks about the recession and its effects on the conference, they are readily visible dynamics of power (3-4). Where does the critical factor of value determination, of power, of who gets what and why, get mapped onto a mechanistic model of learning institutions? Perhaps a mapping of contributors' institutions, of the funding sources for the projects showcased and for participation in the conference, and of the disciplines receiving funding for these sorts of projects would help visualize the configurations of power operative in the rising field of educational multimedia. Questions of power and money notwithstanding, Barrett's introduction sets the social and textual thematics for the collection of essays. His stress on interactivity, on communal knowledge production, on the society of texts, and on media producers and users is carried foward through the other essays, two of which I will discuss. Section I of the book, "Perspectives...," highlights the foundations, uses and possible consequences of multimedia and hypertextuality. The second essay in this section, "Is There a Class in This Text?," plays on the robust exchange surrounding Stanley Fish's book, Is There a Text in This Class?, which presents an attack on authority in reading. The author, John Slatin, has introduced electronic hypertextuality and interaction into his courses. His article maps the transformations in "the content and nature of work, and the workplace itself"-- which, in this case, is not industry but an English poetry class (25). Slatin discovered an increase of productive and cooperative learning in his electronically- mediated classroom. For him, creating knowledge in the electronic classroom involves interaction between students, instructors and course materials through the medium of interactive written discourse. These interactions lead to a new and persistent understanding of the course materials and of the participants' relation to the materials and to one another. The work of the course is to build relationships that, in my view, constitute not only the meaning of individual poems, but poetry itself. The class carries out its work in the continual and usually interactive production of text (31). While I applaud his strategies which dismantle traditional hierarchical structures in academia, the evidence does not convince me that the students know enough to ask important questions or to form a self-directing, learning community. Stanley Fish has not relinquished professing, though he, too, espouses the indeterminancy of the sign. By the fourth week of his course, Slatin's input is, by his own reckoning, reduced to 4% (39). In the transcript of the "controversial" Week 6 exchange on Gertrude Stein--the most disliked poet they were discussing at the time (40)--we see the blind leading the blind. One student parodies Stein for three lines and sums up his input with "I like it." Another, finds Stein's poetry "almost completey [sic] lacking in emotion or any artistic merit" (emphasis added). On what grounds has this student become an arbiter of "artistic merit"? Another student, after admitting being "lost" during the Wallace Steven discussion, talks of having more "respect for Stevens' work than Stein's" and adds that Stein's poetry lacks "conceptual significance[, s]omething which people of varied opinion can intelligently discuss without feeling like total dimwits...." This student has progressed from admitted incomprehension of Stevens' work to imposing her (groundless) respect for his work over Stein's. Then, she exposes her real dislike for Stein's poetry: that she (the student) missed the "conceptual significance" and hence cannot, being a person "of varied opinion," intelligently discuss it "without feeling like [a] total dimwit." Slatin's comment is frightening: "...by this point in the semester students have come to feel increasingly free to challenge the instructor" (41). The students that I have cited are neither thinking critically nor are their preconceptions challenged by student-governed interaction. Thanks to the class format, one student feels self-righteous in her ignorance, and empowered to censure. I believe strongly in student empowerment in the classroom, but only once students have accrued enough knowledge to make informed judgments. Admittedly, Slatin's essay presents only partial data (there are six hundred pages of course transcripts!); still, I wonder how much valuable knowledge and metaknowledge was gained by the students. I also question the extent to which authority and professorial dictature were addressed in this course format. The power structures that make it possible for a college to require such a course, and the choice of texts and pedagogy, were not "on the table." The traditional professorial position may have been displaced, but what took its place?--the authority of consensus with its unidentifiable strong arm, and the faceless reign of software design? Despite Slatin's claim that the students learned about the learning process, there is no evidence (in the article) that the students considered where their attitudes came from, how consensus operates in the construction of knowledge, how power is established and what relationship they have to bureaucratic insitutions. How do we, as teaching professionals, negotiate a balance between an enlightened despotism in education and student-created knowledge? Slatin, and other authors in this book, bring this fundamental question to the fore. There is no definitive answer because the factors involved are ultimately social, and hence, always shifting and reconfiguring. Slatin ends his article with the caveat that computerization can bring about greater estrangement between students, faculty and administration through greater regimentation and control. Of course, it can also "distribute authority and power more widely" (50). Power or authority without a specific face, however, is not necessarily good or just. Shahaf Gal's "Computers and Design Activities: Their Mediating Role in Engineering Education" is found in the second half of the volume, and does not allow for a theory/praxis dichotomy. Gal recounts a brief history of engineering education up to the introduction of Growltiger (GT), a computer-assisted learning aid for design. He demonstrates GT's potential to impact the learning of engineering design by tracking its use by four students in a bridge-building contest. What his text demonstrates clearly is that computers are "inscribing and imaging devices" that add another viewpoint to an on-going dialogue between student, teacher, earlier coursework, and other teaching/learning tools. The less proficient students made a serious error by relying too heavily on the technology, or treating it as a "blueprint provider." They "interacted with GT in a way that trusted the data to represent reality. They did not see their interaction with GT as a negotiation between two knowledge systems" (495). Students who were more thoroughly informed in engineering discourses knew to use the technology as one voice among others--they knew enough not simply to accept the input of the computer as authoritative. The less-advanced students learned a valuable lesson from the competition itself: the fact that their designs were not able to hold up under pressure (literally) brought the fact of their insufficient knowledge crashing down on them (and their bridges). They also had, post factum, several other designs to study, especially the winning one. Although competition and comparison are not good pedagogical strategies for everyone (in this case the competitors had volunteered), at some point what we think we know has to be challenged within the society of discourses to which it belongs. Students need critique in order to learn to push their learning into auto-critique. This is what is lacking in Slatin's discussion and in the writings of other avatars of constructivist, collaborative and computer-mediated pedagogies. Obviously there are differences between instrumental types of knowledge acquisition and discoursive knowledge accumulation. Indeed, I do not promote the teaching of reading, thinking and writing as "skills" per se (then again, Gal's teaching of design is quite discursive, if not dialogic). Nevertheless, the "soft" sciences might benefit from "bridge-building" competitions or the re-institution of some forms of agonia. Not everything agonistic is inhuman agony--the joy of confronting or creating a sound argument supported by defensible evidence, for example. Students need to know that soundbites are not sound arguments despite predictions that electronic writing will be aphoristic rather than periodic. Just because writing and learning can be conceived of hypertextually does not mean that rigor goes the way of the dinosaur. Rigor and hypertextuality are not mutually incompatible. Nor is rigorous thinking and hard intellectual work unpleasurable, although American anti-intellectualism, especially in the mass media, would make it so. At a time when the spurious dogmatics of a Rush Limbaugh and Holocaust revisionist historians circulate "aphoristically" in cyberspace, and at a time when knowledge is becoming increasingly textualized, the role of critical thinking in education will ultimately determine the value(s) of socially constructed knowledge. This volume affords the reader an opportunity to reconsider knowledge, power, and new communications technologies with respect to social dynamics and power relationships.
series other
last changed 2003/04/23 15:14

_id ea96
authors Hacfoort, Eek J. and Veldhuisen, Jan K.
year 1992
title A Building Design and Evaluation System
source New York: John Wiley & Sons, 1992. pp. 195-211 : ill. table. includes bibliography
summary Within the field of architectural design there is a growing awareness of imbalance among the professionalism, the experience, and the creativity of the designers' response to the up-to-date requirements of all parties interested in the design process. The building design and evaluating system COSMOS makes it possible for various participants to work within their own domain, so that separated but coordinated work can be done. This system is meant to organize the initial stage of the design process, where user-defined functions, geometry, type of construction, and building materials are decided. It offers a tool to design a building to calculate a number of effects and for managing the information necessary to evaluate the design decisions. The system is provided with data and sets of parameters for describing the conditions, along with their properties, of the main building functions of a selection of well-known building types. The architectural design is conceptualized as being a hierarchy of spatial units, ranking from building blocks down to specific rooms or spaces. The concept of zoning is used as a means of calculating and directly evaluating the structure of the design without working out the details. A distinction is made between internal and external calculations and evaluations during the initial design process. During design on screen, an estimation can be recorded of building costs, energy costs, acoustics, lighting, construction, and utility. Furthermore, the design can be exported to a design application program, in this case AutoCAD, to make and show drawings in more detail. Through the medium of a database, external calculation and evaluation of building costs, life-cycle costs, energy costs, interior climate, acoustics, lighting, construction, and utility are possible in much more advanced application programs
keywords evaluation, applications, integration, architecture, design, construction, building, energy, cost, lighting, acoustics, performance
series CADline
last changed 2003/06/02 13:58

_id 7e68
authors Holland, J.
year 1992
title Genetic Algorithms
source Scientific America, July 1992
summary Living organisms are consummate problem solvers. They exhibit a versatility that puts the best computer programs to shame. This observation is especially galling for computer scientists, who may spend months or years of intellectual effort on an algorithm, whereas organisms come by their abilities through the apparently undirected mechanism of evolution and natural selection. Pragmatic researchers see evolution's remarkable power as something to be emulated rather than envied. Natural selection eliminates one of the greatest hurdles in software design: specifying in advance all the features of a problem and the actions a program should take to deal with them. By harnessing the mechanisms of evolution, researchers may be able to "breed" programs that solve problems even when no person can fully understand their structure. Indeed, these so-called genetic algorithms have already demonstrated the ability to made breakthroughs in the design of such complex systems as jet engines. Genetic algorithms make it possible to explore a far greater range of potential solutions to a problem than do conventional programs. Furthermore, as researchers probe the natural selection of programs under controlled an well-understood conditions, the practical results they achieve may yield some insight into the details of how life and intelligence evolve in the natural world.
series journal paper
last changed 2003/04/23 15:50

_id 56e9
authors Huang, Tao-Kuang
year 1992
title A Graphical Feedback Model for Computerized Energy Analysis during the Conceptual Design Stage
source Texas A&M University
summary During the last two decades, considerable effort has been placed on the development of building design analysis tools. Architects and designers have begun to take advantage of computers to generate and examine design alternatives. However, because it has been difficult to adapt computer technologies to the visual orientation of the building designer, the majority of computer applications have been limited to numerical analysis and office automation tasks. Only recently, because of advances in hardware and software techniques, computers have entered into a new phase in the development of architectural design. haveters are now able to interactively display graphics solutions to architectural related problems, which is fundamental to the design process. The majority of research programs in energy efficient design have sharpened people's understanding of energy principles and their application of those principles. Energy conservation concepts, however, have not been widely used. A major problem in the implementation of these principles is that energy principles their applications are abstract, hard to visualize and separated from the architectural design process. Furthermore, one aspect of energy analysis may contain thousands of pieces of numerical information which often leads to confusion on the part of designers. If these difficulties can be overcome, it would bring a great benefit to the advancement of energy conservation concepts. This research explores the concept of an integrated computer graphics program to support energy efficient design. It focuses on (1) the integration of energy efficiently and architectural design, and (2) the visualization of building energy use through graphical interfaces during the conceptual design stage. It involves (1) the discussion of frameworks of computer-aided architectural design and computer-aided energy efficient building design, and (2) the development of an integrated computer prototype program with a graphical interface that helps the designer create building layouts, analyze building energy interactively and receive visual feedbacks dynamically. The goal is to apply computer graphics as an aid to visualize the effects of energy related decisions and therefore permit the designer to visualize and understand energy conservation concepts in the conceptual phase of architectural design.
series thesis:PhD
last changed 2003/02/12 22:37

_id 2467
authors Jockusch, Peter R.A.
year 1992
title How Can We Achieve a Good Building?
source New York: John Wiley & Sons, 1992. pp. 51-65 : ill. includes bibliography
summary This paper is concerned with the reasons and purposes for which we evaluate and predict building performance. The discussion is based on the author's experience, gained through the preparation and evaluation of more than 50 major architectural competitions
keywords An attempt is made to discover for whom and in what respect a building can be considered a 'good building,' by asking the following questions: What can prediction and evaluation of building performance achieve? How well can we assess the performance and value of an existing building within its socio-technical context? For what purposes and with what degree of confidence can the eventual performance of a designed and specified building be predicted? How do these evaluations compare to actual post occupancy performance? To what extent do the roles and motivations of assessors, evaluators, and decision makers affect the value-stating process? prediction, evaluation, performance, building, life cycle, design, architecture
series CADline
last changed 2003/06/02 13:58

_id c5d7
authors Kuffer, Monika
year 2003
title Monitoring the Dynamics of Informal Settlements in Dar Es Salaam by Remote Sensing: Exploring the Use of Spot, Ers and Small Format Aerial Photography
source CORP 2003, Vienna University of Technology, 25.2.-28.2.2003 [Proceedings on CD-Rom]
summary Dar es Salaam is exemplary for cities in the developing world facing an enormous population growth. In the last decades, unplanned settlements have tremendously expanded, causing that around 70 percent of the urban dwellers are living now-a-days in these areas. Tools for monitoring such tremendous growth are relatively weak in developing countries, thus an effective satellite based monitoring system can provide a useful instrument for monitoring the dynamics of urban development. An investigation to asses the ability of extracting reliable information on the expansion and consolidation levels (density) of urban development of the city of Dar es Salaam from SPOT-HRV and ERS-SAR images is described. The use of SPOT and ERS should provide data that is complementary to data derived from the most recent aerial photography and from digital topographic maps. In a series of experiments various classification and fusion techniques are applied to the SPOT-HRV and ERS-SAR data to extract information on building density that is comparable to that obtained from the 1992 data. Ultimately, building density is estimated by linear and non-linear regression models on the basis of an one ha kernel and further aggregation is made to the level of informal settlements for a final analysis. In order to assess the reliability, use is made of several sample areas that are relatively stable over the study period, as well as, of data derived from small format aerial photography. The experiments show a high correlation between the density data derived from the satellite images and the test areas.
series other
email
last changed 2003/03/11 20:39

_id caadria2024_365
id caadria2024_365
authors Lahtinen, Aaro, Gardner, Nicole, Ramos Jaime, Cristina and Yu, Kuai
year 2024
title Visualising Sydney's Urban Green: A Web Interface for Monitoring Vegetation Coverage between 1992 and 2022 using Google Earth Engine
doi https://doi.org/10.52842/conf.caadria.2024.2.515
source Nicole Gardner, Christiane M. Herr, Likai Wang, Hirano Toshiki, Sumbul Ahmad Khan (eds.), ACCELERATED DESIGN - Proceedings of the 29th CAADRIA Conference, Singapore, 20-26 April 2024, Volume 2, pp. 515–524
summary With continued population growth and urban expansion, the severity of environmental concerns within cities is likely to increase without proper urban ecosystem monitoring and management. Despite this, limited efforts have been made to effectively communicate the ecological value of urban vegetation to Architecture, Engineering and Construction (AEC) professionals concerned with mitigating these effects and improving urban liveability. In response, this research project proposes a novel framework for identifying and conveying historical changes to vegetation coverage within the Greater Sydney area between 1992 and 2022. The cloud-based geo-spatial analysis platform, Google Earth Engine (GEE), was used to construct an accurate land cover classification of Landsat imagery, allowing the magnitude, spatial configuration, and period of vegetation loss to be promptly identified. The outcomes of this analysis are represented through an intuitive web platform that facilitates a thorough understanding of the complex relationships between anthropogenic activities and vegetation coverage. A key finding indicated that recent developments in the Blacktown area had directly contributed to heightened land surface temperature, suggesting a reformed approach to urban planning is required to address climatic concerns appropriately. The developed web interface provides a unique method for AEC professionals to assess the effectiveness of past planning strategies, encouraging a multi-disciplinary approach to urban ecosystem management.
keywords Urban Vegetation, Web Interface, Landsat Imagery, Land Cover Classification, Google Earth Engine
series CAADRIA
email
last changed 2024/11/17 22:05

_id 46c7
id 46c7
authors Ozel, Filiz
year 1992
title Data Modeling Needs of Life Safety Code (LSC) Compliance Applications
doi https://doi.org/10.52842/conf.acadia.1992.177
source Mission - Method - Madness [ACADIA Conference Proceedings / ISBN 1-880250-01-2] 1992, pp. 177-185
summary One of the most complex code compliance issues originates from the conformance of designs to Life Safety Code (NFPA 101). The development of computer based code compliance checking programs attracted the attention of building researchers and practitioners alike. These studies represent a number of approaches ranging from CAD based procedural approaches to rule based, non graphic ones, but they do not address the interaction of the rule base of such systems with graphic data bases that define the geometry of architectural objects. Automatic extraction of the attributes and the configuration of building systems requires 11 architectural object - graphic entity" data models that allow access and retrieval of the necessary data for code compliance checking. This study aims to specifically focus on the development of such a data model through the use of AutoLISP feature of AutoCAD (Autodesk Inc.) graphic system. This data model is intended to interact with a Life Safety Code rule base created through Level5-Object (Focus Inc.) expert system.

Assuming the availability of a more general building data model, one must define life and fire safety features of a building before any automatic checking can be performed. Object oriented data structures are beginning to be applied to design objects, since they allow the type versatility demanded by design applications. As one generates a functional view of the main data model, the software user must provide domain specific information. A functional view is defined as the process of generating domain specific data structures from a more general purpose data model, such as defining egress routes from wall or room object data structure. Typically in the early design phase of a project, these are related to the emergency egress design features of a building. Certain decisions such as where to provide sprinkler protection or the location of protected egress ways must be made early in the process.

series ACADIA
email
last changed 2022/06/07 08:00

_id 427b
authors Ozel, Filiz
year 1993
title A Computerized Fire Safety Evaluation System for Business Occupancies
source CAAD Futures ‘93 [Conference Proceedings / ISBN 0-444-89922-7] (Pittsburgh / USA), 1993, pp. 241-251
summary The development of computer-based code compliance checking programs has been the focus of many studies. While some of these investigated the procedural aspects of building codes, others focused more on their rule base. On the other hand, due to the complexity of the codes, the process of identifying which sections apply to a given problem, and in which order to access them requires a meta-knowledge structuring system. National Fire Protection Association (NFPA) 101M, Alternative Approaches to Life Safety (1992) provides a framework through which code sections can be systematically accessed by means of a set of checklists. The study presented here primarily focuses on the development of a computer based fire safety code checking system called ARCHCode/Business for business occupancies following the guidelines and the methodology described in Chapter 7 of NFPA 101M.
keywords Fire Safety Expert System, Business Occupancies, CAD Interface
series CAAD Futures
last changed 1999/04/07 12:03

For more results click below:

this is page 0show page 1show page 2show page 3show page 4show page 5... show page 12HOMELOGIN (you are user _anon_454038 from group guest) CUMINCAD Papers Powered by SciX Open Publishing Services 1.002