CumInCAD is a Cumulative Index about publications in Computer Aided Architectural Design
supported by the sibling associations ACADIA, CAADRIA, eCAADe, SIGraDi, ASCAAD and CAAD futures

PDF papers
References

Hits 1 to 20 of 247

_id cef3
authors Bridges, Alan H.
year 1992
title Computing and Problem Based Learning at Delft University of Technology Faculty of Architecture
source CAAD Instruction: The New Teaching of an Architect? [eCAADe Conference Proceedings] Barcelona (Spain) 12-14 November 1992, pp. 289-294
doi https://doi.org/10.52842/conf.ecaade.1992.289
summary Delft University of Technology, founded in 1842, is the oldest and largest technical university in the Netherlands. It provides education for more than 13,000 students in fifteen main subject areas. The Faculty of Architecture, Housing, Urban Design and Planning is one of the largest faculties of the DUT with some 2000 students and over 500 staff members. The course of study takes four academic years: a first year (Propaedeuse) and a further three years (Doctoraal) leading to the "ingenieur" qualification. The basic course material is delivered in the first two years and is taken by all students. The third and fourth years consist of a smaller number of compulsory subjects in each of the department's specialist areas together with a wide range of option choices. The five main subject areas the students may choose from for their specialisation are Architecture, Building and Project Management, Building Technology, Urban Design and Planning, and Housing.

The curriculum of the Faculty has been radically revised over the last two years and is now based on the concept of "Problem-Based Learning". The subject matter taught is divided thematically into specific issues that are taught in six week blocks. The vehicles for these blocks are specially selected and adapted case studies prepared by teams of staff members. These provide a focus for integrating specialist subjects around a studio based design theme. In the case of second year this studio is largely computer-based: many drawings are produced by computer and several specially written computer applications are used in association with the specialist inputs.

This paper describes the "block structure" used in second year, giving examples of the special computer programs used, but also raises a number of broader educational issues. Introduction of the block system arose as a method of curriculum integration in response to difficulties emerging from the independent functioning of strong discipline areas in the traditional work groups. The need for a greater level of selfdirected learning was recognised as opposed to the "passive information model" of student learning in which the students are seen as empty vessels to be filled with knowledge - which they are then usually unable to apply in design related contexts in the studio. Furthermore, the value of electives had been questioned: whilst enabling some diversity of choice, they may also be seen as diverting attention and resources from the real problems of teaching architecture.

series eCAADe
email
last changed 2022/06/07 07:54

_id 0ac0
authors Coyne, Richard and Newton, Sidney
year 1992
title Metaphors, Computers and Architectural Education
source CAAD Instruction: The New Teaching of an Architect? [eCAADe Conference Proceedings] Barcelona (Spain) 12-14 November 1992, pp. 307-318
doi https://doi.org/10.52842/conf.ecaade.1992.307
summary In this paper we present the case for employing metaphor to explain the impact of technology. This contrasts with the empirical-theoretical method of inquiry. We also contrast two widely held metaphors of architectural education (the EPISTEMOLOGICAL and the COMMUNITY metaphors) and of the role of the computer (the MAINFRAME and the UBIQUITOUS COMPUTING metaphors). We show how in each case both metaphors result in different kinds of decision making in relation to resourcing an architecture school.
series eCAADe
email
last changed 2022/06/07 07:56

_id 9f8a
authors Davidow, William H.
year 1992
title The Virtual Corporation: Structuring and Revitalizing the Corporation for the 21St Century
source New York: Harper Collins Publishers
summary The great value of this timely, important book is that it provides an integrated picture of the customer-driven company of the future. We have begun to learn about lean production technology, stripped-down management, worker empowerment, flexible customized manufacturing, and other modern strategies, but Davidow and Malone show for the first time how these ideas are fitting together to create a new kind of corporation and a worldwide business revolution. Their research is fascinating. The authors provide illuminating case studies of American, Japanese, and European companies that have discovered the keys to improved competitiveness, redesigned their businesses and their business relationships, and made extraordinary gains. They also write bluntly and critically about a number of American corporations that are losing market share by clinging to outmoded thinking. Business success in the global marketplace of the future is going to depend upon corporations producing "virtual" products high in added value, rich in variety, and available instantly in response to customer needs. At the heart of this revolution will be fast new information technologies; increased emphasis on quality; accelerated product development; changing management practices, including new alignments between management and labor; and new linkages between company, supplier, and consumer, and between industry and government. The Virtual Corporation is an important cutting-edge book that offers a creative synthesis of the most influential ideas in modern business theory. It has already fired excitement and debate in industry, academia, and government, and it is essential reading for anyone involved in the leadership of America's business and the shaping of America's economic future.
series other
last changed 2003/04/23 15:14

_id 7ce5
authors Gal, Shahaf
year 1992
title Computers and Design Activities: Their Mediating Role in Engineering Education
source Sociomedia, ed. Edward Barret. MIT Press
summary Sociomedia: With all the new words used to describe electronic communication (multimedia, hypertext, cyberspace, etc.), do we need another one? Edward Barrett thinks we do; hence, he coins the term "sociomedia." It is meant to displace a computing economy in which technicity is hypostasized over sociality. Sociomedia, a compilation of twenty-five articles on the theory, design and practice of educational multimedia and hypermedia, attempts to re-value the communicational face of computing. Value, of course, is "ultimately a social construct." As such, it has everything to do with knowledge, power, education and technology. The projects discussed in this book represent the leading edge of electronic knowledge production in academia (not to mention major funding) and are determining the future of educational media. For these reasons, Sociomedia warrants close inspection. Barrett's introduction sets the tone. For him, designing computer media involves hardwiring a mechanism for the social construction of knowledge (1). He links computing to a process of social and communicative interactivity for constructing and desseminating knowledge. Through a mechanistic mapping of the university as hypercontext (a huge network that includes classrooms as well as services and offices), Barrett models intellectual work in such a way as to avoid "limiting definitions of human nature or human development." Education, then, can remain "where it should be--in the human domain (public and private) of sharing ideas and information through the medium of language." By leaving education in a virtual realm (where we can continue to disagree about its meaning and execution), it remains viral, mutating and contaminating in an intellectually healthy way. He concludes that his mechanistic model, by means of its reductionist approach, preserves value (7). This "value" is the social construction of knowledge. While I support the social orientation of Barrett's argument, discussions of value are related to power. I am not referring to the traditional teacher-student power structure that is supposedly dismantled through cooperative and constructivist learning strategies. The power to be reckoned with in the educational arena is foundational, that which (pre)determines value and the circulation of knowledge. "Since each of you reading this paragraph has a different perspective on the meaning of 'education' or 'learning,' and on the processes involved in 'getting an education,' think of the hybris in trying to capture education in a programmable function, in a displayable object, in a 'teaching machine'" (7). Actually, we must think about that hybris because it is, precisely, what informs teaching machines. Moreover, the basic epistemological premises that give rise to such productions are too often assumed. In the case of instructional design, the episteme of cognitive sciences are often taken for granted. It is ironic that many of the "postmodernists" who support electronic hypertextuality seem to have missed Jacques Derrida's and Michel Foucault's "deconstructions" of the epistemology underpinning cognitive sciences (if not of epistemology itself). Perhaps it is the glitz of the technology that blinds some users (qua developers) to the belief systems operating beneath the surface. Barrett is not guilty of reactionary thinking or politics; he is, in fact, quite in line with much American deconstructive and postmodern thinking. The problem arises in that he leaves open the definitions of "education," "learning" and "getting an education." One cannot engage in the production of new knowledge without orienting its design, production and dissemination, and without negotiating with others' orientations, especially where largescale funding is involved. Notions of human nature and development are structural, even infrastructural, whatever the medium of the teaching machine. Although he addresses some dynamics of power, money and politics when he talks about the recession and its effects on the conference, they are readily visible dynamics of power (3-4). Where does the critical factor of value determination, of power, of who gets what and why, get mapped onto a mechanistic model of learning institutions? Perhaps a mapping of contributors' institutions, of the funding sources for the projects showcased and for participation in the conference, and of the disciplines receiving funding for these sorts of projects would help visualize the configurations of power operative in the rising field of educational multimedia. Questions of power and money notwithstanding, Barrett's introduction sets the social and textual thematics for the collection of essays. His stress on interactivity, on communal knowledge production, on the society of texts, and on media producers and users is carried foward through the other essays, two of which I will discuss. Section I of the book, "Perspectives...," highlights the foundations, uses and possible consequences of multimedia and hypertextuality. The second essay in this section, "Is There a Class in This Text?," plays on the robust exchange surrounding Stanley Fish's book, Is There a Text in This Class?, which presents an attack on authority in reading. The author, John Slatin, has introduced electronic hypertextuality and interaction into his courses. His article maps the transformations in "the content and nature of work, and the workplace itself"-- which, in this case, is not industry but an English poetry class (25). Slatin discovered an increase of productive and cooperative learning in his electronically- mediated classroom. For him, creating knowledge in the electronic classroom involves interaction between students, instructors and course materials through the medium of interactive written discourse. These interactions lead to a new and persistent understanding of the course materials and of the participants' relation to the materials and to one another. The work of the course is to build relationships that, in my view, constitute not only the meaning of individual poems, but poetry itself. The class carries out its work in the continual and usually interactive production of text (31). While I applaud his strategies which dismantle traditional hierarchical structures in academia, the evidence does not convince me that the students know enough to ask important questions or to form a self-directing, learning community. Stanley Fish has not relinquished professing, though he, too, espouses the indeterminancy of the sign. By the fourth week of his course, Slatin's input is, by his own reckoning, reduced to 4% (39). In the transcript of the "controversial" Week 6 exchange on Gertrude Stein--the most disliked poet they were discussing at the time (40)--we see the blind leading the blind. One student parodies Stein for three lines and sums up his input with "I like it." Another, finds Stein's poetry "almost completey [sic] lacking in emotion or any artistic merit" (emphasis added). On what grounds has this student become an arbiter of "artistic merit"? Another student, after admitting being "lost" during the Wallace Steven discussion, talks of having more "respect for Stevens' work than Stein's" and adds that Stein's poetry lacks "conceptual significance[, s]omething which people of varied opinion can intelligently discuss without feeling like total dimwits...." This student has progressed from admitted incomprehension of Stevens' work to imposing her (groundless) respect for his work over Stein's. Then, she exposes her real dislike for Stein's poetry: that she (the student) missed the "conceptual significance" and hence cannot, being a person "of varied opinion," intelligently discuss it "without feeling like [a] total dimwit." Slatin's comment is frightening: "...by this point in the semester students have come to feel increasingly free to challenge the instructor" (41). The students that I have cited are neither thinking critically nor are their preconceptions challenged by student-governed interaction. Thanks to the class format, one student feels self-righteous in her ignorance, and empowered to censure. I believe strongly in student empowerment in the classroom, but only once students have accrued enough knowledge to make informed judgments. Admittedly, Slatin's essay presents only partial data (there are six hundred pages of course transcripts!); still, I wonder how much valuable knowledge and metaknowledge was gained by the students. I also question the extent to which authority and professorial dictature were addressed in this course format. The power structures that make it possible for a college to require such a course, and the choice of texts and pedagogy, were not "on the table." The traditional professorial position may have been displaced, but what took its place?--the authority of consensus with its unidentifiable strong arm, and the faceless reign of software design? Despite Slatin's claim that the students learned about the learning process, there is no evidence (in the article) that the students considered where their attitudes came from, how consensus operates in the construction of knowledge, how power is established and what relationship they have to bureaucratic insitutions. How do we, as teaching professionals, negotiate a balance between an enlightened despotism in education and student-created knowledge? Slatin, and other authors in this book, bring this fundamental question to the fore. There is no definitive answer because the factors involved are ultimately social, and hence, always shifting and reconfiguring. Slatin ends his article with the caveat that computerization can bring about greater estrangement between students, faculty and administration through greater regimentation and control. Of course, it can also "distribute authority and power more widely" (50). Power or authority without a specific face, however, is not necessarily good or just. Shahaf Gal's "Computers and Design Activities: Their Mediating Role in Engineering Education" is found in the second half of the volume, and does not allow for a theory/praxis dichotomy. Gal recounts a brief history of engineering education up to the introduction of Growltiger (GT), a computer-assisted learning aid for design. He demonstrates GT's potential to impact the learning of engineering design by tracking its use by four students in a bridge-building contest. What his text demonstrates clearly is that computers are "inscribing and imaging devices" that add another viewpoint to an on-going dialogue between student, teacher, earlier coursework, and other teaching/learning tools. The less proficient students made a serious error by relying too heavily on the technology, or treating it as a "blueprint provider." They "interacted with GT in a way that trusted the data to represent reality. They did not see their interaction with GT as a negotiation between two knowledge systems" (495). Students who were more thoroughly informed in engineering discourses knew to use the technology as one voice among others--they knew enough not simply to accept the input of the computer as authoritative. The less-advanced students learned a valuable lesson from the competition itself: the fact that their designs were not able to hold up under pressure (literally) brought the fact of their insufficient knowledge crashing down on them (and their bridges). They also had, post factum, several other designs to study, especially the winning one. Although competition and comparison are not good pedagogical strategies for everyone (in this case the competitors had volunteered), at some point what we think we know has to be challenged within the society of discourses to which it belongs. Students need critique in order to learn to push their learning into auto-critique. This is what is lacking in Slatin's discussion and in the writings of other avatars of constructivist, collaborative and computer-mediated pedagogies. Obviously there are differences between instrumental types of knowledge acquisition and discoursive knowledge accumulation. Indeed, I do not promote the teaching of reading, thinking and writing as "skills" per se (then again, Gal's teaching of design is quite discursive, if not dialogic). Nevertheless, the "soft" sciences might benefit from "bridge-building" competitions or the re-institution of some forms of agonia. Not everything agonistic is inhuman agony--the joy of confronting or creating a sound argument supported by defensible evidence, for example. Students need to know that soundbites are not sound arguments despite predictions that electronic writing will be aphoristic rather than periodic. Just because writing and learning can be conceived of hypertextually does not mean that rigor goes the way of the dinosaur. Rigor and hypertextuality are not mutually incompatible. Nor is rigorous thinking and hard intellectual work unpleasurable, although American anti-intellectualism, especially in the mass media, would make it so. At a time when the spurious dogmatics of a Rush Limbaugh and Holocaust revisionist historians circulate "aphoristically" in cyberspace, and at a time when knowledge is becoming increasingly textualized, the role of critical thinking in education will ultimately determine the value(s) of socially constructed knowledge. This volume affords the reader an opportunity to reconsider knowledge, power, and new communications technologies with respect to social dynamics and power relationships.
series other
last changed 2003/04/23 15:14

_id a3f5
authors Zandi-Nia, Abolfazl
year 1992
title Topgene: An artificial Intelligence Approach to a Design Process
source Delft University of Technology
summary This work deals with two architectural design (AD) problems at the topological level and in presence of the social norms community, privacy, circulation-cost, and intervening opportunity. The first problem concerns generating a design with respect to a set of above mentioned norms, and the second problem requires evaluation of existing designs with respect to the same set of norms. Both problems are based on the structural-behavioral relationship in buildings. This work has challenged above problems in the following senses: (1) A working system, called TOPGENE (The TOpological Pattern GENErator) has been developed. (2) Both problems may be vague and may lack enough information in their statement. For example, an AD in the presence of the social norms requires the degrees of interactions between the location pairs in the building. This information is not always implicitly available, and must be explicated from the design data. (3) An AD problem at topological level is intractable with no fast and efficient algorithm for its solution. To reduce the search efforts in the process of design generation, TOPGENE uses a heuristic hill climbing strategy that takes advantage of domain specific rules of thumbs to choose a path in the search space of a design. (4) TOPGENE uses the Q-analysis method for explication of hidden information, also hierarchical clustering of location-pairs with respect to their flow generation potential as a prerequisite information for the heuristic reasoning process. (5) To deal with a design of a building at topological level TOPGENE takes advantage of existing graph algorithms such as path-finding and planarity testing during its reasoning process. This work also presents a new efficient algorithm for keeping track of distances in a growing graph. (6) This work also presents a neural net implementation of a special case of the design generation problem. This approach is based on the Hopfield model of neural networks. The result of this approach has been used test TOPGENE approach in generating designs. A comparison of these two approaches shows that the neural network provides mathematically more optimal designs, while TOPGENE produces more realistic designs. These two systems may be integrated to create a hybrid system.
series thesis:PhD
last changed 2003/02/12 22:37

_id 4704
authors Amirante, I., Rinaldi, S. and Muzzillo, F.
year 1992
title A Tutorial Experiment Concerning Dampness Diagnosis Supported by an Expert System
source CAAD Instruction: The New Teaching of an Architect? [eCAADe Conference Proceedings] Barcelona (Spain) 12-14 November 1992, pp. 159-172
doi https://doi.org/10.52842/conf.ecaade.1992.159
summary (A) The teaching of Technology of Building Rehabilitation in Italian Universities - (B) Experimental course of technological rehabilitation with computer tools - (C) Synthesis of technological approach - (D) Dampness diagnostic process using the Expert System - (E) Primary consideration on tutorial experience - (F) Bibliography
series eCAADe
last changed 2022/06/07 07:54

_id 10b7
authors Aukstakalnis, Steve and Blatner, David
year 1992
title Silicon Mirage: The Art and Science of Virtual Reality
source Peachpit Press
summary An introduction to virtual reality covers every aspect of the revolutionary new technology and its many possible applications, from computer games to air traffic control. Original. National ad/promo.
series other
last changed 2003/04/23 15:14

_id ecaadesigradi2019_449
id ecaadesigradi2019_449
authors Becerra Santacruz, Axel
year 2019
title The Architecture of ScarCity Game - The craft and the digital as an alternative design process
source Sousa, JP, Xavier, JP and Castro Henriques, G (eds.), Architecture in the Age of the 4th Industrial Revolution - Proceedings of the 37th eCAADe and 23rd SIGraDi Conference - Volume 3, University of Porto, Porto, Portugal, 11-13 September 2019, pp. 45-52
doi https://doi.org/10.52842/conf.ecaade.2019.3.045
summary The Architecture of ScarCity Game is a board game used as a pedagogical tool that challenges architecture students by involving them in a series of experimental design sessions to understand the design process of scarcity and the actual relation between the craft and the digital. This means "pragmatic delivery processes and material constraints, where the exchange between the artisan of handmade, representing local skills and technology of the digitally conceived is explored" (Huang 2013). The game focuses on understanding the different variables of the crafted design process of traditional communities under conditions of scarcity (Michel and Bevan 1992). This requires first analyzing the spatial environmental model of interaction, available human and natural resources, and the dynamic relationship of these variables in a digital era. In the first stage (Pre-Agency), the game set the concept of the craft by limiting students design exploration from a minimum possible perspective developing locally available resources and techniques. The key elements of the design process of traditional knowledge communities have to be identified (Preez 1984). In other words, this stage is driven by limited resources + chance + contingency. In the second stage (Post-Agency) students taking the architects´ role within this communities, have to speculate and explore the interface between the craft (local knowledge and low technological tools), and the digital represented by computation data, new technologies available and construction. This means the introduction of strategy + opportunity + chance as part of the design process. In this sense, the game has a life beyond its mechanics. This other life challenges the participants to exploit the possibilities of breaking the actual boundaries of design. The result is a tool to challenge conventional methods of teaching and leaning controlling a prescribed design process. It confronts the rules that professionals in this field take for granted. The game simulates a 'fake' reality by exploring in different ways with surveyed information. As a result, participants do not have anything 'real' to lose. Instead, they have all the freedom to innovate and be creative.
keywords Global south, scarcity, low tech, digital-craft, design process and innovation by challenge.
series eCAADeSIGraDi
email
last changed 2022/06/07 07:54

_id 2cb4
authors Bille, Pia
year 1992
title CAD at the AAA
source CAAD Instruction: The New Teaching of an Architect? [eCAADe Conference Proceedings] Barcelona (Spain) 12-14 November 1992, pp. 279-288
doi https://doi.org/10.52842/conf.ecaade.1992.279
summary Teaching computer science at the Aarhus School of Architecture goes back as far as to the beginning of the 80’s, when a few teachers and students were curious towards the new media seeing its great developing perspectives and its possible use in the design of architecture. The curiosity and excitement about technology continued, although the results were modest and the usefulness not a dominant aspect in this early period. In the middle of the 80’s the School of Architecture was given the opportunity by means of state funding to buy the first 10 IBM PC's to run AutoCad among other programmes. Beside this a bigger CAD-system Gable 4D Series was introduced running on MicroVax Workstations. The software was dedicated to drafting buildings in 2 and 3 dimensions - an important task within the profession of architects.

series eCAADe
email
last changed 2022/06/07 07:52

_id 6bff
authors Coyne, Richard
year 1992
title The Role of Metaphor in Understanding Computers in Design
source Mission - Method - Madness [ACADIA Conference Proceedings / ISBN 1-880250-01-2] 1992, pp. 3-11
doi https://doi.org/10.52842/conf.acadia.1992.003
summary The study of metaphor provides valuable insights into the workings of thought and understanding. This chapter addresses the important question of what the study of metaphor has to say about technology, the design process and hence the role of computers in design. The conclusion is that design involves the generation of action within a collaborative environment in which there is the free play of metaphor. A recognition of the close relationship between technology and metaphor provides insights into how to evaluate and develop the effective use of computers in design.

series ACADIA
email
last changed 2022/06/07 07:56

_id sigradi2017_068
id sigradi2017_068
authors da Motta Gaspar, João Alberto; Regina Coeli Ruschel
year 2017
title A evolução do significado atribuído ao acrônimo BIM: Uma perspectiva no tempo [The evolution of the meaning ascribed to the acronym BIM: A perspective in time]
source SIGraDi 2017 [Proceedings of the 21th Conference of the Iberoamerican Society of Digital Graphics - ISBN: 978-956-227-439-5] Chile, Concepción 22 - 24 November 2017, pp.461-469
summary The term Building Information Model emerged in 1992. It has evolved over time and has its meaning currently associated with an object-oriented modeling technology and an associated set of processes to produce, communicate and analyze building models. Its origin is related to several other, older terms. This paper registers the evolution of BIM and related definitions over time by means of a systematic literature review. We present a list of BIM-related terms and their meanings, organized by date of emergence, and charts showing which ones are most used over time, contributing to better understanding of the meaning of BIM.
keywords BIM; History of BIM; Building Information Model.
series SIGRADI
email
last changed 2021/03/28 19:58

_id fbd7
authors Datta, S.
year 1992
title Geometric delineation in Indian temple architecture: A study of the temple of Ranakdevi at Wadhwan
source Centre for Environmental Planning and Technology, School of Architecture, Ahmedabad, India
series thesis:MSc
email
last changed 2004/06/02 19:12

_id cf73
authors Dosti, P., Martens, B. and Voigt, A.
year 1992
title Spatial Simulation In Architecture, City Development and Regional Planning
source CAAD Instruction: The New Teaching of an Architect? [eCAADe Conference Proceedings] Barcelona (Spain) 12-14 November 1992, pp. 195-200
doi https://doi.org/10.52842/conf.ecaade.1992.195
summary The appropriate use of spatial simulation techniques considerably tends to increase the depth of evidence and the realistic content of the design and plannings to be described and moreover may encourage experimentations, trial attempts and planning variants. This means also the more frequent use of combinations between different techniques, having in mind that they are not equivalent, but making use of the respective advantages each offers. Until now the main attention of the EDP-Lab was directed on achieving quantity. For the time to come time it will be the formation of quality. The challenge in the educational system at the Vienna University of Technology is to obtain appropriate results in the frame- work of low-cost simulation. This aspect seems also to be meaningful in order to enforce the final implementation in architectural practice.

series eCAADe
email
more http://info.tuwien.ac.at/ecaade/
last changed 2022/06/07 07:55

_id 6ea4
authors Eastman, C.M.
year 1992
title A Data Model Analysis of Modularity and Extensibility in Building Databases
source Building and Environment, Vol 27, No: 2, pp. 135-148
summary This paper uses data modeling techniques to define how database schemas for an intelligent integrated architectural CAD system can be made extensible. It reviews the product data modeling language EDM, then applies it to define a part of an architectural data model. Extensions are then investigated, regarding how users could integrate various design-specific packages into a uniquely configured system. Both, extension by substituting one technology for another and by adding a new evaluation application, are considered. Data modeling allows specification of a CAD database and identification of the kind of modularization that will work and what problems may arise.''
series journal paper
email
last changed 2003/04/23 15:14

_id 32eb
authors Henry, Daniel
year 1992
title Spatial Perception in Virtual Environments : Evaluating an Architectural Application
source University of Washington
summary Over the last several years, professionals from many different fields have come to the Human Interface Technology Laboratory (H.I.T.L) to discover and learn about virtual environments. In general, they are impressed by their experiences and express the tremendous potential the tool has in their respective fields. But the potentials are always projected far in the future, and the tool remains just a concept. This is justifiable because the quality of the visual experience is so much less than what people are used to seeing; high definition television, breathtaking special cinematographic effects and photorealistic computer renderings. Instead, the models in virtual environments are very simple looking; they are made of small spaces, filled with simple or abstract looking objects of little color distinctions as seen through displays of noticeably low resolution and at an update rate which leaves much to be desired. Clearly, for most applications, the requirements of precision have not been met yet with virtual interfaces as they exist today. However, there are a few domains where the relatively low level of the technology could be perfectly appropriate. In general, these are applications which require that the information be presented in symbolic or representational form. Having studied architecture, I knew that there are moments during the early part of the design process when conceptual decisions are made which require precisely the simple and representative nature available in existing virtual environments. This was a marvelous discovery for me because I had found a viable use for virtual environments which could be immediately beneficial to architecture, my shared area of interest. It would be further beneficial to architecture in that the virtual interface equipment I would be evaluating at the H.I.T.L. happens to be relatively less expensive and more practical than other configurations such as the "Walkthrough" at the University of North Carolina. The set-up at the H.I.T.L. could be easily introduced into architectural firms because it takes up very little physical room (150 square feet) and it does not require expensive and space taking hardware devices (such as the treadmill device for simulating walking). Now that the potential for using virtual environments in this architectural application is clear, it becomes important to verify that this tool succeeds in accurately representing space as intended. The purpose of this study is to verify that the perception of spaces is the same, in both simulated and real environment. It is hoped that the findings of this study will guide and accelerate the process by which the technology makes its way into the field of architecture.
keywords Space Perception; Space (Architecture); Computer Simulation
series thesis:MSc
last changed 2003/02/12 22:37

_id 130d
authors Hoinkes, R. and Mitchell, R.
year 1994
title Playing with Time - Continuous Temporal Mapping Strategies for Interactive Environments
source 6th Canadian GIS Conference, (Ottawa Natura Resources Canada), pp. 318-329
summary The growing acceptance of GIS technology has had far- reaching effects on many fields of research. The recent developments in the area of dynamic and temporal GIS open new possibilities within the realm of historical research where temporal relationship analysis is as important as spatial relationship analysis. While topological structures have had wide use in spatial GIS and have been the subject of most temporal GIS endeavours, the different demands of many of these temporally- oriented analytic processes questions the choice of the topological direction. In the fall of 1992 the Montreal Research Group (MRG) of the Canadian Centre for Architecture mounted an exhibition dealing with the development of the built environment in 18th- century Montreal. To aid in presenting the interpretive messages of their data, the MRG worked with the Centre for Landscape Research (CLR) to incorporate the interactive capabilities of the CLR's PolyTRIM research software with the MRG's data base to produce a research tool as well as a public- access interactive display. The interactive capabilities stemming from a real- time object- oriented structure provided an excellent environment for both researchers and the public to investigate the nature of temporal changes in such aspects as landuse, ethnicity, and fortifications of the 18th century city. This paper describes the need for interactive real- time GIS in such temporal analysis projects and the underlying need for object- oriented vs. topologically structured data access strategies to support them.
series other
last changed 2003/04/23 15:14

_id 4b2a
id 4b2a
authors Jabi, Wassim
year 2004
title A FRAMEWORK FOR COMPUTER-SUPPORTED COLLABORATION IN ARCHITECTURAL DESIGN
source University of Michigan
summary The development of appropriate research frameworks and guidelines for the construction of software aids in the area of architectural design can lead to a better understanding of designing and computer support for designing (Gero and Maher 1997). The field of research and development in computer-supported collaborative architectural design reflects that of the early period in the development of the field of computersupported cooperative work (CSCW). In the early 1990s, the field of CSCW relied on unsystematic attempts to generate software that increases the productivity of people working together (Robinson 1992). Furthermore, a shift is taking place by which researchers in the field of architecture are increasingly becoming consumers of rather than innovators of technology (Gero and Maher . In particular, the field of architecture is rapidly becoming dependent on commercial software implementations that are slow to respond to new research or to user demands. Additionally, these commercial systems force a particular view of the domain they serve and as such might hinder rather than help its development. The aim of this dissertation is to provide information to architects and others to help them build their own tools or, at a minimum, be critical of commercial solutions.
series thesis:PhD
type normal paper
email
last changed 2004/10/24 22:35

_id c5d7
authors Kuffer, Monika
year 2003
title Monitoring the Dynamics of Informal Settlements in Dar Es Salaam by Remote Sensing: Exploring the Use of Spot, Ers and Small Format Aerial Photography
source CORP 2003, Vienna University of Technology, 25.2.-28.2.2003 [Proceedings on CD-Rom]
summary Dar es Salaam is exemplary for cities in the developing world facing an enormous population growth. In the last decades, unplanned settlements have tremendously expanded, causing that around 70 percent of the urban dwellers are living now-a-days in these areas. Tools for monitoring such tremendous growth are relatively weak in developing countries, thus an effective satellite based monitoring system can provide a useful instrument for monitoring the dynamics of urban development. An investigation to asses the ability of extracting reliable information on the expansion and consolidation levels (density) of urban development of the city of Dar es Salaam from SPOT-HRV and ERS-SAR images is described. The use of SPOT and ERS should provide data that is complementary to data derived from the most recent aerial photography and from digital topographic maps. In a series of experiments various classification and fusion techniques are applied to the SPOT-HRV and ERS-SAR data to extract information on building density that is comparable to that obtained from the 1992 data. Ultimately, building density is estimated by linear and non-linear regression models on the basis of an one ha kernel and further aggregation is made to the level of informal settlements for a final analysis. In order to assess the reliability, use is made of several sample areas that are relatively stable over the study period, as well as, of data derived from small format aerial photography. The experiments show a high correlation between the density data derived from the satellite images and the test areas.
series other
email
last changed 2003/03/11 20:39

_id c926
authors Laerdal, Arnbjørn O.
year 1992
title Architecture on Cards
source CAAD Instruction: The New Teaching of an Architect? [eCAADe Conference Proceedings] Barcelona (Spain) 12-14 November 1992, pp. 75-84
doi https://doi.org/10.52842/conf.ecaade.1992.075
summary The ArchiCards project (tentative name) is a prototype of a learning tool in architectural theory and history. It applies this novel technology to give a completely new approach to the acquisition of this kind of information. The aim is to give the user a summary along with an understanding of some of the relations in the world of architecture. Also, it has been an issue to unveil some of the possibilities this technology offers in the teaching of architecture.
series eCAADe
last changed 2022/06/07 07:52

_id ddss9208
id ddss9208
authors Lucardie, G.L.
year 1993
title A functional approach to realizing decision support systems in technical regulation management for design and construction
source Timmermans, Harry (Ed.), Design and Decision Support Systems in Architecture (Proceedings of a conference held in Mierlo, the Netherlands in July 1992), ISBN 0-7923-2444-7
summary Technical building standards defining the quality of buildings, building products, building materials and building processes aim to provide acceptable levels of safety, health, usefulness and energy consumption. However, the logical consistency between these goals and the set of regulations produced to achieve them is often hard to identify. Not only the large quantities of highly complex and frequently changing building regulations to be met, but also the variety of user demands and the steadily increasing technical information on (new) materials, products and buildings have produced a very complex set of knowledge and data that should be taken into account when handling technical building regulations. Integrating knowledge technology and database technology is an important step towards managing the complexity of technical regulations. Generally, two strategies can be followed to integrate knowledge and database technology. The main emphasis of the first strategy is on transferring data structures and processing techniques from one field of research to another. The second approach is concerned exclusively with the semantic structure of what is contained in the data-based or knowledge-based system. The aim of this paper is to show that the second or knowledge-level approach, in particular the theory of functional classifications, is more fundamental and more fruitful. It permits a goal-directed rationalized strategy towards analysis, use and application of regulations. Therefore, it enables the reconstruction of (deep) models of regulations, objects and of users accounting for the flexibility and dynamics that are responsible for the complexity of technical regulations. Finally, at the systems level, the theory supports an effective development of a new class of rational Decision Support Systems (DSS), which should reduce the complexity of technical regulations and restore the logical consistency between the goals of technical regulations and the technical regulations themselves.
series DDSS
last changed 2003/08/07 16:36

For more results click below:

this is page 0show page 1show page 2show page 3show page 4show page 5... show page 12HOMELOGIN (you are user _anon_2785 from group guest) CUMINCAD Papers Powered by SciX Open Publishing Services 1.002