CumInCAD is a Cumulative Index about publications in Computer Aided Architectural Design
supported by the sibling associations ACADIA, CAADRIA, eCAADe, SIGraDi, ASCAAD and CAAD futures

PDF papers
References

Hits 1 to 20 of 67

_id 7e68
authors Holland, J.
year 1992
title Genetic Algorithms
source Scientific America, July 1992
summary Living organisms are consummate problem solvers. They exhibit a versatility that puts the best computer programs to shame. This observation is especially galling for computer scientists, who may spend months or years of intellectual effort on an algorithm, whereas organisms come by their abilities through the apparently undirected mechanism of evolution and natural selection. Pragmatic researchers see evolution's remarkable power as something to be emulated rather than envied. Natural selection eliminates one of the greatest hurdles in software design: specifying in advance all the features of a problem and the actions a program should take to deal with them. By harnessing the mechanisms of evolution, researchers may be able to "breed" programs that solve problems even when no person can fully understand their structure. Indeed, these so-called genetic algorithms have already demonstrated the ability to made breakthroughs in the design of such complex systems as jet engines. Genetic algorithms make it possible to explore a far greater range of potential solutions to a problem than do conventional programs. Furthermore, as researchers probe the natural selection of programs under controlled an well-understood conditions, the practical results they achieve may yield some insight into the details of how life and intelligence evolve in the natural world.
series journal paper
last changed 2003/04/23 15:50

_id cf5c
authors Carpenter, B.
year 1992
title The logic of typed feature structures with applications to unification grammars, logic programs and constraint resolution
source Cambridge Tracts in Theoretical Computer Science, Cambridge University Press
summary This book develops the theory of typed feature structures, a new form of data structure that generalizes both the first-order terms of logic programs and feature-structures of unification-based grammars to include inheritance, typing, inequality, cycles and intensionality. It presents a synthesis of many existing ideas into a uniform framework, which serves as a logical foundation for grammars, logic programming and constraint-based reasoning systems. Throughout the text, a logical perspective is adopted that employs an attribute-value description language along with complete equational axiomatizations of the various systems of feature structures. Efficiency concerns are discussed and complexity and representability results are provided. The application of feature structures to phrase structure grammars is described and completeness results are shown for standard evaluation strategies. Definite clause logic programs are treated as a special case of phrase structure grammars. Constraint systems are introduced and an enumeration technique is given for solving arbitrary attribute-value logic constraints. This book with its innovative approach to data structures will be essential reading for researchers in computational linguistics, logic programming and knowledge representation. Its self-contained presentation makes it flexible enough to serve as both a research tool and a textbook.
series other
last changed 2003/04/23 15:14

_id 4857
authors Escola Tecnica Superior D'arquitectura de Barcelona (Ed.)
year 1992
title CAAD Instruction: The New Teaching of an Architect?
source eCAADe Conference Proceedings / Barcelona (Spain) 12-14 November 1992, 551 p.
doi https://doi.org/10.52842/conf.ecaade.1992
summary The involvement of computer graphic systems in the transmission of knowledge in the areas of urban planning and architectural design will bring a significant change to the didactic programs and methods of those schools which have decided to adopt these new instruments. Workshops of urban planning and architectural design will have to modify their structures, and teaching teams will have to revise their current programs. Some european schools and faculties of architecture have taken steps in this direction. Others are willing to join them.

This process is only delayed by the scarcity of material resources, and by the slowness with which a sufficient number of teachers are adopting these methods.

ECAADE has set out to analyze the state of this issue during its next conference, and it will be discussed from various points of view. From this confrontation of ideas will come, surely, the guidelines for progress in the years to come.

The different sessions will be grouped together following these four themes:

(A.) Multimedia and Course Work / State of the art of the synthesis of graphical and textual information favored by new available multimedia computer programs. Their repercussions on academic programs. (B.) The New Design Studio / Physical characteristics, data concentration and accessibility of a computerized studio can be better approached in a computerized workshop. (C.) How to manage the new education system / Problems and possibilities raised, from the practical and organizational points of view, of architectural education by the introduction of computers in the classrooms. (D.) CAAI. Formal versus informal structure / How will the traditional teaching structure be affected by the incidence of these new systems in which the access to knowledge and information can be obtained in a random way and guided by personal and subjective criteria.

series eCAADe
email
last changed 2022/06/07 07:49

_id 46c7
id 46c7
authors Ozel, Filiz
year 1992
title Data Modeling Needs of Life Safety Code (LSC) Compliance Applications
source Mission - Method - Madness [ACADIA Conference Proceedings / ISBN 1-880250-01-2] 1992, pp. 177-185
doi https://doi.org/10.52842/conf.acadia.1992.177
summary One of the most complex code compliance issues originates from the conformance of designs to Life Safety Code (NFPA 101). The development of computer based code compliance checking programs attracted the attention of building researchers and practitioners alike. These studies represent a number of approaches ranging from CAD based procedural approaches to rule based, non graphic ones, but they do not address the interaction of the rule base of such systems with graphic data bases that define the geometry of architectural objects. Automatic extraction of the attributes and the configuration of building systems requires 11 architectural object - graphic entity" data models that allow access and retrieval of the necessary data for code compliance checking. This study aims to specifically focus on the development of such a data model through the use of AutoLISP feature of AutoCAD (Autodesk Inc.) graphic system. This data model is intended to interact with a Life Safety Code rule base created through Level5-Object (Focus Inc.) expert system.

Assuming the availability of a more general building data model, one must define life and fire safety features of a building before any automatic checking can be performed. Object oriented data structures are beginning to be applied to design objects, since they allow the type versatility demanded by design applications. As one generates a functional view of the main data model, the software user must provide domain specific information. A functional view is defined as the process of generating domain specific data structures from a more general purpose data model, such as defining egress routes from wall or room object data structure. Typically in the early design phase of a project, these are related to the emergency egress design features of a building. Certain decisions such as where to provide sprinkler protection or the location of protected egress ways must be made early in the process.

series ACADIA
email
last changed 2022/06/07 08:00

_id a3f5
authors Zandi-Nia, Abolfazl
year 1992
title Topgene: An artificial Intelligence Approach to a Design Process
source Delft University of Technology
summary This work deals with two architectural design (AD) problems at the topological level and in presence of the social norms community, privacy, circulation-cost, and intervening opportunity. The first problem concerns generating a design with respect to a set of above mentioned norms, and the second problem requires evaluation of existing designs with respect to the same set of norms. Both problems are based on the structural-behavioral relationship in buildings. This work has challenged above problems in the following senses: (1) A working system, called TOPGENE (The TOpological Pattern GENErator) has been developed. (2) Both problems may be vague and may lack enough information in their statement. For example, an AD in the presence of the social norms requires the degrees of interactions between the location pairs in the building. This information is not always implicitly available, and must be explicated from the design data. (3) An AD problem at topological level is intractable with no fast and efficient algorithm for its solution. To reduce the search efforts in the process of design generation, TOPGENE uses a heuristic hill climbing strategy that takes advantage of domain specific rules of thumbs to choose a path in the search space of a design. (4) TOPGENE uses the Q-analysis method for explication of hidden information, also hierarchical clustering of location-pairs with respect to their flow generation potential as a prerequisite information for the heuristic reasoning process. (5) To deal with a design of a building at topological level TOPGENE takes advantage of existing graph algorithms such as path-finding and planarity testing during its reasoning process. This work also presents a new efficient algorithm for keeping track of distances in a growing graph. (6) This work also presents a neural net implementation of a special case of the design generation problem. This approach is based on the Hopfield model of neural networks. The result of this approach has been used test TOPGENE approach in generating designs. A comparison of these two approaches shows that the neural network provides mathematically more optimal designs, while TOPGENE produces more realistic designs. These two systems may be integrated to create a hybrid system.
series thesis:PhD
last changed 2003/02/12 22:37

_id aa78
authors Bayazit, Nigan
year 1992
title Requirements of an Expert System for Design Studios
source CAAD Instruction: The New Teaching of an Architect? [eCAADe Conference Proceedings] Barcelona (Spain) 12-14 November 1992, pp. 187-194
doi https://doi.org/10.52842/conf.ecaade.1992.187
summary The goal of this paper is to study problems of the transition from traditional architectural studio teaching to CAAD studio teaching which requires a CAAD expert system as studio tutor, and to study the behavior of the student in this new environment. The differences between the traditional and computerized studio teaching and the experiences in this field are explained referring to the requirements for designing time in relation to the expertise of the student in the application of a CAD program. Learning styles and the process of design in computerized and non-computerized studio teaching are discussed. Design studio requirements of the students in traditional studio environment while doing design works are clarified depending on the results of an empirical study which explained the relations between the tutor and the student while they were doing studio critiques. Main complaints of the students raised in the empirical study were the lack of data in the specific design problem area, difficulties of realization of ideas and thoughts, not knowing the starting point of design, having no information about the references to be used for the specific design task, having difficulties in the application of presentation techniques. In the concluding parts of the paper are discussed the different styles of teaching and their relation to the CAAD environment, the transformation of the instructional programs for the new design environment, the future expectations from the CAAD programs, properties of the new teaching environment and the roles of the expert systems in design studio education.

keywords CAAD Education, Expert System, Architectural Design Studio, Knowledge Acquisition
series eCAADe
email
last changed 2022/06/07 07:54

_id ddss9219
id ddss9219
authors Bourdakis, V. and Fellows, R.F.
year 1993
title A model appraising the performance of structural systems used in sports hall and swimming pool buildings in greece
source Timmermans, Harry (Ed.), Design and Decision Support Systems in Architecture (Proceedings of a conference held in Mierlo, the Netherlands in July 1992), ISBN 0-7923-2444-7
summary The selection of the best performing structural system (among steel, timber laminated, concrete, fabric tents) for medium span (30-50m) sports halls and swimming pools in Greece formed the impetus for this research. Decision-making concerning selection of the structural system is difficult in this sector of construction, as was explained in the "Long Span Structures" conference (November 1990, Athens. Greece). From the literature it has been found that most building appraisals end up at the level of data analysis and draw conclusions on the individual aspects they investigate. These approaches usually focus on a fraction of the problem, examining it very deeply and theoretically. Their drawback is loss of comprehensiveness and ability to draw conclusions on an overall level and consequently being applicable to the existing conditions. Research on an inclusive level is sparse. In this particular research project, an inclusive appraisal approach was adopted, leading to the identification of three main variables: resources, human-user-satisfaction, and technical. Consequently, this led to a combination of purely quantitative and qualitative data. Case studies were conducted on existing buildings in order to assess the actual performance of the various alternative structural systems. This paper presents the procedure followed for the identification of the research variables and the focus on the development of the model of quantification. The latter is of vital importance if the problem of incompatibility of data is to be solved, overall relation of findings is to be achieved and holistic conclusions are to be drawn.
series DDSS
last changed 2003/11/21 15:16

_id ddss9209
id ddss9209
authors De Gelder, J.T. and Lucardie, G.L.
year 1993
title Knowledge and data modelling in cad/cam applications
source Timmermans, Harry (Ed.), Design and Decision Support Systems in Architecture (Proceedings of a conference held in Mierlo, the Netherlands in July 1992), ISBN 0-7923-2444-7
summary Modelling knowledge and data in CAD/CAM applications is complex because different goals and contexts have to be taken into account. This complexity makes particular demands upon representation formalisms. Today many modelling tools are based on record structures. By analyzing the requirements for a product model of a portal structure in steel, this paper shows that in many situations record structures are not well suited as a representation formalism for storing knowledge and data in CAD/CAM applications. This is illustrated by performing a knowledge-level analysis of the knowledge and data generated in the design and manufacturing process of a portal structure in steel.
series DDSS
last changed 2003/08/07 16:36

_id 7ce5
authors Gal, Shahaf
year 1992
title Computers and Design Activities: Their Mediating Role in Engineering Education
source Sociomedia, ed. Edward Barret. MIT Press
summary Sociomedia: With all the new words used to describe electronic communication (multimedia, hypertext, cyberspace, etc.), do we need another one? Edward Barrett thinks we do; hence, he coins the term "sociomedia." It is meant to displace a computing economy in which technicity is hypostasized over sociality. Sociomedia, a compilation of twenty-five articles on the theory, design and practice of educational multimedia and hypermedia, attempts to re-value the communicational face of computing. Value, of course, is "ultimately a social construct." As such, it has everything to do with knowledge, power, education and technology. The projects discussed in this book represent the leading edge of electronic knowledge production in academia (not to mention major funding) and are determining the future of educational media. For these reasons, Sociomedia warrants close inspection. Barrett's introduction sets the tone. For him, designing computer media involves hardwiring a mechanism for the social construction of knowledge (1). He links computing to a process of social and communicative interactivity for constructing and desseminating knowledge. Through a mechanistic mapping of the university as hypercontext (a huge network that includes classrooms as well as services and offices), Barrett models intellectual work in such a way as to avoid "limiting definitions of human nature or human development." Education, then, can remain "where it should be--in the human domain (public and private) of sharing ideas and information through the medium of language." By leaving education in a virtual realm (where we can continue to disagree about its meaning and execution), it remains viral, mutating and contaminating in an intellectually healthy way. He concludes that his mechanistic model, by means of its reductionist approach, preserves value (7). This "value" is the social construction of knowledge. While I support the social orientation of Barrett's argument, discussions of value are related to power. I am not referring to the traditional teacher-student power structure that is supposedly dismantled through cooperative and constructivist learning strategies. The power to be reckoned with in the educational arena is foundational, that which (pre)determines value and the circulation of knowledge. "Since each of you reading this paragraph has a different perspective on the meaning of 'education' or 'learning,' and on the processes involved in 'getting an education,' think of the hybris in trying to capture education in a programmable function, in a displayable object, in a 'teaching machine'" (7). Actually, we must think about that hybris because it is, precisely, what informs teaching machines. Moreover, the basic epistemological premises that give rise to such productions are too often assumed. In the case of instructional design, the episteme of cognitive sciences are often taken for granted. It is ironic that many of the "postmodernists" who support electronic hypertextuality seem to have missed Jacques Derrida's and Michel Foucault's "deconstructions" of the epistemology underpinning cognitive sciences (if not of epistemology itself). Perhaps it is the glitz of the technology that blinds some users (qua developers) to the belief systems operating beneath the surface. Barrett is not guilty of reactionary thinking or politics; he is, in fact, quite in line with much American deconstructive and postmodern thinking. The problem arises in that he leaves open the definitions of "education," "learning" and "getting an education." One cannot engage in the production of new knowledge without orienting its design, production and dissemination, and without negotiating with others' orientations, especially where largescale funding is involved. Notions of human nature and development are structural, even infrastructural, whatever the medium of the teaching machine. Although he addresses some dynamics of power, money and politics when he talks about the recession and its effects on the conference, they are readily visible dynamics of power (3-4). Where does the critical factor of value determination, of power, of who gets what and why, get mapped onto a mechanistic model of learning institutions? Perhaps a mapping of contributors' institutions, of the funding sources for the projects showcased and for participation in the conference, and of the disciplines receiving funding for these sorts of projects would help visualize the configurations of power operative in the rising field of educational multimedia. Questions of power and money notwithstanding, Barrett's introduction sets the social and textual thematics for the collection of essays. His stress on interactivity, on communal knowledge production, on the society of texts, and on media producers and users is carried foward through the other essays, two of which I will discuss. Section I of the book, "Perspectives...," highlights the foundations, uses and possible consequences of multimedia and hypertextuality. The second essay in this section, "Is There a Class in This Text?," plays on the robust exchange surrounding Stanley Fish's book, Is There a Text in This Class?, which presents an attack on authority in reading. The author, John Slatin, has introduced electronic hypertextuality and interaction into his courses. His article maps the transformations in "the content and nature of work, and the workplace itself"-- which, in this case, is not industry but an English poetry class (25). Slatin discovered an increase of productive and cooperative learning in his electronically- mediated classroom. For him, creating knowledge in the electronic classroom involves interaction between students, instructors and course materials through the medium of interactive written discourse. These interactions lead to a new and persistent understanding of the course materials and of the participants' relation to the materials and to one another. The work of the course is to build relationships that, in my view, constitute not only the meaning of individual poems, but poetry itself. The class carries out its work in the continual and usually interactive production of text (31). While I applaud his strategies which dismantle traditional hierarchical structures in academia, the evidence does not convince me that the students know enough to ask important questions or to form a self-directing, learning community. Stanley Fish has not relinquished professing, though he, too, espouses the indeterminancy of the sign. By the fourth week of his course, Slatin's input is, by his own reckoning, reduced to 4% (39). In the transcript of the "controversial" Week 6 exchange on Gertrude Stein--the most disliked poet they were discussing at the time (40)--we see the blind leading the blind. One student parodies Stein for three lines and sums up his input with "I like it." Another, finds Stein's poetry "almost completey [sic] lacking in emotion or any artistic merit" (emphasis added). On what grounds has this student become an arbiter of "artistic merit"? Another student, after admitting being "lost" during the Wallace Steven discussion, talks of having more "respect for Stevens' work than Stein's" and adds that Stein's poetry lacks "conceptual significance[, s]omething which people of varied opinion can intelligently discuss without feeling like total dimwits...." This student has progressed from admitted incomprehension of Stevens' work to imposing her (groundless) respect for his work over Stein's. Then, she exposes her real dislike for Stein's poetry: that she (the student) missed the "conceptual significance" and hence cannot, being a person "of varied opinion," intelligently discuss it "without feeling like [a] total dimwit." Slatin's comment is frightening: "...by this point in the semester students have come to feel increasingly free to challenge the instructor" (41). The students that I have cited are neither thinking critically nor are their preconceptions challenged by student-governed interaction. Thanks to the class format, one student feels self-righteous in her ignorance, and empowered to censure. I believe strongly in student empowerment in the classroom, but only once students have accrued enough knowledge to make informed judgments. Admittedly, Slatin's essay presents only partial data (there are six hundred pages of course transcripts!); still, I wonder how much valuable knowledge and metaknowledge was gained by the students. I also question the extent to which authority and professorial dictature were addressed in this course format. The power structures that make it possible for a college to require such a course, and the choice of texts and pedagogy, were not "on the table." The traditional professorial position may have been displaced, but what took its place?--the authority of consensus with its unidentifiable strong arm, and the faceless reign of software design? Despite Slatin's claim that the students learned about the learning process, there is no evidence (in the article) that the students considered where their attitudes came from, how consensus operates in the construction of knowledge, how power is established and what relationship they have to bureaucratic insitutions. How do we, as teaching professionals, negotiate a balance between an enlightened despotism in education and student-created knowledge? Slatin, and other authors in this book, bring this fundamental question to the fore. There is no definitive answer because the factors involved are ultimately social, and hence, always shifting and reconfiguring. Slatin ends his article with the caveat that computerization can bring about greater estrangement between students, faculty and administration through greater regimentation and control. Of course, it can also "distribute authority and power more widely" (50). Power or authority without a specific face, however, is not necessarily good or just. Shahaf Gal's "Computers and Design Activities: Their Mediating Role in Engineering Education" is found in the second half of the volume, and does not allow for a theory/praxis dichotomy. Gal recounts a brief history of engineering education up to the introduction of Growltiger (GT), a computer-assisted learning aid for design. He demonstrates GT's potential to impact the learning of engineering design by tracking its use by four students in a bridge-building contest. What his text demonstrates clearly is that computers are "inscribing and imaging devices" that add another viewpoint to an on-going dialogue between student, teacher, earlier coursework, and other teaching/learning tools. The less proficient students made a serious error by relying too heavily on the technology, or treating it as a "blueprint provider." They "interacted with GT in a way that trusted the data to represent reality. They did not see their interaction with GT as a negotiation between two knowledge systems" (495). Students who were more thoroughly informed in engineering discourses knew to use the technology as one voice among others--they knew enough not simply to accept the input of the computer as authoritative. The less-advanced students learned a valuable lesson from the competition itself: the fact that their designs were not able to hold up under pressure (literally) brought the fact of their insufficient knowledge crashing down on them (and their bridges). They also had, post factum, several other designs to study, especially the winning one. Although competition and comparison are not good pedagogical strategies for everyone (in this case the competitors had volunteered), at some point what we think we know has to be challenged within the society of discourses to which it belongs. Students need critique in order to learn to push their learning into auto-critique. This is what is lacking in Slatin's discussion and in the writings of other avatars of constructivist, collaborative and computer-mediated pedagogies. Obviously there are differences between instrumental types of knowledge acquisition and discoursive knowledge accumulation. Indeed, I do not promote the teaching of reading, thinking and writing as "skills" per se (then again, Gal's teaching of design is quite discursive, if not dialogic). Nevertheless, the "soft" sciences might benefit from "bridge-building" competitions or the re-institution of some forms of agonia. Not everything agonistic is inhuman agony--the joy of confronting or creating a sound argument supported by defensible evidence, for example. Students need to know that soundbites are not sound arguments despite predictions that electronic writing will be aphoristic rather than periodic. Just because writing and learning can be conceived of hypertextually does not mean that rigor goes the way of the dinosaur. Rigor and hypertextuality are not mutually incompatible. Nor is rigorous thinking and hard intellectual work unpleasurable, although American anti-intellectualism, especially in the mass media, would make it so. At a time when the spurious dogmatics of a Rush Limbaugh and Holocaust revisionist historians circulate "aphoristically" in cyberspace, and at a time when knowledge is becoming increasingly textualized, the role of critical thinking in education will ultimately determine the value(s) of socially constructed knowledge. This volume affords the reader an opportunity to reconsider knowledge, power, and new communications technologies with respect to social dynamics and power relationships.
series other
last changed 2003/04/23 15:14

_id ea96
authors Hacfoort, Eek J. and Veldhuisen, Jan K.
year 1992
title A Building Design and Evaluation System
source New York: John Wiley & Sons, 1992. pp. 195-211 : ill. table. includes bibliography
summary Within the field of architectural design there is a growing awareness of imbalance among the professionalism, the experience, and the creativity of the designers' response to the up-to-date requirements of all parties interested in the design process. The building design and evaluating system COSMOS makes it possible for various participants to work within their own domain, so that separated but coordinated work can be done. This system is meant to organize the initial stage of the design process, where user-defined functions, geometry, type of construction, and building materials are decided. It offers a tool to design a building to calculate a number of effects and for managing the information necessary to evaluate the design decisions. The system is provided with data and sets of parameters for describing the conditions, along with their properties, of the main building functions of a selection of well-known building types. The architectural design is conceptualized as being a hierarchy of spatial units, ranking from building blocks down to specific rooms or spaces. The concept of zoning is used as a means of calculating and directly evaluating the structure of the design without working out the details. A distinction is made between internal and external calculations and evaluations during the initial design process. During design on screen, an estimation can be recorded of building costs, energy costs, acoustics, lighting, construction, and utility. Furthermore, the design can be exported to a design application program, in this case AutoCAD, to make and show drawings in more detail. Through the medium of a database, external calculation and evaluation of building costs, life-cycle costs, energy costs, interior climate, acoustics, lighting, construction, and utility are possible in much more advanced application programs
keywords evaluation, applications, integration, architecture, design, construction, building, energy, cost, lighting, acoustics, performance
series CADline
last changed 2003/06/02 13:58

_id 130d
authors Hoinkes, R. and Mitchell, R.
year 1994
title Playing with Time - Continuous Temporal Mapping Strategies for Interactive Environments
source 6th Canadian GIS Conference, (Ottawa Natura Resources Canada), pp. 318-329
summary The growing acceptance of GIS technology has had far- reaching effects on many fields of research. The recent developments in the area of dynamic and temporal GIS open new possibilities within the realm of historical research where temporal relationship analysis is as important as spatial relationship analysis. While topological structures have had wide use in spatial GIS and have been the subject of most temporal GIS endeavours, the different demands of many of these temporally- oriented analytic processes questions the choice of the topological direction. In the fall of 1992 the Montreal Research Group (MRG) of the Canadian Centre for Architecture mounted an exhibition dealing with the development of the built environment in 18th- century Montreal. To aid in presenting the interpretive messages of their data, the MRG worked with the Centre for Landscape Research (CLR) to incorporate the interactive capabilities of the CLR's PolyTRIM research software with the MRG's data base to produce a research tool as well as a public- access interactive display. The interactive capabilities stemming from a real- time object- oriented structure provided an excellent environment for both researchers and the public to investigate the nature of temporal changes in such aspects as landuse, ethnicity, and fortifications of the 18th century city. This paper describes the need for interactive real- time GIS in such temporal analysis projects and the underlying need for object- oriented vs. topologically structured data access strategies to support them.
series other
last changed 2003/04/23 15:14

_id ddss9208
id ddss9208
authors Lucardie, G.L.
year 1993
title A functional approach to realizing decision support systems in technical regulation management for design and construction
source Timmermans, Harry (Ed.), Design and Decision Support Systems in Architecture (Proceedings of a conference held in Mierlo, the Netherlands in July 1992), ISBN 0-7923-2444-7
summary Technical building standards defining the quality of buildings, building products, building materials and building processes aim to provide acceptable levels of safety, health, usefulness and energy consumption. However, the logical consistency between these goals and the set of regulations produced to achieve them is often hard to identify. Not only the large quantities of highly complex and frequently changing building regulations to be met, but also the variety of user demands and the steadily increasing technical information on (new) materials, products and buildings have produced a very complex set of knowledge and data that should be taken into account when handling technical building regulations. Integrating knowledge technology and database technology is an important step towards managing the complexity of technical regulations. Generally, two strategies can be followed to integrate knowledge and database technology. The main emphasis of the first strategy is on transferring data structures and processing techniques from one field of research to another. The second approach is concerned exclusively with the semantic structure of what is contained in the data-based or knowledge-based system. The aim of this paper is to show that the second or knowledge-level approach, in particular the theory of functional classifications, is more fundamental and more fruitful. It permits a goal-directed rationalized strategy towards analysis, use and application of regulations. Therefore, it enables the reconstruction of (deep) models of regulations, objects and of users accounting for the flexibility and dynamics that are responsible for the complexity of technical regulations. Finally, at the systems level, the theory supports an effective development of a new class of rational Decision Support Systems (DSS), which should reduce the complexity of technical regulations and restore the logical consistency between the goals of technical regulations and the technical regulations themselves.
series DDSS
last changed 2003/08/07 16:36

_id 244d
authors Monedero, J., Casaus, A. and Coll, J.
year 1992
title From Barcelona. Chronicle and Provisional Evaluation of a New Course on Architectural Solid Modelling by Computerized Means
source CAAD Instruction: The New Teaching of an Architect? [eCAADe Conference Proceedings] Barcelona (Spain) 12-14 November 1992, pp. 351-362
doi https://doi.org/10.52842/conf.ecaade.1992.351
summary The first step made at the ETSAB in the computer field goes back to 1965, when professors Margarit and Buxade acquired an IBM computer, an electromechanical machine which used perforated cards and which was used to produce an innovative method of structural calculation. This method was incorporated in the academic courses and, at that time, this repeated question "should students learn programming?" was readily answered: the exercises required some knowledge of Fortran and every student needed this knowledge to do the exercises. This method, well known in Europe at that time, also provided a service for professional practice and marked the beginning of what is now the CC (Centro de Calculo) of our school. In 1980 the School bought a PDP1134, a computer which had 256 Kb of RAM, two disks of 5 Mb and one of lO Mb, and a multiplexor of 8 lines. Some time later the general politics of the UPC changed their course and this was related to the purchase of a VAX which is still the base of the CC and carries most of the administrative burden of the school. 1985 has probably been the first year in which we can talk of a general policy of the school directed towards computers. A report has been made that year, which includes an inquest adressed to the six Departments of the School (Graphic Expression, Projects, Structures, Construction, Composition and Urbanism) and that contains interesting data. According to the report, there were four departments which used computers in their current courses, while the two others (Projects and Composition) did not use them at all. The main user was the Department of Structures while the incidence of the remaining three was rather sporadic. The kind of problems detected in this report are very typical: lack of resources for hardware and software and for maintenance of the few computers that the school had at that moment; a demand (posed by the students) greatly exceeding the supply (computers and teachers). The main problem appeared to be the lack of computer graphic devices and proper software.

series eCAADe
email
last changed 2022/06/07 07:58

_id aa6d
authors Nichols, Foster Jr., Canete, Isabel J. and Tuladhar, Sagun
year 1992
title Designing for Pedestrians : A CAD-Network Analysis Approach
source New York: John Wiley & Sons, 1992. pp. 379-398 : ill. includes a short bibliography
summary Microcomputer techniques have been developed that combine CAD drawings with transportation network analysis software that uses spreadsheets and stand-alone programs activated from the DOS operating system. The CAD feature simplifies and improves the methods used to design pedestrian circulation facilities and evaluate the impact of new development on existing pedestrian flows. Through the use of customized software, the need for manual data entry is reduced, and the graphical display of analysis results in most intermediate steps in the process are automated. Three hypothetical case studies are presented, concentrating on proposed pedestrian circulation improvements at Penn Station, New York
keywords evaluation, networks, management, CAD, analysis, applications, planning, transportation, prediction, simulation, CAD
series CADline
last changed 2003/06/02 13:58

_id 831d
authors Seebohm, Thomas
year 1992
title Discoursing on Urban History Through Structured Typologies
source Mission - Method - Madness [ACADIA Conference Proceedings / ISBN 1-880250-01-2] 1992, pp. 157-175
doi https://doi.org/10.52842/conf.acadia.1992.157
summary How can urban history be studied with the aid of three-dimensional computer modeling? One way is to model known cities at various times in history, using historical records as sources of data. While such studies greatly enhance the understanding of the form and structure of specific cities at specific points in time, it is questionable whether such studies actually provide a true understanding of history. It can be argued that they do not because such studies only show a record of one of many possible courses of action at various moments in time. To gain a true understanding of urban history one has to place oneself back in historical time to consider all of the possible courses of action which were open in the light of the then current situation of the city, to act upon a possible course of action and to view the consequences in the physical form of the city. Only such an understanding of urban history can transcend the memory of the actual and hence the behavior of the possible. Moreover, only such an understanding can overcome the limitations of historical relativism, which contends that historical fact is of value only in historical context, with the realization, due to Benedetto Croce and echoed by Rudolf Bultmann, that the horizon of "'deeper understanding" lies in "'the actuality of decision"' (Seebohm and van Pelt 1990).

One cannot conduct such studies on real cities except, perhaps, as a point of departure at some specific point in time to provide an initial layout for a city knowing that future forms derived by the studies will diverge from that recorded in history. An entirely imaginary city is therefore chosen. Although the components of this city at the level of individual buildings are taken from known cities in history, this choice does not preclude alternative forms of the city. To some degree, building types are invariants and, as argued in the Appendix, so are the urban typologies into which they may be grouped. In this imaginary city students of urban history play the role of citizens or groups of citizens. As they defend their interests and make concessions, while interacting with each other in their respective roles, they determine the nature of the city as it evolves through the major periods of Western urban history in the form of threedimensional computer models.

My colleague R.J. van Pelt and I presented this approach to the study of urban history previously at ACADIA (Seebohm and van Pelt 1990). Yet we did not pay sufficient attention to the manner in which such urban models should be structured and how the efforts of the participants should be coordinated. In the following sections I therefore review what the requirements are for three-dimensional modeling to support studies in urban history as outlined both from the viewpoint of file structure of the models and other viewpoints which have bearing on this structure. Three alternative software schemes of progressively increasing complexity are then discussed with regard to their ability to satisfy these requirements. This comparative study of software alternatives and their corresponding file structures justifies the present choice of structure in relation to the simpler and better known generic alternatives which do not have the necessary flexibility for structuring the urban model. Such flexibility means, of course, that in the first instance the modeling software is more timeconsuming to learn than a simple point and click package in accord with the now established axiom that ease of learning software tools is inversely related to the functional power of the tools. (Smith 1987).

series ACADIA
email
last changed 2022/06/07 07:56

_id c434
authors Colajanni, B., Pellitteri, G. and Scianna, A.
year 1992
title Two Approaches to Teaching Computers in Architecture: The Experience in the Faculty of Engineering in Palermo, Italy
source CAAD Instruction: The New Teaching of an Architect? [eCAADe Conference Proceedings] Barcelona (Spain) 12-14 November 1992, pp. 295-306
doi https://doi.org/10.52842/conf.ecaade.1992.295
summary Teaching the use of computers in architecture poses the same kind of problems as teaching mathematics. To both there are two possible approaches. The first presents the discipline as a tool of which the merely instrumental aspect is emphasized. Teaching is limited to show the results obtainable by existing programs and how to get them. The second approach, on the contrary emphasizes the autonomous nature of the discipline, mathematics as much as computing, on the basis of the convincement that the maximum of instrumental usefulness can be obtained through the knowledge at the highest degree of generality and, then, of abstraction. The first approach changes little in the mind of the student. He simply learns that is possible, and then worthy doing, a certain amount of operations, mainly checks of performances (and not only the control of the aspect, now easy with one of the many existing CAD) or searches of technical informations in some database. The second approach gives the student the consciousness of the manageability of abstract structures of relationships. He acquires then the idea of creating by himself particular structures of relationships and managing them. This can modify the very idea of the design procedure giving the student the consciousness that he can intervene directly in every segment of the design procedure, reshaping it to some extent in a way better suited to the particular problem he is dealing with. Of course this second approach implies learning not only a language but also the capability of coming to terms with languages. And again it is a cultural acquisition that can be very useful when referred to the languages of architecture. Furthermore the capability of simulating on the computer also a small segment of the design process gives the student a better understanding both of the particular problem he is dealing with and of the very nature of design. As for the first effect, it happens whenever a translation is done from a language to another one. One is obliged to get to the core of the matter in order to overcome the difficulties rising from the different bias of the two languages. The second effect comes from the necessity of placing the studied segment in the general flow of the design process. The organisation in a linear sequence of action to be accomplished recursively in an order always varying in any design occasion is an extremely useful exercise to understand the signification and the techniques of formalisation of design problems.
series eCAADe
email
last changed 2022/06/07 07:56

_id sigradi2015_11.166
id sigradi2015_11.166
authors Calixto, Victor; Celani, Gabriela
year 2015
title A literature review for space planning optimization using an evolutionary algorithm approach: 1992-2014
source SIGRADI 2015 [Proceedings of the 19th Conference of the Iberoamerican Society of Digital Graphics - vol. 2 - ISBN: 978-85-8039-133-6] Florianópolis, SC, Brasil 23-27 November 2015, pp. 662-671.
summary Space planning in architecture is a field of research in which the process of arranging a set of space elements is the main concern. This paper presents a survey of 31 papers among applications and reviews of space planning method using evolutionary algorithms. The objective of this work was to organize, classify and discuss about twenty-two years of SP based on an evolutionary approach to orient future research in the field.
keywords Space Planning, Evolutionary algorithms, Generative System
series SIGRADI
email
last changed 2016/03/10 09:47

_id d919
authors Heckbert, P.S.
year 1992
title Discontinuity Meshing for Radiosity
source Eurographics Workshop on Rendering. May 1992, pp. 203-216
summary The radiosity method is the most popular algorithm for simulating interreflection of light between diffuse surfaces. Most existing radiosity algorithms employ simple meshes and piecewise constant approximations, thereby constraining the radiosity function to be constant across each polygonal element. Much more accurate simulations are possible if linear, quadratic, or higher degree approximations are used. In order to realize the potential accuracy of higher-degree approximations, however, it is necessary for the radiosity mesh to resolve discontinuities such as shadow edges in the radiosity function. A discontinuity meshing algorithm is presented that places mesh boundaries directly along discontinuities. Such algorithms offer the potential of faster, more accurate simulations. Results are shown for three-dimensional scenes.
series other
last changed 2003/04/23 15:14

_id 8488
authors Liggett, Robin S.
year 1992
title A Designer-Automated Algorithm Partnership : An Interactive Graphic Approach to Facility Layout
source New York: John Wiley & Sons, 1992. pp. 101-123 : ill. includes bibliography
summary Automated solution technique for spatial allocation problems have long been an interest of researchers in computer-aided design. This paper describes research focusing on the use of an interactive graphic interface for the solution of facility layout problems which have quantifyable but sometimes competing criteria. The ideas presented in the paper have been implemented in a personal computer system
keywords algorithms, user interface, layout, synthesis, floor plans, architecture, facilities planning, automation, space allocation, optimization
series CADline
email
last changed 2003/06/02 13:58

_id f1f2
authors Watt, Alan
year 1992
title Advanced animation and rendering techniques
source New York: ACM Press
summary This book is an exposition of state-of-the-art techniques in rendering and animation. It provides a unique synthesis of techniques and theory. Four sections describe: Basics, Theoretical Foundations, Advanced Rendering Techniques, and Advanced Animation Techniques. Each technique is illustrated with a series of full color frames showing the development of the example. Many code examples and some complete implementations are given in C for interesting and advanced algorithms such as soft shadows and marching cubes.
series other
last changed 2003/04/23 15:14

For more results click below:

this is page 0show page 1show page 2show page 3HOMELOGIN (you are user _anon_641763 from group guest) CUMINCAD Papers Powered by SciX Open Publishing Services 1.002