CumInCAD is a Cumulative Index about publications in Computer Aided Architectural Design
supported by the sibling associations ACADIA, CAADRIA, eCAADe, SIGraDi, ASCAAD and CAAD futures

PDF papers
References

Hits 1 to 17 of 17

_id ddssar0206
id ddssar0206
authors Bax, M.F.Th. and Trum, H.M.G.J.
year 2002
title Faculties of Architecture
source Timmermans, Harry (Ed.), Sixth Design and Decision Support Systems in Architecture and Urban Planning - Part one: Architecture Proceedings Avegoor, the Netherlands), 2002
summary In order to be inscribed in the European Architect’s register the study program leading to the diploma ‘Architect’ has to meet the criteria of the EC Architect’s Directive (1985). The criteria are enumerated in 11 principles of Article 3 of the Directive. The Advisory Committee, established by the European Council got the task to examine such diplomas in the case some doubts are raised by other Member States. To carry out this task a matrix was designed, as an independent interpreting framework that mediates between the principles of Article 3 and the actual study program of a faculty. Such a tool was needed because of inconsistencies in the list of principles, differences between linguistic versions ofthe Directive, and quantification problems with time, devoted to the principles in the study programs. The core of the matrix, its headings, is a categorisation of the principles on a higher level of abstractionin the form of a taxonomy of domains and corresponding concepts. Filling in the matrix means that each study element of the study programs is analysed according to their content in terms of domains; thesummation of study time devoted to the various domains results in a so-called ‘profile of a faculty’. Judgement of that profile takes place by committee of peers. The domains of the taxonomy are intrinsically the same as the concepts and categories, needed for the description of an architectural design object: the faculties of architecture. This correspondence relates the taxonomy to the field of design theory and philosophy. The taxonomy is an application of Domain theory. This theory,developed by the authors since 1977, takes as a view that the architectural object only can be described fully as an integration of all types of domains. The theory supports the idea of a participatory andinterdisciplinary approach to design, which proved to be awarding both from a scientific and a social point of view. All types of domains have in common that they are measured in three dimensions: form, function and process, connecting the material aspects of the object with its social and proceduralaspects. In the taxonomy the function dimension is emphasised. It will be argued in the paper that the taxonomy is a categorisation following the pragmatistic philosophy of Charles Sanders Peirce. It will bedemonstrated as well that the taxonomy is easy to handle by giving examples of its application in various countries in the last 5 years. The taxonomy proved to be an adequate tool for judgement ofstudy programs and their subsequent improvement, as constituted by the faculties of a Faculty of Architecture. The matrix is described as the result of theoretical reflection and practical application of a matrix, already in use since 1995. The major improvement of the matrix is its direct connection with Peirce’s universal categories and the self-explanatory character of its structure. The connection with Peirce’s categories gave the matrix a more universal character, which enables application in other fieldswhere the term ‘architecture’ is used as a metaphor for artefacts.
series DDSS
last changed 2003/11/21 15:16

_id ddss9408
id ddss9408
authors Bax, Thijs and Trum, Henk
year 1994
title A Taxonomy of Architecture: Core of a Theory of Design
source Second Design and Decision Support Systems in Architecture & Urban Planning (Vaals, the Netherlands), August 15-19, 1994
summary The authors developed a taxonomy of concepts in architectural design. It was accepted by the Advisory Committee for education in the field of architecture, a committee advising the European Commission and Member States, as a reference for their task to harmonize architectural education in Europe. The taxonomy is based on Domain theory, a theory developed by the authors, based on General Systems Theory and the notion of structure according to French Structuralism, takes a participatory viewpoint for the integration of knowledge and interests by parties in the architectural design process. The paper discusses recent developments of the taxonomy, firstly as a result of a confrontation with similar endeavours to structure the field of architectural design, secondly as a result of applications of education and architectural design practice, and thirdly as a result of theapplication of some views derived from the philosophical work from Charles Benjamin Peirce. Developments concern the structural form of the taxonomy comprising basic concepts and levelbound scale concepts, and the specification of the content of the fields which these concepts represent. The confrontation with similar endeavours concerns mainly the work of an ARCUK workingparty, chaired by Tom Marcus, based on the European Directive from 1985. The application concerns experiences with a taxonomy-based enquiry in order to represent the profile of educational programmes of schools and faculties of architecture in Europe in qualitative and quantitative terms. This enquiry was carried out in order to achieve a basis for comparison and judgement, and a basis for future guidelines including quantitative aspects. Views of Peirce, more specifically his views on triarchy as a way of ordering and structuring processes of thinking,provide keys for a re-definition of concepts as building stones of the taxonomy in terms of the form-function-process-triad, which strengthens the coherence of the taxonomy, allowing for a more regular representation in the form of a hierarchical ordered matrix.
series DDSS
last changed 2003/08/07 16:36

_id cf2015_005
id cf2015_005
authors Celani, Gabriela; Sperling, David M. and Franco, Juarez M. S. (eds.)
year 2015
title Preface
source The next city - New technologies and the future of the built environment [16th International Conference CAAD Futures 2015. Sao Paulo, July 8-10, 2015. Electronic Proceedings/ ISBN 978-85-85783-53-2] Sao Paulo, Brazil, July 8-10, 2015, pp. 5-13.
summary Since 1985 the Computer-Aided Architectural Design Futures Foundation has fostered high level discussions about the search for excellence in the built environment through the use of new technologies with an exploratory and critical perspective. In 2015, the 16th CAAD Futures Conference was held, for the first time, in South America, in the lively megalopolis of Sao Paulo, Brazil. In order to establish a connection to local issues, the theme of the conference was "The next city". The city of Sao Paulo was torn down and almost completely rebuilt twice, from the mid 1800s to the mid 1900s, evolving from a city built in rammed-earth to a city built in bricks and then from a city built in bricks to a city built in concrete. In the 21st century, with the widespread use of digital technologies both in the design and production of buildings, cities are changing even faster, in terms of layout, materials, shapes, textures, production methods and, above all, in terms of the information that is now embedded in built systems.Among the 200 abstracts received in the first phase, 64 were selected for presentation in the conference and publication in the Electronic Proceedings, either as long or short papers, after 3 tough evaluation stages. Each paper was reviewed by at least three different experts from an international committee of more than 80 highly experienced researchers. The authors come from 23 different countries. Among all papers, 10 come from Latin-American institutions, which have been usually under-represented in CAAD Futures. The 33 highest rated long papers are also being published in a printed book by Springer. For this reason, only their abstracts were included in this Electronic Proceedings, at the end of each chapter.The papers in this book have been organized under the following topics: (1) modeling, analyzing and simulating the city, (2) sustainability and performance of the built environment, (3) automated and parametric design, (4) building information modeling (BIM), (5) fabrication and materiality, and (6) shape studies. The first topic includes papers describing different uses of computation applied to the study of the urban environment. The second one represents one of the most important current issues in the study and design of the built environment. The third topic, automated and parametric design, is an established field of research that is finally becoming more available to practitioners. Fabrication has been a hot topic in CAAD conferences, and is becoming ever more popular. This new way of making design and buildings will soon start affecting the way cities look like. Finally, shape studies are an established and respected field in design computing that is traditionally discussed in CAAD conferences.
series CAAD Futures
email
last changed 2015/06/29 07:55

_id c898
authors Gero, John S.
year 1986
title An Overview of Knowledge Engineering and its Relevance to CAAD
source Computer-Aided Architectural Design Futures [CAAD Futures Conference Proceedings / ISBN 0-408-05300-3] Delft (The Netherlands), 18-19 September 1985, pp. 107-119
summary Computer-aided architectural design (CAAD) has come to mean a number of often disparate activities. These can be placed into one of two categories: using the computer as a drafting and, to a lesser extent, modelling system; and using it as a design medium. The distinction between the two categories is often blurred. Using the computer as a drafting and modelling tool relies on computing notions concerned with representing objects and structures numerically and with ideas of computer programs as procedural algorithms. Similar notions underly the use of computers as a design medium. We shall return to these later. Clearly, all computer programs contain knowledge, whether methodological knowledge about processes or knowledge about structural relationships in models or databases. However, this knowledge is so intertwined with the procedural representation within the program that it can no longer be seen or found. Architecture is concerned with much more than numerical descriptions of buildings. It is concerned with concepts, ideas, judgement and experience. All these appear to be outside the realm of traditional computing. Yet architects discoursing use models of buildings largely unrelated to either numerical descriptions or procedural representations. They make use of knowledge - about objects, events and processes - and make nonprocedural (declarative) statements that can only be described symbolically. The limits of traditional computing are the limits of traditional computer-aided design systems, namely, that it is unable directly to represent and manipulate declarative, nonalgorithmic, knowledge or to perform symbolic reasoning. Developments in artificial intelligence have opened up ways of increasing the applicability of computers by acquiring and representing knowledge in computable forms. These approaches supplement rather than supplant existing uses of computers. They begin to allow the explicit representations of human knowledge. The remainder of this chapter provides a brief introduction to this field and describes, through applications, its relevance to computer- aided architectural design.
series CAAD Futures
email
last changed 2003/05/16 20:58

_id 68aa
authors Greenberg, Donald P.
year 1986
title Computer Graphics and Visualization
source Computer-Aided Architectural Design Futures [CAAD Futures Conference Proceedings / ISBN 0-408-05300-3] Delft (The Netherlands), 18-19 September 1985, pp. 63-67
summary The field of computer graphics has made enormous progress during the past decade. It is rapidly approaching the time when we will be able to create images of such realism that it will be possible to 'walk through' nonexistent spaces and to evaluate their aesthetic quality based on the simulations. In this chapter we wish to document the historical development of computer graphics image creation and describe some techniques which are currently being developed. We will try to explain some pilot projects that we are just beginning to undertake at the Program of Computer Graphics and the Center for Theory and Simulation in Science and Engineering at Cornell University.
series CAAD Futures
last changed 1999/04/03 17:58

_id 4f6f
authors Kalay, Yehuda E.
year 1985
title Knowledge-Based Computer-Aided Design to Assist Designers of Physical Artifacts
source 1985. [15] p. : ill. includes bibliography
summary The objectives of this project are to increase the productivity of physical designers, and to improve the quality of designed artifacts and environments. The means for achieving these objectives include the development, implementation and verification of a broad-based methodology to be used for building context-sensitive computer-aided design systems to facilitate the design and fabrication of physical artifacts. Such systems will extend computer aides for design over the earliest phases of the design process and thus facilitate design-capture in addition to the common design-communication utilities they currently provide. They will thus constitute intelligent design assistants that will relieve the designer from the necessity to deal with some design details, as well as the need to explicitly manage the consistency of the design database. The project employs principles developed by Artificial Intelligence methods that are used in non-deterministic problem solving processes that represent data and knowledge in distributed networks. Principles such as object-centered data factorization and message-based change propagation techniques are implemented in an existing architectural computer-aided design system and field-tested in a practicing Architectural/Engineering office
keywords CAD, knowledge base, design methods, design process, architecture
series CADline
email
last changed 2003/06/02 13:58

_id a920
authors Kulcke, Richard
year 1989
title CAAD in the Architectural Education of the Fachhochschulen in the Federal Republic of Germany
doi https://doi.org/10.52842/conf.ecaade.1989.x.w7a
source CAAD: Education - Research and Practice [eCAADe Conference Proceedings / ISBN 87-982875-2-4] Aarhus (Denmark) 21-23 September 1989, pp. 4.3.1
summary For over 10 years the author has been a teacher in the field of "computer application in architecture" at the Fachhochschule. Since 1985 he regularly has been taking part in the conferences of A.I.I.D.A. (Arbeitskreis INFORMATIK IN DER ARCHlTEKTENAUSBILDUNG). All the faculties of architecture at the Fachhochschulen (about 10) can send their representatives of CAAD to the conferences. A.I.I.D.A. has been having 2 conferences a year since 1985. At the last conference in Wiesbaden a paper with statements of A.I.I.D.A. for the further education in CAAD was finished. The author presents and explains this paper. On the other hand he shows the actual education program of CAAD of his faculty. The education in CAAD started in 1972 with basic information without practical elements. Now the practical work with the workstation is talking most of the time . The computer application is available for subjects like Building Economics, Building and Structure Design and others. With his assistant the author developed programs of the field of Building Economics. In 1986 he started introduce CAD with AutoCAD in the education program. Now also other colleagues start to integrate CAAD into their subjects.

series eCAADe
last changed 2022/06/07 07:50

_id 244d
authors Monedero, J., Casaus, A. and Coll, J.
year 1992
title From Barcelona. Chronicle and Provisional Evaluation of a New Course on Architectural Solid Modelling by Computerized Means
doi https://doi.org/10.52842/conf.ecaade.1992.351
source CAAD Instruction: The New Teaching of an Architect? [eCAADe Conference Proceedings] Barcelona (Spain) 12-14 November 1992, pp. 351-362
summary The first step made at the ETSAB in the computer field goes back to 1965, when professors Margarit and Buxade acquired an IBM computer, an electromechanical machine which used perforated cards and which was used to produce an innovative method of structural calculation. This method was incorporated in the academic courses and, at that time, this repeated question "should students learn programming?" was readily answered: the exercises required some knowledge of Fortran and every student needed this knowledge to do the exercises. This method, well known in Europe at that time, also provided a service for professional practice and marked the beginning of what is now the CC (Centro de Calculo) of our school. In 1980 the School bought a PDP1134, a computer which had 256 Kb of RAM, two disks of 5 Mb and one of lO Mb, and a multiplexor of 8 lines. Some time later the general politics of the UPC changed their course and this was related to the purchase of a VAX which is still the base of the CC and carries most of the administrative burden of the school. 1985 has probably been the first year in which we can talk of a general policy of the school directed towards computers. A report has been made that year, which includes an inquest adressed to the six Departments of the School (Graphic Expression, Projects, Structures, Construction, Composition and Urbanism) and that contains interesting data. According to the report, there were four departments which used computers in their current courses, while the two others (Projects and Composition) did not use them at all. The main user was the Department of Structures while the incidence of the remaining three was rather sporadic. The kind of problems detected in this report are very typical: lack of resources for hardware and software and for maintenance of the few computers that the school had at that moment; a demand (posed by the students) greatly exceeding the supply (computers and teachers). The main problem appeared to be the lack of computer graphic devices and proper software.

series eCAADe
email
last changed 2022/06/07 07:58

_id 6c66
authors Perlin, Ken
year 1985
title An Image Synthesizer
source SIGGRAPH '85 Conference Proceedings. July, 1985. vol. 19 ; no. 3: pp. 287- 296 : ill. includes bibliography
summary The authors introduce the concept of a Pixel Stream Editor. This forms the basis for an interactive synthesizer for designing highly realistic Computer Generated Imagery. The designer works in an interactive Very High Level programming environment which provides a very fast concept/implement/view iteration cycle. Naturalistic visual complexity is built up by composition of non-linear functions, as opposed to the more conventional texture mapping or growth model algorithms. Powerful primitives are included for creating controlled stochastic effects. The concept of 'solid texture' to the field of CGI is introduced. The authors have used this system to create very convincing representations of clouds, fire, water, stars, marble, wood, rock, soap films and crystals. The algorithms created with this paradigm are generally extremely fast, highly realistic, and asynchronously parallelizable at the pixel level
keywords computer graphics, programming, algorithms, synthesis, realism
series CADline
last changed 1999/02/12 15:09

_id c5a8
authors Schmitt, Gerhard N. (Ed.)
year 1991
title CAAD Futures '91 [Conference Proceedings]
source International Conference on Computer-Aided Architectural Design 1989/ ISBN 3-528-08821-4 / Zürich (Switzerland), July 1991, 594 p.
summary Computer Aided Architectural Design (CAAD) is the art of design and computation. Since the establishment of the CAAD futures organization in 1985, experts meet every two years to explore the state-of-the-art and postulate on future development in Computer Aided Architectural Design. The fourth international CAAD futures conference took place in July 1991 in Zürich at the Swiss Federal Institute of Technology (ETH Zürich), organized by the Chair for CAAD. More than 220 participants from 25 countries attended the conference. Presentation topics were education, research, and application. The mission of CAAD futures '91 was to provide an international forum for the dissemination and discussion of future oriented developments and new experiences in the field of Computer Aided Architectural Design. This book is one result of the conference and is divided into three sections: Education, Research and Application. This international overview of the 1991 state-of-the- art in Computer Aided Architectural Design will serve as a reference for design teachers, researchers, and application developers interested in CAAD.
series CAAD Futures
email
last changed 2003/05/16 20:58

_id 6686
authors Straub, K.
year 1986
title Problems in CAD Practice
source Computer-Aided Architectural Design Futures [CAAD Futures Conference Proceedings / ISBN 0-408-05300-3] Delft (The Netherlands), 18-19 September 1985, pp. 232-234
summary CAD's greatest promise is as a creative, interactive tool, and planning and construction will be more complex as the need to expand information grows. Our tools not only shape our products, they shape our lives. Technology can influence everyday life and also affect the structure of our society. Architecture is an information-intensive profession, and throughout the world information-intensive activities are being changed by technology. The use of computer-aided information processing in planning and construction brings about a period of dramatic change, and the dimensions of technological change will be breathtaking. In the years to come, CAD will be an expanding field in the architectural office, but how long will it be before architecture is routinely produced on a CAD system? There appear to be three issues: (1) cost; (2) time; (3) quality.
series CAAD Futures
last changed 1999/04/03 17:58

_id 0ecb
authors Waerum, Jens and Rüdiger Kristiansen, Bjarne
year 1989
title CAAD Education at the School of Architecture Copenhagen
doi https://doi.org/10.52842/conf.ecaade.1989.x.q8k
source CAAD: Education - Research and Practice [eCAADe Conference Proceedings / ISBN 87-982875-2-4] Aarhus (Denmark) 21-23 September 1989, pp. 4.5.1-4.5.9
summary The establishment of Datacentret (the Data Centre) in summer 1985 was preceded by 15 years slow- moving, arduous work from the early experiments in what was then the computing laboratory under the supervision of architect Per Jacobi, author of the Danish 3D drawing system MONSTER, until 1984, when a special committee was commissioned to draw up proposals for the introduction of teaching in computing at the Architects School. In spring 1985 the school administrators decided that a central computer workshop should be set up and in cooperation with the school's institutes placed jointly in charge of instructing teachers and students, carrying out research and development within the field of architecture and taking steps to work out a curriculum of supplementary training for practising architects. With the aid of a special grant, 12 PC's were successfully acquired in the 2 years that followed, as well as a screen projector and other peripherals.
series eCAADe
last changed 2022/06/07 07:50

_id ca88
authors Buzbee, B.L. and Sharp, D.H.
year 1985
title Perspectives on Supercomputing
source Science. February, 1985. vol. 227: pp. 591-597 : ill. includes bibliography
summary This article provides a brief look at the current status of supercomputers and supercomputing in the United States. It addresses a variety of applications of supercomputers and the characteristics of a large modern supercomputing facility, the radical changes in the design of supercomputers that are impending, and the conditions that are necessary for a conducive climate for the further development and application of supercomputers
keywords parallel processing, hardware, business
series CADline
last changed 2003/06/02 13:58

_id 78ca
authors Friedland, P. (Ed.)
year 1985
title Special Section on Architectures for Knowledge-Based Systems
source CACM (28), 9, September
summary A fundamental shift in the preferred approach to building applied artificial intelligence (AI) systems has taken place since the late 1960s. Previous work focused on the construction of general-purpose intelligent systems; the emphasis was on powerful inference methods that could function efficiently even when the available domain-specific knowledge was relatively meager. Today the emphasis is on the role of specific and detailed knowledge, rather than on reasoning methods.The first successful application of this method, which goes by the name of knowledge-based or expert-system research, was the DENDRAL program at Stanford, a long-term collaboration between chemists and computer scientists for automating the determination of molecular structure from empirical formulas and mass spectral data. The key idea is that knowledge is power, for experts, be they human or machine, are often those who know more facts and heuristics about a domain than lesser problem solvers. The task of building an expert system, therefore, is predominantly one of teaching" a system enough of these facts and heuristics to enable it to perform competently in a particular problem-solving context. Such a collection of facts and heuristics is commonly called a knowledge base. Knowledge-based systems are still dependent on inference methods that perform reasoning on the knowledge base, but experience has shown that simple inference methods like generate and test, backward-chaining, and forward-chaining are very effective in a wide variety of problem domains when they are coupled with powerful knowledge bases. If this methodology remains preeminent, then the task of constructing knowledge bases becomes the rate-limiting factor in expert-system development. Indeed, a major portion of the applied AI research in the last decade has been directed at developing techniques and tools for knowledge representation. We are now in the third generation of such efforts. The first generation was marked by the development of enhanced AI languages like Interlisp and PROLOG. The second generation saw the development of knowledge representation tools at AI research institutions; Stanford, for instance, produced EMYCIN, The Unit System, and MRS. The third generation is now producing fully supported commercial tools like KEE and S.1. Each generation has seen a substantial decrease in the amount of time needed to build significant expert systems. Ten years ago prototype systems commonly took on the order of two years to show proof of concept; today such systems are routinely built in a few months. Three basic methodologies-frames, rules, and logic-have emerged to support the complex task of storing human knowledge in an expert system. Each of the articles in this Special Section describes and illustrates one of these methodologies. "The Role of Frame-Based Representation in Reasoning," by Richard Fikes and Tom Kehler, describes an object-centered view of knowledge representation, whereby all knowldge is partitioned into discrete structures (frames) having individual properties (slots). Frames can be used to represent broad concepts, classes of objects, or individual instances or components of objects. They are joined together in an inheritance hierarchy that provides for the transmission of common properties among the frames without multiple specification of those properties. The authors use the KEE knowledge representation and manipulation tool to illustrate the characteristics of frame-based representation for a variety of domain examples. They also show how frame-based systems can be used to incorporate a range of inference methods common to both logic and rule-based systems.""Rule-Based Systems," by Frederick Hayes-Roth, chronicles the history and describes the implementation of production rules as a framework for knowledge representation. In essence, production rules use IF conditions THEN conclusions and IF conditions THEN actions structures to construct a knowledge base. The autor catalogs a wide range of applications for which this methodology has proved natural and (at least partially) successful for replicating intelligent behavior. The article also surveys some already-available computational tools for facilitating the construction of rule-based knowledge bases and discusses the inference methods (particularly backward- and forward-chaining) that are provided as part of these tools. The article concludes with a consideration of the future improvement and expansion of such tools.The third article, "Logic Programming, " by Michael Genesereth and Matthew Ginsberg, provides a tutorial introduction to the formal method of programming by description in the predicate calculus. Unlike traditional programming, which emphasizes how computations are to be performed, logic programming focuses on the what of objects and their behavior. The article illustrates the ease with which incremental additions can be made to a logic-oriented knowledge base, as well as the automatic facilities for inference (through theorem proving) and explanation that result from such formal descriptions. A practical example of diagnosis of digital device malfunctions is used to show how significantand complex problems can be represented in the formalism.A note to the reader who may infer that the AI community is being split into competing camps by these three methodologies: Although each provides advantages in certain specific domains (logic where the domain can be readily axiomatized and where complete causal models are available, rules where most of the knowledge can be conveniently expressed as experiential heuristics, and frames where complex structural descriptions are necessary to adequately describe the domain), the current view is one of synthesis rather than exclusivity. Both logic and rule-based systems commonly incorporate frame-like structures to facilitate the representation of large amounts of factual information, and frame-based systems like KEE allow both production rules and predicate calculus statements to be stored within and activated from frames to do inference. The next generation of knowledge representation tools may even help users to select appropriate methodologies for each particular class of knowledge, and then automatically integrate the various methodologies so selected into a consistent framework for knowledge. "
series journal paper
last changed 2003/04/23 15:14

_id 0397
authors Nadler, Edmond
year 1985
title Piecewise Linear Approximation on Triangulations of a Planar Region
source Reports in Pattern Analysis. [2], V, 76 p. :ill. May, 1985. No. 140. includes bibliography
summary For any triangulation of a given polygonal region, consider the piecewise linear least squares approximation of a given smooth function u. The problem is to characterize triangulations for which the global error of approximation is minimized for the number of triangles. The analogous problem in one dimension has been thoroughly analyzed, but in higher dimensions one has also to consider the shapes of the subregions, and not only their relative size. After establishing the existence of such an optimal triangulation, the local problem of best triangle shape is considered. Using an expression for the error of approximation involving the matrix H of second derivatives, the best shaped triangle is seen to be an equilateral transformed by a matrix related to H. This triangle is long in the direction of minimum curvature and narrow in the direction of maximum curvature, as one would expect. For the global problem, a series of two lower bounds on the approximation error are obtained, which suggest an asymptotic error estimate for optimal triangulation. The error estimate is shown to hold, and the conditions for attaining the lower bounds characterize the sizes and shapes of the triangles in the optimal triangulation. The shapes are seen to approach the optimal shapes described in the local analysis, and the errors on the triangles are seen to be asymptotically balanced
keywords triangulation, landscape, topology, computational geometry, computer graphics
series CADline
last changed 1999/02/12 15:09

_id ee4b
id ee4b
authors Ozel, Filiz
year 1985
title Using CAD in Fire Safety Research
doi https://doi.org/10.52842/conf.acadia.1985.142
source ACADIA Workshop ‘85 [ACADIA Conference Proceedings] Tempe (Arizona / USA) 2-3 November 1985, pp. 142-154
summary While architecture offices are increasingly using CADD systems for drafting purposes, architectural schools are pursuing projects that use the CAD data base for new applications in the analysis and evaluation of buildings. This paper summarizes two studies done at the University of Michigan, Architecture Research laboratory, where the CAD system was used to develop a fire safety code evaluation program, and an emergency egress behavior simulation.

The former one takes the National Fire Protection Association (NFPA) Life safety Code 101 as a basis, and generates the code compliance requirements of a given project. The ether study accepts people as information processing beings and simulates their way finding behavior under emergency conditions. Both of these studies utilize the graphic characteristics of the CAD system, producing color displays on the CRT screen, and also outputting information in tabular form which refers to the display on the screen. Both of them also have plotting options.

series ACADIA
email
last changed 2022/06/07 08:00

_id 8f9d
authors Wolchko, Matthew J.
year 1985
title Strategies Toward Architectural Knowledge Engineering
doi https://doi.org/10.52842/conf.acadia.1985.069
source ACADIA Workshop ‘85 [ACADIA Conference Proceedings] Tempe (Arizona / USA) 2-3 November 1985, pp. 69-82
summary Conventional CAD-drafting systems become more powerful modeling tools with the addition of a linked attribute spreadsheet module. This affords the designer the ability to make design decisions not only in the graphic environment, but also as a consequence of quantitative design constraints made apparent in the spreadsheet. While the spreadsheet interface is easily understood by the user, it suffers from two limitations: it lacks a variety of functional capabilities that would enable it to solve more complex design tasks; also, it can only report on existing conditions in the graphic environment. A proposal is made for the enhancement of the spreadsheet's programming power, creating an interface for the selection of program modules that can solve various architectural design tasks. Due to the complexity and graphic nature of architectural design, it is suggested that both procedural and propositional programming methods be used in concert within such a system. In the following, a suitable design task (artificial illumination-reflected ceiling layout) is selected, and then decomposed into two parts: the quantitative analysis (via the application of a procedural programming algorithm), and a logical model generation using shape grammar rules in a propositional framework.
series ACADIA
last changed 2022/06/07 07:57

No more hits.

HOMELOGIN (you are user _anon_506448 from group guest) CUMINCAD Papers Powered by SciX Open Publishing Services 1.002