CumInCAD is a Cumulative Index about publications in Computer Aided Architectural Design
supported by the sibling associations ACADIA, CAADRIA, eCAADe, SIGraDi, ASCAAD and CAAD futures

PDF papers
References

Hits 1 to 20 of 145

_id 78ca
authors Friedland, P. (Ed.)
year 1985
title Special Section on Architectures for Knowledge-Based Systems
source CACM (28), 9, September
summary A fundamental shift in the preferred approach to building applied artificial intelligence (AI) systems has taken place since the late 1960s. Previous work focused on the construction of general-purpose intelligent systems; the emphasis was on powerful inference methods that could function efficiently even when the available domain-specific knowledge was relatively meager. Today the emphasis is on the role of specific and detailed knowledge, rather than on reasoning methods.The first successful application of this method, which goes by the name of knowledge-based or expert-system research, was the DENDRAL program at Stanford, a long-term collaboration between chemists and computer scientists for automating the determination of molecular structure from empirical formulas and mass spectral data. The key idea is that knowledge is power, for experts, be they human or machine, are often those who know more facts and heuristics about a domain than lesser problem solvers. The task of building an expert system, therefore, is predominantly one of teaching" a system enough of these facts and heuristics to enable it to perform competently in a particular problem-solving context. Such a collection of facts and heuristics is commonly called a knowledge base. Knowledge-based systems are still dependent on inference methods that perform reasoning on the knowledge base, but experience has shown that simple inference methods like generate and test, backward-chaining, and forward-chaining are very effective in a wide variety of problem domains when they are coupled with powerful knowledge bases. If this methodology remains preeminent, then the task of constructing knowledge bases becomes the rate-limiting factor in expert-system development. Indeed, a major portion of the applied AI research in the last decade has been directed at developing techniques and tools for knowledge representation. We are now in the third generation of such efforts. The first generation was marked by the development of enhanced AI languages like Interlisp and PROLOG. The second generation saw the development of knowledge representation tools at AI research institutions; Stanford, for instance, produced EMYCIN, The Unit System, and MRS. The third generation is now producing fully supported commercial tools like KEE and S.1. Each generation has seen a substantial decrease in the amount of time needed to build significant expert systems. Ten years ago prototype systems commonly took on the order of two years to show proof of concept; today such systems are routinely built in a few months. Three basic methodologies-frames, rules, and logic-have emerged to support the complex task of storing human knowledge in an expert system. Each of the articles in this Special Section describes and illustrates one of these methodologies. "The Role of Frame-Based Representation in Reasoning," by Richard Fikes and Tom Kehler, describes an object-centered view of knowledge representation, whereby all knowldge is partitioned into discrete structures (frames) having individual properties (slots). Frames can be used to represent broad concepts, classes of objects, or individual instances or components of objects. They are joined together in an inheritance hierarchy that provides for the transmission of common properties among the frames without multiple specification of those properties. The authors use the KEE knowledge representation and manipulation tool to illustrate the characteristics of frame-based representation for a variety of domain examples. They also show how frame-based systems can be used to incorporate a range of inference methods common to both logic and rule-based systems.""Rule-Based Systems," by Frederick Hayes-Roth, chronicles the history and describes the implementation of production rules as a framework for knowledge representation. In essence, production rules use IF conditions THEN conclusions and IF conditions THEN actions structures to construct a knowledge base. The autor catalogs a wide range of applications for which this methodology has proved natural and (at least partially) successful for replicating intelligent behavior. The article also surveys some already-available computational tools for facilitating the construction of rule-based knowledge bases and discusses the inference methods (particularly backward- and forward-chaining) that are provided as part of these tools. The article concludes with a consideration of the future improvement and expansion of such tools.The third article, "Logic Programming, " by Michael Genesereth and Matthew Ginsberg, provides a tutorial introduction to the formal method of programming by description in the predicate calculus. Unlike traditional programming, which emphasizes how computations are to be performed, logic programming focuses on the what of objects and their behavior. The article illustrates the ease with which incremental additions can be made to a logic-oriented knowledge base, as well as the automatic facilities for inference (through theorem proving) and explanation that result from such formal descriptions. A practical example of diagnosis of digital device malfunctions is used to show how significantand complex problems can be represented in the formalism.A note to the reader who may infer that the AI community is being split into competing camps by these three methodologies: Although each provides advantages in certain specific domains (logic where the domain can be readily axiomatized and where complete causal models are available, rules where most of the knowledge can be conveniently expressed as experiential heuristics, and frames where complex structural descriptions are necessary to adequately describe the domain), the current view is one of synthesis rather than exclusivity. Both logic and rule-based systems commonly incorporate frame-like structures to facilitate the representation of large amounts of factual information, and frame-based systems like KEE allow both production rules and predicate calculus statements to be stored within and activated from frames to do inference. The next generation of knowledge representation tools may even help users to select appropriate methodologies for each particular class of knowledge, and then automatically integrate the various methodologies so selected into a consistent framework for knowledge. "
series journal paper
last changed 2003/04/23 15:14

_id e234
authors Kalay, Yehuda E. and Harfmann, Anton C.
year 1985
title An Integrative Approach to Computer-Aided Design Education in Architecture
source February, 1985. [17] p. : [8] p. of ill
summary With the advent of CAD, schools of architecture are now obliged to prepare their graduates for using the emerging new design tools and methods in architectural practices of the future. In addition to this educational obligation, schools of architecture (possibly in partnership with practicing firms) are also the most appropriate agents for pursuing research in CAD that will lead to the development of better CAD software for use by the profession as a whole. To meet these two rather different obligations, two kinds of CAD education curricula are required: one which prepares tool- users, and another that prepares tool-builders. The first educates students about the use of CAD tools for the design of buildings, whereas the second educates them about the design of CAD tools themselves. The School of Architecture and Planning in SUNY at Buffalo has recognized these two obligations, and in Fall 1982 began to meet them by planning and implementing an integrated CAD environment. This environment now consists of 3 components: a tool-building sequence of courses, an advanced research program, and a general tool-users architectural curriculum. Students in the tool-building course sequence learn the principles of CAD and may, upon graduation, become researchers and the managers of CAD systems in practicing offices. While in school they form a pool of research assistants who may be employed in the research component of the CAD environment, thereby facilitating the design and development of advanced CAD tools. The research component, through its various projects, develops and provides state of the art tools to be used by practitioners as well as by students in the school, in such courses as architectural studio, environmental controls, performance programming, and basic design courses. Students in these courses who use the tools developed by the research group constitute the tool-users component of the CAD environment. While they are being educated in the methods they will be using throughout their professional careers, they also act as a 'real-world' laboratory for testing the software and thereby provide feedback to the research component. The School of Architecture and Planning in SUNY at Buffalo has been the first school to incorporate such a comprehensive CAD environment in its curriculum, thereby successfully fulfilling its obligation to train students in the innovative methods of design that will be used in architectural practices of the future, and at the same time making a significant contribution to the profession of architecture as a whole. This paper describes the methodology and illustrates the history of the CAD environment's implementation in the School
keywords CAD, architecture, education
series CADline
email
last changed 2003/06/02 13:58

_id ddssar0206
id ddssar0206
authors Bax, M.F.Th. and Trum, H.M.G.J.
year 2002
title Faculties of Architecture
source Timmermans, Harry (Ed.), Sixth Design and Decision Support Systems in Architecture and Urban Planning - Part one: Architecture Proceedings Avegoor, the Netherlands), 2002
summary In order to be inscribed in the European Architect’s register the study program leading to the diploma ‘Architect’ has to meet the criteria of the EC Architect’s Directive (1985). The criteria are enumerated in 11 principles of Article 3 of the Directive. The Advisory Committee, established by the European Council got the task to examine such diplomas in the case some doubts are raised by other Member States. To carry out this task a matrix was designed, as an independent interpreting framework that mediates between the principles of Article 3 and the actual study program of a faculty. Such a tool was needed because of inconsistencies in the list of principles, differences between linguistic versions ofthe Directive, and quantification problems with time, devoted to the principles in the study programs. The core of the matrix, its headings, is a categorisation of the principles on a higher level of abstractionin the form of a taxonomy of domains and corresponding concepts. Filling in the matrix means that each study element of the study programs is analysed according to their content in terms of domains; thesummation of study time devoted to the various domains results in a so-called ‘profile of a faculty’. Judgement of that profile takes place by committee of peers. The domains of the taxonomy are intrinsically the same as the concepts and categories, needed for the description of an architectural design object: the faculties of architecture. This correspondence relates the taxonomy to the field of design theory and philosophy. The taxonomy is an application of Domain theory. This theory,developed by the authors since 1977, takes as a view that the architectural object only can be described fully as an integration of all types of domains. The theory supports the idea of a participatory andinterdisciplinary approach to design, which proved to be awarding both from a scientific and a social point of view. All types of domains have in common that they are measured in three dimensions: form, function and process, connecting the material aspects of the object with its social and proceduralaspects. In the taxonomy the function dimension is emphasised. It will be argued in the paper that the taxonomy is a categorisation following the pragmatistic philosophy of Charles Sanders Peirce. It will bedemonstrated as well that the taxonomy is easy to handle by giving examples of its application in various countries in the last 5 years. The taxonomy proved to be an adequate tool for judgement ofstudy programs and their subsequent improvement, as constituted by the faculties of a Faculty of Architecture. The matrix is described as the result of theoretical reflection and practical application of a matrix, already in use since 1995. The major improvement of the matrix is its direct connection with Peirce’s universal categories and the self-explanatory character of its structure. The connection with Peirce’s categories gave the matrix a more universal character, which enables application in other fieldswhere the term ‘architecture’ is used as a metaphor for artefacts.
series DDSS
last changed 2003/11/21 15:16

_id ddss9408
id ddss9408
authors Bax, Thijs and Trum, Henk
year 1994
title A Taxonomy of Architecture: Core of a Theory of Design
source Second Design and Decision Support Systems in Architecture & Urban Planning (Vaals, the Netherlands), August 15-19, 1994
summary The authors developed a taxonomy of concepts in architectural design. It was accepted by the Advisory Committee for education in the field of architecture, a committee advising the European Commission and Member States, as a reference for their task to harmonize architectural education in Europe. The taxonomy is based on Domain theory, a theory developed by the authors, based on General Systems Theory and the notion of structure according to French Structuralism, takes a participatory viewpoint for the integration of knowledge and interests by parties in the architectural design process. The paper discusses recent developments of the taxonomy, firstly as a result of a confrontation with similar endeavours to structure the field of architectural design, secondly as a result of applications of education and architectural design practice, and thirdly as a result of theapplication of some views derived from the philosophical work from Charles Benjamin Peirce. Developments concern the structural form of the taxonomy comprising basic concepts and levelbound scale concepts, and the specification of the content of the fields which these concepts represent. The confrontation with similar endeavours concerns mainly the work of an ARCUK workingparty, chaired by Tom Marcus, based on the European Directive from 1985. The application concerns experiences with a taxonomy-based enquiry in order to represent the profile of educational programmes of schools and faculties of architecture in Europe in qualitative and quantitative terms. This enquiry was carried out in order to achieve a basis for comparison and judgement, and a basis for future guidelines including quantitative aspects. Views of Peirce, more specifically his views on triarchy as a way of ordering and structuring processes of thinking,provide keys for a re-definition of concepts as building stones of the taxonomy in terms of the form-function-process-triad, which strengthens the coherence of the taxonomy, allowing for a more regular representation in the form of a hierarchical ordered matrix.
series DDSS
last changed 2003/08/07 16:36

_id a217
authors Bhatt, Rajesh V., Fisher, Edward L. and Rasdorf, William J.
year 1985
title Information Retrieval Architectures For Expert System/DBMS Communication
source Industrial Engineering Fall Conference Proceedings. December, 1985. pp. 315-320. CADLINE has abstract only
summary The development of expert systems (ES) for manufacturing problems indicates a need to interact with potentially large amounts of data, much of which resides elsewhere in the ES user's organization. A large amount of information required for planning, design, and control operations can be made available through an existing database management system (DBMS). The need for an ES to access that data is critical. This paper presents two approaches to the development of ES- DBMS interfaces, both query-language based. One approach uses a procedural attachment to the ES language to obtain the required data via the DBMS query language, while the other one uses a separate interface program between the ES and the query language of the DBMS. The procedural attachment is able to acquire data from a DBMS at a faster rate than the interface program; however, the procedural attachment lacks knowledge of the DBMS schema. On the other hand, the interface program sacrifices speed but promotes flexibility, as it has the capability of selecting which DBMS to extract the required data from and allowing augmentation of schema knowledge outside of the ES. A disadvantage of the interface approach is the amount of time involved in data retrieval. The process of writing information to disk files is I/O intensive. This can be quite slow, particularly in PROLOG, the language used to implement the ES. Thus the use of such an interface is only suitable in applications such as design, where extremely fast I/O is not required
keywords design, engineering, expert systems, information, database, DBMS
series CADline
last changed 2003/06/02 10:24

_id 644f
authors Bijl, Aart
year 1986
title Designing with Words and Pictures in a Logic Modelling Environment
source Computer-Aided Architectural Design Futures [CAAD Futures Conference Proceedings / ISBN 0-408-05300-3] Delft (The Netherlands), 18-19 September 1985, pp. 128-145
summary At EdCAAD we are interested in design as something people do. Designed artefacts, the products of designing, are interesting only in so far as they tell us something about design. An extreme expression of this position is to say that the world of design is the thoughts in the heads of designers, plus the skills of designers in externalizing their thoughts; design artifacts, once perceived and accepted in the worlds of other people, are no longer part of the world of design. We can describe design, briefly, as a process of synthesis. Design has to achieve a fusion between parts to create new parts, so that the products are recognized, as having a right and proper place in the world of people. Parts should be understood as referring to anything - physical objects, abstract ideas, aspirations. These parts occur in some design environment from which parts are extracted, designed upon and results replaced; in the example of buildings, the environment is people and results have to be judged by reference to that environment. It is characteristic of design that both the process and the product are not subject to explicit and complete criteria. This view of design differs sharply from the more orthodox understanding of scientific and technological endeavours which rely predominantly on a process of analysis. In the latter case, the approach is to decompose a problem into parts until individual parts are recognized as being amenable to known operations and results are reassembled into a solution. This process has a peripheral role in design when evaluating selected aspects of tentative design proposals, but the absence of well-defined and widely recognized criteria for design excludes it from the main stream of analytical developments.
series CAAD Futures
last changed 2003/11/21 15:16

_id a6f1
authors Bridges, A.H.
year 1986
title Any Progress in Systematic Design?
source Computer-Aided Architectural Design Futures [CAAD Futures Conference Proceedings / ISBN 0-408-05300-3] Delft (The Netherlands), 18-19 September 1985, pp. 5-15
summary In order to discuss this question it is necessary to reflect awhile on design methods in general. The usual categorization discusses 'generations' of design methods, but Levy (1981) proposes an alternative approach. He identifies five paradigm shifts during the course of the twentieth century which have influenced design methods debate. The first paradigm shift was achieved by 1920, when concern with industrial arts could be seen to have replaced concern with craftsmanship. The second shift, occurring in the early 1930s, resulted in the conception of a design profession. The third happened in the 1950s, when the design methods debate emerged; the fourth took place around 1970 and saw the establishment of 'design research'. Now, in the 1980s, we are going through the fifth paradigm shift, associated with the adoption of a holistic approach to design theory and with the emergence of the concept of design ideology. A major point in Levy's paper was the observation that most of these paradigm shifts were associated with radical social reforms or political upheavals. For instance, we may associate concern about public participation with the 1970s shift and the possible use (or misuse) of knowledge, information and power with the 1980s shift. What has emerged, however, from the work of colleagues engaged since the 1970s in attempting to underpin the practice of design with a coherent body of design theory is increasing evidence of the fundamental nature of a person's engagement with the design activity. This includes evidence of the existence of two distinctive modes of thought, one of which can be described as cognitive modelling and the other which can be described as rational thinking. Cognitive modelling is imagining, seeing in the mind's eye. Rational thinking is linguistic thinking, engaging in a form of internal debate. Cognitive modelling is externalized through action, and through the construction of external representations, especially drawings. Rational thinking is externalized through verbal language and, more formally, through mathematical and scientific notations. Cognitive modelling is analogic, presentational, holistic, integrative and based upon pattern recognition and pattern manipulation. Rational thinking is digital, sequential, analytical, explicatory and based upon categorization and logical inference. There is some relationship between the evidence for two distinctive modes of thought and the evidence of specialization in cerebral hemispheres (Cross, 1984). Design methods have tended to focus upon the rational aspects of design and have, therefore, neglected the cognitive aspects. By recognizing that there are peculiar 'designerly' ways of thinking combining both types of thought process used to perceive, construct and comprehend design representations mentally and then transform them into an external manifestation current work in design theory is promising at last to have some relevance to design practice.
series CAAD Futures
email
last changed 2003/11/21 15:16

_id avocaad_2001_02
id avocaad_2001_02
authors Cheng-Yuan Lin, Yu-Tung Liu
year 2001
title A digital Procedure of Building Construction: A practical project
source AVOCAAD - ADDED VALUE OF COMPUTER AIDED ARCHITECTURAL DESIGN, Nys Koenraad, Provoost Tom, Verbeke Johan, Verleye Johan (Eds.), (2001) Hogeschool voor Wetenschap en Kunst - Departement Architectuur Sint-Lucas, Campus Brussel, ISBN 80-76101-05-1
summary In earlier times in which computers have not yet been developed well, there has been some researches regarding representation using conventional media (Gombrich, 1960; Arnheim, 1970). For ancient architects, the design process was described abstractly by text (Hewitt, 1985; Cable, 1983); the process evolved from unselfconscious to conscious ways (Alexander, 1964). Till the appearance of 2D drawings, these drawings could only express abstract visual thinking and visually conceptualized vocabulary (Goldschmidt, 1999). Then with the massive use of physical models in the Renaissance, the form and space of architecture was given better precision (Millon, 1994). Researches continued their attempts to identify the nature of different design tools (Eastman and Fereshe, 1994). Simon (1981) figured out that human increasingly relies on other specialists, computational agents, and materials referred to augment their cognitive abilities. This discourse was verified by recent research on conception of design and the expression using digital technologies (McCullough, 1996; Perez-Gomez and Pelletier, 1997). While other design tools did not change as much as representation (Panofsky, 1991; Koch, 1997), the involvement of computers in conventional architecture design arouses a new design thinking of digital architecture (Liu, 1996; Krawczyk, 1997; Murray, 1997; Wertheim, 1999). The notion of the link between ideas and media is emphasized throughout various fields, such as architectural education (Radford, 2000), Internet, and restoration of historical architecture (Potier et al., 2000). Information technology is also an important tool for civil engineering projects (Choi and Ibbs, 1989). Compared with conventional design media, computers avoid some errors in the process (Zaera, 1997). However, most of the application of computers to construction is restricted to simulations in building process (Halpin, 1990). It is worth studying how to employ computer technology meaningfully to bring significant changes to concept stage during the process of building construction (Madazo, 2000; Dave, 2000) and communication (Haymaker, 2000).In architectural design, concept design was achieved through drawings and models (Mitchell, 1997), while the working drawings and even shop drawings were brewed and communicated through drawings only. However, the most effective method of shaping building elements is to build models by computer (Madrazo, 1999). With the trend of 3D visualization (Johnson and Clayton, 1998) and the difference of designing between the physical environment and virtual environment (Maher et al. 2000), we intend to study the possibilities of using digital models, in addition to drawings, as a critical media in the conceptual stage of building construction process in the near future (just as the critical role that physical models played in early design process in the Renaissance). This research is combined with two practical building projects, following the progress of construction by using digital models and animations to simulate the structural layouts of the projects. We also tried to solve the complicated and even conflicting problems in the detail and piping design process through an easily accessible and precise interface. An attempt was made to delineate the hierarchy of the elements in a single structural and constructional system, and the corresponding relations among the systems. Since building construction is often complicated and even conflicting, precision needed to complete the projects can not be based merely on 2D drawings with some imagination. The purpose of this paper is to describe all the related elements according to precision and correctness, to discuss every possibility of different thinking in design of electric-mechanical engineering, to receive feedback from the construction projects in the real world, and to compare the digital models with conventional drawings.Through the application of this research, the subtle relations between the conventional drawings and digital models can be used in the area of building construction. Moreover, a theoretical model and standard process is proposed by using conventional drawings, digital models and physical buildings. By introducing the intervention of digital media in design process of working drawings and shop drawings, there is an opportune chance to use the digital media as a prominent design tool. This study extends the use of digital model and animation from design process to construction process. However, the entire construction process involves various details and exceptions, which are not discussed in this paper. These limitations should be explored in future studies.
series AVOCAAD
email
last changed 2005/09/09 10:48

_id 298e
authors Dave, Bharat and Woodbury, Robert
year 1990
title Computer Modeling: A First Course in Design Computing
source The Electronic Design Studio: Architectural Knowledge and Media in the Computer Era [CAAD Futures ‘89 Conference Proceedings / ISBN 0-262-13254-0] Cambridge (Massachusetts / USA), 1989, pp. 61-76
summary Computation in design has long been a focus in our department. In recent years our faculty has paid particular attention to the use of computation in professional architectural education. The result is a shared vision of computers in the curriculum [Woodbury 1985] and a set of courses, some with considerable historyland others just now being initiated. We (Dave and Woodbury) have jointly developed and at various times over the last seven years have taught Computer Modeling, the most introductory of these courses. This is a required course for all the incoming freshmen students in the department. In this paper we describe Computer Modeling: its context, the issues and topics it addresses, the tasks it requires of students, and the questions and opportunities that it raises. Computer Modeling is a course about concepts, about ways of explicitly understanding design and its relation to computation. Procedural skills and algorithmic problem solving techniques are given only secondary emphasis. In essential terms, the course is about models, of design processes, of designed objects, of computation and of computational design. Its lessons are intended to communicate a structure of such models to students and through this structure to demonstrate a relationship between computation and design. It is hoped that this structure can be used as a framework, around which students can continue to develop an understanding of computers in design.
series CAAD Futures
email
last changed 2003/05/16 20:58

_id 23bc
authors Demko, Stephen, Hodges, Laurie and Naylor, Bruce F.
year 1985
title Construction of Fractal Objects with Iterated Function Systems
source SIGGRAPH '85 Conference Proceedings. July, 1985. vol. 19 ; no. 3: pp. 271-278 : ill. col. includes bibliography
summary In computer graphics, geometric modeling of complex objects is a difficult process. An important class of complex objects arise from natural phenomena: trees, plants, clouds, mountains, etc. Researchers are investigating a variety of techniques for extending modeling capabilities to include these as well as other classes. One mathematical concept that appears to have significant potential for this is fractals. Much interest currently exists in the general scientific community in using fractals as a model of complex natural phenomena. However, only a few methods for generating fractal sets are known. We have been involved in the development of a new approach to computing fractals. Any set of linear maps (affine transformations) and an associated set of probabilities determines an Iterated Function System (IFS). Each IFS has a unique 'attractor' which is typically a fractal set (object). Specification of only a few maps can produce very complicated objects. Design of fractal objects is made relatively simple and intuitive by the discovery of an important mathematical property relating the fractal sets to the IFS. The method also provides the possibility of solving the inverse problem, given the geometry of an object, determine an IFS that will (approximately) generate that geometry. This paper presents the application of the theory of IFS to geometric modeling
keywords computer graphics, geometric modeling, fractals, visualization
series CADline
last changed 2003/06/02 13:58

_id a48a
authors Kalay, Yehuda E. and Shibley, Robert G.
year 1985
title Computer-Aided Design Research and Technology Transfer : Report of the SUNY-AB Symposium
source Buffalo: November, 1985. pp. 1-16
summary To explore modes of creative relationship between the university, government, industry and professional practice for the purpose of computer-aided design (CAD) research, development, and education in the disciplines that relate to design, construction and management of building, the School of Architecture and Planning of the State University of New York Buffalo, in cooperation with the Maedl Group of Buffalo New York, have assembled a panel of experts to deliberate and to explore how the transfer of CAD technology from research laboratories to architectural and engineering practices can best be accomplished. Institutionally the panel consisted of representatives of the university researchers and educators, private research and development corporations, a governmental agency that supports basic research and technology transfer, and the professional community who will ultimately use the produce
keywords architecture, technology transfer, CAD, research, practice, education
series CADline
email
last changed 2003/06/02 13:58

_id 0e0a
authors Kalay, Yehuda E., Harfmann, Anton C. and Swerdloff, Lucien M.
year 1985
title An Expert System Approach to Computer-Aided Participatory Architectural Design
source February, 1985. 16 p. : ill. includes bibliography
summary Increased satisfaction of the built environment can be achieved by more effective communication between the people who use that environment and the designers who form it. Participatory design is a method which educates and involves the users in the actual design process so that such a communication becomes possible. Methods that have so far been developed for participatory design have proven to be too limited, due mainly to the large time demands they place on architects. An effective participatory design method can be achieved by the use of a knowledge-based expert system which is capable of providing an educational design experience to the user. The development and implementation of such a system, specifically for the design of single family homes, is the focus of this paper
keywords expert systems, CAD, architecture, design process
series CADline
email
last changed 2003/06/02 13:58

_id a920
authors Kulcke, Richard
year 1989
title CAAD in the Architectural Education of the Fachhochschulen in the Federal Republic of Germany
doi https://doi.org/10.52842/conf.ecaade.1989.x.w7a
source CAAD: Education - Research and Practice [eCAADe Conference Proceedings / ISBN 87-982875-2-4] Aarhus (Denmark) 21-23 September 1989, pp. 4.3.1
summary For over 10 years the author has been a teacher in the field of "computer application in architecture" at the Fachhochschule. Since 1985 he regularly has been taking part in the conferences of A.I.I.D.A. (Arbeitskreis INFORMATIK IN DER ARCHlTEKTENAUSBILDUNG). All the faculties of architecture at the Fachhochschulen (about 10) can send their representatives of CAAD to the conferences. A.I.I.D.A. has been having 2 conferences a year since 1985. At the last conference in Wiesbaden a paper with statements of A.I.I.D.A. for the further education in CAAD was finished. The author presents and explains this paper. On the other hand he shows the actual education program of CAAD of his faculty. The education in CAAD started in 1972 with basic information without practical elements. Now the practical work with the workstation is talking most of the time . The computer application is available for subjects like Building Economics, Building and Structure Design and others. With his assistant the author developed programs of the field of Building Economics. In 1986 he started introduce CAD with AutoCAD in the education program. Now also other colleagues start to integrate CAAD into their subjects.

series eCAADe
last changed 2022/06/07 07:50

_id 0711
authors Kunnath, S.K., Reinhorn, A.M. and Abel, J.F.
year 1990
title A Computational Tool for Evaluation of Seismic Performance of RC Buildings
source February, 1990. [1] 15 p. : ill. graphs, tables. includes bibliography: p. 10-11
summary Recent events have demonstrated the damaging power of earthquakes on structural assemblages resulting in immense loss of life and property (Mexico City, 1985; Armenia, 1988; San Francisco, 1989). While the present state-of-the-art in inelastic seismic response analysis of structures is capable of estimating response quantities in terms of deformations, stresses, etc., it has not established a physical qualification of these end-results into measures of damage sustained by the structure wherein system vulnerability is ascertained in terms of serviceability, repairability, and/or collapse. An enhanced computational tool is presented in this paper for evaluation of reinforced concrete structures (such as buildings and bridges) subjected to seismic loading. The program performs a series of tasks to enable a complete evaluation of the structural system: (a) elastic collapse- mode analysis to determine the base shear capacity of the system; (b) step-by-step time history analysis using a macromodel approach in which the inelastic behavior of RC structural components is incorporated; (c) reduction of the response quantities to damage indices so that a physical interpretation of the response is possible. The program is built around two graphical interfaces: one for preprocessing of structural and loading data; and the other for visualization of structural damage following the seismic analysis. This program can serve as an invaluable tool in estimating the seismic performance of existing RC buildings and for designing new structures within acceptable levels of damage
keywords seismic, structures, applications, evaluation, civil engineering, CAD
series CADline
last changed 2003/06/02 14:41

_id 0397
authors Nadler, Edmond
year 1985
title Piecewise Linear Approximation on Triangulations of a Planar Region
source Reports in Pattern Analysis. [2], V, 76 p. :ill. May, 1985. No. 140. includes bibliography
summary For any triangulation of a given polygonal region, consider the piecewise linear least squares approximation of a given smooth function u. The problem is to characterize triangulations for which the global error of approximation is minimized for the number of triangles. The analogous problem in one dimension has been thoroughly analyzed, but in higher dimensions one has also to consider the shapes of the subregions, and not only their relative size. After establishing the existence of such an optimal triangulation, the local problem of best triangle shape is considered. Using an expression for the error of approximation involving the matrix H of second derivatives, the best shaped triangle is seen to be an equilateral transformed by a matrix related to H. This triangle is long in the direction of minimum curvature and narrow in the direction of maximum curvature, as one would expect. For the global problem, a series of two lower bounds on the approximation error are obtained, which suggest an asymptotic error estimate for optimal triangulation. The error estimate is shown to hold, and the conditions for attaining the lower bounds characterize the sizes and shapes of the triangles in the optimal triangulation. The shapes are seen to approach the optimal shapes described in the local analysis, and the errors on the triangles are seen to be asymptotically balanced
keywords triangulation, landscape, topology, computational geometry, computer graphics
series CADline
last changed 1999/02/12 15:09

_id 4c92
authors Norman, Richard B.
year 1985
title Electronic Color in the Architectural Studio - An Alternative Strategy for Introducing the Computer as a Creative Tool in the Studio Environment
doi https://doi.org/10.52842/conf.acadia.1985.035
source ACADIA Workshop ‘85 [ACADIA Conference Proceedings] Tempe (Arizona / USA) 2-3 November 1985, pp. 35-42
summary An alternative strategy is proposed for introducing the computer as a creative tool in the studio environment. It is suggested that computer graphic capabilities, focusing on color as an element of design, be incorporated into basic design studios. Techniques of color drawing on the computer are discussed, and computer modeling of color systems is recommended as a vehicle through which to introduce color theory. The effect of color on the perception of buildings is explored, illustrating how color selection can affect a building's line, form and spatial quality. These techniques enable students to develop an appreciation of the use of color in buildings, reinforcing their knowledge of basic design, and introducing them to graphic computing in a visually provocative manner. The proposal recognizes the importance of both color theory and graphic computers to an evolving architectural curriculum.

series ACADIA
email
last changed 2022/06/07 07:58

_id 8307
authors Rehak, Daniel R. and Howard, Craig H.
year 1985
title Interfacing Expert Systems with Design Databases in Integrated CAD Systems
source Computer Aided Design. November, 1985. vol. 17: pp. 443-454 : ill. includes bibliography
summary A model of a distributed network DBMS, using knowledge-base programming techniques, for interfacing KBS-to-DBMS is presented. In this model, the description of the data model of each KBS and DBMS component of the CAD system is represented as knowledge describing the components, making the components independent of each other. KADBASE, a prototype of such a flexible interface is demonstrating an approach to developing an integrated, distributed CAD system containing a variety of heterogeneous expert systems and design databases
keywords expert systems, design, database, user interface, integration, CAD
series CADline
last changed 2003/06/02 13:58

_id 206caadria2004
id 206caadria2004
authors Ricardo Sosa and John S. Gero
year 2004
title Diffusion of Design Ideas: Gatekeeping Effects
doi https://doi.org/10.52842/conf.caadria.2004.287
source CAADRIA 2004 [Proceedings of the 9th International Conference on Computer Aided Architectural Design Research in Asia / ISBN 89-7141-648-3] Seoul Korea 28-30 April 2004, pp. 287-302
summary Designers and design managers are interested in gaining a deeper understanding of the complexities of creativity and innovation (Langdon and Rothwell 1985). These two phenomena can be seen as complementary dimensions of a differentiation cycle where design plays a key value-adding role that gradually reduces through commoditisation. However, there is a lack of relevant evidence to explain the link between creativity and innovation. Creativity is increasingly considered as occurring in the interaction between the individual generator of an idea and a group of evaluators (Sawyer et al 2003). However, most studies have regarded the generation of a solution -and not its social impact- as the outcome of the creative process (Runco and Pritzker 1999). Accordingly, computational modelling of creativity has been mainly conducted in a social void (Boden 1999).
series CAADRIA
email
last changed 2022/06/07 07:56

_id 020d
authors Shaviv, Edna
year 1986
title Layout Design Problems: Systematic Approaches
source Computer-Aided Architectural Design Futures [CAAD Futures Conference Proceedings / ISBN 0-408-05300-3] Delft (The Netherlands), 18-19 September 1985, pp. 28-52
summary The complexity of the layout design problems known as the 'spatial allocation problems' gave rise to several approaches, which can be generally classified into two main streams. The first attempts to use the computer to generate solutions of the building layout, while in the second, computers are used only to evaluate manually generated solutions. In both classes the generation or evaluation of the layout are performed systematically. Computer algorithms for 'spatial allocation problems' first appeared more than twenty-five years ago (Koopmans, 1957). From 1957 to 1970 over thirty different programs were developed for generating the floor plan layout automatically, as is summarized in CAP-Computer Architecture Program, Vol. 2 (Stewart et al., 1970). It seems that any architect who entered the area of CAAD felt that it was his responsibility to find a solution to this prime architectural problem. Most of the programs were developed for batch processing, and were run on a mainframe without any sophisticated input/output devices. It is interesting to mention that, because of the lack of these sophisticated input/output devices, early researchers used the approach of automatic generation of optimal or quasioptimal layout solution under given constraints. Gradually, we find a recession and slowdown in the development of computer programs for generation of layout solutions. With the improvement of interactive input/output devices and user interfaces, the inclination today is to develop integrated systems in which the architectural solution is obtained manually by the architect and is introduced to the computer for the appraisal of the designer's layout solution (Maver, 1977). The manmachine integrative systems could work well, but it seems that in most of the integrated systems today, and in the commercial ones in particular, there is no route to any appraisal technique of the layout problem. Without any evaluation techniques in commercial integrated systems it seems that the geometrical database exists Just to create working drawings and sometimes also perspectives.
series CAAD Futures
email
last changed 2003/05/16 20:58

_id e8ec
authors Weber, Benz
year 1991
title LEARNING FROM THE FULL-SCALE LABORATORY
source Proceedings of the 3rd European Full-Scale Modelling Conference / ISBN 91-7740044-5 / Lund (Sweden) 13-16 September 1990, pp. 12-19
summary The team from the LEA at Lausanne was not actually involved in the construction of the laboratory itself. During the past five years we have been discovering the qualities and limitations of the lab step by step through the experiments we performed. The method in which we use it is quite different from that of its creators. Since 1985 the external services has been limited to clients coming to the laboratory alone. We help them only with basic instructions for the use of the equipment. Most of these experiments are motivated by the excellent possibilities to discuss the design of a new hospital or home for elderly with the people directly affected by it, such as patients, nurses, doctors and specialists for the technical equipment. The main issues discussed in these meetings are of the dimensions and functional organisation of the spaces. The entire process for a normal room including construction, discussions and dismantling of the full-scale model is between three and five days. Today these types of experiments are occupying the lab only about twenty days a year.
keywords Full-scale Modeling, Model Simulation, Real Environments
series other
type normal paper
more http://info.tuwien.ac.at/efa
last changed 2004/05/04 15:23

For more results click below:

this is page 0show page 1show page 2show page 3show page 4show page 5... show page 7HOMELOGIN (you are user _anon_662990 from group guest) CUMINCAD Papers Powered by SciX Open Publishing Services 1.002