CumInCAD is a Cumulative Index about publications in Computer Aided Architectural Design
supported by the sibling associations ACADIA, CAADRIA, eCAADe, SIGraDi, ASCAAD and CAAD futures

PDF papers
References

Hits 1 to 20 of 215

_id eaca
authors Davis, L. (ed.)
year 1991
title Handbook of genetic algorithms
source Van Nostrand Reinhold, New York
summary This book sets out to explain what genetic algorithms are and how they can be used to solve real-world problems. The first objective is tackled by the editor, Lawrence Davis. The remainder of the book is turned over to a series of short review articles by a collection of authors, each explaining how genetic algorithms have been applied to problems in their own specific area of interest. The first part of the book introduces the fundamental genetic algorithm (GA), explains how it has traditionally been designed and implemented and shows how the basic technique may be applied to a very simple numerical optimisation problem. The basic technique is then altered and refined in a number of ways, with the effects of each change being measured by comparison against the performance of the original. In this way, the reader is provided with an uncluttered introduction to the technique and learns to appreciate why certain variants of GA have become more popular than others in the scientific community. Davis stresses that the choice of a suitable representation for the problem in hand is a key step in applying the GA, as is the selection of suitable techniques for generating new solutions from old. He is refreshingly open in admitting that much of the business of adapting the GA to specific problems owes more to art than to science. It is nice to see the terminology associated with this subject explained, with the author stressing that much of the field is still an active area of research. Few assumptions are made about the reader's mathematical background. The second part of the book contains thirteen cameo descriptions of how genetic algorithmic techniques have been, or are being, applied to a diverse range of problems. Thus, one group of authors explains how the technique has been used for modelling arms races between neighbouring countries (a non- linear, dynamical system), while another group describes its use in deciding design trade-offs for military aircraft. My own favourite is a rather charming account of how the GA was applied to a series of scheduling problems. Having attempted something of this sort with Simulated Annealing, I found it refreshing to see the authors highlighting some of the problems that they had encountered, rather than sweeping them under the carpet as is so often done in the scientific literature. The editor points out that there are standard GA tools available for either play or serious development work. Two of these (GENESIS and OOGA) are described in a short, third part of the book. As is so often the case nowadays, it is possible to obtain a diskette containing both systems by sending your Visa card details (or $60) to an address in the USA.
series other
last changed 2003/04/23 15:14

_id ga0010
id ga0010
authors Moroni, A., Zuben, F. Von and Manzolli, J.
year 2000
title ArTbitrariness in Music
source International Conference on Generative Art
summary Evolution is now considered not only powerful enough to bring about the biological entities as complex as humans and conciousness, but also useful in simulation to create algorithms and structures of higher levels of complexity than could easily be built by design. In the context of artistic domains, the process of human-machine interaction is analyzed as a good framework to explore creativity and to produce results that could not be obtained without this interaction. When evolutionary computation and other computational intelligence methodologies are involved, every attempt to improve aesthetic judgement we denote as ArTbitrariness, and is interpreted as an interactive iterative optimization process. ArTbitrariness is also suggested as an effective way to produce art through an efficient manipulation of information and a proper use of computational creativity to increase the complexity of the results without neglecting the aesthetic aspects [Moroni et al., 2000]. Our emphasis will be in an approach to interactive music composition. The problem of computer generation of musical material has received extensive attention and a subclass of the field of algorithmic composition includes those applications which use the computer as something in between an instrument, in which a user "plays" through the application's interface, and a compositional aid, which a user experiments with in order to generate stimulating and varying musical material. This approach was adopted in Vox Populi, a hybrid made up of an instrument and a compositional environment. Differently from other systems found in genetic algorithms or evolutionary computation, in which people have to listen to and judge the musical items, Vox Populi uses the computer and the mouse as real-time music controllers, acting as a new interactive computer-based musical instrument. The interface is designed to be flexible for the user to modify the music being generated. It explores evolutionary computation in the context of algorithmic composition and provides a graphical interface that allows to modify the tonal center and the voice range, changing the evolution of the music by using the mouse[Moroni et al., 1999]. A piece of music consists of several sets of musical material manipulated and exposed to the listener, for example pitches, harmonies, rhythms, timbres, etc. They are composed of a finite number of elements and basically, the aim of a composer is to organize those elements in an esthetic way. Modeling a piece as a dynamic system implies a view in which the composer draws trajectories or orbits using the elements of each set [Manzolli, 1991]. Nonlinear iterative mappings are associated with interface controls. In the next page two examples of nonlinear iterative mappings with their resulting musical pieces are shown.The mappings may give rise to attractors, defined as geometric figures that represent the set of stationary states of a non-linear dynamic system, or simply trajectories to which the system is attracted. The relevance of this approach goes beyond music applications per se. Computer music systems that are built on the basis of a solid theory can be coherently embedded into multimedia environments. The richness and specialty of the music domain are likely to initiate new thinking and ideas, which will have an impact on areas such as knowledge representation and planning, and on the design of visual formalisms and human-computer interfaces in general. Above and bellow, Vox Populi interface is depicted, showing two nonlinear iterative mappings with their resulting musical pieces. References [Manzolli, 1991] J. Manzolli. Harmonic Strange Attractors, CEM BULLETIN, Vol. 2, No. 2, 4 -- 7, 1991. [Moroni et al., 1999] Moroni, J. Manzolli, F. Von Zuben, R. Gudwin. Evolutionary Computation applied to Algorithmic Composition, Proceedings of CEC99 - IEEE International Conference on Evolutionary Computation, Washington D. C., p. 807 -- 811,1999. [Moroni et al., 2000] Moroni, A., Von Zuben, F. and Manzolli, J. ArTbitration, Las Vegas, USA: Proceedings of the 2000 Genetic and Evolutionary Computation Conference Workshop Program – GECCO, 143 -- 145, 2000.
series other
email artemis@ia.cti.br
more http://www.generativeart.com/
last changed 2003/08/07 17:25

_id a9d7
authors Mitchell, J.W. and McCullough, M.
year 1991
title Digital Design Media: A Handbook for Architects and Design Professionals
source Van Nostrand Reinhold, New York, pp. 421-422
summary In Digital Design Media architects and related design professionals will find a complete conceptual guide to the multidimensional world of computer-aided design. In contrast to the many books that describe how to use particular programs (and which therefore go out of date very quickly), Digital Design Media constructs a lasting theoretical framework, which will make it easier to understand a great number of programs-existing and future-as a whole. Clear structure, numerous historical references, and hundreds of illustrations make this framework both accessible to the nontechnical professional and broadening for the experienced computer-aided designer. The book will be especially valuable to anyone who is ready to expand their work in CAD beyond production drafting systems. The new second edition adds chapters one merging technologies, such as the Internet, but the book's original content is as valid as ever. Thousands of design students and practitioners have made this book a standard.
series other
email mmmc@umich.edu
last changed 2003/04/23 15:14

_id 8b1e
authors Blinn, James F.
year 1991
title A Trip Down the Graphics Pipeline: Line Clipping
source IEEE Computer Graphics and Applications January, 1991. vol. 11: pp. 98-105 : ill. includes bibliography.
summary The classic computer graphics pipeline is an assembly-line like process that geometric objects must experience on their journey to becoming pixels on the screen. This is a first of a series of columns on the graphics pipeline. In this column the author concentrate on the algorithm aspects of the line- clipping part of the pipeline
keywords clipping, algorithms, computer graphics
series CADline
last changed 2003/06/02 13:58

_id ga9921
id ga9921
authors Coates, P.S. and Hazarika, L.
year 1999
title The use of genetic programming for applications in the field of spatial composition
source International Conference on Generative Art
summary Architectural design teaching using computers has been a preoccupation of CECA since 1991. All design tutors provide their students with a set of models and ways to form, and we have explored a set of approaches including cellular automata, genetic programming ,agent based modelling and shape grammars as additional tools with which to explore architectural ( and architectonic) ideas.This paper discusses the use of genetic programming (G.P.) for applications in the field of spatial composition. CECA has been developing the use of Genetic Programming for some time ( see references ) and has covered the evolution of L-Systems production rules( coates 1997, 1999b), and the evolution of generative grammars of form (Coates 1998 1999a). The G.P. was used to generate three-dimensional spatial forms from a set of geometrical structures .The approach uses genetic programming with a Genetic Library (G.Lib) .G.P. provides a way to genetically breed a computer program to solve a problem.G. Lib. enables genetic programming to define potentially useful subroutines dynamically during a run .* Exploring a shape grammar consisting of simple solid primitives and transformations. * Applying a simple fitness function to the solid breeding G.P.* Exploring a shape grammar of composite surface objects. * Developing grammarsfor existing buildings, and creating hybrids. * Exploring the shape grammar of abuilding within a G.P.We will report on new work using a range of different morphologies ( boolean operations, surface operations and grammars of style ) and describe the use of objective functions ( natural selection) and the "eyeball test" ( artificial selection) as ways of controlling and exploring the design spaces thus defined.
series other
more http://www.generativeart.com/
last changed 2003/08/07 17:25

_id fd70
authors Goldman, Glenn and Zdepski, Michael Stephen (Eds.)
year 1991
title Reality and Virtual Reality [Conference Proceedings]
doi https://doi.org/10.52842/conf.acadia.1991
source ACADIA Conference Proceedings / ISBN 1-880250-00-4 / Los Angeles (California - USA) October 1991, 236 p.
summary During the past ten years computers in architecture have evolved from machines used for analytic and numeric calculation, to machines used for generating dynamic images, permitting the creation of photorealistic renderings, and now, in a preliminary way, permitting the simulation of virtual environments. Digital systems have evolved from increasing the speed of human operations, to providing entirely new means for creating, viewing and analyzing data. The following essays illustrate the growing spectrum of computer applications in architecture. They discuss developments in the simulation of future environments on the luminous screen and in virtual space. They investigate new methods and theories for the generation of architectural color, texture, and form. Authors address the complex technical issues of "intelligent" models and their associated analytic contents. There are attempts to categorize and make accessible architects' perceptions of various models of "reality". Much of what is presented foreshadows changes that are taking place in the areas of design theory, building sciences, architectural graphics, and computer research. The work presented is both developmental, evolving from the work done before or in other fields, and unique, exploring new themes and concepts. The application of computer technology to the practice of architecture has had a cross disciplinary effect, as computer algorithms used to generate the "unreal" environments and actors of the motion picture industry are applied to the prediction of buildings and urban landscapes not yet in existence. Buildings and places from history are archeologically "re-constructed" providing digital simulations that enable designers to study that which has previously (or never) existed. Applications of concepts from scientific visualization suggest new methods for understanding the highly interrelated aspects of the architectural sciences: structural systems, environmental control systems, building economics, etc. Simulation systems from the aerospace industry and computer media fields propose new non-physical three-dimensional worlds. Video compositing technology from the television industry and the practice of medicine are now applied to the compositing of existing environments with proposed buildings. Whether based in architectural research or practice, many authors continue to question the development of contemporary computer systems. They seek new interfaces between human and machine, new methods for simulating architectural information digitally, and new ways of conceptualizing the process of architectural design. While the practice of architecture has, of necessity, been primarily concerned with increasing productivity - and automation for improved efficiency, it is clear that university based studies and research continue to go beyond the electronic replication of manual tasks and study issues that can change the processes of architectural design - and ultimately perhaps, the products.
series ACADIA
email Goldman@ADM.NJIT.EDU
more http://www.acadia.org
last changed 2022/06/07 07:49

_id 58cd
authors Schnoedt, Heinrich
year 1991
title Cultural Parametrics
doi https://doi.org/10.52842/conf.acadia.1991.223
source Reality and Virtual Reality [ACADIA Conference Proceedings / ISBN 1-880250-00-4] Los Angeles (California - USA) October 1991, pp. 223-234
summary The human desire for automation of repetitive processes offers opportunities for the employment of binary computing for these procedures. Architecture and the design of buildings is no exception. With an increase in industrial prefabrication of moderately variable building components, the focus of the practising architect shifts from the individual design process toward a selection process of parts or components with a defined parametric extent. While this concept of parameterized parts has been used by architects since the first repetitive part was available, the advent of modern CAAD systems, with a growing number of parametric components and parts already integrated, is likely to greatly amplify the impact of predefined parts on buildings. Both industry and research institutions continue to make a great effort to utilize building codes and organizational structures as the basis to develop sophisticated algorithms of rule based design. Their purpose of the parameterization of parts or concepts is twofold: to reduce the time frame of human labor on the design of pieces and concepts which are considered repetitive,. and, to install a control mechanism to eliminate mistakes which lay outside of the parametric framework. The implementation of these algorithms in architectural practice and in the educational environment suggests consequences on many levels. In the following, an attempt is made to cast some light on the history of parametrics in respect to computing and the problems associated with a predominantly numerically encoded parametric approach.
series ACADIA
email schnoedt@vt.edu
last changed 2022/06/07 07:56

_id b5be
authors Stok, Leon
year 1991
title Architectural synthesis and optimization of digital systems
source Eindhoven University of Technology
summary High level synthesis means going from an functional specification of a digits-system at the algorithmic level to a register transfer level structure. Different appli-cations will ask for different design styles. Despite this diversity in design styles many tasks in the synthesis will be similar. There is no need to write a new synthesis system for each design style. The best way to go seems a decomposition of the high level synthesis problems in several well defined subproblems. How the problem is decomposed depends heavily on a) the type of network architecture chosen, b) the constraints applied to the design and c) on the functional description itself. From this architecture style, the constraints and the functional description a synthesis scheme can be derived. Once this scheme is fixed, algorithms can be chosen which fit into this scheme and solve the subproblems in a fast and, when possible, optimal way. To support such a synthesis philosophy, a framework is needed in which all design information can be stored in a unique way during the various phases of the design process. This asks for a design data base capable of handling all design information with a formally defined interface to all design tools. This thesis gives a formal way to describe both the functional representation, the register transfer level structure and the controller and the relations between all three of them. Special attention has been paid to the efficient representation of mutual exclusive operations and array accesses. The scheduling and allocation problems are defined as mappings between these formal representations. Both the existing synthesis algorithms and the new algorithms described in this thesis fit into this framework. Three new allocation algorithms are presented in this thesis: an algorithm for optimal register allocation in cyclic data flow graphs, an exact polynomial algorithm to do the module allocation and a new scheme to minimize the number of interconnections during all stages of the data path allocation. Cyclic data flow graphs result from high level behavioral descriptions that contain loops. Algorithms for register allocation in high level synthesis published up till now, only considered loop free data flow graphs, When these algorithms are applied to data flow graphs with loops, unnecessary register transfer operations are introduced. A new algorithm is presented that performs a minimal register allocation and eliminates all superfluous register transfer operations. The problem is reformulated as a multicommodity network flow problem for which very efficient solutions exist. Experiments on a benchmark set have shown that in all test cases all register transfers could be eliminated at no increase in register cost. Only heuristic algorithms appeared in literature to solve the module allocation problem. The module allocation problem is usually defined as a clique cover problem on a so-called module allocation graph. It is shown that, under certain conditions, the module allocation graph belongs to the special class of comparability graphs. A polynomial time algorithm can optimally find a clique cover of such a graph. Even when interconnect weights are taken into account, this can be solved exactly. This problem can be transformed into a maximal cost network flow problem, which can be solved exactly in polynomial time. An algorithm is described which solves the module allocation problem with interconnect weights exactly, with a complexity O(kn2), where n is the number of operations In previous research, interconnection was optimized when the module allocation for the operations and the register allocation for the variables already had been done. However, the amount of multiplexing and interconnect are crucial factors to both the delay and the area of a circuit. A new scheme is presented to minimize the number of interconnections during the data path allocation. This scheme first groups all values based on their read and write times. Values belonging to the same group can share a register file. This minimizes the number of data transfers with different sources and destinations. Secondly, registers are allocated for each group separately. Finally the interconnect allocation is done. During the interconnect allocation, the module allocation is determined. The value grouping is based on edge coloring algorithms providing a sharp upper bound on the number of colors needed two techniques: splitting read and write phases of values and introducing serial (re-)write operations for the same value, make that even more efficient exact edge coloring algorithms can be used. It is shown that when variables are grouped into register files and operations are assigned to modules during the interconnection minimization, significant savings (20%) can be obtained in the number of local interconnections and the amount of global interconnect, at the expense of only slightly more register area.
keywords Digital Systems; Digital Systems
series thesis:PhD
email p.d.v.v.d.stok@tue.nl
last changed 2003/02/12 22:37

_id e336
authors Achten, H., Roelen, W., Boekholt, J.-Th., Turksma, A. and Jessurun, J.
year 1999
title Virtual Reality in the Design Studio: The Eindhoven Perspective
doi https://doi.org/10.52842/conf.ecaade.1999.169
source Architectural Computing from Turing to 2000 [eCAADe Conference Proceedings / ISBN 0-9523687-5-7] Liverpool (UK) 15-17 September 1999, pp. 169-177
summary Since 1991 Virtual Reality has been used in student projects in the Building Information Technology group. It started as an experimental tool to assess the impact of VR technology in design, using the environment of the associated Calibre Institute. The technology was further developed in Calibre to become an important presentation tool for assessing design variants and final design solutions. However, it was only sporadically used in student projects. A major shift occurred in 1997 with a number of student projects in which various computer technologies including VR were used in the whole of the design process. In 1998, the new Design Systems group started a design studio with the explicit aim to integrate VR in the whole design process. The teaching effort was combined with the research program that investigates VR as a design support environment. This has lead to increasing number of innovative student projects. The paper describes the context and history of VR in Eindhoven and presents the current set-UP of the studio. It discusses the impact of the technology on the design process and outlines pedagogical issues in the studio work.
keywords Virtual Reality, Design Studio, Student Projects
series eCAADe
email h.h.achten@bwk.tue.nl
last changed 2022/06/07 07:54

_id ecaade2010_040
id ecaade2010_040
authors Akdag, Suzan Girginkaya; Cagdas, Gulen; Guney, Caner
year 2010
title Analyzing the Changes of Bosphorus Silhouette
doi https://doi.org/10.52842/conf.ecaade.2010.815
source FUTURE CITIES [28th eCAADe Conference Proceedings / ISBN 978-0-9541183-9-6] ETH Zurich (Switzerland) 15-18 September 2010, pp.815-823
summary Due to improving technology and global competition today sky is the only limit for high towers of metropolitan areas. The increase in number of high rise has been ruining the silhouette of cities all over the world like Istanbul, whose identity and image have also been destroyed by skyscrapers dominating the seven slopes on which it was once built. The urbanization in Istanbul has somehow become homogenous and destructive over the topography. Despite of raising debates on the critical issue now and then, no analytical approach has ever been introduced. The research therefore, aims to analyze the change of Bosphorus silhouette caused by the emergence of high rise blocks in Zincirlikuyu-Maslak route since it was defined as a Central Business District and a high rise development area by Bosphorus Conservation Law in 1991. ArcGIS Desktop software and its analyst extensions are used for mapping, analyzing and evaluating the urban development within years. The application is considered to be the initial step for a decision support system which will assist in assigning ground for high rise buildings in Istanbul.
wos WOS:000340629400087
keywords GIS; Bosphorus; Silhouette analysis; High rise buildings
series eCAADe
email sgirginkaya@gmail.com
last changed 2022/06/07 07:54

_id 2560
authors Alkhoven, Patricia
year 1991
title The Reconstruction of the Past: The Application of New Techniques for Visualization and Research in Architectural History
source Computer Aided Architectural Design Futures: Education, Research, Applications [CAAD Futures ‘91 Conference Proceedings / ISBN 3-528-08821-4] Zürich (Switzerland), July 1991, pp. 549-566
summary This paper focuses on the visualization of historical architecture. The application of new Computer-Aided- Architectural-Design techniques for visualization on micro computers provides a technique for reconstructing and analyzing architectural objects from the past. The pilot project describes a case study in which the historical transformation of a town will be analyzed by using three- dimensional CAD models in combination with bitmap textures. The transformation of the historic town will be visualized in a space-time computer model in which bitmap textures enable us to display complex and relatively large architectural objects in detail. This three-dimensional descriptive model allows us to survey and analyze the history of architecture in its reconstructed context. It also provides a medium for researching the dynamics of urban management, since new combinations and arrangements with the individual architectural objects can be created. In this way, a new synthesis of the graphic material can reveal typologies and the architectural ordering system of a town.
keywords 3D City modeling
series CAAD Futures
last changed 2003/11/21 15:15

_id 849b
authors Amiel, Maurice
year 1991
title NOTES ON IN-SITU – FULL-SCALE EXPERIMENTATION AND THE DESIGN PROFESSIONS
source Proceedings of the 3rd European Full-Scale Modelling Conference / ISBN 91-7740044-5 / Lund (Sweden) 13-16 September 1990, pp. 40-43
summary In the north american academic context a workshop is different from a paper session in that it is simply an opportunity to exchange ideas and to raise questions among colleagues who can bring to bear in their discussion various points of view and experiences otherwise unavailable.
keywords Full-scale Modeling, Model Simulation, Real Environments
series other
type normal paper
more http://info.tuwien.ac.at/efa
last changed 2004/05/04 15:18

_id 0ab2
authors Amor, R., Hosking, J., Groves, L. and Donn, M.
year 1993
title Design Tool Integration: Model Flexibility for the Building Profession
source Proceedings of Building Systems Automation - Integration, University of Wisconsin-Madison
summary The development of ICAtect, as discussed in the Building Systems Automation and Integration Symposium of 1991, provides a way of integrating simulation tools through a common building model. However, ICAtect is only a small step towards the ultimate goal of total integration and automation of the building design process. In this paper we investigate the next steps on the path toward integration. We examine how models structured to capture the physical attributes of the building, as required by simulation tools, can be used to converse with knowledge-based systems. We consider the types of mappings that occur in the often different views of a building held by these two classes of design tools. This leads us to examine the need for multiple views of a common building model. We then extend our analysis from the views required by simulation and knowledge-based systems, to those required by different segments of the building profession (e.g. architects, engineers, developers, etc.) to converse with such an integrated system. This indicates a need to provide a flexible method of accessing data in the common building model to facilitate use by different building professionals with varying specialities and levels of expertise.
series journal paper
email john@cs.auckland.ac.nz
last changed 2003/05/15 21:22

_id f9bd
authors Amor, R.W.
year 1991
title ICAtect: Integrating Design Tools for Preliminary Architectural Design
source Wellington, New Zealand: Computer Science Department, Victoria University
summary ICAtect is a knowledge based system that provides an interface between expert systems, simulation packages and CAD systems used for preliminary architectural design. This thesis describes its structure and development.The principal work discussed in this thesis involves the formulation of a method for representing a building. This is developed through an examination of a number of design tools used in architectural design, and the ways in which each of these describe a building.Methods of enabling data to be transferred between design tools are explored. A Common Building Model (CBM), forming the core of the ICAtect system, is developed to represent the design tools knowledge of a building. This model covers the range of knowledge required by a large set of disparate design tools used by architects at the initial design stage.Standard methods of integrating information from the tools were examined, but required augmentation to encompass the unusual constraints found in some of the design tools. The integration of the design tools and the CBM is discussed in detail, with example methods developed for each type of design tool. These example methods provide a successful way of moving information between the different representations. Some problems with mapping data between very different representations were encountered in this process, and the solutions or ideas for remedies are detailed. A model for control and use of ICAtect is developed in the thesis, and the extensions to enable a graphical user interface are discussed.The methods developed in this thesis demonstrate the feasibility of an integrated system of this nature, while the discussion of future work indicates the scope and potential power of ICAtect.
series other
last changed 2003/04/23 15:14

_id a620
authors Asanowicz, Alexander
year 1991
title Unde et Quo
doi https://doi.org/10.52842/conf.ecaade.1991.x.t1s
source Experiences with CAAD in Education and Practice [eCAADe Conference Proceedings] Munich (Germany) 17-19 October 1991
summary To begin with, I would like to say a few words about the problem of alienation of modern technologies which we also inevitably faced while starting teaching CAD at our department. Quite often nowadays a technology becomes a fetish as a result of lack of clear goals in human mind. There are multiple technologies without sense of purpose which turned into pure experiments. There is always the danger of losing purposeness and drifting toward alienation. The cause of the danger lies in forgetting about original goals while mastering and developing the technology. Eventually the original idea is ignored and a great gap appears between technical factors and creativity. We had the danger of alienation in mind when preparing the CAAD curriculum. Trying to avoid the tension between technical and creative elements we agreed not to introduce CAD too soon then the fourth year of studies and continue it for two semesters. One thing was clear - we should not teach the technique of CAD but how to design using a computer as a medium. Then we specified projects. The first was called "The bathroom I dream of" and meant to be a 2D drawing. The four introductory meetings were in fact teaching foundations of DOS, then a specific design followed with the help of AutoCAD program. In the IX semester, for example, it was "A family house" (plans, facades, perspective). "I have to follow them - I am their leader" said L.J. Peter in "The Peter's Prescription". This quotation reflects exactly the situation we find ourselves in teaching CAAD at our department. It means that ever growing students interest in CAAD made us introduce changes in the curriculum. According to the popular saying, "The more one gets the more one wants", so did we and the students feel after the first semester of teaching CAD. From autumn 1991 CAAD classes will be carried from the third year of studying for two consecutive years. But before further planning one major steep had to be done - we decided to reverse the typical of the seventies approach to the problem when teaching programming languages preceded practical goals hence discouraging many learners.

series eCAADe
email asan@cksr.ac.bialystok.pl
last changed 2022/06/07 07:50

_id 9964
authors Augenbroe, G. and Winkelmann, F.
year 1991
title Integration of Simulation into the Building Design Process
source J.A. Clarke, J.W. Mitchell, and R.C. Van de Perre (eds.), Proceedings, Building Simulation '91 IBPSA Conference, pp. 367-374
summary We describe the need for a joint effort between design researchers and simulation tool developers in formulating procedures and standards for integrating simulation into the building design process. We review and discuss current efforts in the US and Europe in the development of next-generation simulation tools and design integration techniques. In particular, we describe initiatives in object-oriented simulation environments (including the US Energy 'Kernel System, the Swedish Ida system, the UK Energy Kernel System, and the French ZOOM program.) and consider the relationship of these environments to recent R&D initiatives in design integration (the COMBINE project in Europe and the AEDOT project in the US).
series other
last changed 2003/11/21 15:16

_id ca50
authors Ayrle, Hartmut
year 1991
title XNET2 - Methodical Design of Local Area Networks in Buildings - An Application of the A4 Intelligent Design Tool
source Computer Aided Architectural Design Futures: Education, Research, Applications [CAAD Futures ‘91 Conference Proceedings / ISBN 3-528-08821-4] Zürich (Switzerland), July 1991, pp. 443-450
summary XNET2 is a prototype program, that helps network planners to design Ethernet-conform data-networks for sites and buildings. It is implemented as an example application of the ARMILLA4 Intelligent Design Tool under Knowledge Craft. It is based on a knowledge acquisition phase with experts from DECsite, the network-branch of DEC. The ARMILLA Design Tool is developed on the basis of Fritz Haller's ARMILLA ' a set of geometrical and operational rules for the integration of technical ductwork into a building's construction.
series CAAD Futures
last changed 2003/11/21 15:16

_id 27d2
authors Ayrle, Hartmut
year 1991
title Computers for Architects - Only a Tool?
doi https://doi.org/10.52842/conf.ecaade.1991.x.i9j
source Experiences with CAAD in Education and Practice [eCAADe Conference Proceedings] Munich (Germany) 17-19 October 1991
summary The paper states that, as a result of the schism between architecture as art and engineering as rationalism, the architectural community underestimates the computer as tool with a potential to substantially enlarge the possibilities of building design. It is claimed that the computer could serve as coordination tool for the ruptured design process, as a virtual workbench where all design disciplines sit together and develop their designs in enhanced conscience of what the whole design demands. The paper then concludes, that to develop such software tools, architects must participate in the development of software and may no longer be restricted to the role of applicants, especially during their universitary instruction. The corresponding research and training facilities at the University of Karlsruhe, Faculty of Architecture are described.

series eCAADe
last changed 2022/06/07 07:50

_id 22d6
authors Ballheim, F. and Leppert, J.
year 1991
title Architecture with Machines, Principles and Examples of CAAD-Education at the Technische Universität München
doi https://doi.org/10.52842/conf.ecaade.1991.x.h3w
source Experiences with CAAD in Education and Practice [eCAADe Conference Proceedings] Munich (Germany) 17-19 October 1991
summary "Design tools affect the results of the design process" - this is the starting point of our considerations about the efficient use of CAAD within architecture. To give you a short overview about what we want to say with this thesis lets have a short - an surely incomplete - trip through the fourth dimension back into the early time of civil engineering. As CAD in our faculty is integrated in the "Lehrstuhl für Hochbaustatik und Tragwerksplanung" (if we try to say it in English it would approximately be "institute of structural design"), we chose an example we are very familiar with because of its mathematical background - the cone sections: Circle, ellipse, parabola and hyperbola. If we start our trip two thousand years ago we only find the circle - or in very few cases the ellipse - in their use for the ground plan of greek or roman theaters - if you think of Greek amphitheaters or the Colosseum in Rome - or for the design of the cross section of a building - for example the Pantheon, roman aqueducts or bridges. With the rediscovery of the perspective during the Renaissance the handling of the ellipse was brought to perfection. May be the most famous example is the Capitol in Rome designed by Michelangelo Buonarotti with its elliptical ground plan that looks like a circle if the visitor comes up the famous stairway. During the following centuries - caused by the further development of the natural sciences and the use of new construction materials, i.e. cast-iron, steel or concrete - new design ideas could be realized. With the growing influence of mathematics on the design of buildings we got the division into two professions: Civil engineering and architecture. To the regret of the architects the most innovative constructions were designed by civil engineers, e.g. the early iron bridges in Britain or the famous bridges of Robert Maillard. Nowadays we are in the situation that we try to reintegrate the divided professions. We will return to that point later discussing possible solutions of this problem. But let us continue our 'historical survey demonstrating the state of the art we have today. As the logical consequence of the parabolic and hyperbolic arcs the hyperbolic parabolic shells were developed using traditional design techniques like models and orthogonal sections. Now we reach the point where the question comes up whether complex structures can be completely described by using traditional methods. A question that can be answered by "no" if we take the final step to the completely irregular geometry of cable- net-constructions or deconstructivistic designs. What we see - and what seems to support our thesis of the connection between design tools and the results of the design process - is, that on the one hand new tools enabled the designer to realize new ideas and on the other hand new ideas affected the development of new tools to realize them.

series eCAADe
more http://www.mediatecture.at/ecaade/91/ballheim_leppert.pdf
last changed 2022/06/07 07:50

_id 4eed
authors Benedickt, Michael (ed.)
year 1991
title Cyberspace: First Steps
source The MIT Press, Cambridge, MA and London, UK
summary Cyberspace has been defined as "an infinite artificial world where humans navigate in information-based space" and as "the ultimate computer-human interface." These original contributions take up the philosophical basis for cyberspace in virtual realities, basic communications principles, ramifications of cyberspace for future workplaces, and more.
series other
last changed 2003/04/23 15:14

For more results click below:

this is page 0show page 1show page 2show page 3show page 4show page 5... show page 10HOMELOGIN (you are user _anon_884964 from group guest) CUMINCAD Papers Powered by SciX Open Publishing Services 1.002