CumInCAD is a Cumulative Index about publications in Computer Aided Architectural Design
supported by the sibling associations ACADIA, CAADRIA, eCAADe, SIGraDi, ASCAAD and CAAD futures

PDF papers
References

Hits 1 to 20 of 219

_id ga0010
id ga0010
authors Moroni, A., Zuben, F. Von and Manzolli, J.
year 2000
title ArTbitrariness in Music
source International Conference on Generative Art
summary Evolution is now considered not only powerful enough to bring about the biological entities as complex as humans and conciousness, but also useful in simulation to create algorithms and structures of higher levels of complexity than could easily be built by design. In the context of artistic domains, the process of human-machine interaction is analyzed as a good framework to explore creativity and to produce results that could not be obtained without this interaction. When evolutionary computation and other computational intelligence methodologies are involved, every attempt to improve aesthetic judgement we denote as ArTbitrariness, and is interpreted as an interactive iterative optimization process. ArTbitrariness is also suggested as an effective way to produce art through an efficient manipulation of information and a proper use of computational creativity to increase the complexity of the results without neglecting the aesthetic aspects [Moroni et al., 2000]. Our emphasis will be in an approach to interactive music composition. The problem of computer generation of musical material has received extensive attention and a subclass of the field of algorithmic composition includes those applications which use the computer as something in between an instrument, in which a user "plays" through the application's interface, and a compositional aid, which a user experiments with in order to generate stimulating and varying musical material. This approach was adopted in Vox Populi, a hybrid made up of an instrument and a compositional environment. Differently from other systems found in genetic algorithms or evolutionary computation, in which people have to listen to and judge the musical items, Vox Populi uses the computer and the mouse as real-time music controllers, acting as a new interactive computer-based musical instrument. The interface is designed to be flexible for the user to modify the music being generated. It explores evolutionary computation in the context of algorithmic composition and provides a graphical interface that allows to modify the tonal center and the voice range, changing the evolution of the music by using the mouse[Moroni et al., 1999]. A piece of music consists of several sets of musical material manipulated and exposed to the listener, for example pitches, harmonies, rhythms, timbres, etc. They are composed of a finite number of elements and basically, the aim of a composer is to organize those elements in an esthetic way. Modeling a piece as a dynamic system implies a view in which the composer draws trajectories or orbits using the elements of each set [Manzolli, 1991]. Nonlinear iterative mappings are associated with interface controls. In the next page two examples of nonlinear iterative mappings with their resulting musical pieces are shown.The mappings may give rise to attractors, defined as geometric figures that represent the set of stationary states of a non-linear dynamic system, or simply trajectories to which the system is attracted. The relevance of this approach goes beyond music applications per se. Computer music systems that are built on the basis of a solid theory can be coherently embedded into multimedia environments. The richness and specialty of the music domain are likely to initiate new thinking and ideas, which will have an impact on areas such as knowledge representation and planning, and on the design of visual formalisms and human-computer interfaces in general. Above and bellow, Vox Populi interface is depicted, showing two nonlinear iterative mappings with their resulting musical pieces. References [Manzolli, 1991] J. Manzolli. Harmonic Strange Attractors, CEM BULLETIN, Vol. 2, No. 2, 4 -- 7, 1991. [Moroni et al., 1999] Moroni, J. Manzolli, F. Von Zuben, R. Gudwin. Evolutionary Computation applied to Algorithmic Composition, Proceedings of CEC99 - IEEE International Conference on Evolutionary Computation, Washington D. C., p. 807 -- 811,1999. [Moroni et al., 2000] Moroni, A., Von Zuben, F. and Manzolli, J. ArTbitration, Las Vegas, USA: Proceedings of the 2000 Genetic and Evolutionary Computation Conference Workshop Program – GECCO, 143 -- 145, 2000.
series other
email
more http://www.generativeart.com/
last changed 2003/08/07 17:25

_id fd70
authors Goldman, Glenn and Zdepski, Michael Stephen (Eds.)
year 1991
title Reality and Virtual Reality [Conference Proceedings]
doi https://doi.org/10.52842/conf.acadia.1991
source ACADIA Conference Proceedings / ISBN 1-880250-00-4 / Los Angeles (California - USA) October 1991, 236 p.
summary During the past ten years computers in architecture have evolved from machines used for analytic and numeric calculation, to machines used for generating dynamic images, permitting the creation of photorealistic renderings, and now, in a preliminary way, permitting the simulation of virtual environments. Digital systems have evolved from increasing the speed of human operations, to providing entirely new means for creating, viewing and analyzing data. The following essays illustrate the growing spectrum of computer applications in architecture. They discuss developments in the simulation of future environments on the luminous screen and in virtual space. They investigate new methods and theories for the generation of architectural color, texture, and form. Authors address the complex technical issues of "intelligent" models and their associated analytic contents. There are attempts to categorize and make accessible architects' perceptions of various models of "reality". Much of what is presented foreshadows changes that are taking place in the areas of design theory, building sciences, architectural graphics, and computer research. The work presented is both developmental, evolving from the work done before or in other fields, and unique, exploring new themes and concepts. The application of computer technology to the practice of architecture has had a cross disciplinary effect, as computer algorithms used to generate the "unreal" environments and actors of the motion picture industry are applied to the prediction of buildings and urban landscapes not yet in existence. Buildings and places from history are archeologically "re-constructed" providing digital simulations that enable designers to study that which has previously (or never) existed. Applications of concepts from scientific visualization suggest new methods for understanding the highly interrelated aspects of the architectural sciences: structural systems, environmental control systems, building economics, etc. Simulation systems from the aerospace industry and computer media fields propose new non-physical three-dimensional worlds. Video compositing technology from the television industry and the practice of medicine are now applied to the compositing of existing environments with proposed buildings. Whether based in architectural research or practice, many authors continue to question the development of contemporary computer systems. They seek new interfaces between human and machine, new methods for simulating architectural information digitally, and new ways of conceptualizing the process of architectural design. While the practice of architecture has, of necessity, been primarily concerned with increasing productivity - and automation for improved efficiency, it is clear that university based studies and research continue to go beyond the electronic replication of manual tasks and study issues that can change the processes of architectural design - and ultimately perhaps, the products.
series ACADIA
email
more http://www.acadia.org
last changed 2022/06/07 07:49

_id eaca
authors Davis, L. (ed.)
year 1991
title Handbook of genetic algorithms
source Van Nostrand Reinhold, New York
summary This book sets out to explain what genetic algorithms are and how they can be used to solve real-world problems. The first objective is tackled by the editor, Lawrence Davis. The remainder of the book is turned over to a series of short review articles by a collection of authors, each explaining how genetic algorithms have been applied to problems in their own specific area of interest. The first part of the book introduces the fundamental genetic algorithm (GA), explains how it has traditionally been designed and implemented and shows how the basic technique may be applied to a very simple numerical optimisation problem. The basic technique is then altered and refined in a number of ways, with the effects of each change being measured by comparison against the performance of the original. In this way, the reader is provided with an uncluttered introduction to the technique and learns to appreciate why certain variants of GA have become more popular than others in the scientific community. Davis stresses that the choice of a suitable representation for the problem in hand is a key step in applying the GA, as is the selection of suitable techniques for generating new solutions from old. He is refreshingly open in admitting that much of the business of adapting the GA to specific problems owes more to art than to science. It is nice to see the terminology associated with this subject explained, with the author stressing that much of the field is still an active area of research. Few assumptions are made about the reader's mathematical background. The second part of the book contains thirteen cameo descriptions of how genetic algorithmic techniques have been, or are being, applied to a diverse range of problems. Thus, one group of authors explains how the technique has been used for modelling arms races between neighbouring countries (a non- linear, dynamical system), while another group describes its use in deciding design trade-offs for military aircraft. My own favourite is a rather charming account of how the GA was applied to a series of scheduling problems. Having attempted something of this sort with Simulated Annealing, I found it refreshing to see the authors highlighting some of the problems that they had encountered, rather than sweeping them under the carpet as is so often done in the scientific literature. The editor points out that there are standard GA tools available for either play or serious development work. Two of these (GENESIS and OOGA) are described in a short, third part of the book. As is so often the case nowadays, it is possible to obtain a diskette containing both systems by sending your Visa card details (or $60) to an address in the USA.
series other
last changed 2003/04/23 15:14

_id 58cd
authors Schnoedt, Heinrich
year 1991
title Cultural Parametrics
doi https://doi.org/10.52842/conf.acadia.1991.223
source Reality and Virtual Reality [ACADIA Conference Proceedings / ISBN 1-880250-00-4] Los Angeles (California - USA) October 1991, pp. 223-234
summary The human desire for automation of repetitive processes offers opportunities for the employment of binary computing for these procedures. Architecture and the design of buildings is no exception. With an increase in industrial prefabrication of moderately variable building components, the focus of the practising architect shifts from the individual design process toward a selection process of parts or components with a defined parametric extent. While this concept of parameterized parts has been used by architects since the first repetitive part was available, the advent of modern CAAD systems, with a growing number of parametric components and parts already integrated, is likely to greatly amplify the impact of predefined parts on buildings. Both industry and research institutions continue to make a great effort to utilize building codes and organizational structures as the basis to develop sophisticated algorithms of rule based design. Their purpose of the parameterization of parts or concepts is twofold: to reduce the time frame of human labor on the design of pieces and concepts which are considered repetitive,. and, to install a control mechanism to eliminate mistakes which lay outside of the parametric framework. The implementation of these algorithms in architectural practice and in the educational environment suggests consequences on many levels. In the following, an attempt is made to cast some light on the history of parametrics in respect to computing and the problems associated with a predominantly numerically encoded parametric approach.
series ACADIA
email
last changed 2022/06/07 07:56

_id b5be
authors Stok, Leon
year 1991
title Architectural synthesis and optimization of digital systems
source Eindhoven University of Technology
summary High level synthesis means going from an functional specification of a digits-system at the algorithmic level to a register transfer level structure. Different appli-cations will ask for different design styles. Despite this diversity in design styles many tasks in the synthesis will be similar. There is no need to write a new synthesis system for each design style. The best way to go seems a decomposition of the high level synthesis problems in several well defined subproblems. How the problem is decomposed depends heavily on a) the type of network architecture chosen, b) the constraints applied to the design and c) on the functional description itself. From this architecture style, the constraints and the functional description a synthesis scheme can be derived. Once this scheme is fixed, algorithms can be chosen which fit into this scheme and solve the subproblems in a fast and, when possible, optimal way. To support such a synthesis philosophy, a framework is needed in which all design information can be stored in a unique way during the various phases of the design process. This asks for a design data base capable of handling all design information with a formally defined interface to all design tools. This thesis gives a formal way to describe both the functional representation, the register transfer level structure and the controller and the relations between all three of them. Special attention has been paid to the efficient representation of mutual exclusive operations and array accesses. The scheduling and allocation problems are defined as mappings between these formal representations. Both the existing synthesis algorithms and the new algorithms described in this thesis fit into this framework. Three new allocation algorithms are presented in this thesis: an algorithm for optimal register allocation in cyclic data flow graphs, an exact polynomial algorithm to do the module allocation and a new scheme to minimize the number of interconnections during all stages of the data path allocation. Cyclic data flow graphs result from high level behavioral descriptions that contain loops. Algorithms for register allocation in high level synthesis published up till now, only considered loop free data flow graphs, When these algorithms are applied to data flow graphs with loops, unnecessary register transfer operations are introduced. A new algorithm is presented that performs a minimal register allocation and eliminates all superfluous register transfer operations. The problem is reformulated as a multicommodity network flow problem for which very efficient solutions exist. Experiments on a benchmark set have shown that in all test cases all register transfers could be eliminated at no increase in register cost. Only heuristic algorithms appeared in literature to solve the module allocation problem. The module allocation problem is usually defined as a clique cover problem on a so-called module allocation graph. It is shown that, under certain conditions, the module allocation graph belongs to the special class of comparability graphs. A polynomial time algorithm can optimally find a clique cover of such a graph. Even when interconnect weights are taken into account, this can be solved exactly. This problem can be transformed into a maximal cost network flow problem, which can be solved exactly in polynomial time. An algorithm is described which solves the module allocation problem with interconnect weights exactly, with a complexity O(kn2), where n is the number of operations In previous research, interconnection was optimized when the module allocation for the operations and the register allocation for the variables already had been done. However, the amount of multiplexing and interconnect are crucial factors to both the delay and the area of a circuit. A new scheme is presented to minimize the number of interconnections during the data path allocation. This scheme first groups all values based on their read and write times. Values belonging to the same group can share a register file. This minimizes the number of data transfers with different sources and destinations. Secondly, registers are allocated for each group separately. Finally the interconnect allocation is done. During the interconnect allocation, the module allocation is determined. The value grouping is based on edge coloring algorithms providing a sharp upper bound on the number of colors needed two techniques: splitting read and write phases of values and introducing serial (re-)write operations for the same value, make that even more efficient exact edge coloring algorithms can be used. It is shown that when variables are grouped into register files and operations are assigned to modules during the interconnection minimization, significant savings (20%) can be obtained in the number of local interconnections and the amount of global interconnect, at the expense of only slightly more register area.
keywords Digital Systems; Digital Systems
series thesis:PhD
email
last changed 2003/02/12 22:37

_id 22d6
authors Ballheim, F. and Leppert, J.
year 1991
title Architecture with Machines, Principles and Examples of CAAD-Education at the Technische Universität München
doi https://doi.org/10.52842/conf.ecaade.1991.x.h3w
source Experiences with CAAD in Education and Practice [eCAADe Conference Proceedings] Munich (Germany) 17-19 October 1991
summary "Design tools affect the results of the design process" - this is the starting point of our considerations about the efficient use of CAAD within architecture. To give you a short overview about what we want to say with this thesis lets have a short - an surely incomplete - trip through the fourth dimension back into the early time of civil engineering. As CAD in our faculty is integrated in the "Lehrstuhl für Hochbaustatik und Tragwerksplanung" (if we try to say it in English it would approximately be "institute of structural design"), we chose an example we are very familiar with because of its mathematical background - the cone sections: Circle, ellipse, parabola and hyperbola. If we start our trip two thousand years ago we only find the circle - or in very few cases the ellipse - in their use for the ground plan of greek or roman theaters - if you think of Greek amphitheaters or the Colosseum in Rome - or for the design of the cross section of a building - for example the Pantheon, roman aqueducts or bridges. With the rediscovery of the perspective during the Renaissance the handling of the ellipse was brought to perfection. May be the most famous example is the Capitol in Rome designed by Michelangelo Buonarotti with its elliptical ground plan that looks like a circle if the visitor comes up the famous stairway. During the following centuries - caused by the further development of the natural sciences and the use of new construction materials, i.e. cast-iron, steel or concrete - new design ideas could be realized. With the growing influence of mathematics on the design of buildings we got the division into two professions: Civil engineering and architecture. To the regret of the architects the most innovative constructions were designed by civil engineers, e.g. the early iron bridges in Britain or the famous bridges of Robert Maillard. Nowadays we are in the situation that we try to reintegrate the divided professions. We will return to that point later discussing possible solutions of this problem. But let us continue our 'historical survey demonstrating the state of the art we have today. As the logical consequence of the parabolic and hyperbolic arcs the hyperbolic parabolic shells were developed using traditional design techniques like models and orthogonal sections. Now we reach the point where the question comes up whether complex structures can be completely described by using traditional methods. A question that can be answered by "no" if we take the final step to the completely irregular geometry of cable- net-constructions or deconstructivistic designs. What we see - and what seems to support our thesis of the connection between design tools and the results of the design process - is, that on the one hand new tools enabled the designer to realize new ideas and on the other hand new ideas affected the development of new tools to realize them.

series eCAADe
more http://www.mediatecture.at/ecaade/91/ballheim_leppert.pdf
last changed 2022/06/07 07:50

_id 0b1c
authors Bridges, Alan
year 1991
title Computer Exercises in Architectural Design Theory
doi https://doi.org/10.52842/conf.ecaade.1991.x.f9w
source Experiences with CAAD in Education and Practice [eCAADe Conference Proceedings] Munich (Germany) 17-19 October 1991
summary This paper discusses how architectural theory may be taught using computer based exercises to explore the practical application of those theories. The particular view of architecture developed is, necessarily, a restricted one but the objectives behind the exercises are slightly different to those that a pure architectural theorist or historian might have The formal teaching of architectural theory and composition has not been very fashionable in Schools of Architecture for several years now: indeed there is a considerable inbuilt resistance in students to the application of any form of rules or procedures. There is however a general interest in computing and this can be utilised to advantage. In concentrating on computer applications in design eclectic use has been made of a number of architectural examples ranging from Greek temples to the work of modern deconstructionists. Architectural theory since Vitruvius is littered with attempts to define universal theories of design and this paper certainly does not presume to anything so grand: I have merely looked at buildings, compared them and noted what they have in common and how that might relate to computer-aided design. I have ignored completely any sociological, philosophical or phenomenological questions but would readily agree with the criticism that Cartesian rationality is not, on its own, a sufficient base upon which to build a theory of design. However I believe there is merit in articulating design by separating it from other concerns and making it a subject of study in its own right. Work in design research will provide the models and intellectual structures to facilitate discourse about design and might be expected to benefit the development of design skills by providing material that could be formally taught and debated in a way that is removed from the ephemeral "fashionable designer" debate. Of course, some of the ideas discussed here may prove to be equally ephemeral but that does not entirely negate their value.

series eCAADe
email
last changed 2022/06/07 07:50

_id ga9921
id ga9921
authors Coates, P.S. and Hazarika, L.
year 1999
title The use of genetic programming for applications in the field of spatial composition
source International Conference on Generative Art
summary Architectural design teaching using computers has been a preoccupation of CECA since 1991. All design tutors provide their students with a set of models and ways to form, and we have explored a set of approaches including cellular automata, genetic programming ,agent based modelling and shape grammars as additional tools with which to explore architectural ( and architectonic) ideas.This paper discusses the use of genetic programming (G.P.) for applications in the field of spatial composition. CECA has been developing the use of Genetic Programming for some time ( see references ) and has covered the evolution of L-Systems production rules( coates 1997, 1999b), and the evolution of generative grammars of form (Coates 1998 1999a). The G.P. was used to generate three-dimensional spatial forms from a set of geometrical structures .The approach uses genetic programming with a Genetic Library (G.Lib) .G.P. provides a way to genetically breed a computer program to solve a problem.G. Lib. enables genetic programming to define potentially useful subroutines dynamically during a run .* Exploring a shape grammar consisting of simple solid primitives and transformations. * Applying a simple fitness function to the solid breeding G.P.* Exploring a shape grammar of composite surface objects. * Developing grammarsfor existing buildings, and creating hybrids. * Exploring the shape grammar of abuilding within a G.P.We will report on new work using a range of different morphologies ( boolean operations, surface operations and grammars of style ) and describe the use of objective functions ( natural selection) and the "eyeball test" ( artificial selection) as ways of controlling and exploring the design spaces thus defined.
series other
more http://www.generativeart.com/
last changed 2003/08/07 17:25

_id 098a
authors Perron, Richard and Miller, Deron
year 1991
title Landscape of the Mind
doi https://doi.org/10.52842/conf.acadia.1991.071
source Reality and Virtual Reality [ACADIA Conference Proceedings / ISBN 1-880250-00-4] Los Angeles (California - USA) October 1991, pp. 71-86
summary The focus of this article is the exploration of landscape and the question of representation, more specifically how landscape principles can be represented through computation. It is a quest for essential qualities, through an application of philosophical questioning, and a response to a human perception of reality. Reality, as an invention of the human mind, is often thought of as a set of accepted conventions and constructs. Such a reality has an inherent dependency upon cognition where spatial and temporal principles may be defined within the natural and built environment, and further embraced within a cultural context. However, there also exist rules or relations that are neither invented nor formulated by the participants understanding. In effect these relations may not have been effectively articulated, a result perhaps of unfamiliar cues. Therefore, to the participant, these relations reside in the realm of the unknown or even the mystic. The aesthetic often resides in the realm of the mystic. The discovery of the aesthetic, is often an experience that comes from encountering physical and essential beauty where it has been produced through unconscious relations, perceived, yet transcending human understanding. The aspects of space and time, spatial and temporal properties and relations of things and events, are generally accepted conventions. Yet, the existence of a time order, is often not perceived. An understanding of spatial temporal properties may involve a temporal detachment from convention, allowing the release of previously unknown patterns and relations. Virtual realities are well constructed simulations of our environments, yet they may lack the embedded essential qualities of place. Virtual reality should transcend human perception and traditional modes of understanding, and most importantly our limited notions of the temporal nature of our environment. A desire to reach beyond the limits of perceived time order, may take us beyond existing sets of cultural values, and lead to the realization of new spatial/temporal conventions with the assistance of the computer.
series ACADIA
last changed 2022/06/07 08:00

_id f14c
authors Sariyildiz, Sevil
year 1991
title Conceptual Design by Means of Islamic-Geometric-Patterns within a CAAD-Environment
source Delft University of Technology
summary The starting point in this research was to develop a 3D grammar theory on top of existing 2D Islamic-geometric-patterns, trying to rescue their fundamental geometry contents to be applied in contemporary architecture without compromising any architectural style. As it is self evident the architectural design process consists of clearly distinct stages namely conceptual design, materialisation and further completion. A this conceptual stage the innovative item of the research deals with pattern grammars on 3D complex geometrical patterns, considering them as polyhedra and polytopes, for their use as an underlayer to a concept design, like architects use 2D rectangular and triangular grids by the conventional way. Handling these complex 3D patterns requires a special environment which is possible with CAAD. Within the CAAD environment, the handling of these complex patterns is easily done by means of 3D tools, because the 3D tools permit the user to make any possible manipulations and geometrical transformations in an easier way in space. To a geometrical patterns, there is some attention paid during the last 50 years by some scholars. The most complex geometrical patterns are highly developed in Islamic architecture because it is forbidden in Muslim religion to use man's portraits or sculptures of human beings in the religious buildings. All these approaches to complex patterns are analysed and studied as 2D elements. The question was how could we consider them in 3rd dimensions and use them instead of 2D underlayer, as 3D underlayers in the conceptual phase of the CAAD design. Pattern grammar is a generally employable aid (underlying pattern) for conceptual and material designs. On the basis of rules of symmetry and substitution, ordering principles have been worked out, which can be used for formal design methods as well as detailing systems (e.g. modular coordination). Through the realization of a pattern grammar a wider range of underlying patterns can be offered and a choice from these can be made in a more fundamental manner. At a subsequent stage the collection of "empty boxes" can be filled with (architectural) elements in such a way that another option is created between either filling up the boxes completely, filling them partly, or filling them in such a way that they overflow. It is self-evident that underlying patterns can also be used for details and decoration in a design. Concerning the materialisation of the concept design, within the 3D CAAD environment, substitution methods are partially developed. Further theoretical developments concerning the materialisation phase constantly backed up through feed-back with specialist matters (such as e.g. by means of expert systems, decision-support systems), must be worked out. As feed-back of the research, the possibilities of the design with 3D patterns have been tested and the procedures are explained. (*) Working with 3D patterns gives a designer more inspirations to develop new ideas and new concepts and gives the opportunity to handle the complexity. (*) The formal, structural and symmetrical qualities of geometrical patterns has a positive influence on the industrialisation of the building components. (*) Working with 3D tools which are able to handle complex geometry have a result because of the accuracy of the information, that there has hardly been a mistake made during the preparation and the assembly of the building components. This has also positive results concerning the financial aspects of the building process.
series thesis:PhD
email
last changed 2003/02/12 22:37

_id 86c1
authors Shih, Shen-Guan
year 1991
title Case-based Representation and Adaptation in Design
source Computer Aided Architectural Design Futures: Education, Research, Applications [CAAD Futures ‘91 Conference Proceedings / ISBN 3-528-08821-4] Zürich (Switzerland), July 1991, pp. 301-312
summary By attempting to model the raw memory of experts, case-based reasoning is distinguished from traditional expert systems, which compile experts' knowledge into rules before new problems are given. A case-based reasoning system processes new problems with the most similar prior experiences available, and adapts the prior solutions to solve new problems. Case-based representation, of design knowledge utilizes the desirable features of the selected case as syntax rules to adapt the case to a new context. As a central issue of the paper, three types of adaptation aimed at topological modifications are described. The first type - casebased search - can be viewed as a localized search process. It follows the syntactical structure of the case to search for variations which provide the required functionality. Regarding the complexity of computation, it is recognized that when a context sensitive grammar is used to describe the desirable features, the search process become intractable. The second type of adaptation can be viewed as a process of self-organization, in which context-sensitive grammars play an essential role. Evaluations have to be simulated by local interaction among design primitives. The third type is called direct transduction. A case is translated directly to another structure according to its syntax by some translation functions. A direct transduction is not necessarily a composition of design operators and thus, a crosscontextual mapping is possible. As a perspective use of these adaptation methods, a CAD system which provides designers with the ability to modify the syntactical structure of a group of design elements, according to some concerned semantics, would support designers better than current CAD systems.
series CAAD Futures
last changed 1999/04/07 12:03

_id c00e
authors Tolman, F. P. and Kuiper, P.
year 1991
title Some Integration Requirements for Computer Integrated Building
source The Computer Integrated Future, CIB W78 Seminar. september, 1991. Unnumbered : ill. includes a short bibliography
summary Introduction of computer technology in the Building and Construction industries follows a bottom-up approach. Bottom up approaches always lead to (1) communication problems on higher levels -- in this case recognized as 'islands of automation' -- subsequently followed by more recently (2) a plea for integration. Although the word 'integration' quickly became in vogue, it is not clear what it really means and what it is that we are supposed to integrate. Another interesting and pressing question is: 'How to integrate the different integration efforts'? The paper discusses five hierarchical technical levels of integration. Each level is elaborated in some detail. Also the relations between the levels are brought into perspective. Non-technical integration requirements (e.g. social, organizational, or legal) are not discussed
keywords integration, systems, CAD, building, construction
series CADline
last changed 2003/06/02 10:24

_id 8b1e
authors Blinn, James F.
year 1991
title A Trip Down the Graphics Pipeline: Line Clipping
source IEEE Computer Graphics and Applications January, 1991. vol. 11: pp. 98-105 : ill. includes bibliography.
summary The classic computer graphics pipeline is an assembly-line like process that geometric objects must experience on their journey to becoming pixels on the screen. This is a first of a series of columns on the graphics pipeline. In this column the author concentrate on the algorithm aspects of the line- clipping part of the pipeline
keywords clipping, algorithms, computer graphics
series CADline
last changed 2003/06/02 13:58

_id e336
authors Achten, H., Roelen, W., Boekholt, J.-Th., Turksma, A. and Jessurun, J.
year 1999
title Virtual Reality in the Design Studio: The Eindhoven Perspective
doi https://doi.org/10.52842/conf.ecaade.1999.169
source Architectural Computing from Turing to 2000 [eCAADe Conference Proceedings / ISBN 0-9523687-5-7] Liverpool (UK) 15-17 September 1999, pp. 169-177
summary Since 1991 Virtual Reality has been used in student projects in the Building Information Technology group. It started as an experimental tool to assess the impact of VR technology in design, using the environment of the associated Calibre Institute. The technology was further developed in Calibre to become an important presentation tool for assessing design variants and final design solutions. However, it was only sporadically used in student projects. A major shift occurred in 1997 with a number of student projects in which various computer technologies including VR were used in the whole of the design process. In 1998, the new Design Systems group started a design studio with the explicit aim to integrate VR in the whole design process. The teaching effort was combined with the research program that investigates VR as a design support environment. This has lead to increasing number of innovative student projects. The paper describes the context and history of VR in Eindhoven and presents the current set-UP of the studio. It discusses the impact of the technology on the design process and outlines pedagogical issues in the studio work.
keywords Virtual Reality, Design Studio, Student Projects
series eCAADe
email
last changed 2022/06/07 07:54

_id ecaade2010_040
id ecaade2010_040
authors Akdag, Suzan Girginkaya; Cagdas, Gulen; Guney, Caner
year 2010
title Analyzing the Changes of Bosphorus Silhouette
doi https://doi.org/10.52842/conf.ecaade.2010.815
source FUTURE CITIES [28th eCAADe Conference Proceedings / ISBN 978-0-9541183-9-6] ETH Zurich (Switzerland) 15-18 September 2010, pp.815-823
summary Due to improving technology and global competition today sky is the only limit for high towers of metropolitan areas. The increase in number of high rise has been ruining the silhouette of cities all over the world like Istanbul, whose identity and image have also been destroyed by skyscrapers dominating the seven slopes on which it was once built. The urbanization in Istanbul has somehow become homogenous and destructive over the topography. Despite of raising debates on the critical issue now and then, no analytical approach has ever been introduced. The research therefore, aims to analyze the change of Bosphorus silhouette caused by the emergence of high rise blocks in Zincirlikuyu-Maslak route since it was defined as a Central Business District and a high rise development area by Bosphorus Conservation Law in 1991. ArcGIS Desktop software and its analyst extensions are used for mapping, analyzing and evaluating the urban development within years. The application is considered to be the initial step for a decision support system which will assist in assigning ground for high rise buildings in Istanbul.
wos WOS:000340629400087
keywords GIS; Bosphorus; Silhouette analysis; High rise buildings
series eCAADe
email
last changed 2022/06/07 07:54

_id 2560
authors Alkhoven, Patricia
year 1991
title The Reconstruction of the Past: The Application of New Techniques for Visualization and Research in Architectural History
source Computer Aided Architectural Design Futures: Education, Research, Applications [CAAD Futures ‘91 Conference Proceedings / ISBN 3-528-08821-4] Zürich (Switzerland), July 1991, pp. 549-566
summary This paper focuses on the visualization of historical architecture. The application of new Computer-Aided- Architectural-Design techniques for visualization on micro computers provides a technique for reconstructing and analyzing architectural objects from the past. The pilot project describes a case study in which the historical transformation of a town will be analyzed by using three- dimensional CAD models in combination with bitmap textures. The transformation of the historic town will be visualized in a space-time computer model in which bitmap textures enable us to display complex and relatively large architectural objects in detail. This three-dimensional descriptive model allows us to survey and analyze the history of architecture in its reconstructed context. It also provides a medium for researching the dynamics of urban management, since new combinations and arrangements with the individual architectural objects can be created. In this way, a new synthesis of the graphic material can reveal typologies and the architectural ordering system of a town.
keywords 3D City modeling
series CAAD Futures
last changed 2003/11/21 15:15

_id 0ab2
authors Amor, R., Hosking, J., Groves, L. and Donn, M.
year 1993
title Design Tool Integration: Model Flexibility for the Building Profession
source Proceedings of Building Systems Automation - Integration, University of Wisconsin-Madison
summary The development of ICAtect, as discussed in the Building Systems Automation and Integration Symposium of 1991, provides a way of integrating simulation tools through a common building model. However, ICAtect is only a small step towards the ultimate goal of total integration and automation of the building design process. In this paper we investigate the next steps on the path toward integration. We examine how models structured to capture the physical attributes of the building, as required by simulation tools, can be used to converse with knowledge-based systems. We consider the types of mappings that occur in the often different views of a building held by these two classes of design tools. This leads us to examine the need for multiple views of a common building model. We then extend our analysis from the views required by simulation and knowledge-based systems, to those required by different segments of the building profession (e.g. architects, engineers, developers, etc.) to converse with such an integrated system. This indicates a need to provide a flexible method of accessing data in the common building model to facilitate use by different building professionals with varying specialities and levels of expertise.
series journal paper
email
last changed 2003/05/15 21:22

_id f9bd
authors Amor, R.W.
year 1991
title ICAtect: Integrating Design Tools for Preliminary Architectural Design
source Wellington, New Zealand: Computer Science Department, Victoria University
summary ICAtect is a knowledge based system that provides an interface between expert systems, simulation packages and CAD systems used for preliminary architectural design. This thesis describes its structure and development.The principal work discussed in this thesis involves the formulation of a method for representing a building. This is developed through an examination of a number of design tools used in architectural design, and the ways in which each of these describe a building.Methods of enabling data to be transferred between design tools are explored. A Common Building Model (CBM), forming the core of the ICAtect system, is developed to represent the design tools knowledge of a building. This model covers the range of knowledge required by a large set of disparate design tools used by architects at the initial design stage.Standard methods of integrating information from the tools were examined, but required augmentation to encompass the unusual constraints found in some of the design tools. The integration of the design tools and the CBM is discussed in detail, with example methods developed for each type of design tool. These example methods provide a successful way of moving information between the different representations. Some problems with mapping data between very different representations were encountered in this process, and the solutions or ideas for remedies are detailed. A model for control and use of ICAtect is developed in the thesis, and the extensions to enable a graphical user interface are discussed.The methods developed in this thesis demonstrate the feasibility of an integrated system of this nature, while the discussion of future work indicates the scope and potential power of ICAtect.
series other
last changed 2003/04/23 15:14

_id a620
authors Asanowicz, Alexander
year 1991
title Unde et Quo
doi https://doi.org/10.52842/conf.ecaade.1991.x.t1s
source Experiences with CAAD in Education and Practice [eCAADe Conference Proceedings] Munich (Germany) 17-19 October 1991
summary To begin with, I would like to say a few words about the problem of alienation of modern technologies which we also inevitably faced while starting teaching CAD at our department. Quite often nowadays a technology becomes a fetish as a result of lack of clear goals in human mind. There are multiple technologies without sense of purpose which turned into pure experiments. There is always the danger of losing purposeness and drifting toward alienation. The cause of the danger lies in forgetting about original goals while mastering and developing the technology. Eventually the original idea is ignored and a great gap appears between technical factors and creativity. We had the danger of alienation in mind when preparing the CAAD curriculum. Trying to avoid the tension between technical and creative elements we agreed not to introduce CAD too soon then the fourth year of studies and continue it for two semesters. One thing was clear - we should not teach the technique of CAD but how to design using a computer as a medium. Then we specified projects. The first was called "The bathroom I dream of" and meant to be a 2D drawing. The four introductory meetings were in fact teaching foundations of DOS, then a specific design followed with the help of AutoCAD program. In the IX semester, for example, it was "A family house" (plans, facades, perspective). "I have to follow them - I am their leader" said L.J. Peter in "The Peter's Prescription". This quotation reflects exactly the situation we find ourselves in teaching CAAD at our department. It means that ever growing students interest in CAAD made us introduce changes in the curriculum. According to the popular saying, "The more one gets the more one wants", so did we and the students feel after the first semester of teaching CAD. From autumn 1991 CAAD classes will be carried from the third year of studying for two consecutive years. But before further planning one major steep had to be done - we decided to reverse the typical of the seventies approach to the problem when teaching programming languages preceded practical goals hence discouraging many learners.

series eCAADe
email
last changed 2022/06/07 07:50

_id 9964
authors Augenbroe, G. and Winkelmann, F.
year 1991
title Integration of Simulation into the Building Design Process
source J.A. Clarke, J.W. Mitchell, and R.C. Van de Perre (eds.), Proceedings, Building Simulation '91 IBPSA Conference, pp. 367-374
summary We describe the need for a joint effort between design researchers and simulation tool developers in formulating procedures and standards for integrating simulation into the building design process. We review and discuss current efforts in the US and Europe in the development of next-generation simulation tools and design integration techniques. In particular, we describe initiatives in object-oriented simulation environments (including the US Energy 'Kernel System, the Swedish Ida system, the UK Energy Kernel System, and the French ZOOM program.) and consider the relationship of these environments to recent R&D initiatives in design integration (the COMBINE project in Europe and the AEDOT project in the US).
series other
last changed 2003/11/21 15:16

For more results click below:

this is page 0show page 1show page 2show page 3show page 4show page 5... show page 10HOMELOGIN (you are user _anon_104869 from group guest) CUMINCAD Papers Powered by SciX Open Publishing Services 1.002