CumInCAD is a Cumulative Index about publications in Computer Aided Architectural Design
supported by the sibling associations ACADIA, CAADRIA, eCAADe, SIGraDi, ASCAAD and CAAD futures

PDF papers
References

Hits 1 to 20 of 70

_id ecaade2010_022
id ecaade2010_022
authors Al-kazzaz, Dhuha; Bridges, Alan; Chase, Scott
year 2010
title Shape Grammars for Innovative Hybrid Typological Design
source FUTURE CITIES [28th eCAADe Conference Proceedings / ISBN 978-0-9541183-9-6] ETH Zurich (Switzerland) 15-18 September 2010, pp.187-195
wos WOS:000340629400020
summary This paper describes a new methodology of deriving innovative hybrid designs using shape grammars of heterogeneous designs. The method is detailed within three phases of shape grammars: analysis, synthesis and evaluation. In the analysis phase, the research suggests that original rules of each design component are grouped in subclass rule sets to facilitate rule choices. Additionally, adding new hybrid rules to original rules expands the options available to the grammar user. In the synthesis phase, the research adopts state labels and markers to drive the design generation. The former is implemented with a user guide grammar to ensure hybridity in the generated design, while the latter aims to ensure feasible designs. Lastly evaluation criteria are added to measure the degree of innovation of the hybrid designs. This paper describes the derivation of hybrid minaret designs from a corpus of heterogeneous traditional minaret designs.
keywords Shape grammar; Parallel grammar; Hybrid design; Typology
series eCAADe
email dhuha.abdul-aziz@strath.ac.uk
last changed 2016/05/16 09:08

_id 5cba
authors Anders, Peter
year 1999
title Beyond Y2k: A Look at Acadia's Present and Future
source ACADIA Quarterly, vol. 18, no. 1, p. 10
summary The sky may not be falling, but it sure is getting closer. Where will you when the last three zeros of our millennial odometer click into place? Computer scientists tell us that Y2K will bring the world’s computer infrastructure to its knees. Maybe, maybe not. But it is interesting that Y2K is an issue at all. Speculating on the future is simultaneously a magnifying glass for examining our technologies and a looking glass for what we become through them. "The future" is nothing new. Orwell's vision of totalitarian mass media did come true, if only as Madison Avenue rather than Big Brother. Futureboosters of the '50s were convinced that each garage would house a private airplane by the year 2000. But world citizens of the 60's and 70's feared a nuclear catastrophe that would replace the earth with a smoking crater. Others - perhaps more optimistically -predicted that computers were going to drive all our activities by the year 2000. And, in fact, theymay not be far off... The year 2000 is symbolic marker, a point of reflection and assessment. And - as this date is approaching rapidly - this may be a good time to come to grips with who we are and where we want to be.
series ACADIA
email ptr@mindspace.com
last changed 2003/11/21 14:15

_id ecaade2017_234
id ecaade2017_234
authors Benetti, Alberto, Favargiotti, Sara and Ricci, Mos?
year 2017
title RE.S.U.ME. - REsilient and Smart Urban MEtabolism
source Fioravanti, A, Cursi, S, Elahmar, S, Gargaro, S, Loffreda, G, Novembri, G, Trento, A (eds.), ShoCK! - Sharing Computational Knowledge! - Proceedings of the 35th eCAADe Conference - Volume 1, Sapienza University of Rome, Rome, Italy, 20-22 September 2017, pp. 1113-1120
summary New technologies and uncontrolled open-data policies lead public to a new way of approaching the built environment. To enlarge the competences of the professionals that work within the cities, we believe that providing a deep and dynamic knowledge on the heritage and urban built environment is the more effective solution to offer a unique support to the needs. By providing a boosted geographical database with detailed information about the status of each building, we aim to support the professional by providing a neat vision about vacant buildings available citywide. We think this knowledge is an important asset in covering every kind of public requests: from flat to rent to an abandoned building to restore or to drive better investors. The city of Trento will be the pilot project to test these statements.We studied the phenomenon of pushing new constructions rather investing on the reuse of abandoned buildings with the consequences of unsustainable land use. To address the work we adopted a comprehensive approach across the fields of urbanism, ICT engineering and social sciences. We believe that sharing knowledge and know-hows with municipalities, agencies, and citizens is the way to support better market strategies as well as urban transformation policies.
keywords Information Technology; Urban Metabolism; Re-cycle; Urban Reserves; Policy Decision-Making; Data-driven Analysis
series eCAADe
email sara.favargiotti@unitn.it
last changed 2017/09/13 13:13

_id ascaad2004_paper9
id ascaad2004_paper9
authors Bennadji, A.; H. Ahriz, and P. Alastair
year 2004
title Computer Aided Sustainable Design
source eDesign in Architecture: ASCAAD's First International Conference on Computer Aided Architectural Design, 7-9 December 2004, KFUPM, Saudi Arabia
summary One of the most important aspects architects need to consider fairly early on is that of energy saving, cost, thermal comfort and the effect on the environment in terms of CO2 emissions. At present, during the early design stage of a building, different options are assessed using simple tools (tables, graphs and software) that contain a large number of assumptions the very nature of which can bias choice or possibly lead to an inappropriate solution. It can be argued that the only way to provide a rational assessment of options is to use calculation methods that represent in detail the physical processes involved; this usually involves the use of dynamic thermal models. Furthermore if this tool is also used during detailed design it would introduce a consistency that is normally absent from the analytical design process. Many designers are of the opinion that, because not all details are known, then such tools are not suitable for application at early stages in the design. This view can be challenged because, even at the concept stage, a great deal is known about a building. This paper aims to show that a general description of a building can be used to generate sufficient data to drive a valid analysis using a detailed thermal model at the early sketch stage of the design process. The paper describes the philosophy, methodology and the interface developed to achieve this aim. The interface guides the user through the input process using a series of screens giving options for keywords used to describe the building; comprehensive default data built into the software are then attached to these keywords. The resulting data file is a building description that is the best possible interpretation of the design intent. This can then be used to assess options and guide towards a final design.
series ASCAAD
email a.bennadji@rgu.ac.uk
last changed 2007/04/08 17:47

_id cf2011_p098
id cf2011_p098
authors Bernal, Marcelo; Eastman Charles
year 2011
title Top-down Approach for Interaction of Knowledge-Based Parametric Objects and Preliminary Massing Studies for Decision Making in Early Design Stages
source Computer Aided Architectural Design Futures 2011 [Proceedings of the 14th International Conference on Computer Aided Architectural Design Futures / ISBN 9782874561429] Liege (Belgium) 4-8 July 2011, pp. 149-164.
summary Design activities vary from high-degree of freedom in early concept design stages to highly constrained solution spaces in late ones. Such late developments entail large amount of expertise from technical domains. Multiple parallel models handle different aspects of a project, from geometric master models to specific building components. This variety of models must keep consistency with the design intent while they are dealing with specific domains of knowledge such as architectural design, structure, HVAC, MEP, or plumbing systems. Most of the expertise embedded within the above domains can be translated into parametric objects by capturing design and engineering knowledge through parameters, constraints, or conditionals. The aim of this research is capturing such expertise into knowledge-based parametric objects (KPO) for re-usability along the design process. The proposed case study ‚Äì provided by SOM New York‚ is the interaction between a massing study of a high-rise and its building service core, which at the same time handles elevators, restrooms, emergency stairs, and space for technical systems. This project is focused on capturing design expertise, involved in the definition of a building service core, from a high-rise senior designer, and re-using this object for interaction in real-time with a preliminary massing study model of a building, which will drive the adaption process of the service core. This interaction attempts to provide an integrated design environment for feedback from technical domains to early design stages for decision-making, and generate a well-defined first building draft. The challenges addressed to drive the instantiation of the service core according to the shifting characteristics of the high-rise are automatic instantiation and adaptation of objects based on decision rules, and updating in real-time shared parameters and information derived from the high-rise massing study. The interaction between both models facilitates the process from the designer‚Äôs perspective of reusing previous design solutions in new projects. The massing study model is the component that handles information from the perspective of the outer shape design intent. Variations at this massing study model level drive the behavior of the service core model, which must adapt its configuration to the shifting geometry of the building during design exploration in early concept design stages. These variations depend on a list of inputs derived from multiple sources such as variable lot sizes, building type, variable square footage of the building, considerations about modularity, number of stories, floor-to-floor height, total building height, or total building square footage. The shifting combination of this set of parameters determines the final aspect of the building and, consequently, the final configuration of the service core. The service core is the second component involved in the automatic generation of a building draft. In the context of BIM, it is an assembly of objects, which contains other objects representing elevators, restrooms, emergency stairs, and space for several technical systems. This assembly is driven by different layouts depending on the building type, a drop-off sequence, which is the process of continuous reduction of elevators along the building, and how this reduction affects the re-arrangement of the service core layout. Results from this research involves a methodology for capturing design knowledge, a methodology for defining the architecture of smart parametric objects, and a method for real-time-feedback for decision making in early design stages. The project also wants to demonstrate the feasibility of continuous growth on top of existing parametric objects allowing the creation of libraries of smart re-usable objects for automation in design.
keywords design automation, parametric modeling, design rules, knowledge-based design
series CAAD Futures
email marcelo.bernal@gatech.edu
last changed 2012/02/11 18:21

_id a841
authors Brady, Darlene A.
year 1998
title Premise & Process: The Pedagogical Implications of Computing in Design
source Computers in Design Studio Teaching [EAAE/eCAADe International Workshop Proceedings / ISBN 09523687-7-3] Leuven (Belgium) 13-14 November 1998, pp. 31-39
summary Form is capable of communicating a profound idea only when it is linked to a more essential metaphorical intention. The design studio is a forum for addressing this relationship of idea and the means of expression. Computing offers the potential to enhance the design enquiry, but issues of how and when to integrate computer applications in the studio have significant pedagogical implications. It not only has an impact on the size, complexity and number of design projects, but also on whether architectural ideas or computer technology is the content of the studio. It is important to distinguish between the computer image and the process used to achieve the final result. Many computer-based studios focus on the final product which encourages technology to drive design. This paper addresses how design issues can determine the use of technology so that design ideas and computing can reinforce each other, rather than be competing issues. It examines how the unique strengths of computer modeling and animation is used to explore the relationship between visual expression and intention via the issues of metaphor, tectonic color, context and kinetics in several of my graduate and upper-level undergraduate computer-based design studios in the School of Architecture at the University of Illinois at Urbana-Champaign (UI-UC). The studio topics are diverse in nature and include Normative Studio: Prototype as Formgiver; Urban Issues: Context, Color & Kinetics; and Virtual Metaphors: Literature as Formgiver.

series eCAADe
email architexture@earthlink.net
more http://www.eaae.be/
last changed 2000/11/21 08:10

_id ijac201210405
id ijac201210405
authors Braumann, Johannes; Sigrid-Brell Cokcan
year 2012
title Digital and Physical Tools for Industrial Robots in Architecture: Robotic Interaction and Interfaces
source International Journal of Architectural Computing vol. 10 - no. 4, 541-554
summary The development of digital and physical tools is highly dependent on interfaces, which define the terms of interaction both between humans and machines, as well as between machines and other machines.This research explores how new, advanced human machine interfaces, that are built upon concepts established by entertainment electronics can enhance the interaction between users and complex, kinematic machines. Similarly, physical computing greatly innovates machine-machine interaction, as it allows designers to easily customize microcontroller boards and to embed them into complex systems, where they drive actuators and interact with other machines such as industrial robots.These approaches are especially relevant in the creative industry, where customized soft- and hardware is now enabling innovative and highly effective fabrication strategies that have the potential to compete with high-tech industry applications.
series journal
last changed 2019/05/24 07:55

_id cabb
authors Broughton, T., Tan, A. and Coates, P.S.
year 1997
title The Use of Genetic Programming In Exploring 3D Design Worlds - A Report of Two Projects by Msc Students at CECA UEL
source CAAD Futures 1997 [Conference Proceedings / ISBN 0-7923-4726-9] München (Germany), 4-6 August 1997, pp. 885-915
summary Genetic algorithms are used to evolve rule systems for a generative process, in one case a shape grammar,which uses the "Dawkins Biomorph" paradigm of user driven choices to perform artificial selection, in the other a CA/Lindenmeyer system using the Hausdorff dimension of the resultant configuration to drive natural selection. (1) Using Genetic Programming in an interactive 3D shape grammar. A report of a generative system combining genetic programming (GP) and 3D shape grammars. The reasoning that backs up the basis for this work depends on the interpretation of design as search In this system, a 3D form is a computer program made up of functions (transformations) & terminals (building blocks). Each program evaluates into a structure. Hence, in this instance a program is synonymous with form. Building blocks of form are platonic solids (box, cylinder, etc.). A Variety of combinations of the simple affine transformations of translation, scaling, rotation together with Boolean operations of union, subtraction and intersection performed on the building blocks generate different configurations of 3D forms. Using to the methodology of genetic programming, an initial population of such programs are randomly generated,subjected to a test for fitness (the eyeball test). Individual programs that have passed the test are selected to be parents for reproducing the next generation of programs via the process of recombination. (2) Using a GA to evolve rule sets to achieve a goal configuration. The aim of these experiments was to build a framework in which a structure's form could be defined by a set of instructions encoded into its genetic make-up. This was achieved by combining a generative rule system commonly used to model biological growth with a genetic algorithm simulating the evolutionary process of selection to evolve an adaptive rule system capable of replicating any preselected 3D shape. The generative modelling technique used is a string rewriting Lindenmayer system the genes of the emergent structures are the production rules of the L-system, and the spatial representation of the structures uses the geometry of iso-spatial dense-packed spheres
series CAAD Futures
email p.s.coates@btinternet.com
last changed 2003/11/21 14:16

_id d60a
authors Casti, J.C.
year 1997
title Would be Worlds: How simulation is changing the frontiers of science
source John Wiley & Sons, Inc., New York.
summary Five Golden Rules is caviar for the inquiring reader. Anyone who enjoyed solving math problems in high school will be able to follow the author's explanations, even if high school was a long time ago. There is joy here in watching the unfolding of these intricate and beautiful techniques. Casti's gift is to be able to let the nonmathematical reader share in his understanding of the beauty of a good theory.-Christian Science Monitor "[Five Golden Rules] ranges into exotic fields such as game theory (which played a role in the Cuban Missile Crisis) and topology (which explains how to turn a doughnut into a coffee cup, or vice versa). If you'd like to have fun while giving your brain a first-class workout, then check this book out."-San Francisco Examiner "Unlike many popularizations, [this book] is more than a tour d'horizon: it has the power to change the way you think. Merely knowing about the existence of some of these golden rules may spark new, interesting-maybe even revolutionary-ideas in your mind. And what more could you ask from a book?"-New Scientist "This book has meat! It is solid fare, food for thought . . . makes math less forbidding, and much more interesting."-Ben Bova, The Hartford Courant "This book turns math into beauty."-Colorado Daily "John Casti is one of the great science writers of the 1990s."-San Francisco Examiner In the ever-changing world of science, new instruments often lead to momentous discoveries that dramatically transform our understanding. Today, with the aid of a bold new instrument, scientists are embarking on a scientific revolution as profound as that inspired by Galileo's telescope. Out of the bits and bytes of computer memory, researchers are fashioning silicon surrogates of the real world-elaborate "artificial worlds"-that allow them to perform experiments that are too impractical, too costly, or, in some cases, too dangerous to do "in the flesh." From simulated tests of new drugs to models of the birth of planetary systems and galaxies to computerized petri dishes growing digital life forms, these laboratories of the future are the essential tools of a controversial new scientific method. This new method is founded not on direct observation and experiment but on the mapping of the universe from real space into cyberspace. There is a whole new science happening here-the science of simulation. The most exciting territory being mapped by artificial worlds is the exotic new frontier of "complex, adaptive systems." These systems involve living "agents" that continuously change their behavior in ways that make prediction and measurement by the old rules of science impossible-from environmental ecosystems to the system of a marketplace economy. Their exploration represents the horizon for discovery in the twenty-first century, and simulated worlds are charting the course. In Would-Be Worlds, acclaimed author John Casti takes readers on a fascinating excursion through a number of remarkable silicon microworlds and shows us how they are being used to formulate important new theories and to solve a host of practical problems. We visit Tierra, a "computerized terrarium" in which artificial life forms known as biomorphs grow and mutate, revealing new insights into natural selection and evolution. We play a game of Balance of Power, a simulation of the complex forces shaping geopolitics. And we take a drive through TRANSIMS, a model of the city of Albuquerque, New Mexico, to discover the root causes of events like traffic jams and accidents. Along the way, Casti probes the answers to a host of profound questions these "would-be worlds" raise about the new science of simulation. If we can create worlds inside our computers at will, how real can we say they are? Will they unlock the most intractable secrets of our universe? Or will they reveal instead only the laws of an alternate reality? How "real" do these models need to be? And how real can they be? The answers to these questions are likely to change the face of scientific research forever.
series other
last changed 2003/04/23 13:14

_id caadria2009_000
id caadria2009_000
authors Chang, Teng-Wen; Eric Champion, Sheng-Fen Chien and Shang-Chia Chiou(eds.)
year 2009
title CAADRIA 2009 - Between Man and Machine
source Proceedings of the 14th International Conference on Computer Aided Architectural Design Research in Asia / Yunlin (Taiwan) 22-25 April 2009, 795p.
summary Digital designing takes place through processes of interaction between human designers and computers. As such, its location is the in-between, a shared realm of conversation where capabilities of both man and machine are amplified. CAADRIA 2009 addresses this conversation in terms of three perspectives that drive both research and practice in the computer-aided architectural design field: Digital design as integrating, intuitive and intelligent. CAADRIA 2009 aimed to provide a forum in which ideas pertaining to these notions can be explored, discussed and developed. l Digital design is integrative. With the diverse and fast speed of the global economy in the 21st century, the barrier between different disciplines is being overcome digitally. The integration of multiple disciplines is crucial for facing the next wave of global challenges. l Digital Design is intuitive. With advanced computational technology, how humans will cooperate with machines after the computing era will surely become the next challenge for all computational design-related researchers. Intuitive interaction or computing design is the second theme addressed in CAADRIA 2009. l Digital Design is Intelligent. With artificial intelligence, design intelligence is the third theme we would like to address this year. We wish to challenge global researchers to provide a smart and responsive environment for improving our lives and stimulating our economy in innovative ways.
series CAADRIA
type normal paper
email tengwen@yuntech.edu.tw
last changed 2009/05/17 09:10

_id f9e5
authors Cherneff, Jonathan Martin
year 1990
title Knowledge Based Interpretation of Architectural Drawings
source Massachusetts Institute of Technology, Department of Civil Engineering, Cambridge, MA
summary Architectural schematic drawings have been used to communicate building designs for centuries. The symbolic language used in these drawings efficiently represents much of the intricacy of the building process (e.g. implied business relationships, common building practice, and properties of construction materials). The drawing language is an accepted standard representation for building design, something that modern data languages have failed to achieve. In fact, the lack of an accepted standard electronic representation has hampered efforts at computer intergration and perhaps worsened industry fragmentation. In general, drawings must be interpreted, by a professional, and then reentered in order to transfer them from one CAD system to another. This work develops a method for machine interpretation of architectural (or other) schematic drawings. The central problem is to build an efficient drawing parser (i.e. a program that identifies the semantic entitites, characteristics, and relationships that are represented in the drawing). The parser is built from specifications of the drawing grammar and an underlying spatial model. The grammar describes what to look for, and the spatial model enables the parser to find it quickly. Coupled with existing optical recognition technology, this technique enables the use of drawings directly as: (1) a database to drive various AEC applications, (2) a communication protocol to integrate CAD systems, (3) a traditional user interface.
series thesis:PhD
last changed 2003/02/12 21:37

_id ga0123
id ga0123
authors Coates P., Appels, T. Simon, C. and Derix, C.
year 2001
title Current work at CECA
source International Conference on Generative Art
summary The centre for environment computing and architecture continues to experiment with new ways to form, and this paper presents three recent projects from the MSc programme. The three projects all share underlying assumptions about the use of generative algorithms to constructform, using fractal decomposition, lindenmayer systems and the marching cubes algorithm respectively to construct three dimensional "architectural" objects. The data needed to drive the morphology however ranges from formal proportional systems and Genetic L systems programming through swarming systems to perceptive self organising neural nets. In all cases, the projects pose the question what is architectural form. While after Stanford Anderson (Anderson 66) we know it is simplistic to say that it is an automatic outcome of a proper definition of the brief, it is also difficult to accept that the form of a building is an entirely abstract geometrical object existing without recourse to social or contextual justification. In anattempt to resolve these issues we have turned to the study of systems and general system theory as a way of understanding the mechanics of emergence and morphogenesis generally, and the
series other
email P.S.Coates@uel.ac.uk
more http://www.generativeart.com/
last changed 2003/08/07 15:25

_id 389b
authors Do, Ellen Yi-Luen
year 2000
title Sketch that Scene for Me: Creating Virtual Worlds by Freehand Drawing
source Promise and Reality: State of the Art versus State of Practice in Computing for the Design and Planning Process [18th eCAADe Conference Proceedings / ISBN 0-9523687-6-5] Weimar (Germany) 22-24 June 2000, pp. 265-268
summary With the Web people can now view virtual threedimensional worlds and explore virtual space. Increasingly, novice users are interested in creating 3D Web sites. Virtual Reality Modeling Language gained ISO status in 1997, although it is being supplanted by the compatible Java3D API and alternative 3D Web technologies compete. Viewing VRML scenes is relatively straightforward on most hardware platforms and browsers, but currently there are only two ways to create 3D virtual scenes: One is to code the scene directly using VRML. The other is to use existing CAD and modeling software, and save the world in VRML format or convert to VRML from some other format. Both methods are time consuming, cumbersome, and have steep learning curves. Pen-based user interfaces, on the other hand, are for many an easy and intuitive method for graphics input. Not only are people familiar with the look and feel of paper and pencil, novice users also find it less intimidating to draw what they want, where they want it instead of using a complicated tool palette and pull-down menus. Architects and designers use sketches as a primary tool to generate design ideas and to explore alternatives, and numerous computer-based interfaces have played on the concept of "sketch". However, we restrict the notion of sketch to freehand drawing, which we believe helps people to think, to envision, and to recognize properties of the objects with which they are working. SKETCH employs a pen interface to create three-dimensional models, but it uses a simple language of gestures to control a three-dimensional modeler; it does not attempt to interpret freehand drawings. In contrast, our support of 3D world creation using freehand drawing depend on users’ traditional understanding of a floor plan representation. Igarashi et al. used a pen interface to drive browsing in a 3D world, by projecting the user’s marks on the ground plane in the virtual world. Our Sketch-3D project extends this approach, investigating an interface that allows direct interpretation of the drawing marks (what you draw is what you get) and serves as a rapid prototyping tool for creating 3D virtual scenes.
keywords Freehand Sketching, Pen-Based User Interface, Interaction, VRML, Navigation
series eCAADe
email ellendo@cmu.edu
more http://www.uni-weimar.de/ecaade/
last changed 2004/10/04 05:49

_id 01bb
authors Er, M.C.
year 1981
title A Representation Approach to the Tower of Hanoi Problem
source 22 p. : ill. Wollongong: Department of Computing Science, University of Wollongong, August, 1981. includes bibliography
summary By making the moving direction of each disc explicit in the representation, a bit-string so constructed can be used to drive the Tower of Hanoi algorithm. The behavior of disc moves is further analyzed based on the bit-string representation. It has been shown that the bit-string for moving n discs can be used to generate successively the Gray codes of n bits
keywords representation, programming, combinatorics, algorithms, recursion
series CADline
last changed 2003/06/02 11:58

_id 4129
authors Fargas, Josep and Papazian, Pegor
year 1992
title Metaphors in Design: An Experiment with a Frame, Two Lines and Two Rectangles
source Mission - Method - Madness [ACADIA Conference Proceedings / ISBN 1-880250-01-2] 1992, pp. 13-22
summary The research we will discuss below originated from an attempt to examine the capacity of designers to evaluate an artifact, and to study the feasibility of replicating a designer's moves intended to make an artifact more expressive of a given quality. We will present the results of an interactive computer experiment, first developed at the MIT Design Research Seminar, which is meant to capture the subject’s actions in a simple design task as a series of successive "moves"'. We will propose that designers use metaphors in their interaction with design artifacts and we will argue that the concept of metaphors can lead to a powerful theory of design activity. Finally, we will show how such a theory can drive the project of building a design system.

When trying to understand how designers work, it is tempting to examine design products in order to come up with the principles or norms behind them. The problem with such an approach is that it may lead to a purely syntactical analysis of design artifacts, failing to capture the knowledge of the designer in an explicit way, and ignoring the interaction between the designer and the evolving design. We will present a theory about design activity based on the observation that knowledge is brought into play during a design task by a process of interpretation of the design document. By treating an evolving design in terms of the meanings and rules proper to a given way of seeing, a designer can reduce the complexity of a task by focusing on certain of its aspects, and can manipulate abstract elements in a meaningful way.

series ACADIA
email fargas@dtec.es
last changed 2003/05/14 20:02

_id 2068
authors Frazer, John
year 1995
title AN EVOLUTIONARY ARCHITECTURE
source London: Architectural Association
summary In "An Evolutionary Architecture", John Frazer presents an overview of his work for the past 30 years. Attempting to develop a theoretical basis for architecture using analogies with nature's processes of evolution and morphogenesis. Frazer's vision of the future of architecture is to construct organic buildings. Thermodynamically open systems which are more environmentally aware and sustainable physically, sociologically and economically. The range of topics which Frazer discusses is a good illustration of the breadth and depth of the evolutionary design problem. Environmental Modelling One of the first topics dealt with is the importance of environmental modelling within the design process. Frazer shows how environmental modelling is often misused or misinterpreted by architects with particular reference to solar modelling. From the discussion given it would seem that simplifications of the environmental models is the prime culprit resulting in misinterpretation and misuse. The simplifications are understandable given the amount of information needed for accurate modelling. By simplifying the model of the environmental conditions the architect is able to make informed judgments within reasonable amounts of time and effort. Unfortunately the simplications result in errors which compound and cause the resulting structures to fall short of their anticipated performance. Frazer obviously believes that the computer can be a great aid in the harnessing of environmental modelling data, providing that the same simplifying assumptions are not made and that better models and interfaces are possible. Physical Modelling Physical modelling has played an important role in Frazer's research. Leading to the construction of several novel machine readable interactive models, ranging from lego-like building blocks to beermat cellular automata and wall partitioning systems. Ultimately this line of research has led to the Universal Constructor and the Universal Interactor. The Universal Constructor The Universal Constructor features on the cover of the book. It consists of a base plug-board, called the "landscape", on top of which "smart" blocks, or cells, can be stacked vertically. The cells are individually identified and can communicate with neighbours above and below. Cells communicate with users through a bank of LEDs displaying the current state of the cell. The whole structure is machine readable and so can be interpreted by a computer. The computer can interpret the states of the cells as either colour or geometrical transformations allowing a wide range of possible interpretations. The user interacts with the computer display through direct manipulation of the cells. The computer can communicate and even direct the actions of the user through feedback with the cells to display various states. The direct manipulation of the cells encourages experimentation by the user and demonstrates basic concepts of the system. The Universal Interactor The Universal Interactor is a whole series of experimental projects investigating novel input and output devices. All of the devices speak a common binary language and so can communicate through a mediating central hub. The result is that input, from say a body-suit, can be used to drive the out of a sound system or vice versa. The Universal Interactor opens up many possibilities for expression when using a CAD system that may at first seem very strange.However, some of these feedback systems may prove superior in the hands of skilled technicians than more standard devices. Imagine how a musician might be able to devise structures by playing melodies which express the character. Of course the interpretation of input in this form poses a difficult problem which will take a great deal of research to achieve. The Universal Interactor has been used to provide environmental feedback to affect the development of evolving genetic codes. The feedback given by the Universal Interactor has been used to guide selection of individuals from a population. Adaptive Computing Frazer completes his introduction to the range of tools used in his research by giving a brief tour of adaptive computing techniques. Covering topics including cellular automata, genetic algorithms, classifier systems and artificial evolution. Cellular Automata As previously mentioned Frazer has done some work using cellular automata in both physical and simulated environments. Frazer discusses how surprisingly complex behaviour can result from the simple local rules executed by cellular automata. Cellular automata are also capable of computation, in fact able to perform any computation possible by a finite state machine. Note that this does not mean that cellular automata are capable of any general computation as this would require the construction of a Turing machine which is beyond the capabilities of a finite state machine. Genetic Algorithms Genetic algorithms were first presented by Holland and since have become a important tool for many researchers in various areas.Originally developed for problem-solving and optimization problems with clearly stated criteria and goals. Frazer fails to mention one of the most important differences between genetic algorithms and other adaptive problem-solving techniques, ie. neural networks. Genetic algorithms have the advantage that criteria can be clearly stated and controlled within the fitness function. The learning by example which neural networks rely upon does not afford this level of control over what is to be learned. Classifier Systems Holland went on to develop genetic algorithms into classifier systems. Classifier systems are more focussed upon the problem of learning appropriate responses to stimuli, than searching for solutions to problems. Classifier systems receive information from the environment and respond according to rules, or classifiers. Successful classifiers are rewarded, creating a reinforcement learning environment. Obviously, the mapping between classifier systems and the cybernetic view of organisms sensing, processing and responding to environmental stimuli is strong. It would seem that a central process similar to a classifier system would be appropriate at the core of an organic building. Learning appropriate responses to environmental conditions over time. Artificial Evolution Artificial evolution traces it's roots back to the Biomorph program which was described by Dawkins in his book "The Blind Watchmaker". Essentially, artificial evolution requires that a user supplements the standard fitness function in genetic algorithms to guide evolution. The user may provide selection pressures which are unquantifiable in a stated problem and thus provide a means for dealing ill-defined criteria. Frazer notes that solving problems with ill-defined criteria using artificial evolution seriously limits the scope of problems that can be tackled. The reliance upon user interaction in artificial evolution reduces the practical size of populations and the duration of evolutionary runs. Coding Schemes Frazer goes on to discuss the encoding of architectural designs and their subsequent evolution. Introducing two major systems, the Reptile system and the Universal State Space Modeller. Blueprint vs. Recipe Frazer points out the inadequacies of using standard "blueprint" design techniques in developing organic structures. Using a "recipe" to describe the process of constructing a building is presented as an alternative. Recipes for construction are discussed with reference to the analogous process description given by DNA to construct an organism. The Reptile System The Reptile System is an ingenious construction set capable of producing a wide range of structures using just two simple components. Frazer saw the advantages of this system for rule-based and evolutionary systems in the compactness of structure descriptions. Compactness was essential for the early computational work when computer memory and storage space was scarce. However, compact representations such as those described form very rugged fitness landscapes which are not well suited to evolutionary search techniques. Structures are created from an initial "seed" or minimal construction, for example a compact spherical structure. The seed is then manipulated using a series of processes or transformations, for example stretching, shearing or bending. The structure would grow according to the transformations applied to it. Obviously, the transformations could be a predetermined sequence of actions which would always yield the same final structure given the same initial seed. Alternatively, the series of transformations applied could be environmentally sensitive resulting in forms which were also sensitive to their location. The idea of taking a geometrical form as a seed and transforming it using a series of processes to create complex structures is similar in many ways to the early work of Latham creating large morphological charts. Latham went on to develop his ideas into the "Mutator" system which he used to create organic artworks. Generalising the Reptile System Frazer has proposed a generalised version of the Reptile System to tackle more realistic building problems. Generating the seed or minimal configuration from design requirements automatically. From this starting point (or set of starting points) solutions could be evolved using artificial evolution. Quantifiable and specific aspects of the design brief define the formal criteria which are used as a standard fitness function. Non-quantifiable criteria, including aesthetic judgments, are evaluated by the user. The proposed system would be able to learn successful strategies for satisfying both formal and user criteria. In doing so the system would become a personalised tool of the designer. A personal assistant which would be able to anticipate aesthetic judgements and other criteria by employing previously successful strategies. Ultimately, this is a similar concept to Negroponte's "Architecture Machine" which he proposed would be computer system so personalised so as to be almost unusable by other people. The Universal State Space Modeller The Universal State Space Modeller is the basis of Frazer's current work. It is a system which can be used to model any structure, hence the universal claim in it's title. The datastructure underlying the modeller is a state space of scaleless logical points, called motes. Motes are arranged in a close-packing sphere arrangement, which makes each one equidistant from it's twelve neighbours. Any point can be broken down into a self-similar tetrahedral structure of logical points. Giving the state space a fractal nature which allows modelling at many different levels at once. Each mote can be thought of as analogous to a cell in a biological organism. Every mote carries a copy of the architectural genetic code in the same way that each cell within a organism carries a copy of it's DNA. The genetic code of a mote is stored as a sequence of binary "morons" which are grouped together into spatial configurations which are interpreted as the state of the mote. The developmental process begins with a seed. The seed develops through cellular duplication according to the rules of the genetic code. In the beginning the seed develops mainly in response to the internal genetic code, but as the development progresses the environment plays a greater role. Cells communicate by passing messages to their immediate twelve neighbours. However, it can send messages directed at remote cells, without knowledge of it's spatial relationship. During the development cells take on specialised functions, including environmental sensors or producers of raw materials. The resulting system is process driven, without presupposing the existence of a construction set to use. The datastructure can be interpreted in many ways to derive various phenotypes. The resulting structure is a by-product of the cellular activity during development and in response to the environment. As such the resulting structures have much in common with living organisms which are also the emergent result or by-product of local cellular activity. Primordial Architectural Soups To conclude, Frazer presents some of the most recent work done, evolving fundamental structures using limited raw materials, an initial seed and massive feedback. Frazer proposes to go further and do away with the need for initial seed and start with a primordial soup of basic architectural concepts. The research is attempting to evolve the starting conditions and evolutionary processes without any preconditions. Is there enough time to evolve a complex system from the basic building blocks which Frazer proposes? The computational complexity of the task being embarked upon is not discussed. There is an implicit assumption that the "superb tactics" of natural selection are enough to cut through the complexity of the task. However, Kauffman has shown how self-organisation plays a major role in the early development of replicating systems which we may call alive. Natural selection requires a solid basis upon which it can act. Is the primordial soup which Frazer proposes of the correct constitution to support self-organisation? Kauffman suggests that one of the most important attributes of a primordial soup to be capable of self-organisation is the need for a complex network of catalysts and the controlling mechanisms to stop the reactions from going supracritical. Can such a network be provided of primitive architectural concepts? What does it mean to have a catalyst in this domain? Conclusion Frazer shows some interesting work both in the areas of evolutionary design and self-organising systems. It is obvious from his work that he sympathizes with the opinions put forward by Kauffman that the order found in living organisms comes from both external evolutionary pressure and internal self-organisation. His final remarks underly this by paraphrasing the words of Kauffman, that life is always to found on the edge of chaos. By the "edge of chaos" Kauffman is referring to the area within the ordered regime of a system close to the "phase transition" to chaotic behaviour. Unfortunately, Frazer does not demonstrate that the systems he has presented have the necessary qualities to derive useful order at the edge of chaos. He does not demonstrate, as Kauffman does repeatedly, that there exists a "phase transition" between ordered and chaotic regimes of his systems. He also does not make any studies of the relationship of useful forms generated by his work to phase transition regions of his systems should they exist. If we are to find an organic architecture, in more than name alone, it is surely to reside close to the phase transition of the construction system of which is it built. Only there, if we are to believe Kauffman, are we to find useful order together with environmentally sensitive and thermodynamically open systems which can approach the utility of living organisms.
series other
type normal paper
last changed 2004/05/22 12:12

_id ecaade2015_247
id ecaade2015_247
authors Garcia, Manuel Jimenez and Retsin, Gilles
year 2015
title Design Methods for Large Scale Printing
source Martens, B, Wurzer, G, Grasl T, Lorenz, WE and Schaffranek, R (eds.), Real Time - Proceedings of the 33rd eCAADe Conference - Volume 2, Vienna University of Technology, Vienna, Austria, 16-18 September 2015, pp. 331-339
wos WOS:000372316000039
summary With an exponential increase in the possibilities of computation and computer-controlled fabrication, high density information is becoming a reality in digital design and architecture. However, construction methods and industrial fabrication processes have not yet been reshaped to accommodate the recent changes in those disciplines. Although it is possible to build up complex simulations with millions of particles, the simulation is often disconnected from the actual fabrication process. Our research proposes a bridge between both stages, where one drives the other, producing a smooth transition from design to production. A particle in the digital domain becomes a drop of material in the construction method.The architect's medium of expression has become much more than a representational tool in the last century, and more recently it has evolved even beyond a series of rules to drive from design to production. The design system is the instruction itself; embedding structure, material and tectonics and gets delivered to the very end of the construction chain, where it gets materialised. The research showcased in this paper investigates tectonic systems associated with large scale 3D printing and additive manufacturing methods, inheriting both material properties and fabrication constraints at all stages from design to production. Computational models and custom design software packages are designed and developed as strategies to organise material in space in response to specific structural and logistical input.Although the research has developed a wide spectrum of 3D printing methods, this paper focuses only on two of the most recent projects, where different material and computational logics were investigated. The first, titled Filamentrics, intends to develop free-form space frames, overcoming their homogeneity by introducing robotic plastic extrusion. Through the use of custom made extruders a vast range of high resolution prototypes were developed, evolving the design process towards the fabrication of precise structures that can be materialised using additive manufacturing but without the use of a layered 3D printing method. Instead, material limitations were studied and embedded in custom algorithms that allow depositing material in the air for internal connectivity. The final result is a 3x2x2.5m structure that demonstrates the viability of this construction method for being implemented in more industrial scenarios.While Filamentrics is reshaping the way we could design and build light weight structures, the second project Microstrata aims to establish new construction methods for compression based materials. A layering 3D printing method combines both the deposition of the binder and the distribution of an interconnected network of capillaries. These capillaries are organised following structural principles, configuring a series of channels which are left empty within the mass. In a second stage aluminium is cast in this hollow space to build a continuous tension reinforcement.
series eCAADe
type normal paper
email manuel.j@madmdesign.com
more https://mh-engage.ltcc.tuwien.ac.at/engage/ui/watch.html?id=07a6d8e0-6fe7-11e5-9994-cb14cd908012
last changed 2016/05/16 09:08

_id cf2017_601
id cf2017_601
authors Gerber, David Jason; Pantazis, Evangelos; Wang, Alan
year 2017
title Interactive Design of Shell Structures Using Multi Agent Systems: Design Exploration of Reciprocal Frames Based on Environmental and Structural Performance
source Gülen Çagdas, Mine Özkar, Leman F. Gül and Ethem Gürer (Eds.) Future Trajectories of Computation in Design [17th International Conference, CAAD Futures 2017, Proceedings / ISBN 978-975-561-482-3] Istanbul, Turkey, July 12-14, 2017, pp. 601-616.
summary This paper presents a continuation of research on the prototyping of multi-agent systems for architectural design with a focus on generative design as a means to improve design exploration in the context of multiple objectives and complexity. The interactive design framework focuses on coupling force, environmental constraints and fabrication parameters as design drivers for the form finding of shell structures. The objective of the research is to enable designers to intuitively generate free form shells structures that are conditioned by multiple objectives for architectural exploration in early stages of design. The generated geometries are explored through reciprocal frames, and are evaluated in an automated fashion both on local and global levels in terms of their structural and environmental performance and constructability. The analytical results along with fabrication constraints are fed back into the generative design process in order to more rapidly and expansively design explore across complexly coupled objectives. The paper describes the framework and presents the application of this methodology for the design of fabrication aware shell structures in which environmental and structural trade offs drive the final set of design options.
keywords Generative Design, Parametric Design, Multi-Agent Systems, Digital Fabrication, Form Finding, Reciprocal Frames
series CAAD Futures
email dgerber, epanatazi, alanwang}@usc.edu
last changed 2017/12/01 13:38

_id e3a4
authors Girard, Michael
year 1987
title Interactive Design of 3D Computer-Animated Legged Animal Motion
source IEEE Computer Graphics and Applications. June, 1987. vol. 7: pp. 39-51 : ill. includes bibliography
summary A visually interactive approach to the design of 3D computer- animated legged animal motion in the context of the PODA computer animation system is presented. The design process entails the interactive specification of parameters that drive a computational model for animal movement. The animator incrementally modifies a framework for establishing desired limb and body motion as well as the constraints imposed by physical dynamics (Newtonian mechanical properties) and temporal restrictions. PODA uses the desired motion and constraints specified by the animator to produce motion through an idealized model of the animal's adaptive dynamic control strategies
keywords computer graphics, animation
series CADline
last changed 1999/02/12 14:08

_id 6be9
authors Guo, Haoxu
year 1999
title The Realization of Intelligent Aid to CAD of Architectural Design with the Object-Oriented Method
source CAADRIA '99 [Proceedings of The Fourth Conference on Computer Aided Architectural Design Research in Asia / ISBN 7-5439-1233-3] Shanghai (China) 5-7 May 1999, pp. 443-454
summary The object-oriented analysis and design has been the principal technology of software development since the 90s and intellectualization has been the direction of development for CAD software in the architectural design. An investigation is made on the application of the object-oriented technology to the realization of the intellectualization of the CAD for architectural design.
keywords Object-oriented; CAD for Architectural Design, Intelligent Technology, Design Expert System, Object, Visual-Computing Integration, Parameter Drive, Polymorphism, Inherit, Correlated Operation
series CAADRIA
last changed 2002/09/05 07:20

For more results click below:

this is page 0show page 1show page 2show page 3HOMELOGIN (you are user _anon_476584 from group guest) CUMINCAD Papers Powered by SciX Open Publishing Services 1.002