CumInCAD is a Cumulative Index about publications in Computer Aided Architectural Design
supported by the sibling associations ACADIA, CAADRIA, eCAADe, SIGraDi, ASCAAD and CAAD futures

PDF papers
References

Hits 1 to 20 of 738

_id sigradi2006_e183a
id sigradi2006_e183a
authors Costa Couceiro, Mauro
year 2006
title La Arquitectura como Extensión Fenotípica Humana - Un Acercamiento Basado en Análisis Computacionales [Architecture as human phenotypic extension – An approach based on computational explorations]
source SIGraDi 2006 - [Proceedings of the 10th Iberoamerican Congress of Digital Graphics] Santiago de Chile - Chile 21-23 November 2006, pp. 56-60
summary The study describes some of the aspects tackled within a current Ph.D. research where architectural applications of constructive, structural and organization processes existing in biological systems are considered. The present information processing capacity of computers and the specific software development have allowed creating a bridge between two holistic nature disciplines: architecture and biology. The crossover between those disciplines entails a methodological paradigm change towards a new one based on the dynamical aspects of forms and compositions. Recent studies about artificial-natural intelligence (Hawkins, 2004) and developmental-evolutionary biology (Maturana, 2004) have added fundamental knowledge about the role of the analogy in the creative process and the relationship between forms and functions. The dimensions and restrictions of the Evo-Devo concepts are analyzed, developed and tested by software that combines parametric geometries, L-systems (Lindenmayer, 1990), shape-grammars (Stiny and Gips, 1971) and evolutionary algorithms (Holland, 1975) as a way of testing new architectural solutions within computable environments. It is pondered Lamarck´s (1744-1829) and Weismann (1834-1914) theoretical approaches to evolution where can be found significant opposing views. Lamarck´s theory assumes that an individual effort towards a specific evolutionary goal can cause change to descendents. On the other hand, Weismann defended that the germ cells are not affected by anything the body learns or any ability it acquires during its life, and cannot pass this information on to the next generation; this is called the Weismann barrier. Lamarck’s widely rejected theory has recently found a new place in artificial and natural intelligence researches as a valid explanation to some aspects of the human knowledge evolution phenomena, that is, the deliberate change of paradigms in the intentional research of solutions. As well as the analogy between genetics and architecture (Estévez and Shu, 2000) is useful in order to understand and program emergent complexity phenomena (Hopfield, 1982) for architectural solutions, also the consideration of architecture as a product of a human extended phenotype can help us to understand better its cultural dimension.
keywords evolutionary computation; genetic architectures; artificial/natural intelligence
series SIGRADI
email
last changed 2016/03/10 09:49

_id ga0010
id ga0010
authors Moroni, A., Zuben, F. Von and Manzolli, J.
year 2000
title ArTbitrariness in Music
source International Conference on Generative Art
summary Evolution is now considered not only powerful enough to bring about the biological entities as complex as humans and conciousness, but also useful in simulation to create algorithms and structures of higher levels of complexity than could easily be built by design. In the context of artistic domains, the process of human-machine interaction is analyzed as a good framework to explore creativity and to produce results that could not be obtained without this interaction. When evolutionary computation and other computational intelligence methodologies are involved, every attempt to improve aesthetic judgement we denote as ArTbitrariness, and is interpreted as an interactive iterative optimization process. ArTbitrariness is also suggested as an effective way to produce art through an efficient manipulation of information and a proper use of computational creativity to increase the complexity of the results without neglecting the aesthetic aspects [Moroni et al., 2000]. Our emphasis will be in an approach to interactive music composition. The problem of computer generation of musical material has received extensive attention and a subclass of the field of algorithmic composition includes those applications which use the computer as something in between an instrument, in which a user "plays" through the application's interface, and a compositional aid, which a user experiments with in order to generate stimulating and varying musical material. This approach was adopted in Vox Populi, a hybrid made up of an instrument and a compositional environment. Differently from other systems found in genetic algorithms or evolutionary computation, in which people have to listen to and judge the musical items, Vox Populi uses the computer and the mouse as real-time music controllers, acting as a new interactive computer-based musical instrument. The interface is designed to be flexible for the user to modify the music being generated. It explores evolutionary computation in the context of algorithmic composition and provides a graphical interface that allows to modify the tonal center and the voice range, changing the evolution of the music by using the mouse[Moroni et al., 1999]. A piece of music consists of several sets of musical material manipulated and exposed to the listener, for example pitches, harmonies, rhythms, timbres, etc. They are composed of a finite number of elements and basically, the aim of a composer is to organize those elements in an esthetic way. Modeling a piece as a dynamic system implies a view in which the composer draws trajectories or orbits using the elements of each set [Manzolli, 1991]. Nonlinear iterative mappings are associated with interface controls. In the next page two examples of nonlinear iterative mappings with their resulting musical pieces are shown.The mappings may give rise to attractors, defined as geometric figures that represent the set of stationary states of a non-linear dynamic system, or simply trajectories to which the system is attracted. The relevance of this approach goes beyond music applications per se. Computer music systems that are built on the basis of a solid theory can be coherently embedded into multimedia environments. The richness and specialty of the music domain are likely to initiate new thinking and ideas, which will have an impact on areas such as knowledge representation and planning, and on the design of visual formalisms and human-computer interfaces in general. Above and bellow, Vox Populi interface is depicted, showing two nonlinear iterative mappings with their resulting musical pieces. References [Manzolli, 1991] J. Manzolli. Harmonic Strange Attractors, CEM BULLETIN, Vol. 2, No. 2, 4 -- 7, 1991. [Moroni et al., 1999] Moroni, J. Manzolli, F. Von Zuben, R. Gudwin. Evolutionary Computation applied to Algorithmic Composition, Proceedings of CEC99 - IEEE International Conference on Evolutionary Computation, Washington D. C., p. 807 -- 811,1999. [Moroni et al., 2000] Moroni, A., Von Zuben, F. and Manzolli, J. ArTbitration, Las Vegas, USA: Proceedings of the 2000 Genetic and Evolutionary Computation Conference Workshop Program – GECCO, 143 -- 145, 2000.
series other
email
more http://www.generativeart.com/
last changed 2003/08/07 17:25

_id 8e02
authors Brown, A.G.P. and Coenen, F.P.
year 2000
title Spatial reasoning: improving computational efficiency
source Automation in Construction 9 (4) (2000) pp. 361-367
summary When spatial data is analysed the result is often very computer intensive: even by the standards of contemporary technologies, the machine power needed is great and the processing times significant. This is particularly so in 3-D and 4-D scenarios. What we describe here is a technique, which tackles this and associated problems. The technique is founded in the idea of quad-tesseral addressing; a technique, which was originally applied to the analysis of atomic structures. It is based on ideas concerning Hierarchical clustering developed in the 1960s and 1970s to improve data access time [G.M. Morton, A computer oriented geodetic database and a new technique on file sequencing, IBM Canada, 1996.], and on atomic isohedral (same shape) tiling strategies developed in the 1970s and 1980s concerned with group theory [B. Grunbaum, G.C. Shephard, Tilings and Patterns, Freeman, New York, 1987.]. The technique was first suggested as a suitable representation for GIS in the early 1980s when the two strands were brought together and a tesseral arithmetic applied [F.C. Holdroyd, The Geometry of Tiling Hierarchies, Ars Combanitoria 16B (1983) 211–244.; S.B.M. Bell, B.M. Diaz, F.C. Holroyd, M.J.J. Jackson, Spatially referenced methods of processing raster and vector data, Image and Vision Computing 1 (4) (1983) 211–220.; Diaz, S.B.M. Bell, Spatial Data Processing Using Tesseral Methods, Natural Environment Research Council, Swindon, 1986.]. Here, we describe how that technique can equally be applied to the analysis of environmental interaction with built forms. The way in which the technique deals with the problems described is first to linearise the three-dimensional (3-D) space being investigated. Then, the reasoning applied to that space is applied within the same environment as the definition of the problem data. We show, with an illustrative example, how the technique can be applied. The problem then remains of how to visualise the results of the analysis so undertaken. We show how this has been accomplished so that the 3-D space and the results are represented in a way which facilitates rapid interpretation of the analysis, which has been carried out.
series journal paper
more http://www.elsevier.com/locate/autcon
last changed 2003/05/15 21:22

_id ga0019
id ga0019
authors Ceccato, Cristiano
year 2000
title On the Translation of Design Data into Design Form in Evolutionary Design
source International Conference on Generative Art
summary The marriage of advanced computational methods and new manufacturing technologies give rise to new paradigms in design process and execution. Specifically, the research concerns itself with the application of Generative and Evolutionary computation to the production of mass-customized products and building components. The work is based on the premise that CAD-CAM should evolve into a dynamic, intelligent, multi-user environment that encourages creativity and actively supports the evolution of individual, mass-customized designs that exhibit common features. The concept of Parametric Design is well established, and chiefly concerns itself with generating design sets that exists within the boundaries of pre-set parametric values. Evolutionary Design extends the notion of parametric control by using rule-based generative algorithms to evolve common families of individual design solutions. These can be optimized according to particular criteria, or can form a wide variety of hierarchically related design solutions, while supporting design intuition. The integration of Evolutionary Design with CAD-CAM, in particular the areas of flexible manufacturing and mass-customization, creates a unique scenario which exploits the full power of both approaches to create a new design-process paradigm that can generate limitless possibilities in a non-deterministic manner within a variable search-space of possible solutions.This paper concerns itself with the technical and philosophical aspects of the codification, generation and translation of data within the evolutionary-parametric design process. The efficiency and relevance of different methods for treating design data form the most fundamental aspect within the realm of CAD/CAM and are crucial to the successful implementation of Evolutionary Design mechanisms. This begins at the level of seeding and progresses through the entire evolutionary sequence, including the codification for evaluation criteria. Furthermore, the integration of digital design mechanisms with CAM and CNC technologies requires further translation of data into manufacturable formats. This paper examines different methods available to system designers and discussed their effect on new paradigms of digital design methods.
keywords Evolutionary, Parametric, Generative, Data, Format, Objects, Codification
series other
more http://www.generativeart.com/
last changed 2003/08/07 17:25

_id 0014
authors Hsu, W. and Liu, B.
year 2000
title Conceptual design: issues and challenges
source Computer-Aided Design, Vol. 32 (14) (2000) pp. 849-850
summary Decisions made during conceptual design have significant influence on the cost, performance, reliability, safety and environmental impact of a product. It has been estimated that design decisions account for more than 75% of final product costs. It is, therefore, vital that designers have access to the right tools to support such design activities. In the early 1980s, researchers began to realize the impact of design decisions on downstream activities. As a result, different methodologies such as design for assembly, design for manufacturing and concurrent engineering, have been proposed. Software tools that implement these methodologies have also been developed. However, most of these tools are only applicable in the detailed design phase. Yet, even the highest standard of detailed design cannot compensate for a poor design concept formulated at the conceptual design phase. In spite of this, few CAD tools have been developed to support conceptual design activities. This is because knowledge of the design requirements and constraints during this early phase of a product's life cycle is usually imprecise and incomplete, making it difficult to utilize computer-based systems or prototypes. However, recent advances in fields such as fuzzy logic, computational geometry, constraints programming and so on have now made it possible for researchers to tackle some of the challenging issues in dealing with conceptual design activities. In this special issue, we have gathered together discussions on various aspects of conceptual design phase: from the capturing of the designer's intent, to modelling design constraints and solving them in an efficient manner, to verifying the correctness of the design.
series journal paper
email
last changed 2003/05/15 10:54

_id 4077
authors Kolarevic, Branko
year 2000
title Digital Morphogenesis and Computational Architectures
source SIGraDi’2000 - Construindo (n)o espacio digital (constructing the digital Space) [4th SIGRADI Conference Proceedings / ISBN 85-88027-02-X] Rio de Janeiro (Brazil) 25-28 september 2000, pp. 98-103
summary This paper examines methods in which digital media is employed not as a representational tool for visualization but as a generative tool for the derivation of form and its transformation - the digital morphogenesis. It explores the possibilities for the “finding of form”, which the emergence of various digitally based generative techniques seem to bring about. It surveys the digital generative processes - the computational architectures - based on concepts such as topological space, isomorphic surfaces, kinematics and dynamics, keyshape animation, parametric design, and genetic algorithms.
series SIGRADI
email
last changed 2016/03/10 09:54

_id 4fa1
authors Lee, E., Ida, Y., Woo, S. and Sasada, T.
year 1999
title Environmental Design Using Fractals in Computer Graphics
source Architectural Computing from Turing to 2000 [eCAADe Conference Proceedings / ISBN 0-9523687-5-7] Liverpool (UK) 15-17 September 1999, pp. 533-538
doi https://doi.org/10.52842/conf.ecaade.1999.533
summary Computer graphics have developed efficient techniques for visualisation of the real world. Many of the algorithms have a physical basis, such as computational models for the light and the shadow, models of real objects (buildings, mountains, roads and so on) and the simulation of natural phenomenon. Now computer graphics techniques provide the virtual world with a perception of three dimensions. The concept of the virtual world and its technology have been expanding and intensifying in recent years. Almost everything in the real world has been simulated in virtual world. When it comes to a terrain model, what we need is labour and time. But now it is possible to simulate terrain like the real world using fractals in computer graphics with a very small program and small data set. This study aims to show how to build a real world impression in the virtual world. In this paper the authors suggest a landscape design method and show the results of its application.
keywords Fractals, Polygon-Reduction, Computer Graphics, Virtual World, Collaboration
series eCAADe
last changed 2022/06/07 07:51

_id 1b04
authors Leu, S.-S., Yang, C.-H. and Huang, J.-C.
year 2000
title Resource leveling in construction by genetic algorithm-based optimization and its decision support system application
source Automation in Construction 10 (1) (2000) pp. 27-41
summary Traditional analytical and heuristic approaches are inefficient and inflexible when solving construction resource leveling problems. A computational optimization technique, genetic algorithms (GAs), was employed in this study to overcome drawbacks of traditional construction resource leveling algorithms. The proposed algorithm can effectively provide the optimal or near-optimal combination of multiple construction resources, as well as starting and finishing dates of activities subjected to the objective of resource leveling. Furthermore, a prototype of a decision support system (DSS) for construction resource leveling was also developed. Construction planners can interact with the system to carry out ad hoc analysis through "what-if" queries.
series journal paper
more http://www.elsevier.com/locate/autcon
last changed 2003/05/15 21:22

_id 00d5
authors Liou, ShuennRen and Chyn, TaRen
year 2000
title Constructing Geometric Regularity underlying Building Facades
source Promise and Reality: State of the Art versus State of Practice in Computing for the Design and Planning Process [18th eCAADe Conference Proceedings / ISBN 0-9523687-6-5] Weimar (Germany) 22-24 June 2000, pp. 313-315
doi https://doi.org/10.52842/conf.ecaade.2000.313
summary Geometric regularity constitutes a basis for designers to initiate the formulation of building shapes and urban forms. For example, Le Corbusier considers the regulating line "an inevitable element of architecture" and uses it as a "means" for understanding and creating good designs. Thomas Beeby argues that the acquisition of knowledge on geometric construction plays a crucial role in the education of architecture design. This paper illustrates a computational approach to constructing the regularity of architectural geometry. The formal structure underlying a single façade and continuous façades are examined.
keywords Geometric Regularity, Building Facades, Cluster Analysis, CAAD
series eCAADe
email
more http://www.uni-weimar.de/ecaade/
last changed 2022/06/07 07:59

_id d8df
authors Naticchia, Berardo
year 1999
title Physical Knowledge in Patterns: Bayesian Network Models for Preliminary Design
source Architectural Computing from Turing to 2000 [eCAADe Conference Proceedings / ISBN 0-9523687-5-7] Liverpool (UK) 15-17 September 1999, pp. 611-619
doi https://doi.org/10.52842/conf.ecaade.1999.611
summary Computer applications in design have pursued two main development directions: analytical modelling and information technology. The former line has produced a large number of tools for reality simulation (i.e. finite element models), the latter is producing an equally large amount of advances in conceptual design support (i.e. artificial intelligence tools). Nevertheless we can trace rare interactions between computation models related to those different approaches. This lack of integration is the main reason of the difficulty of CAAD application to the preliminary stage of design, where logical and quantitative reasoning are closely related in a process that we often call 'qualitative evaluation'. This paper briefly surveys the current development of qualitative physical models applied in design and propose a general approach for modelling physical behaviour by means of Bayesian network we are employing to develop a tutoring and coaching system for natural ventilation preliminary design of halls, called VENTPad. This tool explores the possibility of modelling the causal mechanism that operate in real systems in order to allow a number of integrated logical and quantitative inference about the fluid-dynamic behaviour of an hall. This application could be an interesting connection tool between logical and analytical procedures in preliminary design aiding, able to help students or unskilled architects, both to guide them through the analysis process of numerical data (i.e. obtained with sophisticate Computational Fluid Dynamics software) or experimental data (i.e. obtained with laboratory test models) and to suggest improvements to the design.
keywords Qualitative Physical Modelling, Preliminary Design, Bayesian Networks
series eCAADe
email
last changed 2022/06/07 07:59

_id 2345
authors Park, Hyong-June and Vakalo, Emmanuel-George
year 2000
title An Enchanted Toy Based on Froebel’s Gifts: A computational tool used to teach architectural knowledge to students
source Promise and Reality: State of the Art versus State of Practice in Computing for the Design and Planning Process [18th eCAADe Conference Proceedings / ISBN 0-9523687-6-5] Weimar (Germany) 22-24 June 2000, pp. 35-39
doi https://doi.org/10.52842/conf.ecaade.2000.035
summary Assuming that students can require architectural knowledge through direct manipulation of formal objects, this paper introduces a computational toy as a means for teaching knowledge about composition and geometry to students of architecture. The bottom-up approach is employed in the manipulation of the toy. The toy aims at recovering and nourishing the students’ creative spirit and enriching their vocabulary of forms and spaces.
keywords bottom-up approach, formalization, data abstraction, communication, basic transformation functions, syntactic interventions, isolated island of automation, Feedback and error-elimination
series eCAADe
email
more http://www.uni-weimar.de/ecaade/
last changed 2022/06/07 08:00

_id f727
authors Stouffs, Rudi and Krishnamur, Ramesh
year 2000
title Alternative Computational Design Representations
source SIGraDi’2000 - Construindo (n)o espacio digital (constructing the digital Space) [4th SIGRADI Conference Proceedings / ISBN 85-88027-02-X] Rio de Janeiro (Brazil) 25-28 september 2000, pp. 200-202
summary Supporting data sharing among different disciplines, applications, and users in the building industry is a complex and difficult task. Standardization efforts and research into product models have since long attempted to facilitate data exchange among building partners, with little result so far. Different technologies have resulted in different approaches, in particular, an object-oriented approach has led to the specification of IFCs as a basis for information sharing, while other initiatives adopt XML as a flexible language for marking up and describing project information. We propose a concept for representational flexibility, named sorts, that combines many of the advantages of both approaches. Based on an extensible vocabulary of representational classes and compositional relationships and grounded in an object-oriented framework that has each of the representational classes specify its own operational behavior, it will enable a designer to define, develop, and adopt alternative design representations that can suit a specific purpose or task at hand.
series SIGRADI
email
last changed 2016/03/10 10:01

_id 83cb
authors Telea, Alexandru C.
year 2000
title Visualisation and simulation with object-oriented networks
source Eindhoven University of Technology
summary Among the existing systems, visual programming environments address best these issues. However, producing interactive simulations and visualisations is still a difficult task. This defines the main research objective of this thesis: The development and implementation of concepts and techniques to combine visualisation, simulation, and application construction in an interactive, easy to use, generic environment. The aim is to produce an environment in which the above mentioned activities can be learnt and carried out easily by a researcher. Working with such an environment should decrease the amount of time usually spent in redesigning existing software elements such as graphics interfaces, existing computational modules, and general infrastructure code. Writing new computational components or importing existing ones should be simple and automatic enough to make using the envisaged system an attractive option for a non programmer expert. Besides this, all proven successful elements of an interactive simulation and visualisation environment should be provided, such as visual programming, graphics user interfaces, direct manipulation, and so on. Finally, a large palette of existing scientific computation, data processing, and visualisation components should be integrated in the proposed system. On one hand, this should prove our claims of openness and easy code integration. On the other hand, this should provide the concrete set of tools needed for building a range of scientific applications and visualisations. This thesis is structured as follows. Chapter 2 defines the context of our work. The scientific research environment is presented and partitioned into the three roles of end user, application designer, and component developer. The interactions between these roles and their specific requirements are described and lead to a more precise formulation of our problem statement. Chapter 3 presents the most used architectures for simulation and visualisation systems: the monolithic system, the application library, and the framework. The advantages and disadvantages of these architectural models are then discussed in relation with our problem statement requirements. The main conclusion drawn is that no single existing architectural model suffices, and that what is needed is a combination of the features present in all three models. Chapter 4 introduces the new architectural model we propose, based on the combination of object-orientation in form of the C++ language and dataflow modelling in the new MC++ language. Chapter 5 presents VISSION, an interactive simulation and visualisation environment constructed on the introduced new architectural model, and shows how the usual tasks of application construction, steering, and visualisation are addressed. In chapter 6, the implementation of VISSION’s architectural model is described in terms of its component parts. Chapter 7 presents the applications of VISSION to numerical simulation, while chapter 8 focuses on its visualisation and graphics applications. Finally, chapter 9 concludes the thesis and outlines possible direction for future research.
keywords Computer Visualisation
series thesis:PhD
email
last changed 2003/02/12 22:37

_id d24b
authors Van Dam, A., Forsberg, A., Laidlaw, D., La Viola, J. and Simpson, R.
year 2000
title Immersive VR for scientific visualization
source IEEE Computer Graphics and Applications, 20(6), pp. 26-52
summary Immersive virtual reality (IVR) has the potential to be a powerful tool for the visualization of burgeoning scientific datasets and models. While IVR has been available for well over a decade, its use in scientific visualization is relatively new and many challenges remain before IVR can become a standard tool for the working scientist. In this presentation we provide a progress report and sketch a research agenda for the technology underlying IVR for scientific visualization. Among the interesting problem areas are how to do computational steering for exploration, how to use art-inspired visualization techniques for multi-valued data, and how to construct interaction techniques and metaphors for pleasant and efficient control of the environment. To illustrate our approaches to some of these issues, we will present specific examples of work from our lab, including immersive visualizations of arterial blood flow and of medical imaging.
series journal paper
last changed 2003/04/23 15:50

_id f91f
authors Elezkurtaj, Tomor and Franck, Georg
year 2000
title Geometry and Topology. A User-Interface to Artificial Evolution in Architectural Design
source Promise and Reality: State of the Art versus State of Practice in Computing for the Design and Planning Process [18th eCAADe Conference Proceedings / ISBN 0-9523687-6-5] Weimar (Germany) 22-24 June 2000, pp. 309-312
doi https://doi.org/10.52842/conf.ecaade.2000.309
summary The paper presents a system that supports architectural floor plan design interactively. The method of problem solving implemented is a combination of an evolutionary strategy (ES) and a genetic algorithm (GA). The problem to be solved consists of fitting a number of rooms (n) into an outline by observing functional requirements. The rooms themselves are specified concerning size, function and preferred proportion. The functional requirements entering the fitness functions are expressed in terms of the proportions of the rooms and the neighbourhood relations between them. The system is designed to deal with one of the core problems of computer supported creativity in architecture. For architecture, form not only, but also function is relevant. Without specifying the function that a piece of architecture is supposed to fulfil, it is hard to support its design by computerised methods of problem solving and optimisation. In architecture, however, function relates to comfort, easiness of use, and aesthetics as well. Since it is extraordinary hard, if not impossible, to operationalise aesthetics, computer aided support of creative architectural design is still in its infancy.
keywords New AI, Genetic Algorithms, Artificial Evolution, creative Architectural Design, Interactive Design, Topology
series eCAADe
email
more http://www.uni-weimar.de/ecaade/
last changed 2022/06/07 07:55

_id 625d
authors Liapi, Katherine A.
year 2001
title Geometric Configuration and Graphical Representation of Spherical Tensegrity Networks
source Reinventing the Discourse - How Digital Tools Help Bridge and Transform Research, Education and Practice in Architecture [Proceedings of the Twenty First Annual Conference of the Association for Computer-Aided Design in Architecture / ISBN 1-880250-10-1] Buffalo (New York) 11-14 October 2001, pp. 258-267
doi https://doi.org/10.52842/conf.acadia.2001.258
summary The term “Tensegrity,” that describes mainly a structural concept, is used in building design to address a class of structures with very promising applications in architecture. Tensegrity structures are characterized by almost no separation between structural configuration and formal or architectural expression (Liapi 2000). In the last two decades structural and mechanical aspects in the design of these structures have been successfully addressed, while their intriguing morphology has inspired several artists and architects. Yet, very few real world applications of the tensegrity concept in architecture have been encountered. The geometric and topological complexity of tensegrity structures that is inherent to their structural and mechanical basis may account for significant difficulties in the study of their form and their limited application in building design. In this paper an efficient method for the generation of the geometry of spherical tensegrity networks is presented. The method is based on the integration of CAD tools with Descriptive Geometry procedures and allows designers to resolve and visualize the complex geometry of such structures.
keywords Tensegrity Networks, Visualization, Geometric Configuration
series ACADIA
email
last changed 2022/06/07 07:59

_id 53c6
authors Mardaljevic, John
year 2000
title Daylight Simulation: Validation, Sky Models and Daylight Coefficients
source De Montfort University, Leicester, UK
summary The application of lighting simulation techniques for daylight illuminance modelling in architectural spaces is described in this thesis. The prediction tool used for all the work described here is the Radiance lighting simulation system. An overview of the features and capabilities of the Radiance system is presented. Daylight simulation using the Radiance system is described in some detail. The relation between physical quantities and the lighting simulation parameters is made clear in a series of progressively more complex examples. Effective use of the inter-reflection calculation is described. The illuminance calculation is validated under real sky conditions for a full-size office space. The simulation model used sky luminance patterns that were based directly on measurements. Internal illuminance predictions are compared with measurements for 754 skies that cover a wide range of naturally occurring conditions. The processing of the sky luminance measurements for the lighting simulation is described. The accuracy of the illuminance predictions is shown to be, in the main, comparable with the accuracy of the model input data. There were a number of predictions with low accuracy. Evidence is presented to show that these result from imprecision in the model specification - such as, uncertainty of the circumsolar luminance - rather than the prediction algorithms themselves. Procedures to visualise and reduce illuminance and lighting-related data are presented. The ability of sky models to reproduce measured sky luminance patterns for the purpose of predicting internal illuminance is investigated. Four sky models and two sky models blends are assessed. Predictions of internal illuminance using sky models/blends are compared against those using measured sky luminance patterns. The sky model blends and the Perez All-weather model are shown to perform comparably well. Illuminance predictions using measured skies however were invariably better than those using sky models/blends. Several formulations of the daylight coefficient approach for predicting time varying illuminances are presented. Radiance is used to predict the daylight coefficients from which internal illuminances are derived. The form and magnitude of the daylight coefficients are related to the scene geometry and the discretisation scheme. Internal illuminances are derived for four daylight coefficient formulations based on the measured luminance patterns for the 754 skies. For the best of the formulations, the accuracy of the daylight coefficient derived illuminances is shown to be comparable to that using the standard Radiance calculation method. The use of the daylight coefficient approach to both accurately and efficiently predict hourly internal daylight illuminance levels for an entire year is described. Daylight coefficients are invariant to building orientation for a fixed building configuration. This property of daylight coefficients is exploited to yield hourly internal illuminances for a full year as a function of building orientation. Visual data analysis techniques are used to display and process the massive number of derived illuminances.
series thesis:PhD
email
more http://www.iesd.dmu.ac.uk/~jm/thesis/
last changed 2003/02/12 22:37

_id ga0014
id ga0014
authors McGuire, Kevin
year 2000
title Controlling Chaos: a Simple Deterministic System for Creating Complex Organic Shapes
source International Conference on Generative Art
summary It is difficult and frustrating to create complex organic shapes using the current set of computer graphic programs. One reason is because the geometry of nature is different from that of our tools. Its self-similarity and fine detail are derived from growth processes that are very different from the working process imposed by drawing programs. This mismatch makesit difficult to create natural looking artifacts. Drawing programs provide a palette of shapes that may be manipulated in a variety ways, but the palette is limited and based on a cold Euclidean geometry. Clouds, rivers, and rocks are not lines or circles. Paint programs provide interesting filters and effects, but require great skill and effort. Always, the details must be arduously managed by the artist. This limits the artist's expressive power. Fractals have stunning visual richness, but the artist's techniques are limited to those of choosing colours and searching the fractal space. Genetic algorithms provide a powerful means for exploring a space of variations, but the artist's skill is limited by the very difficult ability to arrive at the correct fitness function. It is hard to get the picture you wanted. Ideally, the artist should have macroscopic control over the creation while leaving the computer to manage the microscopic details. For the result to feel organic, the details should be rich, consistent and varied, cohesive but not repetitious. For the results to be reproducible, the system should be deterministic. For it to be expressive there should be a cause-effect relationship between the actions in the program and change in the resulting picture. Finally, it would be interesting if the way we drew was more closely related to the way things grew. We present a simple drawing program which provides this mixture of macroscopic control with free microscopic detail. Through use of an accretion growth model, the artist controls large scale structure while varied details emerge naturally from senstive dependence in the system. Its algorithms are simple and deterministic, so its results are predictable and reproducible. The overall resulting structure can be anticipated, but it can also surprise. Despite its simplicity, it has been used to generate a surprisingly rich assortment of complex organic looking pictures.
series other
more http://www.generativeart.com/
last changed 2003/08/07 17:25

_id 38ff
authors Van den Heuvel, F.A.
year 2000
title Trends in CAD-based photogrammetric measurement
source International Archives of Photogrammetry and Remote Sensing, Vol. 33, Part 5/2, pp. 852-863
summary In the past few decades, Computer Aided Design (CAD) systems have evolved from 2D tools that assist in construction design to the basis of software systems for a variety of applications, such as (re)design, manufacturing, quality control, and facility management. The basic functions of a modern CAD system are storage and retrieval of 3D data, their construction, manipulation, and visualisation. All these functions are needed in a photogrammetric measurement system. Therefore, photogrammetry benefits from integration with CAD, and thereby from developments in this field. There are two main interpretations of the term CAD-based photogrammetry. The first interpretation is on a system level: there is a trend towards integration of photogrammetric tools in existing CAD systems. The second interpretation is on an algorithmic level: developments in the field of CAD regarding object modelling techniques are being implemented in photogrammetric systems. In practice, the two interpretations overlap to a varying extent. The integrated photogrammetric processing of geometry and topology is defined as a minimum requirement for CAD-based photogrammetry. The paper discusses the relation between CAD and photogrammetry with an emphasis on close-range photogrammetry. Several approaches for the integration of CAD and photogrammetry are briefly reviewed, and trends in CAD-based photogrammetry are outlined. First of all, the trend towards CAD-based photogrammetry is observed. The integration of photogrammetry and CAD increases the efficiency of photogrammetric modelling. One of the reasons for this is the improvement of the user-interface, which allows better interaction with the data. A more fundamental improvement is the use of advanced object modelling techniques such as Constructive Solid Geometry, and the incorporation of geometric object constraints. Furthermore, research emphasis is on CAD-based matching techniques for automatic precise measurement of CAD-models. An overall conclusion remains: the integration of photogrammetry and CAD has great potential for widening the acceptance of photogrammetry, especially in industry. This is firstly because of the improvement in efficiency, and secondly because of the established and well-known concept of CAD.
series journal paper
last changed 2003/04/23 15:50

_id ga0007
id ga0007
authors Coates, Paul and Miranda, Pablo
year 2000
title Swarm modelling. The use of Swarm Intelligence to generate architectural form
source International Conference on Generative Art
summary .neither the human purposes nor the architect's method are fully known in advance. Consequently, if this interpretation of the architectural problem situation is accepted, any problem-solving technique that relies on explicit problem definition, on distinct goal orientation, on data collection, or even on non-adaptive algorithms will distort the design process and the human purposes involved.' Stanford Anderson, "Problem-Solving and Problem-Worrying". The works concentrates in the use of the computer as a perceptive device, a sort of virtual hand or "sense", capable of prompting an environment. From a set of data that conforms the environment (in this case the geometrical representation of the form of the site) this perceptive device is capable of differentiating and generating distinct patterns in its behavior, patterns that an observer has to interpret as meaningful information. As Nicholas Negroponte explains referring to the project GROPE in his Architecture Machine: 'In contrast to describing criteria and asking the machine to generate physical form, this exercise focuses on generating criteria from physical form.' 'The onlooking human or architecture machine observes what is "interesting" by observing GROPE's behavior rather than by receiving the testimony that this or that is "interesting".' The swarm as a learning device. In this case the work implements a Swarm as a perceptive device. Swarms constitute a paradigm of parallel systems: a multitude of simple individuals aggregate in colonies or groups, giving rise to collaborative behaviors. The individual sensors can't learn, but the swarm as a system can evolve in to more stable states. These states generate distinct patterns, a result of the inner mechanics of the swarm and of the particularities of the environment. The dynamics of the system allows it to learn and adapt to the environment; information is stored in the speed of the sensors (the more collisions, the slower) that acts as a memory. The speed increases in the absence of collisions and so providing the system with the ability to forget, indispensable for differentiation of information and emergence of patterns. The swarm is both a perceptive and a spatial phenomenon. For being able to Interact with an environment an observer requires some sort of embodiment. In the case of the swarm, its algorithms for moving, collision detection, and swarm mechanics conform its perceptive body. The way this body interacts with its environment in the process of learning and differentiation of spatial patterns constitutes also a spatial phenomenon. The enactive space of the Swarm. Enaction, a concept developed by Maturana and Varela for the description of perception in biological terms, is the understanding of perception as the result of the structural coupling of an environment and an observer. Enaction does not address cognition in the currently conventional sense as an internal manipulation of extrinsic 'information' or 'signals', but as the relation between environment and observer and the blurring of their identities. Thus, the space generated by the swarm is an enactive space, a space without explicit description, and an invention of the swarm-environment structural coupling. If we consider a gestalt as 'Some property -such as roundness- common to a set of sense data and appreciated by organisms or artefacts' (Gordon Pask), the swarm is also able to differentiate space 'gestalts' or spaces of some characteristics, such as 'narrowness', or 'fluidness' etc. Implicit surfaces and the wrapping algorithm. One of the many ways of describing this space is through the use of implicit surfaces. An implicit surface may be imagined as an infinitesimally thin band of some measurable quantity such as color, density, temperature, pressure, etc. Thus, an implicit surface consists of those points in three-space that satisfy some particular requirement. This allows as to wrap the regions of space where a difference of quantity has been produced, enclosing the spaces in which some particular events in the history of the Swarm have occurred. The wrapping method allows complex topologies, such as manifoldness in one continuous surface. It is possible to transform the information generated by the swarm in to a landscape that is the result of the particular reading of the site by the swarm. Working in real time. Because of the complex nature of the machine, the only possible way to evaluate the resulting behavior is in real time. For this purpose specific applications had to be developed, using OpenGL for the Windows programming environment. The package consisted on translators from DXF format to a specific format used by these applications and viceversa, the Swarm "engine", a simulated parallel environment, and the Wrapping programs, to generate the implicit surfaces. Different versions of each had been produced, in different stages of development of the work.
series other
email
more http://www.generativeart.com/
last changed 2003/08/07 17:25

For more results click below:

this is page 0show page 1show page 2show page 3show page 4show page 5... show page 36HOMELOGIN (you are user _anon_939395 from group guest) CUMINCAD Papers Powered by SciX Open Publishing Services 1.002