CumInCAD is a Cumulative Index about publications in Computer Aided Architectural Design
supported by the sibling associations ACADIA, CAADRIA, eCAADe, SIGraDi, ASCAAD and CAAD futures

PDF papers
References

Hits 1 to 20 of 219

_id 9964
authors Augenbroe, G. and Winkelmann, F.
year 1991
title Integration of Simulation into the Building Design Process
source J.A. Clarke, J.W. Mitchell, and R.C. Van de Perre (eds.), Proceedings, Building Simulation '91 IBPSA Conference, pp. 367-374
summary We describe the need for a joint effort between design researchers and simulation tool developers in formulating procedures and standards for integrating simulation into the building design process. We review and discuss current efforts in the US and Europe in the development of next-generation simulation tools and design integration techniques. In particular, we describe initiatives in object-oriented simulation environments (including the US Energy 'Kernel System, the Swedish Ida system, the UK Energy Kernel System, and the French ZOOM program.) and consider the relationship of these environments to recent R&D initiatives in design integration (the COMBINE project in Europe and the AEDOT project in the US).
series other
last changed 2003/11/21 15:16

_id 22d6
authors Ballheim, F. and Leppert, J.
year 1991
title Architecture with Machines, Principles and Examples of CAAD-Education at the Technische Universität München
source Experiences with CAAD in Education and Practice [eCAADe Conference Proceedings] Munich (Germany) 17-19 October 1991
doi https://doi.org/10.52842/conf.ecaade.1991.x.h3w
summary "Design tools affect the results of the design process" - this is the starting point of our considerations about the efficient use of CAAD within architecture. To give you a short overview about what we want to say with this thesis lets have a short - an surely incomplete - trip through the fourth dimension back into the early time of civil engineering. As CAD in our faculty is integrated in the "Lehrstuhl für Hochbaustatik und Tragwerksplanung" (if we try to say it in English it would approximately be "institute of structural design"), we chose an example we are very familiar with because of its mathematical background - the cone sections: Circle, ellipse, parabola and hyperbola. If we start our trip two thousand years ago we only find the circle - or in very few cases the ellipse - in their use for the ground plan of greek or roman theaters - if you think of Greek amphitheaters or the Colosseum in Rome - or for the design of the cross section of a building - for example the Pantheon, roman aqueducts or bridges. With the rediscovery of the perspective during the Renaissance the handling of the ellipse was brought to perfection. May be the most famous example is the Capitol in Rome designed by Michelangelo Buonarotti with its elliptical ground plan that looks like a circle if the visitor comes up the famous stairway. During the following centuries - caused by the further development of the natural sciences and the use of new construction materials, i.e. cast-iron, steel or concrete - new design ideas could be realized. With the growing influence of mathematics on the design of buildings we got the division into two professions: Civil engineering and architecture. To the regret of the architects the most innovative constructions were designed by civil engineers, e.g. the early iron bridges in Britain or the famous bridges of Robert Maillard. Nowadays we are in the situation that we try to reintegrate the divided professions. We will return to that point later discussing possible solutions of this problem. But let us continue our 'historical survey demonstrating the state of the art we have today. As the logical consequence of the parabolic and hyperbolic arcs the hyperbolic parabolic shells were developed using traditional design techniques like models and orthogonal sections. Now we reach the point where the question comes up whether complex structures can be completely described by using traditional methods. A question that can be answered by "no" if we take the final step to the completely irregular geometry of cable- net-constructions or deconstructivistic designs. What we see - and what seems to support our thesis of the connection between design tools and the results of the design process - is, that on the one hand new tools enabled the designer to realize new ideas and on the other hand new ideas affected the development of new tools to realize them.

series eCAADe
more http://www.mediatecture.at/ecaade/91/ballheim_leppert.pdf
last changed 2022/06/07 07:50

_id ga0010
id ga0010
authors Moroni, A., Zuben, F. Von and Manzolli, J.
year 2000
title ArTbitrariness in Music
source International Conference on Generative Art
summary Evolution is now considered not only powerful enough to bring about the biological entities as complex as humans and conciousness, but also useful in simulation to create algorithms and structures of higher levels of complexity than could easily be built by design. In the context of artistic domains, the process of human-machine interaction is analyzed as a good framework to explore creativity and to produce results that could not be obtained without this interaction. When evolutionary computation and other computational intelligence methodologies are involved, every attempt to improve aesthetic judgement we denote as ArTbitrariness, and is interpreted as an interactive iterative optimization process. ArTbitrariness is also suggested as an effective way to produce art through an efficient manipulation of information and a proper use of computational creativity to increase the complexity of the results without neglecting the aesthetic aspects [Moroni et al., 2000]. Our emphasis will be in an approach to interactive music composition. The problem of computer generation of musical material has received extensive attention and a subclass of the field of algorithmic composition includes those applications which use the computer as something in between an instrument, in which a user "plays" through the application's interface, and a compositional aid, which a user experiments with in order to generate stimulating and varying musical material. This approach was adopted in Vox Populi, a hybrid made up of an instrument and a compositional environment. Differently from other systems found in genetic algorithms or evolutionary computation, in which people have to listen to and judge the musical items, Vox Populi uses the computer and the mouse as real-time music controllers, acting as a new interactive computer-based musical instrument. The interface is designed to be flexible for the user to modify the music being generated. It explores evolutionary computation in the context of algorithmic composition and provides a graphical interface that allows to modify the tonal center and the voice range, changing the evolution of the music by using the mouse[Moroni et al., 1999]. A piece of music consists of several sets of musical material manipulated and exposed to the listener, for example pitches, harmonies, rhythms, timbres, etc. They are composed of a finite number of elements and basically, the aim of a composer is to organize those elements in an esthetic way. Modeling a piece as a dynamic system implies a view in which the composer draws trajectories or orbits using the elements of each set [Manzolli, 1991]. Nonlinear iterative mappings are associated with interface controls. In the next page two examples of nonlinear iterative mappings with their resulting musical pieces are shown.The mappings may give rise to attractors, defined as geometric figures that represent the set of stationary states of a non-linear dynamic system, or simply trajectories to which the system is attracted. The relevance of this approach goes beyond music applications per se. Computer music systems that are built on the basis of a solid theory can be coherently embedded into multimedia environments. The richness and specialty of the music domain are likely to initiate new thinking and ideas, which will have an impact on areas such as knowledge representation and planning, and on the design of visual formalisms and human-computer interfaces in general. Above and bellow, Vox Populi interface is depicted, showing two nonlinear iterative mappings with their resulting musical pieces. References [Manzolli, 1991] J. Manzolli. Harmonic Strange Attractors, CEM BULLETIN, Vol. 2, No. 2, 4 -- 7, 1991. [Moroni et al., 1999] Moroni, J. Manzolli, F. Von Zuben, R. Gudwin. Evolutionary Computation applied to Algorithmic Composition, Proceedings of CEC99 - IEEE International Conference on Evolutionary Computation, Washington D. C., p. 807 -- 811,1999. [Moroni et al., 2000] Moroni, A., Von Zuben, F. and Manzolli, J. ArTbitration, Las Vegas, USA: Proceedings of the 2000 Genetic and Evolutionary Computation Conference Workshop Program – GECCO, 143 -- 145, 2000.
series other
email
more http://www.generativeart.com/
last changed 2003/08/07 17:25

_id c81f
authors Chandansing, R.A. and Vos, C.J.
year 1991
title IT - Use in Reinforced Concrete Detailing : The Current State, a Forecasting-Model, and a Future-Concept
source The Computer Integrated Future, CIB W78 Seminar. september, 1991. Unnumbered : ill., tables. includes bibliography
summary This paper describes the current state in the Netherlands, concerning the levels of CAD-systems used, their diffusion in practice and constraints and effects of their use for reinforced concrete detailing. An initial forecasting model for the further development of IT in the concrete construction industry and a future-concept for IT-use in reinforced concrete detailing are presented as well
keywords CAD, structures, engineering, building, practice, systems, detailing, construction
series CADline
last changed 2003/06/02 13:58

_id avocaad_2001_02
id avocaad_2001_02
authors Cheng-Yuan Lin, Yu-Tung Liu
year 2001
title A digital Procedure of Building Construction: A practical project
source AVOCAAD - ADDED VALUE OF COMPUTER AIDED ARCHITECTURAL DESIGN, Nys Koenraad, Provoost Tom, Verbeke Johan, Verleye Johan (Eds.), (2001) Hogeschool voor Wetenschap en Kunst - Departement Architectuur Sint-Lucas, Campus Brussel, ISBN 80-76101-05-1
summary In earlier times in which computers have not yet been developed well, there has been some researches regarding representation using conventional media (Gombrich, 1960; Arnheim, 1970). For ancient architects, the design process was described abstractly by text (Hewitt, 1985; Cable, 1983); the process evolved from unselfconscious to conscious ways (Alexander, 1964). Till the appearance of 2D drawings, these drawings could only express abstract visual thinking and visually conceptualized vocabulary (Goldschmidt, 1999). Then with the massive use of physical models in the Renaissance, the form and space of architecture was given better precision (Millon, 1994). Researches continued their attempts to identify the nature of different design tools (Eastman and Fereshe, 1994). Simon (1981) figured out that human increasingly relies on other specialists, computational agents, and materials referred to augment their cognitive abilities. This discourse was verified by recent research on conception of design and the expression using digital technologies (McCullough, 1996; Perez-Gomez and Pelletier, 1997). While other design tools did not change as much as representation (Panofsky, 1991; Koch, 1997), the involvement of computers in conventional architecture design arouses a new design thinking of digital architecture (Liu, 1996; Krawczyk, 1997; Murray, 1997; Wertheim, 1999). The notion of the link between ideas and media is emphasized throughout various fields, such as architectural education (Radford, 2000), Internet, and restoration of historical architecture (Potier et al., 2000). Information technology is also an important tool for civil engineering projects (Choi and Ibbs, 1989). Compared with conventional design media, computers avoid some errors in the process (Zaera, 1997). However, most of the application of computers to construction is restricted to simulations in building process (Halpin, 1990). It is worth studying how to employ computer technology meaningfully to bring significant changes to concept stage during the process of building construction (Madazo, 2000; Dave, 2000) and communication (Haymaker, 2000).In architectural design, concept design was achieved through drawings and models (Mitchell, 1997), while the working drawings and even shop drawings were brewed and communicated through drawings only. However, the most effective method of shaping building elements is to build models by computer (Madrazo, 1999). With the trend of 3D visualization (Johnson and Clayton, 1998) and the difference of designing between the physical environment and virtual environment (Maher et al. 2000), we intend to study the possibilities of using digital models, in addition to drawings, as a critical media in the conceptual stage of building construction process in the near future (just as the critical role that physical models played in early design process in the Renaissance). This research is combined with two practical building projects, following the progress of construction by using digital models and animations to simulate the structural layouts of the projects. We also tried to solve the complicated and even conflicting problems in the detail and piping design process through an easily accessible and precise interface. An attempt was made to delineate the hierarchy of the elements in a single structural and constructional system, and the corresponding relations among the systems. Since building construction is often complicated and even conflicting, precision needed to complete the projects can not be based merely on 2D drawings with some imagination. The purpose of this paper is to describe all the related elements according to precision and correctness, to discuss every possibility of different thinking in design of electric-mechanical engineering, to receive feedback from the construction projects in the real world, and to compare the digital models with conventional drawings.Through the application of this research, the subtle relations between the conventional drawings and digital models can be used in the area of building construction. Moreover, a theoretical model and standard process is proposed by using conventional drawings, digital models and physical buildings. By introducing the intervention of digital media in design process of working drawings and shop drawings, there is an opportune chance to use the digital media as a prominent design tool. This study extends the use of digital model and animation from design process to construction process. However, the entire construction process involves various details and exceptions, which are not discussed in this paper. These limitations should be explored in future studies.
series AVOCAAD
email
last changed 2005/09/09 10:48

_id ga0024
id ga0024
authors Ferrara, Paolo and Foglia, Gabriele
year 2000
title TEAnO or the computer assisted generation of manufactured aesthetic goods seen as a constrained flux of technological unconsciousness
source International Conference on Generative Art
summary TEAnO (Telematica, Elettronica, Analisi nell'Opificio) was born in Florence, in 1991, at the age of 8, being the direct consequence of years of attempts by a group of computer science professionals to use the digital computers technology to find a sustainable match among creation, generation (or re-creation) and recreation, the three basic keywords underlying the concept of “Littérature potentielle” deployed by Oulipo in France and Oplepo in Italy (see “La Littérature potentielle (Créations Re-créations Récréations) published in France by Gallimard in 1973). During the last decade, TEAnO has been involving in the generation of “artistic goods” in aesthetic domains such as literature, music, theatre and painting. In all those artefacts in the computer plays a twofold role: it is often a tool to generate the good (e.g. an editor to compose palindrome sonnets of to generate antonymic music) and, sometimes it is the medium that makes the fruition of the good possible (e.g. the generator of passages of definition literature). In that sense such artefacts can actually be considered as “manufactured” goods. A great part of such creation and re-creation work has been based upon a rather small number of generation constraints borrowed from Oulipo, deeply stressed by the use of the digital computer massive combinatory power: S+n, edge extraction, phonetic manipulation, re-writing of well known masterpieces, random generation of plots, etc. Regardless this apparently simple underlying generation mechanisms, the systematic use of computer based tools, as weel the analysis of the produced results, has been the way to highlight two findings which can significantly affect the practice of computer based generation of aesthetic goods: ? the deep structure of an aesthetic work persists even through the more “desctructive” manipulations, (such as the antonymic transformation of the melody and lyrics of a music work) and become evident as a sort of profound, earliest and distinctive constraint; ? the intensive flux of computer generated “raw” material seems to confirm and to bring to our attention the existence of what Walter Benjamin indicated as the different way in which the nature talk to a camera and to our eye, and Franco Vaccari called “technological unconsciousness”. Essential references R. Campagnoli, Y. Hersant, “Oulipo La letteratura potenziale (Creazioni Ri-creazioni Ricreazioni)”, 1985 R. Campagnoli “Oupiliana”, 1995 TEAnO, “Quaderno n. 2 Antologia di letteratura potenziale”, 1996 W. Benjiamin, “Das Kunstwerk im Zeitalter seiner technischen Reprodizierbarkeit”, 1936 F. Vaccari, “Fotografia e inconscio tecnologico”, 1994
series other
more http://www.generativeart.com/
last changed 2003/08/07 17:25

_id 673a
authors Fukuda, T., Nagahama, R. and Sasada, T.
year 1997
title Networked Interactive 3-D design System for Collaboration
source CAADRIA ‘97 [Proceedings of the Second Conference on Computer Aided Architectural Design Research in Asia / ISBN 957-575-057-8] Taiwan 17-19 April 1997, pp. 429-437
doi https://doi.org/10.52842/conf.caadria.1997.429
summary The concept of ODE (Open Design Environment) and corresponding system were presented in 1991. Then the new concept of NODE. which is networked version of ODE. was generated to make wide area collaboration in 1994. The aim of our research is to facilitate the collaboration among the various people involved in the design process of an urban or architectural project. This includes various designers and engineers, the client and the citizens who may be affected by such a project. With the new technologies of hyper medium, network, and component architecture, we have developed NODE system and applied in practical use of the collaboration among the various people. This study emphasizes the interactive 3-D design tool of NODE which is able to make realistic and realtime presentation with interactive interface. In recent years, ProjectFolder of NODE system, which is a case including documents, plans, and tools to proceed project., is created in the World Wide Web (WWW) and makes hyper links between a 3-D object and a text, an image. and other digital data.
series CAADRIA
email
last changed 2022/06/07 07:50

_id b04c
authors Goerger, S., Darken, R., Boyd, M., Gagnon, T., Liles, S., Sullivan, J. and Lawson, J.
year 1996
title Spatial Knowledge Acquisition from Maps and Virtual Environments in Complex Architectural Space
source Proc. 16 th Applied Behavioral Sciences Symposium, 22-23 April, U.S. Airforce Academy, Colorado Springs, CO., 1996, 6-10
summary It has often been suggested that due to its inherent spatial nature, a virtual environment (VE) might be a powerful tool for spatial knowledge acquisition of a real environment, as opposed to the use of maps or some other two-dimensional, symbolic medium. While interesting from a psychological point of view, a study of the use of a VE in lieu of a map seems nonsensical from a practical point of view. Why would the use of a VE preclude the use of a map? The more interesting investigation would be of the value added of the VE when used with a map. If the VE could be shown to substantially improve navigation performance, then there might be a case for its use as a training tool. If not, then we have to assume that maps continue to be the best spatial knowledge acquisition tool available. An experiment was conducted at the Naval Postgraduate School to determine if the use of an interactive, three-dimensional virtual environment would enhance spatial knowledge acquisition of a complex architectural space when used in conjunction with floor plan diagrams. There has been significant interest in this research area of late. Witmer, Bailey, and Knerr (1995) showed that a VE was useful in acquiring route knowledge of a complex building. Route knowledge is defined as the procedural knowledge required to successfully traverse paths between distant locations (Golledge, 1991). Configurational (or survey) knowledge is the highest level of spatial knowledge and represents a map-like internal encoding of the environment (Thorndyke, 1980). The Witmer study could not confirm if configurational knowledge was being acquired. Also, no comparison was made to a map-only condition, which we felt is the most obvious alternative. Comparisons were made only to a real world condition and a symbolic condition where the route is presented verbally.
series other
last changed 2003/04/23 15:50

_id fd70
authors Goldman, Glenn and Zdepski, Michael Stephen (Eds.)
year 1991
title Reality and Virtual Reality [Conference Proceedings]
source ACADIA Conference Proceedings / ISBN 1-880250-00-4 / Los Angeles (California - USA) October 1991, 236 p.
doi https://doi.org/10.52842/conf.acadia.1991
summary During the past ten years computers in architecture have evolved from machines used for analytic and numeric calculation, to machines used for generating dynamic images, permitting the creation of photorealistic renderings, and now, in a preliminary way, permitting the simulation of virtual environments. Digital systems have evolved from increasing the speed of human operations, to providing entirely new means for creating, viewing and analyzing data. The following essays illustrate the growing spectrum of computer applications in architecture. They discuss developments in the simulation of future environments on the luminous screen and in virtual space. They investigate new methods and theories for the generation of architectural color, texture, and form. Authors address the complex technical issues of "intelligent" models and their associated analytic contents. There are attempts to categorize and make accessible architects' perceptions of various models of "reality". Much of what is presented foreshadows changes that are taking place in the areas of design theory, building sciences, architectural graphics, and computer research. The work presented is both developmental, evolving from the work done before or in other fields, and unique, exploring new themes and concepts. The application of computer technology to the practice of architecture has had a cross disciplinary effect, as computer algorithms used to generate the "unreal" environments and actors of the motion picture industry are applied to the prediction of buildings and urban landscapes not yet in existence. Buildings and places from history are archeologically "re-constructed" providing digital simulations that enable designers to study that which has previously (or never) existed. Applications of concepts from scientific visualization suggest new methods for understanding the highly interrelated aspects of the architectural sciences: structural systems, environmental control systems, building economics, etc. Simulation systems from the aerospace industry and computer media fields propose new non-physical three-dimensional worlds. Video compositing technology from the television industry and the practice of medicine are now applied to the compositing of existing environments with proposed buildings. Whether based in architectural research or practice, many authors continue to question the development of contemporary computer systems. They seek new interfaces between human and machine, new methods for simulating architectural information digitally, and new ways of conceptualizing the process of architectural design. While the practice of architecture has, of necessity, been primarily concerned with increasing productivity - and automation for improved efficiency, it is clear that university based studies and research continue to go beyond the electronic replication of manual tasks and study issues that can change the processes of architectural design - and ultimately perhaps, the products.
series ACADIA
email
more http://www.acadia.org
last changed 2022/06/07 07:49

_id a395
authors Mitchell, W.J., Liggett, R.S., Pollalis, S.N. and Tan, M.
year 1991
title Integrating Shape Grammars and Design Analysis
source Computer Aided Architectural Design Futures: Education, Research, Applications [CAAD Futures ‘91 Conference Proceedings / ISBN 3-528-08821-4] Zürich (Switzerland), July 1991, pp. 17-32
summary This paper demonstrates how design problems can be solved by combining a shape grammar to generate alternatives with standard engineering analysis procedures to test them. It provides a detailed worked example, and discusses practical applications of the idea in design teaching.
series CAAD Futures
email
last changed 2003/05/16 20:58

_id 7508
authors Montgomery, D.C.
year 1991
title Design and Analysis of Experiments
source John Wiley, Chichester
summary Learn How to Achieve Optimal Industrial Experimentation Through four editions, Douglas Montgomery has provided statisticians, engineers, scientists, and managers with the most effective approach for learning how to design, conduct, and analyze experiments that optimize performance in products and processes. Now, in this fully revised and enhanced Fifth Edition, Montgomery has improved his best-selling text by focusing even more sharply on factorial and fractional factorial design and presenting new analysis techniques (including the generalized linear model). There is also expanded coverage of experiments with random factors, response surface methods, experiments with mixtures, and methods for process robustness studies. The book also illustrates two of today's most powerful software tools for experimental design: Design-Expert(r) and Minitab(r). Throughout the text, You'll find output from these two programs, along with detailed discussion on how computers are currently used in the analysis and design of experiments. You'll also learn how to use statistically designed experiments to: * Obtain information for characterization and optimization of systems * Improve manufacturing processes * Design and develop new processes and products * Evaluate material alternatives in product design * Improve the field performance, reliability, and manufacturing aspects of products * Learn how to conduct experiments effectively and efficiently Other important textbook features: * Student version of Design-Expert(r) software is available. * Web site (www.wiley.com/college/montgomery) offers supplemental text material for each chapter, a sample syllabus, and sample student projects from the author's Design of Experiments course at Arizona State University.
series other
last changed 2003/04/23 15:14

_id 3105
authors Novak, T.P., Hoffman, D.L., and Yung, Y.-F.
year 1996
title Modeling the structure of the flow experience
source INFORMS Marketing Science and the Internet Mini-Conference, MIT
summary The flow construct (Csikszentmihalyi 1977) has recently been proposed by Hoffman and Novak (1996) as essential to understanding consumer navigation behavior in online environments such as the World Wide Web. Previous researchers (e.g. Csikszentmihalyi 1990; Ghani, Supnick and Rooney 1991; Trevino and Webster 1992; Webster, Trevino and Ryan 1993) have noted that flow is a useful construct for describing more general human-computer interactions. Hoffman and Novak define flow as the state occurring during network navigation which is: 1) characterized by a seamless sequence of responses facilitated by machine interactivity, 2) intrinsically enjoyable, 3) accompanied by a loss of self-consciousness, and 4) selfreinforcing." To experience flow while engaged in an activity, consumers must perceive a balance between their skills and the challenges of the activity, and both their skills and challenges must be above a critical threshold. Hoffman and Novak (1996) propose that flow has a number of positive consequences from a marketing perspective, including increased consumer learning, exploratory behavior, and positive affect."
series other
last changed 2003/04/23 15:50

_id 2abf
id 2abf
authors Rafi, A
year 2001
title Design creativity in emerging technologies
source In Von, H., Stocker, G. and Schopf, C. (Eds.), Takeover: Who’s doing art of tomorrow (pp. 41-54), New York: SpringerWein.
summary Human creativity works best when there are constraints – pressures to react to, to shape, to suggest. People are generally not very good at making it all up from scratch (Laurel, 1991). Emerging technology particularly virtual reality (VR) Multimedia and Internet is yet to be fully discovered as it allows unprecedented creative talent, ability, skill set, creative thinking, representation, exploration, observation and reference. In an effort to deliver interactive content, designers tend to freely borrow from different fields such as advertising, medicine, game, fine art, commerce, entertainment, edutainment, film-making and architecture (Rafi, Kamarulzaman, Fauzan and Karboulonis, 2000). As a result, content becomes a base that developers transfer the technique of conventional medium design media to the computer. What developers (e.g. artist and technologist) often miss is that to develop the emerging technology content based on the nature of the medium. In this context, the user is the one that will be the best judge to value the effectiveness of the content.

The paper will introduce Global Information Infrastructure (GII) that is currently being developed in the Asian region and discuss its impact on the Information Age society. It will further highlight the ‘natural’ value and characteristics of the emerging technologies in particular Virtual Reality (VR), Multimedia and Internet as a guidance to design an effective, rich and innovative content development. This paper also argues that content designers of the future must not only be both artist and technologist, but artist and technologist that are aware of the re-convergence of art and science and context in which content is being developed. Some of our exploration at the Faculty of Creative Multimedia, Multimedia University will also be demonstrated. It is hoped that this will be the evidence to guide future ‘techno-creative designers’.

keywords design, creativity, content, emerging technologies
series book
type normal paper
email
last changed 2007/09/13 03:46

_id 80ce
authors Turner, R., Balaguer, F., Gobbetti, E. and Thalmann, D.
year 1991
title Interactive Scene Walkthrough Using a Physically-Based Virtual Camera
source Computer Aided Architectural Design Futures: Education, Research, Applications [CAAD Futures ‘91 Conference Proceedings / ISBN 3-528-08821-4] Zürich (Switzerland), July 1991, pp. 511-520
summary One of the most powerful results of recent advances in graphics hardware is the ability of a computer user to interactively explore a virtual building or landscape. The newest threedimensional input devices, together with high speed 3D graphics workstations, make it possible to view and move through a 3D scene by interactively controlling the motion of a virtual camera. In this paper, we describe how natural and intuitive control of building walkthrough can be achieved by using a physically-based model of the virtual camera's behavior. Using the laws of classical mechanics to create an abstract physical model of the camera, we then simulate the virtual camera motion in real time in response toforce data from the various 3D input devices (e.g. the Spaceball and Polhemus 3Space Digitizer). The resulting interactive behavior of the model is determined by several physical parameters such as mass, moment of inertia, and various friction coefficients which can all be varied interactively, and by constraints on the camera's degrees of freedom. This allows us to explore a continuous range of physically-based metaphors for controlling the camera motion. We present the results of experiments using several of these metaphors for virtual camera motion and describe the effects of the various physical parameters.
series CAAD Futures
last changed 1999/04/07 12:03

_id 241f
authors Van Wyk, C.S.G., Bhat, R., Gauchel, J. and Hartkopf, V.
year 1991
title A Knowledge-based Approach to Building Design and Performance Evaluation
source Reality and Virtual Reality [ACADIA Conference Proceedings / ISBN 1-880250-00-4] Los Angeles (California - USA) October 1991, pp. 1-14
doi https://doi.org/10.52842/conf.acadia.1991.001
summary The introduction of physically-based description and simulation methods to issues of building performance (i.e., acoustic, visual, and air quality; thermal comfort, cost, and long-term system integrity) began in the early 1960s as one of the first examples of computer-aided design in architecture. Since that time, the development of commercially-available computer-aided design systems has largely been oriented towards the visualization and representation of the geometry of buildings, while the development of building performance applications has been concerned with approaches to mathematical and physics-based modeling for predictive purposes.
series ACADIA
email
last changed 2022/06/07 07:58

_id ecaade2020_139
id ecaade2020_139
authors Zwierzycki, Mateusz
year 2020
title On AI Adoption Issues in Architectural Design - Identifying the issues based on an extensive literature review.
source Werner, L and Koering, D (eds.), Anthropologic: Architecture and Fabrication in the cognitive age - Proceedings of the 38th eCAADe Conference - Volume 1, TU Berlin, Berlin, Germany, 16-18 September 2020, pp. 515-524
doi https://doi.org/10.52842/conf.ecaade.2020.1.515
summary An analysis of AI in design literature, compiled from almost 200 publications from the 1980s onwards. The majority of the sources are proceedings from various conferences. This work is inspired by the Ten Problems for AI in Design (Gero 1991) workshop report, which listed the problems to be tackled in design with AI. Almost 30 years since the publication, it seems most of the Ten Problems cannot be considered solved or even addressed. One of this paper's goals is to identify, categorize and examine the bottlenecks in the adoption of AI in design. The collected papers were analysed to obtain the following data: Problem, Tool, Solution, Stage and Future work. The conclusions drawn from the analysis are used to define a range of existing problems with AI adoption, further illustrated with an update to the Ten Problems. Ideally this paper will spark a discussion on the quality of research, methodology and continuity in research.
keywords artificial intelligence; review; design automation; knowledge representation; machine learning; expert system
series eCAADe
email
last changed 2022/06/07 07:57

_id 86c1
authors Shih, Shen-Guan
year 1991
title Case-based Representation and Adaptation in Design
source Computer Aided Architectural Design Futures: Education, Research, Applications [CAAD Futures ‘91 Conference Proceedings / ISBN 3-528-08821-4] Zürich (Switzerland), July 1991, pp. 301-312
summary By attempting to model the raw memory of experts, case-based reasoning is distinguished from traditional expert systems, which compile experts' knowledge into rules before new problems are given. A case-based reasoning system processes new problems with the most similar prior experiences available, and adapts the prior solutions to solve new problems. Case-based representation, of design knowledge utilizes the desirable features of the selected case as syntax rules to adapt the case to a new context. As a central issue of the paper, three types of adaptation aimed at topological modifications are described. The first type - casebased search - can be viewed as a localized search process. It follows the syntactical structure of the case to search for variations which provide the required functionality. Regarding the complexity of computation, it is recognized that when a context sensitive grammar is used to describe the desirable features, the search process become intractable. The second type of adaptation can be viewed as a process of self-organization, in which context-sensitive grammars play an essential role. Evaluations have to be simulated by local interaction among design primitives. The third type is called direct transduction. A case is translated directly to another structure according to its syntax by some translation functions. A direct transduction is not necessarily a composition of design operators and thus, a crosscontextual mapping is possible. As a perspective use of these adaptation methods, a CAD system which provides designers with the ability to modify the syntactical structure of a group of design elements, according to some concerned semantics, would support designers better than current CAD systems.
series CAAD Futures
last changed 1999/04/07 12:03

_id 0ab2
authors Amor, R., Hosking, J., Groves, L. and Donn, M.
year 1993
title Design Tool Integration: Model Flexibility for the Building Profession
source Proceedings of Building Systems Automation - Integration, University of Wisconsin-Madison
summary The development of ICAtect, as discussed in the Building Systems Automation and Integration Symposium of 1991, provides a way of integrating simulation tools through a common building model. However, ICAtect is only a small step towards the ultimate goal of total integration and automation of the building design process. In this paper we investigate the next steps on the path toward integration. We examine how models structured to capture the physical attributes of the building, as required by simulation tools, can be used to converse with knowledge-based systems. We consider the types of mappings that occur in the often different views of a building held by these two classes of design tools. This leads us to examine the need for multiple views of a common building model. We then extend our analysis from the views required by simulation and knowledge-based systems, to those required by different segments of the building profession (e.g. architects, engineers, developers, etc.) to converse with such an integrated system. This indicates a need to provide a flexible method of accessing data in the common building model to facilitate use by different building professionals with varying specialities and levels of expertise.
series journal paper
email
last changed 2003/05/15 21:22

_id f9bd
authors Amor, R.W.
year 1991
title ICAtect: Integrating Design Tools for Preliminary Architectural Design
source Wellington, New Zealand: Computer Science Department, Victoria University
summary ICAtect is a knowledge based system that provides an interface between expert systems, simulation packages and CAD systems used for preliminary architectural design. This thesis describes its structure and development.The principal work discussed in this thesis involves the formulation of a method for representing a building. This is developed through an examination of a number of design tools used in architectural design, and the ways in which each of these describe a building.Methods of enabling data to be transferred between design tools are explored. A Common Building Model (CBM), forming the core of the ICAtect system, is developed to represent the design tools knowledge of a building. This model covers the range of knowledge required by a large set of disparate design tools used by architects at the initial design stage.Standard methods of integrating information from the tools were examined, but required augmentation to encompass the unusual constraints found in some of the design tools. The integration of the design tools and the CBM is discussed in detail, with example methods developed for each type of design tool. These example methods provide a successful way of moving information between the different representations. Some problems with mapping data between very different representations were encountered in this process, and the solutions or ideas for remedies are detailed. A model for control and use of ICAtect is developed in the thesis, and the extensions to enable a graphical user interface are discussed.The methods developed in this thesis demonstrate the feasibility of an integrated system of this nature, while the discussion of future work indicates the scope and potential power of ICAtect.
series other
last changed 2003/04/23 15:14

_id 8b1e
authors Blinn, James F.
year 1991
title A Trip Down the Graphics Pipeline: Line Clipping
source IEEE Computer Graphics and Applications January, 1991. vol. 11: pp. 98-105 : ill. includes bibliography.
summary The classic computer graphics pipeline is an assembly-line like process that geometric objects must experience on their journey to becoming pixels on the screen. This is a first of a series of columns on the graphics pipeline. In this column the author concentrate on the algorithm aspects of the line- clipping part of the pipeline
keywords clipping, algorithms, computer graphics
series CADline
last changed 2003/06/02 13:58

For more results click below:

this is page 0show page 1show page 2show page 3show page 4show page 5... show page 10HOMELOGIN (you are user _anon_917687 from group guest) CUMINCAD Papers Powered by SciX Open Publishing Services 1.002