CumInCAD is a Cumulative Index about publications in Computer Aided Architectural Design
supported by the sibling associations ACADIA, CAADRIA, eCAADe, SIGraDi, ASCAAD and CAAD futures

PDF papers
References

Hits 1 to 20 of 5841

_id avocaad_2001_02
id avocaad_2001_02
authors Cheng-Yuan Lin, Yu-Tung Liu
year 2001
title A digital Procedure of Building Construction: A practical project
source AVOCAAD - ADDED VALUE OF COMPUTER AIDED ARCHITECTURAL DESIGN, Nys Koenraad, Provoost Tom, Verbeke Johan, Verleye Johan (Eds.), (2001) Hogeschool voor Wetenschap en Kunst - Departement Architectuur Sint-Lucas, Campus Brussel, ISBN 80-76101-05-1
summary In earlier times in which computers have not yet been developed well, there has been some researches regarding representation using conventional media (Gombrich, 1960; Arnheim, 1970). For ancient architects, the design process was described abstractly by text (Hewitt, 1985; Cable, 1983); the process evolved from unselfconscious to conscious ways (Alexander, 1964). Till the appearance of 2D drawings, these drawings could only express abstract visual thinking and visually conceptualized vocabulary (Goldschmidt, 1999). Then with the massive use of physical models in the Renaissance, the form and space of architecture was given better precision (Millon, 1994). Researches continued their attempts to identify the nature of different design tools (Eastman and Fereshe, 1994). Simon (1981) figured out that human increasingly relies on other specialists, computational agents, and materials referred to augment their cognitive abilities. This discourse was verified by recent research on conception of design and the expression using digital technologies (McCullough, 1996; Perez-Gomez and Pelletier, 1997). While other design tools did not change as much as representation (Panofsky, 1991; Koch, 1997), the involvement of computers in conventional architecture design arouses a new design thinking of digital architecture (Liu, 1996; Krawczyk, 1997; Murray, 1997; Wertheim, 1999). The notion of the link between ideas and media is emphasized throughout various fields, such as architectural education (Radford, 2000), Internet, and restoration of historical architecture (Potier et al., 2000). Information technology is also an important tool for civil engineering projects (Choi and Ibbs, 1989). Compared with conventional design media, computers avoid some errors in the process (Zaera, 1997). However, most of the application of computers to construction is restricted to simulations in building process (Halpin, 1990). It is worth studying how to employ computer technology meaningfully to bring significant changes to concept stage during the process of building construction (Madazo, 2000; Dave, 2000) and communication (Haymaker, 2000).In architectural design, concept design was achieved through drawings and models (Mitchell, 1997), while the working drawings and even shop drawings were brewed and communicated through drawings only. However, the most effective method of shaping building elements is to build models by computer (Madrazo, 1999). With the trend of 3D visualization (Johnson and Clayton, 1998) and the difference of designing between the physical environment and virtual environment (Maher et al. 2000), we intend to study the possibilities of using digital models, in addition to drawings, as a critical media in the conceptual stage of building construction process in the near future (just as the critical role that physical models played in early design process in the Renaissance). This research is combined with two practical building projects, following the progress of construction by using digital models and animations to simulate the structural layouts of the projects. We also tried to solve the complicated and even conflicting problems in the detail and piping design process through an easily accessible and precise interface. An attempt was made to delineate the hierarchy of the elements in a single structural and constructional system, and the corresponding relations among the systems. Since building construction is often complicated and even conflicting, precision needed to complete the projects can not be based merely on 2D drawings with some imagination. The purpose of this paper is to describe all the related elements according to precision and correctness, to discuss every possibility of different thinking in design of electric-mechanical engineering, to receive feedback from the construction projects in the real world, and to compare the digital models with conventional drawings.Through the application of this research, the subtle relations between the conventional drawings and digital models can be used in the area of building construction. Moreover, a theoretical model and standard process is proposed by using conventional drawings, digital models and physical buildings. By introducing the intervention of digital media in design process of working drawings and shop drawings, there is an opportune chance to use the digital media as a prominent design tool. This study extends the use of digital model and animation from design process to construction process. However, the entire construction process involves various details and exceptions, which are not discussed in this paper. These limitations should be explored in future studies.
series AVOCAAD
email
last changed 2005/09/09 10:48

_id 0e58
authors Campbell, D.A. and Wells, M.
year 1994
title A Critique of Virtual Reality in the Architectural Design Process, R-94-3
source Human Interface Technology Laboratory, University of Washington, Seattle, USA, http://www.hitl.washington.edu/publications/r-94-3/: 23 May 2001
summary An addition to a building was designed using virtual reality (VR). The project was part of a design studio for graduate students of architecture. During the design process a detailed journal of activities was kept. In addition, the design implemented with VR was compared to designs implemented with more traditional methods. Both immersive and non-immersive VR simulations were attempted. Part of the rationale for exploring the use of VR in this manner was to develop insight into how VR techniques can be incorporated into the architectural design process, and to provide guidance for the implementers of future VR systems. This paper describes the role of VR in schematic design, through design development to presentation and evaluation. In addition, there are some comments on the effects of VR on detailed design. VR proved to be advantageous in several phases of the design. However, several shortcomings in both hardware and software became apparent. These are described, and a number of recommendations are provided.
series other
email
last changed 2003/04/23 15:50

_id 3fd1
authors Cybis Pereira, A.T., Tissiani, G. and Bocianoski, I.
year 2000
title Design de Interfaces para Ambientes Virtuais: como Obter Usabilidade em 3D (Interface Design for Virtual Environments: How to obtain use of 3-D space.)
source SIGraDi’2000 - Construindo (n)o espacio digital (constructing the digital Space) [4th SIGRADI Conference Proceedings / ISBN 85-88027-02-X] Rio de Janeiro (Brazil) 25-28 september 2000, pp. 313-315
summary The paper presents part of a research developed close to LRV, Laboratory of Virtual Reality of the program of Post-Graduation of the Engineering of production of UFSC. The research aims to answer the approaches for the design of Human-Computer Interfaces, called HCI, for virtual media. Being considered VR the more advanced computer interface technology, at least by the point of view of the interactivity, how come guarantee its usability and at the same time draw graphic interfaces that possess aesthetic and functional value? Besides, in virtual space with or without immersion, how can the design of the interface contribute to stimulate the user’s interactivity with the system in VR? These and other subjects are essential for those who work with interface design for computer systems, and that comes across the need of presenting medias that use virtual reality technology. Through this article a study is presented on the design techniques, the used tools, the recommendations and the necessary requirements of visual communication for HCI for virtual spaces.
series SIGRADI
email
last changed 2016/03/10 09:49

_id 2979
authors Henry, D. and Furness, T.A.
year 1993
title Spatial Perception in Virtual Environments: Evaluating an Architectural Application
source IEEE Virtual Reality Annual International Symposium, 1993, Seattle
summary Over the last several years, professionals from many different fields have come to the Human Interface Technology Laboratory (H.I.T.L) to discover and learn about virtual environments. In general, they are impressed by their experiences and express the tremendous potential the tool has in their respective fields. But the potentials are always projected far in the future, and the tool remains just a concept. This is justifiable because the quality of the visual experience is so much less than what people are used to seeing; high definition television, breathtaking special cinematographic effects and photorealistic computer renderings. Instead, the models in virtual environments are very simple looking; they are made of small spaces, filled with simple or abstract looking objects of little color distinctions as seen through displays of noticeably low resolution and at an update rate which leaves much to be desired. Clearly, for most applications, the requirements of precision have not been met yet with virtual interfaces as they exist today. However, there are a few domains where the relatively low level of the technology could be perfectly appropriate. In general, these are applications which require that the information be presented in symbolic or representational form. Having studied architecture, I knew that there are moments during the early part of the design process when conceptual decisions are made which require precisely the simple and representative nature available in existing virtual environments.
series journal paper
last changed 2003/04/23 15:14

_id 32eb
authors Henry, Daniel
year 1992
title Spatial Perception in Virtual Environments : Evaluating an Architectural Application
source University of Washington
summary Over the last several years, professionals from many different fields have come to the Human Interface Technology Laboratory (H.I.T.L) to discover and learn about virtual environments. In general, they are impressed by their experiences and express the tremendous potential the tool has in their respective fields. But the potentials are always projected far in the future, and the tool remains just a concept. This is justifiable because the quality of the visual experience is so much less than what people are used to seeing; high definition television, breathtaking special cinematographic effects and photorealistic computer renderings. Instead, the models in virtual environments are very simple looking; they are made of small spaces, filled with simple or abstract looking objects of little color distinctions as seen through displays of noticeably low resolution and at an update rate which leaves much to be desired. Clearly, for most applications, the requirements of precision have not been met yet with virtual interfaces as they exist today. However, there are a few domains where the relatively low level of the technology could be perfectly appropriate. In general, these are applications which require that the information be presented in symbolic or representational form. Having studied architecture, I knew that there are moments during the early part of the design process when conceptual decisions are made which require precisely the simple and representative nature available in existing virtual environments. This was a marvelous discovery for me because I had found a viable use for virtual environments which could be immediately beneficial to architecture, my shared area of interest. It would be further beneficial to architecture in that the virtual interface equipment I would be evaluating at the H.I.T.L. happens to be relatively less expensive and more practical than other configurations such as the "Walkthrough" at the University of North Carolina. The set-up at the H.I.T.L. could be easily introduced into architectural firms because it takes up very little physical room (150 square feet) and it does not require expensive and space taking hardware devices (such as the treadmill device for simulating walking). Now that the potential for using virtual environments in this architectural application is clear, it becomes important to verify that this tool succeeds in accurately representing space as intended. The purpose of this study is to verify that the perception of spaces is the same, in both simulated and real environment. It is hoped that the findings of this study will guide and accelerate the process by which the technology makes its way into the field of architecture.
keywords Space Perception; Space (Architecture); Computer Simulation
series thesis:MSc
last changed 2003/02/12 22:37

_id a620
authors Asanowicz, Alexander
year 1991
title Unde et Quo
doi https://doi.org/10.52842/conf.ecaade.1991.x.t1s
source Experiences with CAAD in Education and Practice [eCAADe Conference Proceedings] Munich (Germany) 17-19 October 1991
summary To begin with, I would like to say a few words about the problem of alienation of modern technologies which we also inevitably faced while starting teaching CAD at our department. Quite often nowadays a technology becomes a fetish as a result of lack of clear goals in human mind. There are multiple technologies without sense of purpose which turned into pure experiments. There is always the danger of losing purposeness and drifting toward alienation. The cause of the danger lies in forgetting about original goals while mastering and developing the technology. Eventually the original idea is ignored and a great gap appears between technical factors and creativity. We had the danger of alienation in mind when preparing the CAAD curriculum. Trying to avoid the tension between technical and creative elements we agreed not to introduce CAD too soon then the fourth year of studies and continue it for two semesters. One thing was clear - we should not teach the technique of CAD but how to design using a computer as a medium. Then we specified projects. The first was called "The bathroom I dream of" and meant to be a 2D drawing. The four introductory meetings were in fact teaching foundations of DOS, then a specific design followed with the help of AutoCAD program. In the IX semester, for example, it was "A family house" (plans, facades, perspective). "I have to follow them - I am their leader" said L.J. Peter in "The Peter's Prescription". This quotation reflects exactly the situation we find ourselves in teaching CAAD at our department. It means that ever growing students interest in CAAD made us introduce changes in the curriculum. According to the popular saying, "The more one gets the more one wants", so did we and the students feel after the first semester of teaching CAD. From autumn 1991 CAAD classes will be carried from the third year of studying for two consecutive years. But before further planning one major steep had to be done - we decided to reverse the typical of the seventies approach to the problem when teaching programming languages preceded practical goals hence discouraging many learners.

series eCAADe
email
last changed 2022/06/07 07:50

_id 4eed
authors Benedickt, Michael (ed.)
year 1991
title Cyberspace: First Steps
source The MIT Press, Cambridge, MA and London, UK
summary Cyberspace has been defined as "an infinite artificial world where humans navigate in information-based space" and as "the ultimate computer-human interface." These original contributions take up the philosophical basis for cyberspace in virtual realities, basic communications principles, ramifications of cyberspace for future workplaces, and more.
series other
last changed 2003/04/23 15:14

_id 00bc
authors Chen, Chen-Cheng
year 1991
title Analogical and inductive reasoning in architectural design computation
source Swiss Federal Institute of Technology, ETH Zurich
summary Computer-aided architectural design technology is now a crucial tool of modern architecture, from the viewpoint of higher productivity and better products. As technologies advance, the amount of information and knowledge that designers can apply to a project is constantly increasing. This requires development of more advanced knowledge acquisition technology to achieve higher functionality, flexibility, and efficient performance of the knowledge-based design systems in architecture. Human designers do not solve design problems from scratch, they utilize previous problem solving episodes for similar design problems as a basis for developmental decision making. This observation leads to the starting point of this research: First, we can utilize past experience to solve a new problem by detecting the similarities between the past problem and the new problem. Second, we can identify constraints and general rules implied by those similarities and the similar parts of similar situations. That is, by applying analogical and inductive reasoning we can advance the problem solving process. The main objective of this research is to establish the theory that (1) design process can be viewed as a learning process, (2) design innovation involves analogical and inductive reasoning, and (3) learning from a designer's previous design cases is necessary for the development of the next generation in a knowledge-based design system. This thesis draws upon results from several disciplines, including knowledge representation and machine learning in artificial intelligence, and knowledge acquisition in knowledge engineering, to investigate a potential design environment for future developments in computer-aided architectural design. This thesis contains three parts which correspond to the different steps of this research. Part I, discusses three different ways - problem solving, learning and creativity - of generating new thoughts based on old ones. In Part II, the problem statement of the thesis is made and a conceptual model of analogical and inductive reasoning in design is proposed. In Part III, three different methods of building design systems for solving an architectural design problem are compared rule-based, example-based, and case-based. Finally, conclusions are made based on the current implementation of the work, and possible future extensions of this research are described. It reveals new approaches for knowledge acquisition, machine learning, and knowledge-based design systems in architecture.
series thesis:PhD
email
last changed 2003/05/10 05:42

_id c967
authors Fantacone, Enrico
year 1994
title Exporting CAD Teaching into Developing Countries
doi https://doi.org/10.52842/conf.ecaade.1994.x.t3s
source The Virtual Studio [Proceedings of the 12th European Conference on Education in Computer Aided Architectural Design / ISBN 0-9523687-0-6] Glasgow (Scotland) 7-10 September 1994, p. 222
summary In 1986 the Faculty of Architecture was established in Maputo. It is financed by the Italian Ministry of Foreign Affairs and managed by a Scientific Council of the Faculty of Architecture of "Università La Sapienza" of Rome. The need to create human technical resources beeing able to work profesionally as soon as they finish their studies, made the teaching basis for lab exercises and design. The new architects (the first six students graduated in 1991), need to design and make very important decisions without any control by more experienced local technical institutions. The creation of a CAAD laboratory, and the teaching of information technologies and metodologies in architectural designing aimes to achieve a double goal: (-) to make the new architects able to manage on their own, because of the lack of qualified human resources, large quantity of data, and difficult design problems; (-) to make University, the most important scientific center in the country, an information exchange center between developped countries, and Moçambique.
series eCAADe
last changed 2022/06/07 07:50

_id fd70
authors Goldman, Glenn and Zdepski, Michael Stephen (Eds.)
year 1991
title Reality and Virtual Reality [Conference Proceedings]
doi https://doi.org/10.52842/conf.acadia.1991
source ACADIA Conference Proceedings / ISBN 1-880250-00-4 / Los Angeles (California - USA) October 1991, 236 p.
summary During the past ten years computers in architecture have evolved from machines used for analytic and numeric calculation, to machines used for generating dynamic images, permitting the creation of photorealistic renderings, and now, in a preliminary way, permitting the simulation of virtual environments. Digital systems have evolved from increasing the speed of human operations, to providing entirely new means for creating, viewing and analyzing data. The following essays illustrate the growing spectrum of computer applications in architecture. They discuss developments in the simulation of future environments on the luminous screen and in virtual space. They investigate new methods and theories for the generation of architectural color, texture, and form. Authors address the complex technical issues of "intelligent" models and their associated analytic contents. There are attempts to categorize and make accessible architects' perceptions of various models of "reality". Much of what is presented foreshadows changes that are taking place in the areas of design theory, building sciences, architectural graphics, and computer research. The work presented is both developmental, evolving from the work done before or in other fields, and unique, exploring new themes and concepts. The application of computer technology to the practice of architecture has had a cross disciplinary effect, as computer algorithms used to generate the "unreal" environments and actors of the motion picture industry are applied to the prediction of buildings and urban landscapes not yet in existence. Buildings and places from history are archeologically "re-constructed" providing digital simulations that enable designers to study that which has previously (or never) existed. Applications of concepts from scientific visualization suggest new methods for understanding the highly interrelated aspects of the architectural sciences: structural systems, environmental control systems, building economics, etc. Simulation systems from the aerospace industry and computer media fields propose new non-physical three-dimensional worlds. Video compositing technology from the television industry and the practice of medicine are now applied to the compositing of existing environments with proposed buildings. Whether based in architectural research or practice, many authors continue to question the development of contemporary computer systems. They seek new interfaces between human and machine, new methods for simulating architectural information digitally, and new ways of conceptualizing the process of architectural design. While the practice of architecture has, of necessity, been primarily concerned with increasing productivity - and automation for improved efficiency, it is clear that university based studies and research continue to go beyond the electronic replication of manual tasks and study issues that can change the processes of architectural design - and ultimately perhaps, the products.
series ACADIA
email
more http://www.acadia.org
last changed 2022/06/07 07:49

_id 40aa
authors Heinecke, Andreas M.
year 1991
title Developing Recommendations for CAD User Interfaces Congress II: Design and Implementation of Interactive Systems: Standardization; Development of Standards
source Proceedings of the Fourth InternationalConference on Human-Computer Interaction 1991 v.1 pp. 543-547
summary The reference model for CAD systems developed by the Gesellschaft fur Informatik (GI -- the German membership organization of IFIP) is a frame for classifying the functionality of CAD systems. Whereas the reference model regards the user interface as one of several modules of the CAD system, the user interface appears to the user as being the whole system. This is the reason why an additional task working group on CAD user interfaces has been established by the GI in order to develop recommendations for the design of CAD user interfaces. Proceeding and preliminary results of the working group are described.
keywords Design of User Interfaces; Guidelines; Standardization
series other
last changed 2002/07/07 16:01

_id diss_hensen
id diss_hensen
authors Hensen, J.L.M.
year 1991
title On the Thermal Interaction of Building Structure and Heating and Ventilating System
source Eindhoven University of Technology
summary In this dissertation, developments in the field of building performance evaluation tools are described. The subject of these tools is the thermal interaction of building structure and heating and ventilating system. The employed technique is computer simulation of the integrated, dynamic system comprising the occupants, the building and its heating and ventilating system. With respect to buildings and the heating and ventilating systems which service them, the practical objective is ensuring thermal comfort while using an optimum amount of fuel. While defining the optimum had to be left for other workers, the issue of thermal comfort is addressed here. The conventional theory of thermal comfort in conditions characteristic for dwellings and offices assumes steady-state conditions. Yet thermal conditions in buildings are seldom steady, due to the thermal interaction between building structure, climate, occupancy, and auxiliary systems. A literature rewiew is presented regarding work on thermal comfort specifically undertaken to examine what fluctuations in indoor climate may be acceptable. From the results, assessment criteria are defined. Although its potentials reach beyond the area of Computer Aided Building Design, a description is given of building and plant energy simulation within the context of the CABD field of technology. Following an account of the present state-of-the-art, the choice for starting from an existing energy simulation environment (ESPR) is justified. The main development areas of this software platform - within the present context - are identified as: fluid flow simulation, plant simulation, and their integration with the building side of the overall problem domain. In the field of fluid flow simulation, a fluid flow network simulation module is described. The module is based on the mass balance approach, and may be operated either in standalone mode or from within the integrated building and plant energy simulation system. The program is capable of predicting pressures and mass flows in a user-defined building / plant network comprising nodes (ie building zones, plant components, etc) and connections (ie air leakages, fans, pipes, ducts, etc), when subjected to flow control (eg thermostatic valves) and / or to transient boundary conditions (eg due to wind). The modelling and simulation techniques employed to predict the dynamic behaviour of the heating and ventilating system, are elaborated. The simultaneous approach of the plant and its associated control is described. The present work involved extensions to the ESPR energy simulation environment with respect to robustness of the program, and with respect to additional plant simulation features, supported plant component models and control features. The coupling of fluid flow, plant side energy and mass, and building side energy simulation into one integrated program is described. It is this "modular-simultaneous" technique for the simulation of combined heat and fluid flow in a building / plant context, which enables an integral approach of the thermal interaction of building structure and heating and ventilating system.

A multi stage verification and validation methodology is described, and its applicability to the present work is demonstrated by a number of examples addressing each successive step of the methodology. A number of imaginary and real world case studies are described to demonstrate application of the present work both in a modelling orientated context and in a building engineering context. Then the general conclusions of the present work are summarized. Next and finally, there are recommendations towards possible future work in the areas of: theory, user interface, software structure, application, and technology transfer.

series thesis:PhD
last changed 2003/12/15 14:43

_id 8951
authors Hix, Deborah and Schulman, Robert S.
year 1991
title Human-Computer Interface Development Tools : A Methodology for Their Evaluation
source Communications of the ACM. March, 1991. vol. 34: pp. 75-87 : tables. includes bibliography
summary A comprehensive check-list-based methodology produces quantifyable criteria for evaluating and comparing human computer interface development tools along two dimensions: Functionality and Usability. An empirical evaluation shows that the methodology which is in use in several corporate interface development environments, produces reliable (consistent) results. This research provides a communication mechanism for tool researchers, tool practitioners, and tool users for making coherent critiques of their own and other tools. The authors goal was to provide a rigorous trusted methodology for evaluating human-computer interface development
keywords tools, methodology, user interface, evaluation
series CADline
last changed 2003/06/02 10:24

_id a6d4
authors Krueger, Myron W.
year 1991
title Artificial Reality II
source Reading, Massachusetts: Addison-Wesley Publishing. 2nd.ed.
summary The focus of Myron Krueger in Artificial Reality II is the interaction between humans and machines, both in the immediate interface and the associated cultural relationships. He uses the concept of artificial reality as a medium of experience and as a tool to examine the relationships between people and machines. When he first coined the term in the mid-1970s, his 'goal was full-body participation in computer events that were so compelling that they would be accepted as real experience.' He wanted to create an artificial reality that would perceive human actions in a simulated world of sight, sounds, and other sensations and would make the experience of this illusion convincing. His focus was to create unencumbered, artificial realities where the humans could participate with their entire body without wearing any special instruments (be they sensors or displays) in an experience created by the computer. The environment could be controlled by preexisting programs, or could have operators intervene and use the computer to amplify their ability to interact with people. The intention was not to reproduce conventional reality but to create synthetic realities.
series other
last changed 2003/04/23 15:14

_id e7fb
authors Leclercq, Pierre
year 1991
title Students in Efficient Energy Management
doi https://doi.org/10.52842/conf.ecaade.1991.x.e7o
source Experiences with CAAD in Education and Practice [eCAADe Conference Proceedings] Munich (Germany) 17-19 October 1991
summary The LEMA presents Strategy II, the new version of his CAL software in thermal design of building. Based on his latest experiences using the first prototypes, the present programme provides an complete human interface and interesting tools for decision taking. A first educational experience with this software is described. Strategy II has been studied in 1990 by two twin teams: one is the LEMA (Laboratoire d'Etudes Méthodologiques Architecturales) and the other one is the CTE (Centre des Technologies de l'Education), parts of the University of Liège, in Belgium.

series eCAADe
last changed 2022/06/07 07:50

_id ga0010
id ga0010
authors Moroni, A., Zuben, F. Von and Manzolli, J.
year 2000
title ArTbitrariness in Music
source International Conference on Generative Art
summary Evolution is now considered not only powerful enough to bring about the biological entities as complex as humans and conciousness, but also useful in simulation to create algorithms and structures of higher levels of complexity than could easily be built by design. In the context of artistic domains, the process of human-machine interaction is analyzed as a good framework to explore creativity and to produce results that could not be obtained without this interaction. When evolutionary computation and other computational intelligence methodologies are involved, every attempt to improve aesthetic judgement we denote as ArTbitrariness, and is interpreted as an interactive iterative optimization process. ArTbitrariness is also suggested as an effective way to produce art through an efficient manipulation of information and a proper use of computational creativity to increase the complexity of the results without neglecting the aesthetic aspects [Moroni et al., 2000]. Our emphasis will be in an approach to interactive music composition. The problem of computer generation of musical material has received extensive attention and a subclass of the field of algorithmic composition includes those applications which use the computer as something in between an instrument, in which a user "plays" through the application's interface, and a compositional aid, which a user experiments with in order to generate stimulating and varying musical material. This approach was adopted in Vox Populi, a hybrid made up of an instrument and a compositional environment. Differently from other systems found in genetic algorithms or evolutionary computation, in which people have to listen to and judge the musical items, Vox Populi uses the computer and the mouse as real-time music controllers, acting as a new interactive computer-based musical instrument. The interface is designed to be flexible for the user to modify the music being generated. It explores evolutionary computation in the context of algorithmic composition and provides a graphical interface that allows to modify the tonal center and the voice range, changing the evolution of the music by using the mouse[Moroni et al., 1999]. A piece of music consists of several sets of musical material manipulated and exposed to the listener, for example pitches, harmonies, rhythms, timbres, etc. They are composed of a finite number of elements and basically, the aim of a composer is to organize those elements in an esthetic way. Modeling a piece as a dynamic system implies a view in which the composer draws trajectories or orbits using the elements of each set [Manzolli, 1991]. Nonlinear iterative mappings are associated with interface controls. In the next page two examples of nonlinear iterative mappings with their resulting musical pieces are shown.The mappings may give rise to attractors, defined as geometric figures that represent the set of stationary states of a non-linear dynamic system, or simply trajectories to which the system is attracted. The relevance of this approach goes beyond music applications per se. Computer music systems that are built on the basis of a solid theory can be coherently embedded into multimedia environments. The richness and specialty of the music domain are likely to initiate new thinking and ideas, which will have an impact on areas such as knowledge representation and planning, and on the design of visual formalisms and human-computer interfaces in general. Above and bellow, Vox Populi interface is depicted, showing two nonlinear iterative mappings with their resulting musical pieces. References [Manzolli, 1991] J. Manzolli. Harmonic Strange Attractors, CEM BULLETIN, Vol. 2, No. 2, 4 -- 7, 1991. [Moroni et al., 1999] Moroni, J. Manzolli, F. Von Zuben, R. Gudwin. Evolutionary Computation applied to Algorithmic Composition, Proceedings of CEC99 - IEEE International Conference on Evolutionary Computation, Washington D. C., p. 807 -- 811,1999. [Moroni et al., 2000] Moroni, A., Von Zuben, F. and Manzolli, J. ArTbitration, Las Vegas, USA: Proceedings of the 2000 Genetic and Evolutionary Computation Conference Workshop Program – GECCO, 143 -- 145, 2000.
series other
email
more http://www.generativeart.com/
last changed 2003/08/07 17:25

_id 2abf
id 2abf
authors Rafi, A
year 2001
title Design creativity in emerging technologies
source In Von, H., Stocker, G. and Schopf, C. (Eds.), Takeover: Who’s doing art of tomorrow (pp. 41-54), New York: SpringerWein.
summary Human creativity works best when there are constraints – pressures to react to, to shape, to suggest. People are generally not very good at making it all up from scratch (Laurel, 1991). Emerging technology particularly virtual reality (VR) Multimedia and Internet is yet to be fully discovered as it allows unprecedented creative talent, ability, skill set, creative thinking, representation, exploration, observation and reference. In an effort to deliver interactive content, designers tend to freely borrow from different fields such as advertising, medicine, game, fine art, commerce, entertainment, edutainment, film-making and architecture (Rafi, Kamarulzaman, Fauzan and Karboulonis, 2000). As a result, content becomes a base that developers transfer the technique of conventional medium design media to the computer. What developers (e.g. artist and technologist) often miss is that to develop the emerging technology content based on the nature of the medium. In this context, the user is the one that will be the best judge to value the effectiveness of the content.

The paper will introduce Global Information Infrastructure (GII) that is currently being developed in the Asian region and discuss its impact on the Information Age society. It will further highlight the ‘natural’ value and characteristics of the emerging technologies in particular Virtual Reality (VR), Multimedia and Internet as a guidance to design an effective, rich and innovative content development. This paper also argues that content designers of the future must not only be both artist and technologist, but artist and technologist that are aware of the re-convergence of art and science and context in which content is being developed. Some of our exploration at the Faculty of Creative Multimedia, Multimedia University will also be demonstrated. It is hoped that this will be the evidence to guide future ‘techno-creative designers’.

keywords design, creativity, content, emerging technologies
series book
type normal paper
email
last changed 2007/09/13 03:46

_id 8847
authors Richens, P.
year 1991
title 3D Interface
source Human Interfaces for Design and Visualisation, British Computer Society, London
series other
email
more http://www.arct.cam.ac.uk/research/pubs/
last changed 2000/03/13 20:32

_id f14c
authors Sariyildiz, Sevil
year 1991
title Conceptual Design by Means of Islamic-Geometric-Patterns within a CAAD-Environment
source Delft University of Technology
summary The starting point in this research was to develop a 3D grammar theory on top of existing 2D Islamic-geometric-patterns, trying to rescue their fundamental geometry contents to be applied in contemporary architecture without compromising any architectural style. As it is self evident the architectural design process consists of clearly distinct stages namely conceptual design, materialisation and further completion. A this conceptual stage the innovative item of the research deals with pattern grammars on 3D complex geometrical patterns, considering them as polyhedra and polytopes, for their use as an underlayer to a concept design, like architects use 2D rectangular and triangular grids by the conventional way. Handling these complex 3D patterns requires a special environment which is possible with CAAD. Within the CAAD environment, the handling of these complex patterns is easily done by means of 3D tools, because the 3D tools permit the user to make any possible manipulations and geometrical transformations in an easier way in space. To a geometrical patterns, there is some attention paid during the last 50 years by some scholars. The most complex geometrical patterns are highly developed in Islamic architecture because it is forbidden in Muslim religion to use man's portraits or sculptures of human beings in the religious buildings. All these approaches to complex patterns are analysed and studied as 2D elements. The question was how could we consider them in 3rd dimensions and use them instead of 2D underlayer, as 3D underlayers in the conceptual phase of the CAAD design. Pattern grammar is a generally employable aid (underlying pattern) for conceptual and material designs. On the basis of rules of symmetry and substitution, ordering principles have been worked out, which can be used for formal design methods as well as detailing systems (e.g. modular coordination). Through the realization of a pattern grammar a wider range of underlying patterns can be offered and a choice from these can be made in a more fundamental manner. At a subsequent stage the collection of "empty boxes" can be filled with (architectural) elements in such a way that another option is created between either filling up the boxes completely, filling them partly, or filling them in such a way that they overflow. It is self-evident that underlying patterns can also be used for details and decoration in a design. Concerning the materialisation of the concept design, within the 3D CAAD environment, substitution methods are partially developed. Further theoretical developments concerning the materialisation phase constantly backed up through feed-back with specialist matters (such as e.g. by means of expert systems, decision-support systems), must be worked out. As feed-back of the research, the possibilities of the design with 3D patterns have been tested and the procedures are explained. (*) Working with 3D patterns gives a designer more inspirations to develop new ideas and new concepts and gives the opportunity to handle the complexity. (*) The formal, structural and symmetrical qualities of geometrical patterns has a positive influence on the industrialisation of the building components. (*) Working with 3D tools which are able to handle complex geometry have a result because of the accuracy of the information, that there has hardly been a mistake made during the preparation and the assembly of the building components. This has also positive results concerning the financial aspects of the building process.
series thesis:PhD
email
last changed 2003/02/12 22:37

_id b5be
authors Stok, Leon
year 1991
title Architectural synthesis and optimization of digital systems
source Eindhoven University of Technology
summary High level synthesis means going from an functional specification of a digits-system at the algorithmic level to a register transfer level structure. Different appli-cations will ask for different design styles. Despite this diversity in design styles many tasks in the synthesis will be similar. There is no need to write a new synthesis system for each design style. The best way to go seems a decomposition of the high level synthesis problems in several well defined subproblems. How the problem is decomposed depends heavily on a) the type of network architecture chosen, b) the constraints applied to the design and c) on the functional description itself. From this architecture style, the constraints and the functional description a synthesis scheme can be derived. Once this scheme is fixed, algorithms can be chosen which fit into this scheme and solve the subproblems in a fast and, when possible, optimal way. To support such a synthesis philosophy, a framework is needed in which all design information can be stored in a unique way during the various phases of the design process. This asks for a design data base capable of handling all design information with a formally defined interface to all design tools. This thesis gives a formal way to describe both the functional representation, the register transfer level structure and the controller and the relations between all three of them. Special attention has been paid to the efficient representation of mutual exclusive operations and array accesses. The scheduling and allocation problems are defined as mappings between these formal representations. Both the existing synthesis algorithms and the new algorithms described in this thesis fit into this framework. Three new allocation algorithms are presented in this thesis: an algorithm for optimal register allocation in cyclic data flow graphs, an exact polynomial algorithm to do the module allocation and a new scheme to minimize the number of interconnections during all stages of the data path allocation. Cyclic data flow graphs result from high level behavioral descriptions that contain loops. Algorithms for register allocation in high level synthesis published up till now, only considered loop free data flow graphs, When these algorithms are applied to data flow graphs with loops, unnecessary register transfer operations are introduced. A new algorithm is presented that performs a minimal register allocation and eliminates all superfluous register transfer operations. The problem is reformulated as a multicommodity network flow problem for which very efficient solutions exist. Experiments on a benchmark set have shown that in all test cases all register transfers could be eliminated at no increase in register cost. Only heuristic algorithms appeared in literature to solve the module allocation problem. The module allocation problem is usually defined as a clique cover problem on a so-called module allocation graph. It is shown that, under certain conditions, the module allocation graph belongs to the special class of comparability graphs. A polynomial time algorithm can optimally find a clique cover of such a graph. Even when interconnect weights are taken into account, this can be solved exactly. This problem can be transformed into a maximal cost network flow problem, which can be solved exactly in polynomial time. An algorithm is described which solves the module allocation problem with interconnect weights exactly, with a complexity O(kn2), where n is the number of operations In previous research, interconnection was optimized when the module allocation for the operations and the register allocation for the variables already had been done. However, the amount of multiplexing and interconnect are crucial factors to both the delay and the area of a circuit. A new scheme is presented to minimize the number of interconnections during the data path allocation. This scheme first groups all values based on their read and write times. Values belonging to the same group can share a register file. This minimizes the number of data transfers with different sources and destinations. Secondly, registers are allocated for each group separately. Finally the interconnect allocation is done. During the interconnect allocation, the module allocation is determined. The value grouping is based on edge coloring algorithms providing a sharp upper bound on the number of colors needed two techniques: splitting read and write phases of values and introducing serial (re-)write operations for the same value, make that even more efficient exact edge coloring algorithms can be used. It is shown that when variables are grouped into register files and operations are assigned to modules during the interconnection minimization, significant savings (20%) can be obtained in the number of local interconnections and the amount of global interconnect, at the expense of only slightly more register area.
keywords Digital Systems; Digital Systems
series thesis:PhD
email
last changed 2003/02/12 22:37

For more results click below:

this is page 0show page 1show page 2show page 3show page 4show page 5... show page 292HOMELOGIN (you are user _anon_234588 from group guest) CUMINCAD Papers Powered by SciX Open Publishing Services 1.002