CumInCAD is a Cumulative Index about publications in Computer Aided Architectural Design
supported by the sibling associations ACADIA, CAADRIA, eCAADe, SIGraDi, ASCAAD and CAAD futures

PDF papers
References

Hits 1 to 20 of 175

_id ecaade2023_281
id ecaade2023_281
authors Prokop, Šimon, Kubalík, Jiøí and Kurilla, Lukáš
year 2023
title Neural Networks for Estimating Wind Pressure on Complex Double-Curved Facades
source Dokonal, W, Hirschberg, U and Wurzer, G (eds.), Digital Design Reconsidered - Proceedings of the 41st Conference on Education and Research in Computer Aided Architectural Design in Europe (eCAADe 2023) - Volume 2, Graz, 20-22 September 2023, pp. 639–647
doi https://doi.org/10.52842/conf.ecaade.2023.2.639
summary Due to their complex geometry, it is challenging to assess wind effects on the freeform, double-curved building facades. The traditional building code EN 1991-1-4 (730035) only accounts for basic shapes such as cubes, spheres, and cylinders. Moreover, even though wind tunnel measurements are considered to be more precise than other methods, they are still limited by the number of measurement points that can be taken. This limitation, combined with the time and resources required for the analysis, can limit the ability to fully capture detailed wind effects on the whole complex freeform shape of the building. In this study, we propose the use of neural network models trained to predict wind pressure on complex double-curved facades. The neural network is a powerful data-driven machine learning technique that can, in theory, learn an approximation of any function from data, making it well-suited for this application. Our approach was empirically evaluated using a set of 31 points measured in the wind tunnel on a 3D printed model in 1:300 scale of the real architectural design of a concert hall in Ostrava. The results of this evaluation demonstrate the effectiveness of our neural network method in estimating wind pressures on complex freeform facades.
keywords wind pressure, double-curved façade, neural network
series eCAADe
email
last changed 2023/12/10 10:49

_id 00bc
authors Chen, Chen-Cheng
year 1991
title Analogical and inductive reasoning in architectural design computation
source Swiss Federal Institute of Technology, ETH Zurich
summary Computer-aided architectural design technology is now a crucial tool of modern architecture, from the viewpoint of higher productivity and better products. As technologies advance, the amount of information and knowledge that designers can apply to a project is constantly increasing. This requires development of more advanced knowledge acquisition technology to achieve higher functionality, flexibility, and efficient performance of the knowledge-based design systems in architecture. Human designers do not solve design problems from scratch, they utilize previous problem solving episodes for similar design problems as a basis for developmental decision making. This observation leads to the starting point of this research: First, we can utilize past experience to solve a new problem by detecting the similarities between the past problem and the new problem. Second, we can identify constraints and general rules implied by those similarities and the similar parts of similar situations. That is, by applying analogical and inductive reasoning we can advance the problem solving process. The main objective of this research is to establish the theory that (1) design process can be viewed as a learning process, (2) design innovation involves analogical and inductive reasoning, and (3) learning from a designer's previous design cases is necessary for the development of the next generation in a knowledge-based design system. This thesis draws upon results from several disciplines, including knowledge representation and machine learning in artificial intelligence, and knowledge acquisition in knowledge engineering, to investigate a potential design environment for future developments in computer-aided architectural design. This thesis contains three parts which correspond to the different steps of this research. Part I, discusses three different ways - problem solving, learning and creativity - of generating new thoughts based on old ones. In Part II, the problem statement of the thesis is made and a conceptual model of analogical and inductive reasoning in design is proposed. In Part III, three different methods of building design systems for solving an architectural design problem are compared rule-based, example-based, and case-based. Finally, conclusions are made based on the current implementation of the work, and possible future extensions of this research are described. It reveals new approaches for knowledge acquisition, machine learning, and knowledge-based design systems in architecture.
series thesis:PhD
email
last changed 2003/05/10 05:42

_id ca50
authors Ayrle, Hartmut
year 1991
title XNET2 - Methodical Design of Local Area Networks in Buildings - An Application of the A4 Intelligent Design Tool
source Computer Aided Architectural Design Futures: Education, Research, Applications [CAAD Futures ‘91 Conference Proceedings / ISBN 3-528-08821-4] Zürich (Switzerland), July 1991, pp. 443-450
summary XNET2 is a prototype program, that helps network planners to design Ethernet-conform data-networks for sites and buildings. It is implemented as an example application of the ARMILLA4 Intelligent Design Tool under Knowledge Craft. It is based on a knowledge acquisition phase with experts from DECsite, the network-branch of DEC. The ARMILLA Design Tool is developed on the basis of Fritz Haller's ARMILLA ' a set of geometrical and operational rules for the integration of technical ductwork into a building's construction.
series CAAD Futures
last changed 2003/11/21 15:16

_id 22d6
authors Ballheim, F. and Leppert, J.
year 1991
title Architecture with Machines, Principles and Examples of CAAD-Education at the Technische Universität München
source Experiences with CAAD in Education and Practice [eCAADe Conference Proceedings] Munich (Germany) 17-19 October 1991
doi https://doi.org/10.52842/conf.ecaade.1991.x.h3w
summary "Design tools affect the results of the design process" - this is the starting point of our considerations about the efficient use of CAAD within architecture. To give you a short overview about what we want to say with this thesis lets have a short - an surely incomplete - trip through the fourth dimension back into the early time of civil engineering. As CAD in our faculty is integrated in the "Lehrstuhl für Hochbaustatik und Tragwerksplanung" (if we try to say it in English it would approximately be "institute of structural design"), we chose an example we are very familiar with because of its mathematical background - the cone sections: Circle, ellipse, parabola and hyperbola. If we start our trip two thousand years ago we only find the circle - or in very few cases the ellipse - in their use for the ground plan of greek or roman theaters - if you think of Greek amphitheaters or the Colosseum in Rome - or for the design of the cross section of a building - for example the Pantheon, roman aqueducts or bridges. With the rediscovery of the perspective during the Renaissance the handling of the ellipse was brought to perfection. May be the most famous example is the Capitol in Rome designed by Michelangelo Buonarotti with its elliptical ground plan that looks like a circle if the visitor comes up the famous stairway. During the following centuries - caused by the further development of the natural sciences and the use of new construction materials, i.e. cast-iron, steel or concrete - new design ideas could be realized. With the growing influence of mathematics on the design of buildings we got the division into two professions: Civil engineering and architecture. To the regret of the architects the most innovative constructions were designed by civil engineers, e.g. the early iron bridges in Britain or the famous bridges of Robert Maillard. Nowadays we are in the situation that we try to reintegrate the divided professions. We will return to that point later discussing possible solutions of this problem. But let us continue our 'historical survey demonstrating the state of the art we have today. As the logical consequence of the parabolic and hyperbolic arcs the hyperbolic parabolic shells were developed using traditional design techniques like models and orthogonal sections. Now we reach the point where the question comes up whether complex structures can be completely described by using traditional methods. A question that can be answered by "no" if we take the final step to the completely irregular geometry of cable- net-constructions or deconstructivistic designs. What we see - and what seems to support our thesis of the connection between design tools and the results of the design process - is, that on the one hand new tools enabled the designer to realize new ideas and on the other hand new ideas affected the development of new tools to realize them.

series eCAADe
more http://www.mediatecture.at/ecaade/91/ballheim_leppert.pdf
last changed 2022/06/07 07:50

_id c1ca
authors Daru, Roel
year 1991
title Sketch as Sketch Can - Design Sketching with Imperfect Aids and Sketchpads of the Future
source Experiences with CAAD in Education and Practice [eCAADe Conference Proceedings] Munich (Germany) 17-19 October 1991
doi https://doi.org/10.52842/conf.ecaade.1991.x.k1t
summary Sketching plays a manifold role in design and design education now as much as it did in the computerless days. Design sketching is indispensable during the early phases of the architectural design process. But if we ask architects and design educators alike what they are doing with computers, idea sketching is the least mentioned answer if not left out entirely. It is not because they are computer-illiterates, as the computer industry would tend to imply, but because their computers are not offering an adequate environment for design sketching. In education this means that those trying to create computeraided design sketching courses are confronted with the choice of either working with imperfect tools, or waiting for better tools. But by exploring the possibilities in available surrogates we will build the necessary experiences for specifying what is really useful for idea-sketching. Without such exercises, we will never go beyond the electronic metaphor of the sketchbook with pencil or marker.

series eCAADe
email
last changed 2022/06/07 07:50

_id a113
authors Milne, Murray
year 1991
title Design Tools: Future Design Environments for Visualizing Building Performance
source Computer Aided Architectural Design Futures: Education, Research, Applications [CAAD Futures ‘91 Conference Proceedings / ISBN 3-528-08821-4] Zürich (Switzerland), July 1991, pp. 485-496
summary In the future of Computer Aided Architectural Design (CAAD), architects clearly need more than just computer aided design and drafting systems (CAD). Unquestionably CAD systems continue to become increasingly powerful, but there is more to designing a good building than its three-dimensional existence, especially in the eyes of all the non-architects of the world: users, owners, contractors, regulators, environmentalists. The ultimate measure of a building's quality has something to do with how well it behaves over time. Predictions about its performance have many different dimensions; how much it costs to build, to operate, and to demolish; how comfortable it is; how effectively people can perform their functions in it; how much energy it uses or wastes. Every year dozens of building performance simulation programs are being written that can predict performance over time along any of these dimensions. That is why the need for both CAD systems and performance predictors can be taken for granted, and why instead it may be more interesting to speculate about the need for 'design tools'. A design tool can be defined as a piece of software that is easy and natural for architects to use, that easily accommodates three-dimensional representations of the building, and that-predicts something useful about a building's performance. There are at least five different components of design tools that will be needed for the design environment of the future.
series CAAD Futures
email
last changed 2003/05/16 20:58

_id 2a0e
authors Jacobs, Stephen Paul
year 1991
title The CAD Design Studio: 3D modeling as a fundamental design skill
source McGraw-Hill, New York
summary Until now, books on CAD aimed at architects have addressed the use of computer-aided design and drafting as a recording tool, a faster means of producing and storing finished working drawings-and not as an adjunctive creative tool for the design process. Without being software specific, this book will guide the professional and student architect and graphics designer in how to use the computer as an electronic modelling tool, exploring graphic and geometric forms and systems with the freedom and speed of the computer. The reader will be led through a progression of design exercises and design problems, learning how to come up with multiple solutions to a given program. Beautifully illustrated throughout, including 10 four-color CAD drawings!
series other
last changed 2003/04/23 15:14

_id diss_kuo
id diss_kuo
authors Kuo, C.J.
year 1999
title Unsupervised Dynamic Concurrent Computer-Aided Design Assistant
source Los Angeles: UCLA
summary The increasing capability of computer-aided architectural design systems has strengthened the role that the computer plays in the workplace. Due to the complexity of developing new techniques and research, these systems are undertaken mostly by scientists and engineers without significant architectural input (Willey, 1991). The design concept of these systems may be based on a well-defined and well-understood process, which is not yet realized in architectural design (Galle, 1994). The output of such research may not be easily adapted into the design process. Most of the techniques assume a complete understanding of the design space (Gero and Maher, 1987) (Willey, 1991). The description or construction of the design space is always time and space consuming, and the result can never be complete due to the ever-changing nature of architectural design. This research intends to initiate a solution for the above problems. The proposed system is an unsupervised-dynamic-concurrent-computer-aided-design assistant. The “unsupervised” means the learning process is not supervised by the user because it is against the designer's nature to “think-aloud” in the design studio and it also increases the work load. It is dynamic because the size of the knowledge base is constantly changing. Concurrent means that there are multiple procedures active simultaneously. This research focuses on learning the operational knowledge from an individual designer and reapplying it in future designs. A computer system for this experiment is constructed. It is capable of The preliminary result shows a positive feedback from test subjects. The purpose of this research is to suggest a potent computational frame within which future developments may flourish.
series thesis:PhD
last changed 2003/11/28 07:37

_id 7508
authors Montgomery, D.C.
year 1991
title Design and Analysis of Experiments
source John Wiley, Chichester
summary Learn How to Achieve Optimal Industrial Experimentation Through four editions, Douglas Montgomery has provided statisticians, engineers, scientists, and managers with the most effective approach for learning how to design, conduct, and analyze experiments that optimize performance in products and processes. Now, in this fully revised and enhanced Fifth Edition, Montgomery has improved his best-selling text by focusing even more sharply on factorial and fractional factorial design and presenting new analysis techniques (including the generalized linear model). There is also expanded coverage of experiments with random factors, response surface methods, experiments with mixtures, and methods for process robustness studies. The book also illustrates two of today's most powerful software tools for experimental design: Design-Expert(r) and Minitab(r). Throughout the text, You'll find output from these two programs, along with detailed discussion on how computers are currently used in the analysis and design of experiments. You'll also learn how to use statistically designed experiments to: * Obtain information for characterization and optimization of systems * Improve manufacturing processes * Design and develop new processes and products * Evaluate material alternatives in product design * Improve the field performance, reliability, and manufacturing aspects of products * Learn how to conduct experiments effectively and efficiently Other important textbook features: * Student version of Design-Expert(r) software is available. * Web site (www.wiley.com/college/montgomery) offers supplemental text material for each chapter, a sample syllabus, and sample student projects from the author's Design of Experiments course at Arizona State University.
series other
last changed 2003/04/23 15:14

_id 3105
authors Novak, T.P., Hoffman, D.L., and Yung, Y.-F.
year 1996
title Modeling the structure of the flow experience
source INFORMS Marketing Science and the Internet Mini-Conference, MIT
summary The flow construct (Csikszentmihalyi 1977) has recently been proposed by Hoffman and Novak (1996) as essential to understanding consumer navigation behavior in online environments such as the World Wide Web. Previous researchers (e.g. Csikszentmihalyi 1990; Ghani, Supnick and Rooney 1991; Trevino and Webster 1992; Webster, Trevino and Ryan 1993) have noted that flow is a useful construct for describing more general human-computer interactions. Hoffman and Novak define flow as the state occurring during network navigation which is: 1) characterized by a seamless sequence of responses facilitated by machine interactivity, 2) intrinsically enjoyable, 3) accompanied by a loss of self-consciousness, and 4) selfreinforcing." To experience flow while engaged in an activity, consumers must perceive a balance between their skills and the challenges of the activity, and both their skills and challenges must be above a critical threshold. Hoffman and Novak (1996) propose that flow has a number of positive consequences from a marketing perspective, including increased consumer learning, exploratory behavior, and positive affect."
series other
last changed 2003/04/23 15:50

_id ascaad2007_026
id ascaad2007_026
authors Sarji, E.A.; A. Rafi and R. Mat Rani
year 2007
title Preparing a multimedia-based gallery for institute of higher learning: A case study of Malaysian experience
source Em‘body’ing Virtual Architecture: The Third International Conference of the Arab Society for Computer Aided Architectural Design (ASCAAD 2007), 28-30 November 2007, Alexandria, Egypt, pp. 305-316
summary While the majority of medium and small sized institutions still rely on their physical or traditional content, it has been observed a pre-disposition usually by major, recently founded or contemporary art institutions to display net-based projects (Buiani, 2001) and to some extent established as a permanent display. This changing of exhibitions has penetrated in many Asian galleries and as a result many schools trying to re-position and present in such a way that it can be easily changed and adapted to host multimedia, Internet, interactive and computer-based content. This funded research project investigates the functions of gallery in IHL in Malaysia. A triangulated study was conducted to understand the potentials and issues faced by galleries in public and private universities focusing on design schools that include art and design, and architecture. This research starts with the understanding of gallery design theories. It is then followed by a qualitative method survey to all galleries in the IHL. This research continues with an in depth study and a survey on Electronic Gallery (e-Gallery), Faculty of Creative Multimedia (FCM), Multimedia University (MMU) to understand between the theories and design ideas. A set of questionnaires was developed based on Mathews (1991) and Stewart’s (2002) principles and guidelines on research methods and distributed to visitors throughout a period of time consisting of open-ended, close-ended, Likert Summated Rating Scale and Multiple-choice. This involved a controlled group of visitors comprises students and staff of the faculty. The results of these studies will be used as a reference to further conduct a wider scope of galleries worldwide towards designing a multimedia-based gallery framework for Institute of Higher Learning.
series ASCAAD
email
last changed 2008/01/21 22:00

_id e8ec
authors Weber, Benz
year 1991
title LEARNING FROM THE FULL-SCALE LABORATORY
source Proceedings of the 3rd European Full-Scale Modelling Conference / ISBN 91-7740044-5 / Lund (Sweden) 13-16 September 1990, pp. 12-19
summary The team from the LEA at Lausanne was not actually involved in the construction of the laboratory itself. During the past five years we have been discovering the qualities and limitations of the lab step by step through the experiments we performed. The method in which we use it is quite different from that of its creators. Since 1985 the external services has been limited to clients coming to the laboratory alone. We help them only with basic instructions for the use of the equipment. Most of these experiments are motivated by the excellent possibilities to discuss the design of a new hospital or home for elderly with the people directly affected by it, such as patients, nurses, doctors and specialists for the technical equipment. The main issues discussed in these meetings are of the dimensions and functional organisation of the spaces. The entire process for a normal room including construction, discussions and dismantling of the full-scale model is between three and five days. Today these types of experiments are occupying the lab only about twenty days a year.
keywords Full-scale Modeling, Model Simulation, Real Environments
series other
type normal paper
more http://info.tuwien.ac.at/efa
last changed 2004/05/04 15:23

_id ecaade2020_139
id ecaade2020_139
authors Zwierzycki, Mateusz
year 2020
title On AI Adoption Issues in Architectural Design - Identifying the issues based on an extensive literature review.
source Werner, L and Koering, D (eds.), Anthropologic: Architecture and Fabrication in the cognitive age - Proceedings of the 38th eCAADe Conference - Volume 1, TU Berlin, Berlin, Germany, 16-18 September 2020, pp. 515-524
doi https://doi.org/10.52842/conf.ecaade.2020.1.515
summary An analysis of AI in design literature, compiled from almost 200 publications from the 1980s onwards. The majority of the sources are proceedings from various conferences. This work is inspired by the Ten Problems for AI in Design (Gero 1991) workshop report, which listed the problems to be tackled in design with AI. Almost 30 years since the publication, it seems most of the Ten Problems cannot be considered solved or even addressed. One of this paper's goals is to identify, categorize and examine the bottlenecks in the adoption of AI in design. The collected papers were analysed to obtain the following data: Problem, Tool, Solution, Stage and Future work. The conclusions drawn from the analysis are used to define a range of existing problems with AI adoption, further illustrated with an update to the Ten Problems. Ideally this paper will spark a discussion on the quality of research, methodology and continuity in research.
keywords artificial intelligence; review; design automation; knowledge representation; machine learning; expert system
series eCAADe
email
last changed 2022/06/07 07:57

_id eaca
authors Davis, L. (ed.)
year 1991
title Handbook of genetic algorithms
source Van Nostrand Reinhold, New York
summary This book sets out to explain what genetic algorithms are and how they can be used to solve real-world problems. The first objective is tackled by the editor, Lawrence Davis. The remainder of the book is turned over to a series of short review articles by a collection of authors, each explaining how genetic algorithms have been applied to problems in their own specific area of interest. The first part of the book introduces the fundamental genetic algorithm (GA), explains how it has traditionally been designed and implemented and shows how the basic technique may be applied to a very simple numerical optimisation problem. The basic technique is then altered and refined in a number of ways, with the effects of each change being measured by comparison against the performance of the original. In this way, the reader is provided with an uncluttered introduction to the technique and learns to appreciate why certain variants of GA have become more popular than others in the scientific community. Davis stresses that the choice of a suitable representation for the problem in hand is a key step in applying the GA, as is the selection of suitable techniques for generating new solutions from old. He is refreshingly open in admitting that much of the business of adapting the GA to specific problems owes more to art than to science. It is nice to see the terminology associated with this subject explained, with the author stressing that much of the field is still an active area of research. Few assumptions are made about the reader's mathematical background. The second part of the book contains thirteen cameo descriptions of how genetic algorithmic techniques have been, or are being, applied to a diverse range of problems. Thus, one group of authors explains how the technique has been used for modelling arms races between neighbouring countries (a non- linear, dynamical system), while another group describes its use in deciding design trade-offs for military aircraft. My own favourite is a rather charming account of how the GA was applied to a series of scheduling problems. Having attempted something of this sort with Simulated Annealing, I found it refreshing to see the authors highlighting some of the problems that they had encountered, rather than sweeping them under the carpet as is so often done in the scientific literature. The editor points out that there are standard GA tools available for either play or serious development work. Two of these (GENESIS and OOGA) are described in a short, third part of the book. As is so often the case nowadays, it is possible to obtain a diskette containing both systems by sending your Visa card details (or $60) to an address in the USA.
series other
last changed 2003/04/23 15:14

_id 9ad2
authors Owen, J.C.
year 1991
title Algebraic Solution for Geometry from Dimensional Constraints
source ACM Symp. Found. of Solid Modeling, Austin TX, pp. 397-407
summary We investigate general configurations of distance and angle dimensions between points, lines and circles on a plane. A simple graphical representation is described for the system of coupled ctuadratic equations which results from treating the geometries as variables and the dimensions as defining equations. For many configurations of practical interest we show that these equations are poorly suited to numerical solution. We describe an algorithm for computing the solution to a subset of all possible configurations of geometry and dimensions using purely algebraic methods (in fact the usual arithmetic operations plus square roots). We show that this algorithm solves for all configurations of a practically useful type and that it solves for any configuration which can in principle be solved using these algebraic operations. Specifically, we use the Galois theory of equations to show that the following statements are equivalent. 1. The geometry can be constructed in principle on a drawing board using a ruler and compasses. 2. The coordinates of the geometries can be computed algebraically using only arithmetic operations plus square root. 3. The coordinates of the geometries lie in a normal field extension over the dimension values of degree 2n for some n. 4. For general (i.e. algebraically independent) dimension values the algorithm described will compute the geometries. We also describe a working implementation of the algorithm and describe some extensions to the basic ideaa which are necessary to make it a practically useful way to specify geometry by means of dimensional constraints.
series other
last changed 2003/04/23 15:50

_id b5be
authors Stok, Leon
year 1991
title Architectural synthesis and optimization of digital systems
source Eindhoven University of Technology
summary High level synthesis means going from an functional specification of a digits-system at the algorithmic level to a register transfer level structure. Different appli-cations will ask for different design styles. Despite this diversity in design styles many tasks in the synthesis will be similar. There is no need to write a new synthesis system for each design style. The best way to go seems a decomposition of the high level synthesis problems in several well defined subproblems. How the problem is decomposed depends heavily on a) the type of network architecture chosen, b) the constraints applied to the design and c) on the functional description itself. From this architecture style, the constraints and the functional description a synthesis scheme can be derived. Once this scheme is fixed, algorithms can be chosen which fit into this scheme and solve the subproblems in a fast and, when possible, optimal way. To support such a synthesis philosophy, a framework is needed in which all design information can be stored in a unique way during the various phases of the design process. This asks for a design data base capable of handling all design information with a formally defined interface to all design tools. This thesis gives a formal way to describe both the functional representation, the register transfer level structure and the controller and the relations between all three of them. Special attention has been paid to the efficient representation of mutual exclusive operations and array accesses. The scheduling and allocation problems are defined as mappings between these formal representations. Both the existing synthesis algorithms and the new algorithms described in this thesis fit into this framework. Three new allocation algorithms are presented in this thesis: an algorithm for optimal register allocation in cyclic data flow graphs, an exact polynomial algorithm to do the module allocation and a new scheme to minimize the number of interconnections during all stages of the data path allocation. Cyclic data flow graphs result from high level behavioral descriptions that contain loops. Algorithms for register allocation in high level synthesis published up till now, only considered loop free data flow graphs, When these algorithms are applied to data flow graphs with loops, unnecessary register transfer operations are introduced. A new algorithm is presented that performs a minimal register allocation and eliminates all superfluous register transfer operations. The problem is reformulated as a multicommodity network flow problem for which very efficient solutions exist. Experiments on a benchmark set have shown that in all test cases all register transfers could be eliminated at no increase in register cost. Only heuristic algorithms appeared in literature to solve the module allocation problem. The module allocation problem is usually defined as a clique cover problem on a so-called module allocation graph. It is shown that, under certain conditions, the module allocation graph belongs to the special class of comparability graphs. A polynomial time algorithm can optimally find a clique cover of such a graph. Even when interconnect weights are taken into account, this can be solved exactly. This problem can be transformed into a maximal cost network flow problem, which can be solved exactly in polynomial time. An algorithm is described which solves the module allocation problem with interconnect weights exactly, with a complexity O(kn2), where n is the number of operations In previous research, interconnection was optimized when the module allocation for the operations and the register allocation for the variables already had been done. However, the amount of multiplexing and interconnect are crucial factors to both the delay and the area of a circuit. A new scheme is presented to minimize the number of interconnections during the data path allocation. This scheme first groups all values based on their read and write times. Values belonging to the same group can share a register file. This minimizes the number of data transfers with different sources and destinations. Secondly, registers are allocated for each group separately. Finally the interconnect allocation is done. During the interconnect allocation, the module allocation is determined. The value grouping is based on edge coloring algorithms providing a sharp upper bound on the number of colors needed two techniques: splitting read and write phases of values and introducing serial (re-)write operations for the same value, make that even more efficient exact edge coloring algorithms can be used. It is shown that when variables are grouped into register files and operations are assigned to modules during the interconnection minimization, significant savings (20%) can be obtained in the number of local interconnections and the amount of global interconnect, at the expense of only slightly more register area.
keywords Digital Systems; Digital Systems
series thesis:PhD
email
last changed 2003/02/12 22:37

_id 4eed
authors Benedickt, Michael (ed.)
year 1991
title Cyberspace: First Steps
source The MIT Press, Cambridge, MA and London, UK
summary Cyberspace has been defined as "an infinite artificial world where humans navigate in information-based space" and as "the ultimate computer-human interface." These original contributions take up the philosophical basis for cyberspace in virtual realities, basic communications principles, ramifications of cyberspace for future workplaces, and more.
series other
last changed 2003/04/23 15:14

_id 6266
authors Carini, Alessandra
year 1991
title REVIEW OF MOST RECENT ACTIVITIES OF THE "LABORATORIO TIPOLOGICO NAZIONALE"
source Proceedings of the 3rd European Full-Scale Modelling Conference / ISBN 91-7740044-5 / Lund (Sweden) 13-16 September 1990, pp. 20-22
summary ??{??'s activities did not start immediately after its opening since the following year was mainly given over to the definition of criteria and procedures for the management of the Laboratory itself by OIKOS. Actual research started in 1990 on the basis of a programme drawn up with the collaboration of the Public Housing Committee ('Comitato per I'Edilizia Residenziale").
keywords Full-scale Modeling, Model Simulation, Real Environments
series other
type normal paper
more http://info.tuwien.ac.at/efa
last changed 2004/05/04 15:24

_id ga9921
id ga9921
authors Coates, P.S. and Hazarika, L.
year 1999
title The use of genetic programming for applications in the field of spatial composition
source International Conference on Generative Art
summary Architectural design teaching using computers has been a preoccupation of CECA since 1991. All design tutors provide their students with a set of models and ways to form, and we have explored a set of approaches including cellular automata, genetic programming ,agent based modelling and shape grammars as additional tools with which to explore architectural ( and architectonic) ideas.This paper discusses the use of genetic programming (G.P.) for applications in the field of spatial composition. CECA has been developing the use of Genetic Programming for some time ( see references ) and has covered the evolution of L-Systems production rules( coates 1997, 1999b), and the evolution of generative grammars of form (Coates 1998 1999a). The G.P. was used to generate three-dimensional spatial forms from a set of geometrical structures .The approach uses genetic programming with a Genetic Library (G.Lib) .G.P. provides a way to genetically breed a computer program to solve a problem.G. Lib. enables genetic programming to define potentially useful subroutines dynamically during a run .* Exploring a shape grammar consisting of simple solid primitives and transformations. * Applying a simple fitness function to the solid breeding G.P.* Exploring a shape grammar of composite surface objects. * Developing grammarsfor existing buildings, and creating hybrids. * Exploring the shape grammar of abuilding within a G.P.We will report on new work using a range of different morphologies ( boolean operations, surface operations and grammars of style ) and describe the use of objective functions ( natural selection) and the "eyeball test" ( artificial selection) as ways of controlling and exploring the design spaces thus defined.
series other
more http://www.generativeart.com/
last changed 2003/08/07 17:25

_id 2e56
authors Coyne, Robert Francis
year 1991
title ABLOOS : an evolving hierarchical design framework
source Carnegie Mellon University, Department of Architecture
summary The research reported in this thesis develops an approach toward a more effective use of hierarchical decomposition in computational design systems. The approach is based on providing designers a convenient interactive means to specify and experiment with the decompositional structure of design problems, rather than having decompositions pre-specified and encoded in the design system. Following this approach, a flexible decomposition capability is combined with an underlying design method to form the basis for an extensible and evolving framework for cooperative (humdcomputer) design. As a testbed for this approach, the ABLOOS framework for layout design is designed and constructed as a hierarchical extension of LOOS.’The framework enables a layout task to be hierarchically decomposed, and for the LOOS methodology to be applied recursively to layout subtasks at appropriate levels of abstraction within the hierarchy; layout solutions for the subtasks are then recomposed to achieve an overall solution, Research results thus far are promising: ABLOOS has produced high quality solutions for a class of industrial layout design tasks (an analog power board layout with 60 components that have multiple complex constraints on their placement); the adaptability of the framework across domains and disciplines has been demonstrated; and, further development of ABLOOS is underway including its extension to layouts in 2 1/2D space and truly 3D arrangements. The contribution of this work is in demonstrating an effective, flexible and extensible capability for hierarchical decomposition in design. It has also produced a more comprehensive layout system that can serve as a foundation for the further investigation of hierarchical decomposition in a variety of design domains.
series thesis:PhD
last changed 2003/02/12 22:37

For more results click below:

this is page 0show page 1show page 2show page 3show page 4show page 5... show page 8HOMELOGIN (you are user _anon_728439 from group guest) CUMINCAD Papers Powered by SciX Open Publishing Services 1.002