CumInCAD is a Cumulative Index about publications in Computer Aided Architectural Design
supported by the sibling associations ACADIA, CAADRIA, eCAADe, SIGraDi, ASCAAD and CAAD futures

PDF papers
References

Hits 1 to 20 of 188

_id 6f3e
authors Eastman, Charles M. and Lang, Jurg
year 1991
title Experiments in Architectural Design Development Using CAD
source Computer Aided Architectural Design Futures: Education, Research, Applications [CAAD Futures ‘91 Conference Proceedings / ISBN 3-528-08821-4] Zürich (Switzerland), July 1991, pp. 49-64
summary The need to explore development techniques in computer-based design is reviewed. Some premises are given for design development using computers, including integrating multiple representations, the use of object-based modeling and the importance of visual analysis and 3-D modeling. We then present techniques used in a UCLA design studio that explored methods of computer-based design development based on these premises. The two main methods used were hierarchical object structures and multi-representational coordination. They were applied using conventional CAD systems. Some lessons learned from this class are reviewed.
series CAAD Futures
email
last changed 2003/05/16 20:58

_id ddss9483
id ddss9483
authors Shyi, Gary C.-W. and Huang, Tina S.-T.
year 1994
title Constructing Three-Dimensional Mental Models from Two-Dimensional Displays
source Second Design and Decision Support Systems in Architecture & Urban Planning (Vaals, the Netherlands), August 15-19, 1994
summary In the present study we adopted the tasks and the experimental procedures used in a recent series of study by Cooper (1990, 1991) for the purpose of examining how we utilized two-dimensional information in a line-drawing of visual objects to construct the corresponding three-dimensional mental structure represented by the 2-D displays. We expected that the stimulus materials we used avoided some of the problems that Cooper's stimuli had, and with that we examined the effect of complexity on the process of constructing 3-D models from 2-D displays. Such a manipulation helps to elucidate the difficulties of solving problems that require spatial abilities. We also investigated whether or not providing information representing an object viewed from different standpoints would affect the construction of the object's 3-D model. Some researchers have argued that 3-D models, once constructed, should be viewer-independent or viewpoint-invariant, while others have suggested that 3-D models are affected by the viewpoint of observation. Data pertinent to this issue are presented and discussed.
series DDSS
last changed 2003/08/07 16:36

_id ga0010
id ga0010
authors Moroni, A., Zuben, F. Von and Manzolli, J.
year 2000
title ArTbitrariness in Music
source International Conference on Generative Art
summary Evolution is now considered not only powerful enough to bring about the biological entities as complex as humans and conciousness, but also useful in simulation to create algorithms and structures of higher levels of complexity than could easily be built by design. In the context of artistic domains, the process of human-machine interaction is analyzed as a good framework to explore creativity and to produce results that could not be obtained without this interaction. When evolutionary computation and other computational intelligence methodologies are involved, every attempt to improve aesthetic judgement we denote as ArTbitrariness, and is interpreted as an interactive iterative optimization process. ArTbitrariness is also suggested as an effective way to produce art through an efficient manipulation of information and a proper use of computational creativity to increase the complexity of the results without neglecting the aesthetic aspects [Moroni et al., 2000]. Our emphasis will be in an approach to interactive music composition. The problem of computer generation of musical material has received extensive attention and a subclass of the field of algorithmic composition includes those applications which use the computer as something in between an instrument, in which a user "plays" through the application's interface, and a compositional aid, which a user experiments with in order to generate stimulating and varying musical material. This approach was adopted in Vox Populi, a hybrid made up of an instrument and a compositional environment. Differently from other systems found in genetic algorithms or evolutionary computation, in which people have to listen to and judge the musical items, Vox Populi uses the computer and the mouse as real-time music controllers, acting as a new interactive computer-based musical instrument. The interface is designed to be flexible for the user to modify the music being generated. It explores evolutionary computation in the context of algorithmic composition and provides a graphical interface that allows to modify the tonal center and the voice range, changing the evolution of the music by using the mouse[Moroni et al., 1999]. A piece of music consists of several sets of musical material manipulated and exposed to the listener, for example pitches, harmonies, rhythms, timbres, etc. They are composed of a finite number of elements and basically, the aim of a composer is to organize those elements in an esthetic way. Modeling a piece as a dynamic system implies a view in which the composer draws trajectories or orbits using the elements of each set [Manzolli, 1991]. Nonlinear iterative mappings are associated with interface controls. In the next page two examples of nonlinear iterative mappings with their resulting musical pieces are shown.The mappings may give rise to attractors, defined as geometric figures that represent the set of stationary states of a non-linear dynamic system, or simply trajectories to which the system is attracted. The relevance of this approach goes beyond music applications per se. Computer music systems that are built on the basis of a solid theory can be coherently embedded into multimedia environments. The richness and specialty of the music domain are likely to initiate new thinking and ideas, which will have an impact on areas such as knowledge representation and planning, and on the design of visual formalisms and human-computer interfaces in general. Above and bellow, Vox Populi interface is depicted, showing two nonlinear iterative mappings with their resulting musical pieces. References [Manzolli, 1991] J. Manzolli. Harmonic Strange Attractors, CEM BULLETIN, Vol. 2, No. 2, 4 -- 7, 1991. [Moroni et al., 1999] Moroni, J. Manzolli, F. Von Zuben, R. Gudwin. Evolutionary Computation applied to Algorithmic Composition, Proceedings of CEC99 - IEEE International Conference on Evolutionary Computation, Washington D. C., p. 807 -- 811,1999. [Moroni et al., 2000] Moroni, A., Von Zuben, F. and Manzolli, J. ArTbitration, Las Vegas, USA: Proceedings of the 2000 Genetic and Evolutionary Computation Conference Workshop Program – GECCO, 143 -- 145, 2000.
series other
email
more http://www.generativeart.com/
last changed 2003/08/07 17:25

_id 2c7b
authors Stenvert, Ronald
year 1993
title The Vector-drawing as a Means to Unravel Architectural Communication in the Past
doi https://doi.org/10.52842/conf.ecaade.1993.x.q9a
source [eCAADe Conference Proceedings] Eindhoven (The Netherlands) 11-13 November 1993
summary Unlike in painting, in architecture one single person never controls the whole process between conception and realization of a building. Ideas of what the building will eventually look like, have to be conveyed from patron to the actual builders, by way of drawings. Generally the architect is the key-figure in this process of communication of visual ideas. Nowadays many architects design their new buildings by using computers and Computer-Aided (Architectural) Design programs like AutoCad and VersaCAD. Just like traditional drawings, all these computer drawings are in fact vector-drawings; a collection of geometrical primitives like lines, circle segments etc. identified by the coordinates of their end points. Vector-based computer programs can not only be used to design the future, but also as a means to unravel the architectural communication in the past. However, using the computer as an analyzing tool for a better comprehension of the past is not as simple as it seems. Historical data from the past are governed by unique features of date and place. The complexity of the past combined with the straightforwardness of the computer requires a pragmatic and basic approach in which the computer acts as a catalytic agent, enabling the scholar to arrive manually at his own - computer-assisted - conclusions. From this it turns out that only a limited number of projects of a morphological kind are suited to contribute to new knowledge, acquired by the close-reading of the information gained by way of meaningful abstraction. An important problem in this respect is how to obtain the right kind of architectural information. All four major elements of the building process - architect, design, drawing and realization - have their own different and gradually shifting interpretations in the past. This goes especially for the run-of-the-mill architecture which makes up the larger part of the historical urban environment. Starting with the architect, one has to realize that only a very limited part of mainstream architecture was designed by architects. In almost all other cases the role of the patron and the actual builder exceeds that of the architect, even to the extent that they designed buildings themselves. The position of design and drawing as means of communication also changed in the past. Until the middle of the nineteenth century drawings were not the chief means of communication between architects and builders, who got the gist of the design from a model, or, encountering problems, simply asked the architect or supervisor. From the nineteenth century onwards the use of drawings became more common, but almost never represented the building entirely "as built". In 1991 I published my Ph.D. thesis: Constructing the past: computerassisted architectural-historical research: the application of image-processing using the computer and Computer-Aided Design for the study of the urban environment, illustrated by the use of treatises in seventeenth-century architecture (Utrecht 1991). Here, a reconstruction of this historical communication process will be presented on the basis of a project studying the use of the Classical orders as prescribed in various architectural treatises, compared to the use of the orders in a specific group of still existing buildings in The Netherlands dating from the late sixteenth and entire seventeenth century. Comparisons were made by using vector-drawings. Both the illustrations in the the treatises and actual buildings were "translated" into computer-drawings and then analyzed.

series eCAADe
last changed 2022/06/07 07:50

_id 8db6
authors Bijl, Aart
year 1991
title On Knowing - Feeling and Expression
source Computer Aided Architectural Design Futures: Education, Research, Applications [CAAD Futures ‘91 Conference Proceedings / ISBN 3-528-08821-4] Zürich (Switzerland), July 1991, pp. 157-176
summary The basic assumptions for CAD, and for any use of computers, are re-examined. They refer to how we know things, how we think of knowledge being represented, and the impact of representation techniques on evolution of knowledge. Japan offers stimulating clues on how we might regard the usefulness of computers, and these are explained. Evocative illustrations are presented, to show a direction for future developments.
series CAAD Futures
last changed 2003/11/21 15:16

_id 00bc
authors Chen, Chen-Cheng
year 1991
title Analogical and inductive reasoning in architectural design computation
source Swiss Federal Institute of Technology, ETH Zurich
summary Computer-aided architectural design technology is now a crucial tool of modern architecture, from the viewpoint of higher productivity and better products. As technologies advance, the amount of information and knowledge that designers can apply to a project is constantly increasing. This requires development of more advanced knowledge acquisition technology to achieve higher functionality, flexibility, and efficient performance of the knowledge-based design systems in architecture. Human designers do not solve design problems from scratch, they utilize previous problem solving episodes for similar design problems as a basis for developmental decision making. This observation leads to the starting point of this research: First, we can utilize past experience to solve a new problem by detecting the similarities between the past problem and the new problem. Second, we can identify constraints and general rules implied by those similarities and the similar parts of similar situations. That is, by applying analogical and inductive reasoning we can advance the problem solving process. The main objective of this research is to establish the theory that (1) design process can be viewed as a learning process, (2) design innovation involves analogical and inductive reasoning, and (3) learning from a designer's previous design cases is necessary for the development of the next generation in a knowledge-based design system. This thesis draws upon results from several disciplines, including knowledge representation and machine learning in artificial intelligence, and knowledge acquisition in knowledge engineering, to investigate a potential design environment for future developments in computer-aided architectural design. This thesis contains three parts which correspond to the different steps of this research. Part I, discusses three different ways - problem solving, learning and creativity - of generating new thoughts based on old ones. In Part II, the problem statement of the thesis is made and a conceptual model of analogical and inductive reasoning in design is proposed. In Part III, three different methods of building design systems for solving an architectural design problem are compared rule-based, example-based, and case-based. Finally, conclusions are made based on the current implementation of the work, and possible future extensions of this research are described. It reveals new approaches for knowledge acquisition, machine learning, and knowledge-based design systems in architecture.
series thesis:PhD
email
last changed 2003/05/10 05:42

_id e573
authors McLaughlin, Sally
year 1991
title Reading Architectural Plans: A Computable Model
source Computer Aided Architectural Design Futures: Education, Research, Applications [CAAD Futures ‘91 Conference Proceedings / ISBN 3-528-08821-4] Zürich (Switzerland), July 1991, pp. 347-364
summary A fundamental aspect of the expertise of the architectural designer is the ability to assess the quality of existing and developing proposals from schematic representations such as plans, elevations and sections. In this paper I present a computable model of those aspects of the evaluation of architectural floor plans that I believe to be amenable to rule-like formulation. The objective behind the development of this model is twofold: 1) to articulate a belief about the role of simple symbolic representations in the task of evaluation, a task which relies primarily on uniquely human capabilities; and 2) to explore the possible uses of such representations in the development of design expertise. // Input to the model consists of a specification of a design proposal in terms of walls, doors, windows, openings and spaces together with a specification of the context in which the proposal has been developed. Information about context is used to retrieve the goal packages relevant to the evaluation of the proposal. The goal packages encode information about requirements such as circulation, visual privacy and thermal performance. Generic associations between aspects of a plan and individual goals are used to establish if and how each of the goals have been achieved in the given proposal. These associations formalize relationships between aspects of the topology of the artefact, such as the existence of a door between two rooms, and a goal, in this case the goal of achieving circulation between those two rooms. Output from the model consists of both a graphic representation of the way in which goals are achieved and a list of those goals that have not been achieved. The list of goals not achieved provides a means of accessing appropriate design recommendations. What the model provides is essentially a computational tool for exploring the value judgements made in a particular proposal given a set of predefined requirements such as those to be found in design recommendation literature.
series CAAD Futures
last changed 1999/04/07 12:03

_id a113
authors Milne, Murray
year 1991
title Design Tools: Future Design Environments for Visualizing Building Performance
source Computer Aided Architectural Design Futures: Education, Research, Applications [CAAD Futures ‘91 Conference Proceedings / ISBN 3-528-08821-4] Zürich (Switzerland), July 1991, pp. 485-496
summary In the future of Computer Aided Architectural Design (CAAD), architects clearly need more than just computer aided design and drafting systems (CAD). Unquestionably CAD systems continue to become increasingly powerful, but there is more to designing a good building than its three-dimensional existence, especially in the eyes of all the non-architects of the world: users, owners, contractors, regulators, environmentalists. The ultimate measure of a building's quality has something to do with how well it behaves over time. Predictions about its performance have many different dimensions; how much it costs to build, to operate, and to demolish; how comfortable it is; how effectively people can perform their functions in it; how much energy it uses or wastes. Every year dozens of building performance simulation programs are being written that can predict performance over time along any of these dimensions. That is why the need for both CAD systems and performance predictors can be taken for granted, and why instead it may be more interesting to speculate about the need for 'design tools'. A design tool can be defined as a piece of software that is easy and natural for architects to use, that easily accommodates three-dimensional representations of the building, and that-predicts something useful about a building's performance. There are at least five different components of design tools that will be needed for the design environment of the future.
series CAAD Futures
email
last changed 2003/05/16 20:58

_id 6028
authors Sachs, E., Roberts, A. and Stoops, D.
year 1991
title 3-draw: A tool for designing 3D shapes
source IEEE Computer Graphics & Applications, pp. 18-25
summary A fundamentally new type of CAD system for designing shape that is intuitive, easy to use, and powerful is presented. It is based on a paradigm that can be described as designing directly in 3-D. By virtue of two hand-held sensors, designers using 3-Draw to sketch their ideas in the air feel as if they're actually holding and working on objects. Current design practice and related work are reviewed, and current work on 3-Draw is summarized. To capture the flavor of 3-Draw, construction of a sample model of a 12-m yacht is described. 3-Draw's features and data structures are discussed.
series journal paper
last changed 2003/04/23 15:14

_id bbdc
authors Tang, J.C.
year 1991
title Findings from observational studies of collaborative work
source international Joumal of Man-Machine Studies, 34(2), 143-160
summary The work activity of small groups of three to four people was videotaped and analyzed in order to understand collaborative work and to guide the development of tools to support it. The software tools we currently have are often based on a single user model. Even those that are based on multiple user models define structures of interaction that restrict fluid collaboration. We need observe how people collaborate then build software that facilitates collaboration based on those observations, giving the users the tools" that are "naturally" defined in face-to-face interaction. In this experiment small groups of people were observed in a collaborative design task using a shared drawing space. Specific features of collaborative work activity that raise design implications for collaborative technology: 1. collaborators use hand gestures to uniquely communicate significant information 2. the process of creating and using drawings conveys much information not contained in the resulting drawings 3. the drawing space is an important resource for the group in mediating their collaboration 4. there is a fluent mix of activity in the drawing space 5. the spatial orientation among the collaborators and the drawing space has a role in structuring their activity
series other
last changed 2003/04/23 15:14

_id 80ce
authors Turner, R., Balaguer, F., Gobbetti, E. and Thalmann, D.
year 1991
title Interactive Scene Walkthrough Using a Physically-Based Virtual Camera
source Computer Aided Architectural Design Futures: Education, Research, Applications [CAAD Futures ‘91 Conference Proceedings / ISBN 3-528-08821-4] Zürich (Switzerland), July 1991, pp. 511-520
summary One of the most powerful results of recent advances in graphics hardware is the ability of a computer user to interactively explore a virtual building or landscape. The newest threedimensional input devices, together with high speed 3D graphics workstations, make it possible to view and move through a 3D scene by interactively controlling the motion of a virtual camera. In this paper, we describe how natural and intuitive control of building walkthrough can be achieved by using a physically-based model of the virtual camera's behavior. Using the laws of classical mechanics to create an abstract physical model of the camera, we then simulate the virtual camera motion in real time in response toforce data from the various 3D input devices (e.g. the Spaceball and Polhemus 3Space Digitizer). The resulting interactive behavior of the model is determined by several physical parameters such as mass, moment of inertia, and various friction coefficients which can all be varied interactively, and by constraints on the camera's degrees of freedom. This allows us to explore a continuous range of physically-based metaphors for controlling the camera motion. We present the results of experiments using several of these metaphors for virtual camera motion and describe the effects of the various physical parameters.
series CAAD Futures
last changed 1999/04/07 12:03

_id 2344
authors White, Richard
year 1991
title Recognizing Structures: Some Problems in Reasoning with Drawings
source Computer Aided Architectural Design Futures: Education, Research, Applications [CAAD Futures ‘91 Conference Proceedings / ISBN 3-528-08821-4] Zürich (Switzerland), July 1991, pp. 381-394
summary This paper describes work on our current project aimed at developing a generalized system for performing automated reasoning tasks in various domains, using information extracted from drawings. It briefly describes the MOLE representation system, a frame-like formalism which can be used to build both description and inheritance hierarchies. The use of MOLE for representing graphical objects as well as the objects they represent is also described.The paper goes on to discuss some of the problems faced in the development of systems which can perform reasoning tasks on such representations. In particular, problems arising from the need to map the structures required by the application domain to the drawing description are outlined and a model which adapts existing Artificial Intelligence (AI) techniques to solve these problems is proposed.
series CAAD Futures
last changed 1999/04/07 12:03

_id 7508
authors Montgomery, D.C.
year 1991
title Design and Analysis of Experiments
source John Wiley, Chichester
summary Learn How to Achieve Optimal Industrial Experimentation Through four editions, Douglas Montgomery has provided statisticians, engineers, scientists, and managers with the most effective approach for learning how to design, conduct, and analyze experiments that optimize performance in products and processes. Now, in this fully revised and enhanced Fifth Edition, Montgomery has improved his best-selling text by focusing even more sharply on factorial and fractional factorial design and presenting new analysis techniques (including the generalized linear model). There is also expanded coverage of experiments with random factors, response surface methods, experiments with mixtures, and methods for process robustness studies. The book also illustrates two of today's most powerful software tools for experimental design: Design-Expert(r) and Minitab(r). Throughout the text, You'll find output from these two programs, along with detailed discussion on how computers are currently used in the analysis and design of experiments. You'll also learn how to use statistically designed experiments to: * Obtain information for characterization and optimization of systems * Improve manufacturing processes * Design and develop new processes and products * Evaluate material alternatives in product design * Improve the field performance, reliability, and manufacturing aspects of products * Learn how to conduct experiments effectively and efficiently Other important textbook features: * Student version of Design-Expert(r) software is available. * Web site (www.wiley.com/college/montgomery) offers supplemental text material for each chapter, a sample syllabus, and sample student projects from the author's Design of Experiments course at Arizona State University.
series other
last changed 2003/04/23 15:14

_id b5be
authors Stok, Leon
year 1991
title Architectural synthesis and optimization of digital systems
source Eindhoven University of Technology
summary High level synthesis means going from an functional specification of a digits-system at the algorithmic level to a register transfer level structure. Different appli-cations will ask for different design styles. Despite this diversity in design styles many tasks in the synthesis will be similar. There is no need to write a new synthesis system for each design style. The best way to go seems a decomposition of the high level synthesis problems in several well defined subproblems. How the problem is decomposed depends heavily on a) the type of network architecture chosen, b) the constraints applied to the design and c) on the functional description itself. From this architecture style, the constraints and the functional description a synthesis scheme can be derived. Once this scheme is fixed, algorithms can be chosen which fit into this scheme and solve the subproblems in a fast and, when possible, optimal way. To support such a synthesis philosophy, a framework is needed in which all design information can be stored in a unique way during the various phases of the design process. This asks for a design data base capable of handling all design information with a formally defined interface to all design tools. This thesis gives a formal way to describe both the functional representation, the register transfer level structure and the controller and the relations between all three of them. Special attention has been paid to the efficient representation of mutual exclusive operations and array accesses. The scheduling and allocation problems are defined as mappings between these formal representations. Both the existing synthesis algorithms and the new algorithms described in this thesis fit into this framework. Three new allocation algorithms are presented in this thesis: an algorithm for optimal register allocation in cyclic data flow graphs, an exact polynomial algorithm to do the module allocation and a new scheme to minimize the number of interconnections during all stages of the data path allocation. Cyclic data flow graphs result from high level behavioral descriptions that contain loops. Algorithms for register allocation in high level synthesis published up till now, only considered loop free data flow graphs, When these algorithms are applied to data flow graphs with loops, unnecessary register transfer operations are introduced. A new algorithm is presented that performs a minimal register allocation and eliminates all superfluous register transfer operations. The problem is reformulated as a multicommodity network flow problem for which very efficient solutions exist. Experiments on a benchmark set have shown that in all test cases all register transfers could be eliminated at no increase in register cost. Only heuristic algorithms appeared in literature to solve the module allocation problem. The module allocation problem is usually defined as a clique cover problem on a so-called module allocation graph. It is shown that, under certain conditions, the module allocation graph belongs to the special class of comparability graphs. A polynomial time algorithm can optimally find a clique cover of such a graph. Even when interconnect weights are taken into account, this can be solved exactly. This problem can be transformed into a maximal cost network flow problem, which can be solved exactly in polynomial time. An algorithm is described which solves the module allocation problem with interconnect weights exactly, with a complexity O(kn2), where n is the number of operations In previous research, interconnection was optimized when the module allocation for the operations and the register allocation for the variables already had been done. However, the amount of multiplexing and interconnect are crucial factors to both the delay and the area of a circuit. A new scheme is presented to minimize the number of interconnections during the data path allocation. This scheme first groups all values based on their read and write times. Values belonging to the same group can share a register file. This minimizes the number of data transfers with different sources and destinations. Secondly, registers are allocated for each group separately. Finally the interconnect allocation is done. During the interconnect allocation, the module allocation is determined. The value grouping is based on edge coloring algorithms providing a sharp upper bound on the number of colors needed two techniques: splitting read and write phases of values and introducing serial (re-)write operations for the same value, make that even more efficient exact edge coloring algorithms can be used. It is shown that when variables are grouped into register files and operations are assigned to modules during the interconnection minimization, significant savings (20%) can be obtained in the number of local interconnections and the amount of global interconnect, at the expense of only slightly more register area.
keywords Digital Systems; Digital Systems
series thesis:PhD
email
last changed 2003/02/12 22:37

_id ecaade2023_281
id ecaade2023_281
authors Prokop, Šimon, Kubalík, Jiøí and Kurilla, Lukáš
year 2023
title Neural Networks for Estimating Wind Pressure on Complex Double-Curved Facades
doi https://doi.org/10.52842/conf.ecaade.2023.2.639
source Dokonal, W, Hirschberg, U and Wurzer, G (eds.), Digital Design Reconsidered - Proceedings of the 41st Conference on Education and Research in Computer Aided Architectural Design in Europe (eCAADe 2023) - Volume 2, Graz, 20-22 September 2023, pp. 639–647
summary Due to their complex geometry, it is challenging to assess wind effects on the freeform, double-curved building facades. The traditional building code EN 1991-1-4 (730035) only accounts for basic shapes such as cubes, spheres, and cylinders. Moreover, even though wind tunnel measurements are considered to be more precise than other methods, they are still limited by the number of measurement points that can be taken. This limitation, combined with the time and resources required for the analysis, can limit the ability to fully capture detailed wind effects on the whole complex freeform shape of the building. In this study, we propose the use of neural network models trained to predict wind pressure on complex double-curved facades. The neural network is a powerful data-driven machine learning technique that can, in theory, learn an approximation of any function from data, making it well-suited for this application. Our approach was empirically evaluated using a set of 31 points measured in the wind tunnel on a 3D printed model in 1:300 scale of the real architectural design of a concert hall in Ostrava. The results of this evaluation demonstrate the effectiveness of our neural network method in estimating wind pressures on complex freeform facades.
keywords wind pressure, double-curved façade, neural network
series eCAADe
email
last changed 2023/12/10 10:49

_id 0b1c
authors Bridges, Alan
year 1991
title Computer Exercises in Architectural Design Theory
doi https://doi.org/10.52842/conf.ecaade.1991.x.f9w
source Experiences with CAAD in Education and Practice [eCAADe Conference Proceedings] Munich (Germany) 17-19 October 1991
summary This paper discusses how architectural theory may be taught using computer based exercises to explore the practical application of those theories. The particular view of architecture developed is, necessarily, a restricted one but the objectives behind the exercises are slightly different to those that a pure architectural theorist or historian might have The formal teaching of architectural theory and composition has not been very fashionable in Schools of Architecture for several years now: indeed there is a considerable inbuilt resistance in students to the application of any form of rules or procedures. There is however a general interest in computing and this can be utilised to advantage. In concentrating on computer applications in design eclectic use has been made of a number of architectural examples ranging from Greek temples to the work of modern deconstructionists. Architectural theory since Vitruvius is littered with attempts to define universal theories of design and this paper certainly does not presume to anything so grand: I have merely looked at buildings, compared them and noted what they have in common and how that might relate to computer-aided design. I have ignored completely any sociological, philosophical or phenomenological questions but would readily agree with the criticism that Cartesian rationality is not, on its own, a sufficient base upon which to build a theory of design. However I believe there is merit in articulating design by separating it from other concerns and making it a subject of study in its own right. Work in design research will provide the models and intellectual structures to facilitate discourse about design and might be expected to benefit the development of design skills by providing material that could be formally taught and debated in a way that is removed from the ephemeral "fashionable designer" debate. Of course, some of the ideas discussed here may prove to be equally ephemeral but that does not entirely negate their value.

series eCAADe
email
last changed 2022/06/07 07:50

_id cdb1
authors Cornick, T., Noble, B. and Hallahan, C.
year 1991
title The Limitations of Current Working Practices on the Development of Computer Integrating Modelling in Construction
source computer Integrated Future, CIB W78 Seminar. Calibre, The Netherlands: Eindhoven University of Technology, september, 1991. Unnumbered. includes bibliography
summary For the construction Industry to improve its processes through the application computer-based systems, traditional working practices must first change to support the integrated control of design and construction. Current manual methods of practice accept the limitations of man to process a wide range of building performance and production information simultaneously. However when these limitations are removed, through the applications of computer systems, the constraints of manual methods need no longer apply. The first generation of computer applications to the Construction Industry merely modelled the divided and sequential processes of manual methods i.e. drafting, specification writing, engineering and quantity calculations, estimating, billing, material ordering data-bases and activity planning. Use of these systems raises expectations that connections within the computer between the processes modelled can actually be made and faster and more integrated information processing be achieved. 'Linking' software is then developed. The end result of this approach was that users were able to produce information faster, present it in an impressive manner but, in reality, no perceived improvement in actual building performance, production economy or efficiency was realized. A current government sponsored Teaching Company Programme with a UK design and build company is addressing the problem of how real economic benefit can be realized through improvement in, amongst other things, their existing computer applications. This work is being carried out by both considering an academic conceptual model of how 'designing for production' can be achieved in computer applications and what is immediately realizable in practice by modelling the integration of a limited number of knowledge domains to which computers are already being applied. i.e. billing from design, estimating and buying. This paper describes each area of work and how they are impacting on each other
keywords construction, building process, integration
series CADline
last changed 2003/06/02 13:58

_id eaca
authors Davis, L. (ed.)
year 1991
title Handbook of genetic algorithms
source Van Nostrand Reinhold, New York
summary This book sets out to explain what genetic algorithms are and how they can be used to solve real-world problems. The first objective is tackled by the editor, Lawrence Davis. The remainder of the book is turned over to a series of short review articles by a collection of authors, each explaining how genetic algorithms have been applied to problems in their own specific area of interest. The first part of the book introduces the fundamental genetic algorithm (GA), explains how it has traditionally been designed and implemented and shows how the basic technique may be applied to a very simple numerical optimisation problem. The basic technique is then altered and refined in a number of ways, with the effects of each change being measured by comparison against the performance of the original. In this way, the reader is provided with an uncluttered introduction to the technique and learns to appreciate why certain variants of GA have become more popular than others in the scientific community. Davis stresses that the choice of a suitable representation for the problem in hand is a key step in applying the GA, as is the selection of suitable techniques for generating new solutions from old. He is refreshingly open in admitting that much of the business of adapting the GA to specific problems owes more to art than to science. It is nice to see the terminology associated with this subject explained, with the author stressing that much of the field is still an active area of research. Few assumptions are made about the reader's mathematical background. The second part of the book contains thirteen cameo descriptions of how genetic algorithmic techniques have been, or are being, applied to a diverse range of problems. Thus, one group of authors explains how the technique has been used for modelling arms races between neighbouring countries (a non- linear, dynamical system), while another group describes its use in deciding design trade-offs for military aircraft. My own favourite is a rather charming account of how the GA was applied to a series of scheduling problems. Having attempted something of this sort with Simulated Annealing, I found it refreshing to see the authors highlighting some of the problems that they had encountered, rather than sweeping them under the carpet as is so often done in the scientific literature. The editor points out that there are standard GA tools available for either play or serious development work. Two of these (GENESIS and OOGA) are described in a short, third part of the book. As is so often the case nowadays, it is possible to obtain a diskette containing both systems by sending your Visa card details (or $60) to an address in the USA.
series other
last changed 2003/04/23 15:14

_id c886
authors Graham, Ian
year 1991
title Object oriented methods
source Addison Wesley
summary This is another book aimed at helping those making decisions to arrive at better informed ones. This is a second (and substantially updated)edition of a book that was deservedly well reviewed when it was originally published. Those who have to give advice on the choice of any aspect of OO technology from design to programming and testing will know that they are faced with attempting to make decisions based on ill-informed and often biased sources of information. Ian Graham attempts to survey the whole field, laying out your choices for you rather than making them for you. In each aspect of the subject the result of reading Object-Oriented Methods will be to allow you to reach decisions based on an understanding of the problems and the current range of tools aimed at helping you solve them. If you have a serious decision to make this would be a good place to start before proceeding to a more detailed investigation of what seem the potentially best choices for you and your needs. The other group of people who will benefit from reading this book are those that want or need a general overview of the OO arena. This is a good text that should be read by students of Computing, those who recognise that good advice is based on a comprehensive knowledge of the field and those who have to make a practical commercial decision about which OO route to take.
series other
last changed 2003/04/23 15:14

_id 098a
authors Perron, Richard and Miller, Deron
year 1991
title Landscape of the Mind
doi https://doi.org/10.52842/conf.acadia.1991.071
source Reality and Virtual Reality [ACADIA Conference Proceedings / ISBN 1-880250-00-4] Los Angeles (California - USA) October 1991, pp. 71-86
summary The focus of this article is the exploration of landscape and the question of representation, more specifically how landscape principles can be represented through computation. It is a quest for essential qualities, through an application of philosophical questioning, and a response to a human perception of reality. Reality, as an invention of the human mind, is often thought of as a set of accepted conventions and constructs. Such a reality has an inherent dependency upon cognition where spatial and temporal principles may be defined within the natural and built environment, and further embraced within a cultural context. However, there also exist rules or relations that are neither invented nor formulated by the participants understanding. In effect these relations may not have been effectively articulated, a result perhaps of unfamiliar cues. Therefore, to the participant, these relations reside in the realm of the unknown or even the mystic. The aesthetic often resides in the realm of the mystic. The discovery of the aesthetic, is often an experience that comes from encountering physical and essential beauty where it has been produced through unconscious relations, perceived, yet transcending human understanding. The aspects of space and time, spatial and temporal properties and relations of things and events, are generally accepted conventions. Yet, the existence of a time order, is often not perceived. An understanding of spatial temporal properties may involve a temporal detachment from convention, allowing the release of previously unknown patterns and relations. Virtual realities are well constructed simulations of our environments, yet they may lack the embedded essential qualities of place. Virtual reality should transcend human perception and traditional modes of understanding, and most importantly our limited notions of the temporal nature of our environment. A desire to reach beyond the limits of perceived time order, may take us beyond existing sets of cultural values, and lead to the realization of new spatial/temporal conventions with the assistance of the computer.
series ACADIA
last changed 2022/06/07 08:00

For more results click below:

this is page 0show page 1show page 2show page 3show page 4show page 5... show page 9HOMELOGIN (you are user _anon_921798 from group guest) CUMINCAD Papers Powered by SciX Open Publishing Services 1.002