CumInCAD is a Cumulative Index about publications in Computer Aided Architectural Design
supported by the sibling associations ACADIA, CAADRIA, eCAADe, SIGraDi, ASCAAD and CAAD futures

PDF papers
References

Hits 1 to 20 of 221

_id 84a7
authors Kalay, Yehuda E.
year 1991
title Multi-Faceted, Dynamic Representation of Design Knowledge
source ARCC Conference on Reflections on Representations. September, 1991. [16] p. : ill. includes bibliography
summary Explicit representation of design knowledge is needed if scientific methods are to be applied in design research, and if computers are to be used in the aid of design education and practice. The representation of knowledge in general, and design knowledge in particular, have been the subject matter of computer science, design methods, and computer-aided design research for quite some time. Several models of design knowledge representation have been developed over the last 30 years, addressing specific aspects of the problem. This paper describes an approach that recognizes the multiplicity of design knowledge representation modalities and the dynamic nature of the represented knowledge. It uses a variety of computational tools to encode different aspects of design knowledge, including the realities, perceptions and the intentions it comprises. The representation is intended to form a parsimonious, communicable and presentable knowledge- base that can be used as a tool for design research and education
keywords design, knowledge, representation, architecture, integration
series CADline
email
last changed 2003/06/02 10:24

_id e573
authors McLaughlin, Sally
year 1991
title Reading Architectural Plans: A Computable Model
source Computer Aided Architectural Design Futures: Education, Research, Applications [CAAD Futures ‘91 Conference Proceedings / ISBN 3-528-08821-4] Zürich (Switzerland), July 1991, pp. 347-364
summary A fundamental aspect of the expertise of the architectural designer is the ability to assess the quality of existing and developing proposals from schematic representations such as plans, elevations and sections. In this paper I present a computable model of those aspects of the evaluation of architectural floor plans that I believe to be amenable to rule-like formulation. The objective behind the development of this model is twofold: 1) to articulate a belief about the role of simple symbolic representations in the task of evaluation, a task which relies primarily on uniquely human capabilities; and 2) to explore the possible uses of such representations in the development of design expertise. // Input to the model consists of a specification of a design proposal in terms of walls, doors, windows, openings and spaces together with a specification of the context in which the proposal has been developed. Information about context is used to retrieve the goal packages relevant to the evaluation of the proposal. The goal packages encode information about requirements such as circulation, visual privacy and thermal performance. Generic associations between aspects of a plan and individual goals are used to establish if and how each of the goals have been achieved in the given proposal. These associations formalize relationships between aspects of the topology of the artefact, such as the existence of a door between two rooms, and a goal, in this case the goal of achieving circulation between those two rooms. Output from the model consists of both a graphic representation of the way in which goals are achieved and a list of those goals that have not been achieved. The list of goals not achieved provides a means of accessing appropriate design recommendations. What the model provides is essentially a computational tool for exploring the value judgements made in a particular proposal given a set of predefined requirements such as those to be found in design recommendation literature.
series CAAD Futures
last changed 1999/04/07 12:03

_id ga0010
id ga0010
authors Moroni, A., Zuben, F. Von and Manzolli, J.
year 2000
title ArTbitrariness in Music
source International Conference on Generative Art
summary Evolution is now considered not only powerful enough to bring about the biological entities as complex as humans and conciousness, but also useful in simulation to create algorithms and structures of higher levels of complexity than could easily be built by design. In the context of artistic domains, the process of human-machine interaction is analyzed as a good framework to explore creativity and to produce results that could not be obtained without this interaction. When evolutionary computation and other computational intelligence methodologies are involved, every attempt to improve aesthetic judgement we denote as ArTbitrariness, and is interpreted as an interactive iterative optimization process. ArTbitrariness is also suggested as an effective way to produce art through an efficient manipulation of information and a proper use of computational creativity to increase the complexity of the results without neglecting the aesthetic aspects [Moroni et al., 2000]. Our emphasis will be in an approach to interactive music composition. The problem of computer generation of musical material has received extensive attention and a subclass of the field of algorithmic composition includes those applications which use the computer as something in between an instrument, in which a user "plays" through the application's interface, and a compositional aid, which a user experiments with in order to generate stimulating and varying musical material. This approach was adopted in Vox Populi, a hybrid made up of an instrument and a compositional environment. Differently from other systems found in genetic algorithms or evolutionary computation, in which people have to listen to and judge the musical items, Vox Populi uses the computer and the mouse as real-time music controllers, acting as a new interactive computer-based musical instrument. The interface is designed to be flexible for the user to modify the music being generated. It explores evolutionary computation in the context of algorithmic composition and provides a graphical interface that allows to modify the tonal center and the voice range, changing the evolution of the music by using the mouse[Moroni et al., 1999]. A piece of music consists of several sets of musical material manipulated and exposed to the listener, for example pitches, harmonies, rhythms, timbres, etc. They are composed of a finite number of elements and basically, the aim of a composer is to organize those elements in an esthetic way. Modeling a piece as a dynamic system implies a view in which the composer draws trajectories or orbits using the elements of each set [Manzolli, 1991]. Nonlinear iterative mappings are associated with interface controls. In the next page two examples of nonlinear iterative mappings with their resulting musical pieces are shown.The mappings may give rise to attractors, defined as geometric figures that represent the set of stationary states of a non-linear dynamic system, or simply trajectories to which the system is attracted. The relevance of this approach goes beyond music applications per se. Computer music systems that are built on the basis of a solid theory can be coherently embedded into multimedia environments. The richness and specialty of the music domain are likely to initiate new thinking and ideas, which will have an impact on areas such as knowledge representation and planning, and on the design of visual formalisms and human-computer interfaces in general. Above and bellow, Vox Populi interface is depicted, showing two nonlinear iterative mappings with their resulting musical pieces. References [Manzolli, 1991] J. Manzolli. Harmonic Strange Attractors, CEM BULLETIN, Vol. 2, No. 2, 4 -- 7, 1991. [Moroni et al., 1999] Moroni, J. Manzolli, F. Von Zuben, R. Gudwin. Evolutionary Computation applied to Algorithmic Composition, Proceedings of CEC99 - IEEE International Conference on Evolutionary Computation, Washington D. C., p. 807 -- 811,1999. [Moroni et al., 2000] Moroni, A., Von Zuben, F. and Manzolli, J. ArTbitration, Las Vegas, USA: Proceedings of the 2000 Genetic and Evolutionary Computation Conference Workshop Program – GECCO, 143 -- 145, 2000.
series other
email
more http://www.generativeart.com/
last changed 2003/08/07 17:25

_id b6b3
authors Brown, J.S. and Duguid, P.
year 1991
title Organizational Learning and Communities-of-Practice: Toward a Unified View of Working, Learning, and Innovation
source Organization Science, 2(1), 40-57
summary Recent ethnographic studies of workplace practices indicate that the ways people actually work usually differ fundamentally from the ways organizations describe that work in manuals, training programs, organizational charts, and job descriptions. Organizations tend to rely on the latter in their attempts to understand and improve work practice. We relate the conclusions of one study of work practices to compatible investigations of learning and innovation to argue that conventional descriptions of jobs mask not only the ways people work, but also the learning and innovation generated in the informal communities-of-practice in which they work. By reassessing the apparently conflicting triad of work, learning, and innovation in the context of actual communities and actual practices, we suggest that the synergistic connections between these three become apparent. With a unified view of working, learning, and innovating, it should be possible to reconceive of and redesign organizations to improve all three.
series journal paper
last changed 2003/04/23 15:14

_id ga0024
id ga0024
authors Ferrara, Paolo and Foglia, Gabriele
year 2000
title TEAnO or the computer assisted generation of manufactured aesthetic goods seen as a constrained flux of technological unconsciousness
source International Conference on Generative Art
summary TEAnO (Telematica, Elettronica, Analisi nell'Opificio) was born in Florence, in 1991, at the age of 8, being the direct consequence of years of attempts by a group of computer science professionals to use the digital computers technology to find a sustainable match among creation, generation (or re-creation) and recreation, the three basic keywords underlying the concept of “Littérature potentielle” deployed by Oulipo in France and Oplepo in Italy (see “La Littérature potentielle (Créations Re-créations Récréations) published in France by Gallimard in 1973). During the last decade, TEAnO has been involving in the generation of “artistic goods” in aesthetic domains such as literature, music, theatre and painting. In all those artefacts in the computer plays a twofold role: it is often a tool to generate the good (e.g. an editor to compose palindrome sonnets of to generate antonymic music) and, sometimes it is the medium that makes the fruition of the good possible (e.g. the generator of passages of definition literature). In that sense such artefacts can actually be considered as “manufactured” goods. A great part of such creation and re-creation work has been based upon a rather small number of generation constraints borrowed from Oulipo, deeply stressed by the use of the digital computer massive combinatory power: S+n, edge extraction, phonetic manipulation, re-writing of well known masterpieces, random generation of plots, etc. Regardless this apparently simple underlying generation mechanisms, the systematic use of computer based tools, as weel the analysis of the produced results, has been the way to highlight two findings which can significantly affect the practice of computer based generation of aesthetic goods: ? the deep structure of an aesthetic work persists even through the more “desctructive” manipulations, (such as the antonymic transformation of the melody and lyrics of a music work) and become evident as a sort of profound, earliest and distinctive constraint; ? the intensive flux of computer generated “raw” material seems to confirm and to bring to our attention the existence of what Walter Benjamin indicated as the different way in which the nature talk to a camera and to our eye, and Franco Vaccari called “technological unconsciousness”. Essential references R. Campagnoli, Y. Hersant, “Oulipo La letteratura potenziale (Creazioni Ri-creazioni Ricreazioni)”, 1985 R. Campagnoli “Oupiliana”, 1995 TEAnO, “Quaderno n. 2 Antologia di letteratura potenziale”, 1996 W. Benjiamin, “Das Kunstwerk im Zeitalter seiner technischen Reprodizierbarkeit”, 1936 F. Vaccari, “Fotografia e inconscio tecnologico”, 1994
series other
more http://www.generativeart.com/
last changed 2003/08/07 17:25

_id 3105
authors Novak, T.P., Hoffman, D.L., and Yung, Y.-F.
year 1996
title Modeling the structure of the flow experience
source INFORMS Marketing Science and the Internet Mini-Conference, MIT
summary The flow construct (Csikszentmihalyi 1977) has recently been proposed by Hoffman and Novak (1996) as essential to understanding consumer navigation behavior in online environments such as the World Wide Web. Previous researchers (e.g. Csikszentmihalyi 1990; Ghani, Supnick and Rooney 1991; Trevino and Webster 1992; Webster, Trevino and Ryan 1993) have noted that flow is a useful construct for describing more general human-computer interactions. Hoffman and Novak define flow as the state occurring during network navigation which is: 1) characterized by a seamless sequence of responses facilitated by machine interactivity, 2) intrinsically enjoyable, 3) accompanied by a loss of self-consciousness, and 4) selfreinforcing." To experience flow while engaged in an activity, consumers must perceive a balance between their skills and the challenges of the activity, and both their skills and challenges must be above a critical threshold. Hoffman and Novak (1996) propose that flow has a number of positive consequences from a marketing perspective, including increased consumer learning, exploratory behavior, and positive affect."
series other
last changed 2003/04/23 15:50

_id b5be
authors Stok, Leon
year 1991
title Architectural synthesis and optimization of digital systems
source Eindhoven University of Technology
summary High level synthesis means going from an functional specification of a digits-system at the algorithmic level to a register transfer level structure. Different appli-cations will ask for different design styles. Despite this diversity in design styles many tasks in the synthesis will be similar. There is no need to write a new synthesis system for each design style. The best way to go seems a decomposition of the high level synthesis problems in several well defined subproblems. How the problem is decomposed depends heavily on a) the type of network architecture chosen, b) the constraints applied to the design and c) on the functional description itself. From this architecture style, the constraints and the functional description a synthesis scheme can be derived. Once this scheme is fixed, algorithms can be chosen which fit into this scheme and solve the subproblems in a fast and, when possible, optimal way. To support such a synthesis philosophy, a framework is needed in which all design information can be stored in a unique way during the various phases of the design process. This asks for a design data base capable of handling all design information with a formally defined interface to all design tools. This thesis gives a formal way to describe both the functional representation, the register transfer level structure and the controller and the relations between all three of them. Special attention has been paid to the efficient representation of mutual exclusive operations and array accesses. The scheduling and allocation problems are defined as mappings between these formal representations. Both the existing synthesis algorithms and the new algorithms described in this thesis fit into this framework. Three new allocation algorithms are presented in this thesis: an algorithm for optimal register allocation in cyclic data flow graphs, an exact polynomial algorithm to do the module allocation and a new scheme to minimize the number of interconnections during all stages of the data path allocation. Cyclic data flow graphs result from high level behavioral descriptions that contain loops. Algorithms for register allocation in high level synthesis published up till now, only considered loop free data flow graphs, When these algorithms are applied to data flow graphs with loops, unnecessary register transfer operations are introduced. A new algorithm is presented that performs a minimal register allocation and eliminates all superfluous register transfer operations. The problem is reformulated as a multicommodity network flow problem for which very efficient solutions exist. Experiments on a benchmark set have shown that in all test cases all register transfers could be eliminated at no increase in register cost. Only heuristic algorithms appeared in literature to solve the module allocation problem. The module allocation problem is usually defined as a clique cover problem on a so-called module allocation graph. It is shown that, under certain conditions, the module allocation graph belongs to the special class of comparability graphs. A polynomial time algorithm can optimally find a clique cover of such a graph. Even when interconnect weights are taken into account, this can be solved exactly. This problem can be transformed into a maximal cost network flow problem, which can be solved exactly in polynomial time. An algorithm is described which solves the module allocation problem with interconnect weights exactly, with a complexity O(kn2), where n is the number of operations In previous research, interconnection was optimized when the module allocation for the operations and the register allocation for the variables already had been done. However, the amount of multiplexing and interconnect are crucial factors to both the delay and the area of a circuit. A new scheme is presented to minimize the number of interconnections during the data path allocation. This scheme first groups all values based on their read and write times. Values belonging to the same group can share a register file. This minimizes the number of data transfers with different sources and destinations. Secondly, registers are allocated for each group separately. Finally the interconnect allocation is done. During the interconnect allocation, the module allocation is determined. The value grouping is based on edge coloring algorithms providing a sharp upper bound on the number of colors needed two techniques: splitting read and write phases of values and introducing serial (re-)write operations for the same value, make that even more efficient exact edge coloring algorithms can be used. It is shown that when variables are grouped into register files and operations are assigned to modules during the interconnection minimization, significant savings (20%) can be obtained in the number of local interconnections and the amount of global interconnect, at the expense of only slightly more register area.
keywords Digital Systems; Digital Systems
series thesis:PhD
email
last changed 2003/02/12 22:37

_id ga9921
id ga9921
authors Coates, P.S. and Hazarika, L.
year 1999
title The use of genetic programming for applications in the field of spatial composition
source International Conference on Generative Art
summary Architectural design teaching using computers has been a preoccupation of CECA since 1991. All design tutors provide their students with a set of models and ways to form, and we have explored a set of approaches including cellular automata, genetic programming ,agent based modelling and shape grammars as additional tools with which to explore architectural ( and architectonic) ideas.This paper discusses the use of genetic programming (G.P.) for applications in the field of spatial composition. CECA has been developing the use of Genetic Programming for some time ( see references ) and has covered the evolution of L-Systems production rules( coates 1997, 1999b), and the evolution of generative grammars of form (Coates 1998 1999a). The G.P. was used to generate three-dimensional spatial forms from a set of geometrical structures .The approach uses genetic programming with a Genetic Library (G.Lib) .G.P. provides a way to genetically breed a computer program to solve a problem.G. Lib. enables genetic programming to define potentially useful subroutines dynamically during a run .* Exploring a shape grammar consisting of simple solid primitives and transformations. * Applying a simple fitness function to the solid breeding G.P.* Exploring a shape grammar of composite surface objects. * Developing grammarsfor existing buildings, and creating hybrids. * Exploring the shape grammar of abuilding within a G.P.We will report on new work using a range of different morphologies ( boolean operations, surface operations and grammars of style ) and describe the use of objective functions ( natural selection) and the "eyeball test" ( artificial selection) as ways of controlling and exploring the design spaces thus defined.
series other
more http://www.generativeart.com/
last changed 2003/08/07 17:25

_id a6be
authors Doyle, J.
year 1991
title Static and Dynamic Analysis of Structures, with an emphasis on mechanics and computer methods
source Kluwer Academic Pub., Dordrecht
summary This book is concerned with the static and dynamic analysis of structures. Specifically, it uses the stiffness formulated matrix methods for use on computers to tackle some of the fundamental problems facing engineers in structural mechanics. This is done by covering the Mechanics of Structures, its rephrasing in terms of the Matrix Methods and then their Computational implementation, all within a cohesive setting. Although the book is designed primarily as a text for use at the upper-graduate and beginning graduate level, many practising structural engineers will find it useful as a reference and self-study guide. Each chapter is supplemented with a collection of pertinent problems that indicate extensions of the theory and the applications. These problems, combined with selected references to relevant literature, can form the basis for further study. The book sets as its goal the treatment of structural dynamics starting with the basic mechanics and going all the way to their implementation on digital computers. Rather than discuss particular commercial packages, Professor Doyle has developed STADYN: a complete (but lean) program to perform each of the standard procedures used in commercial programs. Each module in the program is reasonably complete in itself, and all were written with the sole aim of clarity plus a modicum of efficiency, compactness and elegance.
series other
last changed 2003/04/23 15:14

_id 467d
authors Eastman, Charles M.
year 1991
title A Data Model Analysis of Modularity and Extensibility in Building Databases
source February, 1991. Report No. 16: This paper uses data modeling techniques to define how database schemas for an intelligent integrated architectural CAD system can be made extensible. It reviews the product data modeling language EDM, then applies it to define a part of an architectural data model. Extensions are then investigated, regarding how users could integrate various design-specific packages into a uniquely configured system
summary Both extension by substituting one technology for another and by adding a new evaluation application, are considered. Data modeling allows specification of a CAD database and identification of the kind of modularization that will work and what problems may arise
keywords database, building, modeling, CAD, integration, systems, architecture, design
series CADline
email
last changed 2003/05/17 10:15

_id a6d4
authors Krueger, Myron W.
year 1991
title Artificial Reality II
source Reading, Massachusetts: Addison-Wesley Publishing. 2nd.ed.
summary The focus of Myron Krueger in Artificial Reality II is the interaction between humans and machines, both in the immediate interface and the associated cultural relationships. He uses the concept of artificial reality as a medium of experience and as a tool to examine the relationships between people and machines. When he first coined the term in the mid-1970s, his 'goal was full-body participation in computer events that were so compelling that they would be accepted as real experience.' He wanted to create an artificial reality that would perceive human actions in a simulated world of sight, sounds, and other sensations and would make the experience of this illusion convincing. His focus was to create unencumbered, artificial realities where the humans could participate with their entire body without wearing any special instruments (be they sensors or displays) in an experience created by the computer. The environment could be controlled by preexisting programs, or could have operators intervene and use the computer to amplify their ability to interact with people. The intention was not to reproduce conventional reality but to create synthetic realities.
series other
last changed 2003/04/23 15:14

_id a113
authors Milne, Murray
year 1991
title Design Tools: Future Design Environments for Visualizing Building Performance
source Computer Aided Architectural Design Futures: Education, Research, Applications [CAAD Futures ‘91 Conference Proceedings / ISBN 3-528-08821-4] Zürich (Switzerland), July 1991, pp. 485-496
summary In the future of Computer Aided Architectural Design (CAAD), architects clearly need more than just computer aided design and drafting systems (CAD). Unquestionably CAD systems continue to become increasingly powerful, but there is more to designing a good building than its three-dimensional existence, especially in the eyes of all the non-architects of the world: users, owners, contractors, regulators, environmentalists. The ultimate measure of a building's quality has something to do with how well it behaves over time. Predictions about its performance have many different dimensions; how much it costs to build, to operate, and to demolish; how comfortable it is; how effectively people can perform their functions in it; how much energy it uses or wastes. Every year dozens of building performance simulation programs are being written that can predict performance over time along any of these dimensions. That is why the need for both CAD systems and performance predictors can be taken for granted, and why instead it may be more interesting to speculate about the need for 'design tools'. A design tool can be defined as a piece of software that is easy and natural for architects to use, that easily accommodates three-dimensional representations of the building, and that-predicts something useful about a building's performance. There are at least five different components of design tools that will be needed for the design environment of the future.
series CAAD Futures
email
last changed 2003/05/16 20:58

_id 2914
authors Mortola, Elena and Giangrande, Alessandro
year 1991
title An Evaluation Module for "An Interface for Designing" (AID)- A Procedure based on Trichotomic Segmentation
source Computer Aided Architectural Design Futures: Education, Research, Applications [CAAD Futures ‘91 Conference Proceedings / ISBN 3-528-08821-4] Zürich (Switzerland), July 1991, pp. 139-154
summary The paper illustrates a model used to construct the evaluation module for "An Interface for Designing" (AID), a system to aid architectural design. The model can be used at the end of every cycle of analysis-synthesis-evaluation in the intermediate phases of design development. With the aid of this model it is possible to evaluate the quality of a project in overall terms to establish whether the project is acceptable, whether it should be elaborated ex-novo or whether it is necessary to begin a new cycle to improve it. In this last case it is also possible to evaluate the effectiveness of the possible actions and strategies for improvement. T he model is based on a procedure of trichotomic segmentation, developed within the MCDA (Multi- Criteria Decision Aid), which uses the outranking relation to compare the project with some evaluation profiles taken as projects for reference. In the paper an example of the application of the model in the teaching field will also be described.
series CAAD Futures
last changed 1999/04/07 12:03

_id f2f6
authors Papper, M., Danahy, J. and Baecker, R.
year 1991
title Predictable Modelling Interaction Using High-Level Constraints: Making Objects Behave As They Would In Our Environment
doi https://doi.org/10.52842/conf.acadia.1991.211
source Reality and Virtual Reality [ACADIA Conference Proceedings / ISBN 1-880250-00-4] Los Angeles (California - USA) October 1991, pp. 211-222
summary An approach for a graphical CAD system that is capable of representing physical and geometric aspects of a design using high-level constraints (HLCs) is presented. A prototype spatial planning system incorporating constraints is used in an interactive manner to refine designs by following an iterative approach which uses visual information to evaluate the design at each stage of iteration. High-level constraints aid this iterative approach by influencing (or constraining) the behavior of objects as they are interactively manipulated during the design stage of problem solving. High-level constraints also define the scaling properties of objects which are useful during the construction stage of problem solving. This system and its implications for the design of CAD systems incorporating HLCs are discussed.
series ACADIA
email
last changed 2022/06/07 08:00

_id a9bc
authors Ronchi, Alfredo
year 1991
title CAAD Technical Information Management by Hypertext
doi https://doi.org/10.52842/conf.ecaade.1991.x.j4d
source Experiences with CAAD in Education and Practice [eCAADe Conference Proceedings] Munich (Germany) 17-19 October 1991
summary The research of applications concerning design, sizing and building of computer models have been, during the last years, undoubtedly of great importance and interest. Therefore, analyzing in detail the graphic packages concerning drafting and solid modelling we can undoubtedly say that these are nowadays an integral part of our daily work. In the near future, we can of course expect from those applications, new studies and research mainly concerning an easier start up and the standardization of the graphic interface; if we analyze, for example the well known package AutoCAD we can consider a new real data-base and the redesign of the interface on a graphic base (graphic choice of drawings and blocks, icons for commands, better capability of text editing, pattern editing and stretching, loading capability and visualization of various drawings in graphic windows, full compatibility with MS WINDOWS, etc. etc.). As above mentioned, these studies work on updating well known existing applications aiming to consolidate their uses; one specific section of design not yet supported by computer application is that related to the management of technical and non-technical information, nowadays still written and stored on paper.

series eCAADe
email
last changed 2022/06/07 07:50

_id 26ef
authors Seebohm, Thomas
year 1991
title A Possible Palladian Villa
doi https://doi.org/10.52842/conf.acadia.1991.135
source Reality and Virtual Reality [ACADIA Conference Proceedings / ISBN 1-880250-00-4] Los Angeles (California - USA) October 1991, pp. 135-166
summary Ever since Wittkower published Architectural Principles in the Age of Humanism in 1949, in which he showed that Palladio's villa plans are based on a tartan grid, it seemed that Palladio's design principles had been encapsulated. When subsequently, in 1978, Mitchell and Stiny enunciated all the topological possibilities for Palladian villa plans, it appeared that the case was closed. Freedman and Hersey have since shown that it is precisely in the application of specific building dimensions and proportions that additional design rules come into play, however. The present study builds on the work of Freedman and Hersey. It uses and extends their method which involves incorporation of the known design principles for Palladian villas, as given implicitly in Palladio's Four Books of Architecture and in his built works, into a computer program capable of generating schematic plans and elevations based on those principles and visually comparing the generated plans and elevations with the known works of Palladio. In cases of disagreement, the reasons for the disagreement help formulate further design rules.
series ACADIA
email
last changed 2022/06/07 07:56

_id 870b
authors Sivaloganathan, Sangarappillai
year 1991
title Sketching input for computer aided engineering
source City University, Department of Mechanical Engineering and Aeronautics
summary The design process often begins with a graphical description of the proposed device or system and sketching is the physical expression of the design engineer's thinking process. Computer Aided Design is a technique in which man and machine are blended into a problem solving team, intimately coupling the best characteristics of each. Solid modelling is developed to act as the common medium between man and the computer. At present it is achieved mainly by designing with volumes and hence does not leave much room for sketching input, the traditional physical expression of the thinking process of the design engineer. This thesis describes a method of accepting isometric free hand sketching as the input to a solid model. The design engineer is allowed to make a sketch on top of a digitizer indicating (i) visible lines; (ii) hidden lines; (iii) construction lines; (iv) centre lines; (v) erased lines; and (vi) redundant lines as the input. The computer then processes this sketch by identifying the line segments, fitting the best possible lines, removing the erased lines, ignoring the redundant lines and finally merging the hidden lines and visible lines to form the lines in the solid in an interactive manner. The program then uses these lines and the information about the three dimensional origin of the object and produces three dimensional information such as the faces, loops, holes, rings, edges and vertices which are sufficient to build a solid model. This is achieved in the following manner. The points in the sketch are first written into a file. The computer than reads this file, breaks the group of points into sub-groups belonging to individual line segments, fits the best lines and identify the vertices in two dimensions. These improved lines in two dimensions are then merged to form the lines and vertices in the solid. These lines are then used together with the three dimensional origin (or any other point) to produce the wireframe model in three dimensions. The loops in the wireframe models are then identified and surface equations are fitted to these loops. Finally all the necessary inputs to build a B-rep solid model are produced.
series thesis:PhD
last changed 2003/02/12 22:37

_id f9bd
authors Amor, R.W.
year 1991
title ICAtect: Integrating Design Tools for Preliminary Architectural Design
source Wellington, New Zealand: Computer Science Department, Victoria University
summary ICAtect is a knowledge based system that provides an interface between expert systems, simulation packages and CAD systems used for preliminary architectural design. This thesis describes its structure and development.The principal work discussed in this thesis involves the formulation of a method for representing a building. This is developed through an examination of a number of design tools used in architectural design, and the ways in which each of these describe a building.Methods of enabling data to be transferred between design tools are explored. A Common Building Model (CBM), forming the core of the ICAtect system, is developed to represent the design tools knowledge of a building. This model covers the range of knowledge required by a large set of disparate design tools used by architects at the initial design stage.Standard methods of integrating information from the tools were examined, but required augmentation to encompass the unusual constraints found in some of the design tools. The integration of the design tools and the CBM is discussed in detail, with example methods developed for each type of design tool. These example methods provide a successful way of moving information between the different representations. Some problems with mapping data between very different representations were encountered in this process, and the solutions or ideas for remedies are detailed. A model for control and use of ICAtect is developed in the thesis, and the extensions to enable a graphical user interface are discussed.The methods developed in this thesis demonstrate the feasibility of an integrated system of this nature, while the discussion of future work indicates the scope and potential power of ICAtect.
series other
last changed 2003/04/23 15:14

_id 792a
authors Blaschke, Thomas and Tiede, Dirk
year 2003
title Bridging GIS-based landscape analysis/modelling and 3D-simulation.Is this already 4D?
source CORP 2003, Vienna University of Technology, 25.2.-28.2.2003 [Proceedings on CD-Rom]
summary Several studies have used remote sensing to map patterns of e.g. deforestation or to analyse the rates of land use change. Thesestudies have proven useful for interpreting the causes of urbanization, deforestation etc. and the impact of such changes on theregion. Monitoring of change (e.g. deforestation or reforestation) is frequently perceived as one of the most important contributionsof remote sensing technology to the study of global ecological and environmental change (Roughgarden et al. 1991). Manyresearchers believe that the integration of remote sensing techniques within analysis of environmental change is essential if ecologistsare to meet the challenges of the future, specifically issues relating to global change; however, in practice, this integration has so farbeen limited (Griffiths & Mather 2000). Considerable difficulties are encountered in linking, on the one hand, the biologies oforganisms and the ecologies of populations to the fluxes of material and energy quantifiable at the level of ecosystems. In this paper,we concentrate on the methodological aspects of the delineation of landscape objects and touch the ecological application onlysuperficially but we elucidate the potential of the proposed methodology for several ecological applications briefly.
series other
email
last changed 2003/11/21 15:16

_id 227a
authors Bourdeau, L., Dubois, A.-M. and Poyet, P.
year 1991
title A Common Data Model for Computer Integrated Building
source computer Integrated Future, CIB W78 Seminar. September, 1991. Unnumbered : some ill. includes bibliography
summary The connection of various building performance evaluation tools in a collaborative way is an essential request to develop true CAD systems. It is a basic requirement for the future of integrated information systems for building projects, where data concerning multiple aspects of the project can be exchanged during the different design steps. This paper deals with the on-going research concerning the generation of a common data model in the framework of a European collaborative action, the COMBINE Project, which is supported by the CEC, General Directorate XII for Research Science and Development, within the JOULE programme. The first step of the research concerns the progressive construction of a conceptual model and the paper focuses on the development of this Integrated Data Model (IDM). The paper reports on the definition of the architecture of the IDM. The main issues and the methodology of the IDM development are presented. The IDM development methodology is based on successive steps dealing with the identification of the data and context which are considered by the Design Tool Prototypes (DTP) to be connected through the IDM, the conceptual integration of this knowledge, and the implementation of the model on an appropriate software environment
keywords standards, integration, communication, building, evaluation, modeling
series CADline
last changed 2003/06/02 14:41

For more results click below:

this is page 0show page 1show page 2show page 3show page 4show page 5... show page 11HOMELOGIN (you are user _anon_339776 from group guest) CUMINCAD Papers Powered by SciX Open Publishing Services 1.002