CumInCAD is a Cumulative Index about publications in Computer Aided Architectural Design
supported by the sibling associations ACADIA, CAADRIA, eCAADe, SIGraDi, ASCAAD and CAAD futures

PDF papers
References

Hits 1 to 20 of 99

_id 812d
authors Peng, Q. S.
year 1984
title An Algorithm for Finding the Intersection Lines Between Two B-Spline Surfaces
source Computer Aided Design July, 1984. vol. 16: pp. 191-196 : ill. includes bibliography.
summary A divide-and-conquer algorithm is presented for finding all the intersection lines between two B-spline surfaces. Each surface is organized as an n-branch tree. For each intersection line, an initial point is detected after a depth-first search along one tree, i.e. the host tree. Extrapolation methods are then used to trace the entire length of the line, thus the line appears naturally in a continuous form. Efficiency of the algorithm is achieved by employing of an adaptive division strategy and by the careful choice of the representation basis of the patches on both surfaces
keywords logic, algorithms, B-splines, techniques, divide-and- conquer, intersection, curves, curved surfaces, representation
series CADline
last changed 2003/06/02 10:24

_id d5c8
authors Angelo, C.V., Bueno, A.P., Ludvig, C., Reis, A.F. and Trezub, D.
year 1999
title Image and Shape: Two Distinct Approaches
source III Congreso Iberoamericano de Grafico Digital [SIGRADI Conference Proceedings] Montevideo (Uruguay) September 29th - October 1st 1999, pp. 410-415
summary This paper is the result of two researches done at the district of Campeche, Florianópolis, by the Grupo PET/ARQ/UFSC/CAPES. Different aspects and conceptual approaches were used to study the spatial attributes of this district located in the Southern part of Santa Catarina Island. The readings and analysis of two researches were based on graphic pistures builded with the use of Corel 7.0 e AutoCadR14. The first research – "Urban Development in the Island of Santa Catarina: Public Space Study"- examined the urban structures of Campeche based on the Spatial Syntax Theory developed by Hillier and Hanson (1984) that relates form and social appropriation of public spaces. The second research – "Topoceptive Characterisation of Campeche: The Image of a Locality in Expansion in the Island of Santa Catarina" -, based on the methodology developed by Kohlsdorf (1996) and also on the visual analysis proposed by Lynch (1960), identified characteristics of this locality with the specific goal of selecting attributes that contributed to the ideas of the place its population held. The paper consists of an initial exercise of linking these two methods in order to test the complementarity of their analytical tools. Exemplifying the analytical procedures undertaken in the two approaches, the readings done - global (of the locality as a whole) and partial (from parts of the settlement) - are presented and compared.
series SIGRADI
email
last changed 2016/03/10 09:47

_id ecaadesigradi2019_449
id ecaadesigradi2019_449
authors Becerra Santacruz, Axel
year 2019
title The Architecture of ScarCity Game - The craft and the digital as an alternative design process
doi https://doi.org/10.52842/conf.ecaade.2019.3.045
source Sousa, JP, Xavier, JP and Castro Henriques, G (eds.), Architecture in the Age of the 4th Industrial Revolution - Proceedings of the 37th eCAADe and 23rd SIGraDi Conference - Volume 3, University of Porto, Porto, Portugal, 11-13 September 2019, pp. 45-52
summary The Architecture of ScarCity Game is a board game used as a pedagogical tool that challenges architecture students by involving them in a series of experimental design sessions to understand the design process of scarcity and the actual relation between the craft and the digital. This means "pragmatic delivery processes and material constraints, where the exchange between the artisan of handmade, representing local skills and technology of the digitally conceived is explored" (Huang 2013). The game focuses on understanding the different variables of the crafted design process of traditional communities under conditions of scarcity (Michel and Bevan 1992). This requires first analyzing the spatial environmental model of interaction, available human and natural resources, and the dynamic relationship of these variables in a digital era. In the first stage (Pre-Agency), the game set the concept of the craft by limiting students design exploration from a minimum possible perspective developing locally available resources and techniques. The key elements of the design process of traditional knowledge communities have to be identified (Preez 1984). In other words, this stage is driven by limited resources + chance + contingency. In the second stage (Post-Agency) students taking the architects´ role within this communities, have to speculate and explore the interface between the craft (local knowledge and low technological tools), and the digital represented by computation data, new technologies available and construction. This means the introduction of strategy + opportunity + chance as part of the design process. In this sense, the game has a life beyond its mechanics. This other life challenges the participants to exploit the possibilities of breaking the actual boundaries of design. The result is a tool to challenge conventional methods of teaching and leaning controlling a prescribed design process. It confronts the rules that professionals in this field take for granted. The game simulates a 'fake' reality by exploring in different ways with surveyed information. As a result, participants do not have anything 'real' to lose. Instead, they have all the freedom to innovate and be creative.
keywords Global south, scarcity, low tech, digital-craft, design process and innovation by challenge.
series eCAADeSIGraDi
email
last changed 2022/06/07 07:54

_id 6118
authors Bentley, Jon L.
year 1984
title Code Tuning -- Programming Pearls
source communications of the ACM. February, 1984. vol. 27: pp. 91-96
summary Efficiency is one of many problems in programming, and there are many ways to achieve it. This column is about a low-level approach . 'Code tuning' locates the expensive parts of an existing program and then modifies that code to improve its performance
keywords programming, search, algorithms, techniques
series CADline
last changed 2003/06/02 13:58

_id a6f1
authors Bridges, A.H.
year 1986
title Any Progress in Systematic Design?
source Computer-Aided Architectural Design Futures [CAAD Futures Conference Proceedings / ISBN 0-408-05300-3] Delft (The Netherlands), 18-19 September 1985, pp. 5-15
summary In order to discuss this question it is necessary to reflect awhile on design methods in general. The usual categorization discusses 'generations' of design methods, but Levy (1981) proposes an alternative approach. He identifies five paradigm shifts during the course of the twentieth century which have influenced design methods debate. The first paradigm shift was achieved by 1920, when concern with industrial arts could be seen to have replaced concern with craftsmanship. The second shift, occurring in the early 1930s, resulted in the conception of a design profession. The third happened in the 1950s, when the design methods debate emerged; the fourth took place around 1970 and saw the establishment of 'design research'. Now, in the 1980s, we are going through the fifth paradigm shift, associated with the adoption of a holistic approach to design theory and with the emergence of the concept of design ideology. A major point in Levy's paper was the observation that most of these paradigm shifts were associated with radical social reforms or political upheavals. For instance, we may associate concern about public participation with the 1970s shift and the possible use (or misuse) of knowledge, information and power with the 1980s shift. What has emerged, however, from the work of colleagues engaged since the 1970s in attempting to underpin the practice of design with a coherent body of design theory is increasing evidence of the fundamental nature of a person's engagement with the design activity. This includes evidence of the existence of two distinctive modes of thought, one of which can be described as cognitive modelling and the other which can be described as rational thinking. Cognitive modelling is imagining, seeing in the mind's eye. Rational thinking is linguistic thinking, engaging in a form of internal debate. Cognitive modelling is externalized through action, and through the construction of external representations, especially drawings. Rational thinking is externalized through verbal language and, more formally, through mathematical and scientific notations. Cognitive modelling is analogic, presentational, holistic, integrative and based upon pattern recognition and pattern manipulation. Rational thinking is digital, sequential, analytical, explicatory and based upon categorization and logical inference. There is some relationship between the evidence for two distinctive modes of thought and the evidence of specialization in cerebral hemispheres (Cross, 1984). Design methods have tended to focus upon the rational aspects of design and have, therefore, neglected the cognitive aspects. By recognizing that there are peculiar 'designerly' ways of thinking combining both types of thought process used to perceive, construct and comprehend design representations mentally and then transform them into an external manifestation current work in design theory is promising at last to have some relevance to design practice.
series CAAD Futures
email
last changed 2003/11/21 15:16

_id ea4c
authors Chang, Hsi and Iyengar, S. Sitharama
year 1984
title Efficient Algorithms to Globally Balance a Binary Search Tree
source Communications of the ACM. July, 1984. vol. 27: pp. 695-702. includes bibliography
summary A binary search tree can be globally balanced by readjustment of pointers or with a sorting process in O(n) time, n being the total number of nodes. This paper presents three global balancing algorithms, one of which uses folding with the other two adopting parallel procedures. These algorithms show improvement in time efficiency over some sequential algorithms when applied to large binary search trees. A comparison of various algorithms is presented
keywords techniques, parallel processing, algorithms, search, sorting
series CADline
last changed 2003/06/02 13:58

_id architectural_intelligence2023_10
id architectural_intelligence2023_10
authors Cheng Bi Duan, Su Yi Shen, Ding Wen Bao & Xin Yan
year 2023
title Innovative design solutions for contemporary Tou-Kung based on topological optimisation
doi https://doi.org/https://doi.org/10.1007/s44223-023-00028-x
source Architectural Intelligence Journal
summary Tou-Kung, which is pronounced in Chinese and known as Bracket Set (Liang & Fairbank, A pictorial history of Chinese architecture, 1984), is a vital support component in the Chinese traditional wooden tectonic systems. It is located between the column and the beam and connects the eave and pillar, making the heavy roof extend out of the eaves longer. The development of Tou-Kung is entirely a microcosm of the development of ancient Chinese architecture; the aesthetic structure and Asian artistic temperament behind Tou-Kung make it gradually become the cultural and spiritual symbol of traditional Chinese architecture. In the contemporary era, inheriting and developing Tou-Kung has become an essential issue. Several architects have attempted to employ new materials and techniques to integrate the traditional Tou-Kung into modern architectural systems, such as the China Pavilion at the 2010 World Expo and Yusuhara Wooden Bridge Museum. This paper introduces the topological optimisation method bi-directional evolutionary structural optimisation (BESO) for form-finding. BESO method is one of the most popular topology optimisation methods widely employed in civil engineering and architecture. Through analyzing the development trend of Tou-Kung and mechanical structure, the authors integrate 2D and 3D optimisation methods and apply the hybrid methods to form-finding. Meanwhile, mortise and tenon joint used to create stable connections with components of Tou-Kung are retained. This research aims to design a new Tou-Kung corresponding to “structural performance-based aesthetics”. The workflow proposed in this paper is valuable for Architrave and other traditional building components.
series Architectural Intelligence
email
last changed 2025/01/09 15:00

_id b8b9
authors Gibson, W.
year 1984
title Neuromancer
source Victor Gollancz
summary Here is the novel that started it all, launching the cyberpunk generation, and the first novel to win the holy trinity of science fiction: the Hugo Award, the Nebula Award and the Philip K. Dick Award. With Neuromancer, William Gibson introduced the world to cyberspace--and science fiction has never been the same. Case was the hottest computer cowboy cruising the information superhighway--jacking his consciousness into cyberspace, soaring through tactile lattices of data and logic, rustling encoded secrets for anyone with the money to buy his skills. Then he double-crossed the wrong people, who caught up with him in a big way--and burned the talent out of his brain, micron by micron. Banished from cyberspace, trapped in the meat of his physical body, Case courted death in the high-tech underworld. Until a shadowy conspiracy offered him a second chance--and a cure--for a price....
series other
last changed 2003/04/23 15:14

_id 61be
authors Goldberg, A.J.
year 1984
title Smalltalk-80: The Interactive Programming Environment
source Reading, MA: Addison-Wesley
summary This book describes the process by which Smalltalk was introduced to people outside Xerox PARC, where it was developed. This book first describes the incredibly exciting history of how Smalltalk was built from scratch. It then goes on to show the way in which Smalltalk was made public. At first, this was an engineering process. Large companies were contacted and offered to participate by porting the Smalltalk VM to their machines, and then running an image provided on tape. Each of these teams then wrote a paper on their experience, and these original papers are included in this book. Xerox PARC also wrote its own paper. These papers are an invaluable source of information for any Smalltalker. They range from overall design issues down to statistics on the work of the VM and image contents.
series other
last changed 2003/04/23 15:14

_id 653f
authors Hedelman, Harold
year 1984
title A Data Flow Approach to Procedural Modeling
source IEEE Computer Graphics and Applications January, 1984. vol. 4: pp. 16-26 : ill. (some col.). includes bibliography.
summary Computer graphics tasks generally involve either modeling or viewing. Modeling combines primitive building blocks (polygons, patches, etc.) into data structures that represent entire objects and scenes. To visualize a modeled object, its data structure is input to appropriate viewing routines. While a great deal has been done on modeling and viewing with geometric primitives, little has been published on the use of procedural primitives. A procedural model is a step-by-step guide for constructing a representation of an object or process, i.e., a program. It is also a function, a 'black box' with a set of inputs and outputs. Two questions are especially pertinent to the work presented in this article
keywords First, what are the advantages of both data flow methods and procedural modeling? Second, how can such models be used in composition? computer graphics, modeling, information, management
series CADline
last changed 1999/02/12 15:08

_id 40d6
authors Johnson, Robert E.
year 1984
title The Integration of Economic Analysis and Computer-based Building Models
source CIB W-65 Symposium. July, 1984. [19] p. : ill. includes bibliography
summary Most current methods used to evaluate the economics of building designs are inadequate in that they focus on the evaluation of completed designs and do not assist in the development and creation of designs. They are used after most major design decisions have been made. This paper describes the first year of a two year research project (funded by the National Science Foundation, the U.S. Government) which seeks to integrate economic analysis techniques into design decision-making within the context of an interactive computer-aided architectural and engineering design system. Issues reviewed include the current state of computer software, existing economic analysis models and existing economic analysis software. A conclusion is reached that most economic analysis systems fall into the category of single purpose software and are not adaptable to the wide range of idiosyncratic evaluation models used in real estate, architecture, engineering, construction and building management. Objectives are proposed for a general purpose, interactive cost modeling system that is integrated with a geometric computer-based building model. Initial experiments with a prototype of this system at various stages of the design-construction-use process are discussed. Further development of this system as a research tool for exploring alternative economic modeling procedures is presented
keywords analysis, evaluation, CAD, architecture, design, methods, economics, integration
series CADline
last changed 2003/06/02 13:58

_id ebcc
authors Kolb, David A.
year 1984
title Experiential Learning
source Prentice Hall
summary In his book Experiential Learning: Experience as the Source of Learning and Development (1984), David Kolb introduces his experiential learning theory and provides a model for its application in schools, organizations, and virtually anywhere people are gathered together. Kolb's comprehensive and practical theory builds on the rich foundations of experience-based learning provided by John Dewey, Kurt Lewin, and Jean Piaget. We first consider the roots of his theory following which we offer a summary of it in practice.
series other
last changed 2003/04/23 15:14

_id 4b27
authors Lansdown, John
year 1984
title Knowledge for Designers
source Architect`s journal. England: February, 1984. vol. 179: pp. 55-58
summary The first of two articles discussing expert systems. Both design and construction are carried out within the framework of empirical rules and regulations designed more for ease of implementation and checking than scientific validity. On completion of a building, little follow up research is done on the way it is used or on the way in which the assumption made in its design are borne out in practice. This present two problems: How to make information from disparate sources easily available to designers and constructors, and how to make them aware that they need this information. This paper describes how a special type of computer programming might assist in solving these problems
keywords design, construction, building, expert systems, knowledge base, systems, programming, life cycle
series CADline
last changed 1999/02/12 15:09

_id 4af9
authors Levy, Henry
year 1984
title VAXstation : A General-Purpose Raster Graphics Architecture
source ACM Transactions on Graphics. January, 1984. vol. 3: pp. 70-83 : ill. includes bibliography
summary A raster graphics architecture and a raster graphics device are described. The graphics architecture is an extension of the RasterOp model and supports operations for rectangle movement, text writing, curve drawing, flood, and fill. The architecture is intended for implementation by both closely and loosely coupled display subsystems. The first implementation of the architecture is a remote raster display connected by fiber optics to a VAX minicomputer. The device contains a separate microprocessor, frame buffer, and additional local memory: it is capable of executing raster commands on operands in local memory or VAX host memory
keywords hardware, computer graphics, technology
series CADline
last changed 1999/02/12 15:09

_id ac8b
authors Mitchell, W.
year 1984
title CAD Technology, Its Effects on Practice and the Response of Education - an Overview
doi https://doi.org/10.52842/conf.ecaade.1984.x.b3c
source The Third European Conference on CAD in the Education of Architecture [eCAADe Conference Proceedings] Helsinki (Finnland) 20-22 September 1984.
summary Related with the evolution of hardware there also is an evolution of CAD techniques. The very first CAD/CAM packages were developed on mainframes. They moved into practice when 16-bit minicomputers became available. The packages mainly were production drafting applications. The 32-bit super minicomputers give wider possibilities, but at the same time some software problems arise, namely the complexity of CAD- databases and the development and maintenance cost of large programs. With VLSI the distribution of intelligence becomes possible, the enthousiasm for CAD increases, but still the gap between available hardware and high quality software, remains high.Concerning CAD teaching there are severe problems. First of all there are not enough really good designers which know CAD in such a way that they can teach it. Second there is a shortage of equipment and a financial problem. Thirdly there is the question what the students need to know about CAD. which is not clear at the moment. At the University of California, Los Angeles, the following 5 subjects are teached: Computer Support, Computer Literacy, Professional Practice Implications, Exploration of CAD as a Design Medium and Theoretical Foundations of CAD. To use computers as a medium it is necessary to understand architecture, its objects, its operators and its evaluation criteria. The last topic is considered at research level.
series eCAADe
email
more www.ecaade.org
last changed 2022/06/07 07:50

_id 452c
authors Vanier, D. J. and Worling, Jamie
year 1986
title Three-dimensional Visualization: A Case Study
source Computer-Aided Architectural Design Futures [CAAD Futures Conference Proceedings / ISBN 0-408-05300-3] Delft (The Netherlands), 18-19 September 1985, pp. 92-102
summary Three-dimensional computer visualization has intrigued both building designers and computer scientists for decades. Research and conference papers present an extensive list of existing and potential uses for threedimensional geometric data for the building industry (Baer et al., 1979). Early studies on visualization include urban planning (Rogers, 1980), treeshading simulation (Schiler and Greenberg, 1980), sun studies (Anon, 1984), finite element analysis (Proulx, 1983), and facade texture rendering (Nizzolese, 1980). With the advent of better interfaces, faster computer processing speeds and better application packages, there had been interest on the part of both researchers and practitioners in three-dimensional -models for energy analysis (Pittman and Greenberg, 1980), modelling with transparencies (Hebert, 1982), super-realistic rendering (Greenberg, 1984), visual impact (Bridges, 1983), interference clash checking (Trickett, 1980), and complex object visualization (Haward, 1984). The Division of Building Research is currently investigating the application of geometric modelling in the building delivery process using sophisticated software (Evans, 1985). The first stage of the project (Vanier, 1985), a feasibility study, deals with the aesthetics of the mode. It identifies two significant requirements for geometric modelling systems: the need for a comprehensive data structure and the requirement for realistic accuracies and tolerances. This chapter presents the results of the second phase of this geometric modelling project, which is the construction of 'working' and 'presentation' models for a building.
series CAAD Futures
email
last changed 2003/05/16 20:58

_id 09e8
authors Wallace, Mark
year 1984
title Communicating with Databases in Natural Languages
source 170 p. West, Sussex, England: Ellis Horwood limited, 1984. includes bibliography: p.[163]-166 and index. -- (Ellis Horwood Series on Artificial Intelligence)
summary In the first chapters is a full description of natural languages and interface to relational database. Natural language processing and the use of PROLOG are discussed. Features include also a practical discussion of parsing natural language with accompanying programs in PROLOG
keywords natural languages, PROLOG, relational database, user interface
series CADline
last changed 1999/02/12 15:10

_id af76
authors Wong, Waycal C.H. and Will, Barry F.
year 1996
title An Analysis of Using a Digital 3D Sundial as a Design and Decision Support Tool
doi https://doi.org/10.52842/conf.caadria.1996.131
source CAADRIA ‘96 [Proceedings of The First Conference on Computer Aided Architectural Design Research in Asia / ISBN 9627-75-703-9] Hong Kong (Hong Kong) 25-27 April 1996, pp. 131-141
summary The rapid speed of computer development brings new technologies, and these advances require innovative investigations to apply them optimally in the field of architecture. Burkett (1984) demonstrated that computer graphics can ‘provide an excellent opportunity for exploring solar issues in building redesign’. With one of the latest computer technologies, the "hyper-model” environment, this research investigates how to environment can become an aid in the design and decision support area. The research first reviews the communication between the architect and the client as described by Salisbury (1990). The review indicates that an interactive 3D hypermedia paradigm, with quick response, fast data manipulation and 3D visualization, offers a better communication media between the architect and the client. This research applies the "hyper-model” environment to design and develop a new methodology in collecting, analyzing, and presenting solar data. It also endeavors to show the possibilities of using the environment in design process.
series CAADRIA
last changed 2022/06/07 07:57

_id f9f4
authors Cook, R.L., Porter, Th. and Carpenter, L.
year 1984
title Distributed Ray Tracing
source Computer Graphics, vol. 18, no. 3, pp. 137145, July 1984. SIGGRAPH '84 Proceedings
summary Ray tracing is one of the most elegant techniques in computer graphics. Many phenomena that are difficult or impossible with other techniques are simple with ray tracing, including shadows, reflections, and refracted light. Ray directions, however, have been determined precisely, and this has limited the capabilities of ray tracing. By distributing the directions of the rays according to the analytic function they sample, ray tracing can incorporate fuzzy phenomena. This provides correct and easy solutions to some previously unsolved or partially solved problems, including motion blur, depth of field, penumbras, translucency, and fuzzy reflections. Motion blur and depth of field calculations can be integrated with the visible surface calculations, avoiding the problems found in previous methods.
series journal paper
last changed 2003/04/23 15:14

_id 0589
authors Weghorst, H., Hooper, G., and Greenberg, D.
year 1984
title Improved Computational Methods for Ray Tracing
source ACM Trans. on Graphics, vol. 3, no. 1, pp. 52-69, Jan. 1984
summary This paper describes algorithmic procedures that have been implemented to reduce the computational expense of producing ray-traced images. The selection of bounding volumes is examined to reduce the computational cost of the ray-intersection test. The use of object coherence, which relies on a hierarchical description of the environment, is then presented. Finally, since the building of the ray-intersection trees is such a large portion of the computation, a method using image coherence is described. This visible-surface preprocessing method, which is dependent upon the creation of an "item buffer," takes advantage of a priori image information. Examples that indicate the efficiency of these techniques for a variety of representative environments are presented.
series other
last changed 2003/04/23 15:50

For more results click below:

this is page 0show page 1show page 2show page 3show page 4HOMELOGIN (you are user _anon_52362 from group guest) CUMINCAD Papers Powered by SciX Open Publishing Services 1.002