CumInCAD is a Cumulative Index about publications in Computer Aided Architectural Design
supported by the sibling associations ACADIA, CAADRIA, eCAADe, SIGraDi, ASCAAD and CAAD futures

PDF papers
References

Hits 1 to 20 of 98

_id a6f1
authors Bridges, A.H.
year 1986
title Any Progress in Systematic Design?
source Computer-Aided Architectural Design Futures [CAAD Futures Conference Proceedings / ISBN 0-408-05300-3] Delft (The Netherlands), 18-19 September 1985, pp. 5-15
summary In order to discuss this question it is necessary to reflect awhile on design methods in general. The usual categorization discusses 'generations' of design methods, but Levy (1981) proposes an alternative approach. He identifies five paradigm shifts during the course of the twentieth century which have influenced design methods debate. The first paradigm shift was achieved by 1920, when concern with industrial arts could be seen to have replaced concern with craftsmanship. The second shift, occurring in the early 1930s, resulted in the conception of a design profession. The third happened in the 1950s, when the design methods debate emerged; the fourth took place around 1970 and saw the establishment of 'design research'. Now, in the 1980s, we are going through the fifth paradigm shift, associated with the adoption of a holistic approach to design theory and with the emergence of the concept of design ideology. A major point in Levy's paper was the observation that most of these paradigm shifts were associated with radical social reforms or political upheavals. For instance, we may associate concern about public participation with the 1970s shift and the possible use (or misuse) of knowledge, information and power with the 1980s shift. What has emerged, however, from the work of colleagues engaged since the 1970s in attempting to underpin the practice of design with a coherent body of design theory is increasing evidence of the fundamental nature of a person's engagement with the design activity. This includes evidence of the existence of two distinctive modes of thought, one of which can be described as cognitive modelling and the other which can be described as rational thinking. Cognitive modelling is imagining, seeing in the mind's eye. Rational thinking is linguistic thinking, engaging in a form of internal debate. Cognitive modelling is externalized through action, and through the construction of external representations, especially drawings. Rational thinking is externalized through verbal language and, more formally, through mathematical and scientific notations. Cognitive modelling is analogic, presentational, holistic, integrative and based upon pattern recognition and pattern manipulation. Rational thinking is digital, sequential, analytical, explicatory and based upon categorization and logical inference. There is some relationship between the evidence for two distinctive modes of thought and the evidence of specialization in cerebral hemispheres (Cross, 1984). Design methods have tended to focus upon the rational aspects of design and have, therefore, neglected the cognitive aspects. By recognizing that there are peculiar 'designerly' ways of thinking combining both types of thought process used to perceive, construct and comprehend design representations mentally and then transform them into an external manifestation current work in design theory is promising at last to have some relevance to design practice.
series CAAD Futures
email
last changed 2003/11/21 15:16

_id 20ff
id 20ff
authors Derix, Christian
year 2004
title Building a Synthetic Cognizer
source Design Computation Cognition conference 2004, MIT
summary Understanding ‘space’ as a structured and dynamic system can provide us with insight into the central concept in the architectural discourse that so far has proven to withstand theoretical framing (McLuhan 1964). The basis for this theoretical assumption is that space is not a void left by solid matter but instead an emergent quality of action and interaction between individuals and groups with a physical environment (Hillier 1996). In this way it can be described as a parallel distributed system, a self-organising entity. Extrapolating from Luhmann’s theory of social systems (Luhmann 1984), a spatial system is autonomous from its progenitors, people, but remains intangible to a human observer due to its abstract nature and therefore has to be analysed by computed entities, synthetic cognisers, with the capacity to perceive. This poster shows an attempt to use another complex system, a distributed connected algorithm based on Kohonen’s self-organising feature maps – SOM (Kohonen 1997), as a “perceptual aid” for creating geometric mappings of these spatial systems that will shed light on our understanding of space by not representing space through our usual mechanics but by constructing artificial spatial cognisers with abilities to make spatial representations of their own. This allows us to be shown novel representations that can help us to see new differences and similarities in spatial configurations.
keywords architectural design, neural networks, cognition, representation
series other
type poster
email
more http://www.springer.com/computer/ai/book/978-1-4020-2392-7
last changed 2012/09/17 21:13

_id 409c
authors Akin, Omer, Flemming, Ulrich and Woodbury, Robert F.
year 1984
title Development of Computer Systems for Use in Architectural Education
source 1984. ii, 47 p. includes bibliography
summary Computers have not been used in education in a way that fosters intellectual development of alternate approaches to design. Sufficient theory exists to use computing devices to support other potentially fruitful approaches to design. A proposal is made for the development of a computer system for architectural education which is built upon a particular model for design, that of rational decision making. Within the framework provided by the model, a series of courseware development projects are proposed which together with hardware acquisitions constitute a comprehensive computer system for architectural education
keywords architecture, education, design, decision making
series CADline
email
last changed 2003/06/02 13:58

_id sigradi2015_9.347
id sigradi2015_9.347
authors Andrade, Eduardo; Orellana, Nicolas; Mesa, Javiera; Felmer, Patricio
year 2015
title Spatial Configuration and Sociaty. Comparison between the street market Tristan Matta and Tirso de Molina Market
source SIGRADI 2015 [Proceedings of the 19th Conference of the Iberoamerican Society of Digital Graphics - vol. 2 - ISBN: 978-85-8039-133-6] Florianópolis, SC, Brasil 23-27 November 2015, pp. 481-485.
summary This research aims to clarify how certain visual and accessibility patterns, in buildings and urban environments, are related to social activities that take place in them. The study, based on the theory of space syntax (Hillier & Hanson 1984; Hillier, 1996), seeks to recognize patterns of behavior, both individual and aggregate. The case studies are Tirso de Molina Market and the free street market Tristan Matta, both in Santiago de Chile.
keywords pace Syntax, Visibilidad, Accesibilidad, Conectividad, Comportamiento
series SIGRADI
email
last changed 2016/03/10 09:47

_id d5c8
authors Angelo, C.V., Bueno, A.P., Ludvig, C., Reis, A.F. and Trezub, D.
year 1999
title Image and Shape: Two Distinct Approaches
source III Congreso Iberoamericano de Grafico Digital [SIGRADI Conference Proceedings] Montevideo (Uruguay) September 29th - October 1st 1999, pp. 410-415
summary This paper is the result of two researches done at the district of Campeche, Florianópolis, by the Grupo PET/ARQ/UFSC/CAPES. Different aspects and conceptual approaches were used to study the spatial attributes of this district located in the Southern part of Santa Catarina Island. The readings and analysis of two researches were based on graphic pistures builded with the use of Corel 7.0 e AutoCadR14. The first research – "Urban Development in the Island of Santa Catarina: Public Space Study"- examined the urban structures of Campeche based on the Spatial Syntax Theory developed by Hillier and Hanson (1984) that relates form and social appropriation of public spaces. The second research – "Topoceptive Characterisation of Campeche: The Image of a Locality in Expansion in the Island of Santa Catarina" -, based on the methodology developed by Kohlsdorf (1996) and also on the visual analysis proposed by Lynch (1960), identified characteristics of this locality with the specific goal of selecting attributes that contributed to the ideas of the place its population held. The paper consists of an initial exercise of linking these two methods in order to test the complementarity of their analytical tools. Exemplifying the analytical procedures undertaken in the two approaches, the readings done - global (of the locality as a whole) and partial (from parts of the settlement) - are presented and compared.
series SIGRADI
email
last changed 2016/03/10 09:47

_id 36
authors González, Carlos Guillermo
year 1998
title Una TecnologÌa Digital Para el Diseño: El Tde-Ac (A Digital Technology for Design: The Tde-Ac)
source II Seminario Iberoamericano de Grafico Digital [SIGRADI Conference Proceedings / ISBN 978-97190-0-X] Mar del Plata (Argentina) 9-11 september 1998, pp. 274-279
summary TDE is a graphic language capable of notation of pure design operations, which offers an alternative to Monge and Perspective drawing. This language which was perfected and developed by Claudio Guerri in the late 80's, is originated in the Theory of Spatial Delimitation of CÈsar Janello (1974-1984). From 1995 onwards, and within the framework of the UBACyT AR025 Project (1995-1997), a software in order to apply the TDE through computer technology started to be developed. This work is carried out within the framework of the research program SPATIAL SEMIOTICS-DESIGN THEORY of the FADU-UBA directed by Claudio Guerri, and is continued in the UBACyT AR01 4 Project (1998-2000) "TDE-AC. Graphic language. TDE computer assisted". The computer tool TDE-AC, adds to this graphic language the power of the processing speed and a certain autonomy of interpretation and execution of design operations, which enables to visualize results with a remarkable speed in relation with manual or intellectual work in front of the drawing table. Trough the amplified projection on the screens of the program the stage of development and effectivity of TDE-AC will be demonstrated.
series SIGRADI
email
last changed 2016/03/10 09:52

_id ddss2008-02
id ddss2008-02
authors Gonçalves Barros, Ana Paula Borba; Valério Augusto Soares de Medeiros, Paulo Cesar Marques da Silva and Frederico de Holanda
year 2008
title Road hierarchy and speed limits in Brasília/Brazil
source H.J.P. Timmermans, B. de Vries (eds.) 2008, Design & Decision Support Systems in Architecture and Urban Planning, ISBN 978-90-6814-173-3, University of Technology Eindhoven, published on CD
summary This paper aims at exploring the theory of the Social Logic of Space or Space Syntax as a strategy to define parameters of road hierarchy and, if this use is found possible, to establish maximum speeds allowed in the transportation system of Brasília, the capital city of Brazil. Space Syntax – a theory developed by Hillier and Hanson (1984) – incorporates the space topological relationships, considering the city shape and its influence in the distribution of movements within the space. The theory’s axiality method – used in this study – analyses the accessibility to the street network relationships, by means of the system’s integration, one of its explicative variables in terms of copresence, or potential co-existence between the through-passing movements of people and vehicles (Hillier, 1996). One of the most used concepts of Space Syntax in the integration, which represents the potential flow generation in the road axes and is the focus of this paper. It is believed there is a strong correlation between urban space-form configuration and the way flows and movements are distributed in the city, considering nodes articulations and the topological location of segments and streets in the grid (Holanda, 2002; Medeiros, 2006). For urban transportation studies, traffic-related problems are often investigated and simulated by assignment models – well-established in traffic studies. Space Syntax, on the other hand, is a tool with few applications in transport (Barros, 2006; Barros et al, 2007), an area where configurational models are considered to present inconsistencies when used in transportation (cf. Cybis et al, 1996). Although this is true in some cases, it should not be generalized. Therefore, in order to simulate and evaluate Space Syntax for the traffic approach, the city of Brasília was used as a case study. The reason for the choice was the fact the capital of Brazil is a masterpiece of modern urban design and presents a unique urban layout based on an axial grid system considering several express and arterial long roads, each one with 3 to 6 lanes,
keywords Space syntax, road hierarchy
series DDSS
last changed 2008/09/01 17:06

_id c1ae
authors Gulliehsen, Eric and Chang, Ernest
year 1984
title An Expert System for Generative Architectural Design
source December, 1984. pp. 253-267. includes bibliography
summary The mathematician-architect Christopher Alexander has devised a scientific theory of architectural design. He believes that all existing architectural entities can be described as interacting patterns, all possible relationships of which are governed by generative rules. These form a pattern language capable of generating design forms appropriate to a given environmental context. The complexity of interaction among these rules leads to difficulties in their representation by conventional methods. This paper presents a computer-based expert system which implements Alexander's design methodology
keywords synthesis, expert systems, CAD, patterns, design, methods, architecture, theory
series CADline
last changed 2003/06/02 10:24

_id 63a9
authors Hellgardt, Michael
year 1993
title Architectural Theory and Design Grammars
doi https://doi.org/10.52842/conf.ecaade.1993.x.i6u
source [eCAADe Conference Proceedings] Eindhoven (The Netherlands) 11-13 November 1993
summary The idea of artificial brains and artificial intelligence (AI) has been subject to criticism. The objection of J. Searle, for instance, which has been published in 1984 and which was partially directly addressed to one of the centres of AI, the Carnegie Mellon University in Pittsburgh, is mainly based on two points: (1) interactions between physiological and mental functions, and (2) the intentionality and context-relatedness of meaning. - With an emphasis on architectural design, this paper is about the second point, because the problem of meaning is a neuralgic point in the discussion of "artificial intelligence in design" (AID). Technical parameters are incompatible with mechanisms of meaning in any field of artistic, cultural or non-technical expression. This point, that is the relation between acts of meaning and acts of technical problem-solving and, connectedly, the relation between technological and architectural design, has been widely ignored in the discussion on AID. The development seems to be dominated by the tacit assumption that architecture can be articulated and generated purely in technical and formal terms of information processing beyond the field of architecture itself. Design and shape grammars have become a well established field in the discussion of AID, also with respect to architecture. But questions of architectural history and theory are touched on only incidentally and not sufficiently in this discussion. The problem is not, in other words, simply to include more or less unrelated cases of architecture, or architectural concepts -even if these are famous ones, such as Laugier's original hut for instance but to establish structural relations between arguments of architectural theory and arguments of AID.

series eCAADe
email
last changed 2022/06/07 07:50

_id ebcc
authors Kolb, David A.
year 1984
title Experiential Learning
source Prentice Hall
summary In his book Experiential Learning: Experience as the Source of Learning and Development (1984), David Kolb introduces his experiential learning theory and provides a model for its application in schools, organizations, and virtually anywhere people are gathered together. Kolb's comprehensive and practical theory builds on the rich foundations of experience-based learning provided by John Dewey, Kurt Lewin, and Jean Piaget. We first consider the roots of his theory following which we offer a summary of it in practice.
series other
last changed 2003/04/23 15:14

_id 20a8
authors Ruffle, Simon
year 1986
title How Can CAD Provide for the Changing Role of the Architect?
source Computer-Aided Architectural Design Futures [CAAD Futures Conference Proceedings / ISBN 0-408-05300-3] Delft (The Netherlands), 18-19 September 1985, pp. 197-199
summary At the RIBA Conference of 1981 entitled 'New Opportunities', and more recently at the 1984 ACA Annual Conference on 'Architects in Competition' there has been talk of marketing, new areas of practice, recapturing areas of practice lost to other professions, more accountability to client and public 'the decline of the mystique of the professional'. It is these issues, rather than technical advances in software and hardware, that will be the prime movers in getting computers into widespread practice in the future. In this chapter we will examine how changing attitudes in the profession might affect three practical issues in computing with which the author has been preoccupied in the past year. We will conclude by considering how, in future, early design stage computing may need to be linked to architectural theory, and, as this is a conference where we are encouraged to be outspoken, we will raise the issue of a computer-based theory of architecture.
series CAAD Futures
email
last changed 2003/05/16 20:58

_id c8c3
authors Schucker, Kurt J.
year 1984
title Fuzzy Sets, Natural Language Compositions, and Risk Analysis
source xv, 192 p. : ill. Rockville, Maryland: Computer science press, inc., 1984. includes bibliography: p. 155-185 and index
summary A new approach to analyze the risks a computer system may be subject to. A non-numeric method that allows natural language expression is presented. A tutorial for implementation of the ideas of fuzzy set theory in general, and of the linguistic approach to risk analysis in particular, are discussed
keywords natural languages, fuzzy logic, analysis, programming
series CADline
last changed 2003/06/02 13:58

_id 676a
authors Valiant, L.G.
year 1984
title A Theory of the Learnable
source Communications of the ACM. November,1984. vol. 27: pp. 1134-1142. includes bibliography
summary In this paper the author regards learning as the phenomenon of knowledge acquisition in the absence of explicit programming. The author gives a precise methodology for studying this phenomenon from a computational viewpoint. It consists of choosing an appropriate information gathering mechanism, the learning protocol, and exploring the class of concepts that can be learned using it in a reasonable (polynomial) number of steps. Although inherent algorithmic complexity appears to set serious limits on the range of concepts that can be learned, the author shows that there are some important nontrivial classes of propositional concepts that can be learned in a realistic sense
keywords AI, learning, natural languages, research, techniques, design, knowledge acquisition, theory
series CADline
last changed 2003/06/02 13:58

_id 44b1
authors Balas, Egon
year 1984
title On the Facial Structure of Scheduling Polyhedra
source 49 p., 6 p. of appendix : ill. Pittsburgh, PA: Design Research Center, Carnegie Mellon Univ., December, 1984. includes bibliography
summary A well-known job shop scheduling problem can be formulated as follows. Given a graph G with node set N and with directed and undirected arcs, find an orientation of the undirected arcs that minimizes the length of a longest path in G. The author treats the problem as a disjunctive program, without recourse to integer variables, and give a partial characterization of the scheduling polyhedron P(N), i.e., the convex hull of feasible schedules. In particular, he derives all the facets inducing inequalities for the scheduling polyhedron P(K) defined on some clique with node set K, and gives a sufficient condition for such inequalities to also induce facets of P(N). One of our results is that any inequality that induces a facet of P(H) for some HCK, also induces a facet of P(K). Another one is a recursive formula for deriving a facet inducing inequality with p positive coefficients from one with p-1 positive coefficients. The author also addresses the constraint identification problem, and gives a procedure for finding an inequality that cuts off a given solution to a subset of the constraints
keywords polyhedra, graphs, optimization, convex hull
series CADline
last changed 1999/02/12 15:07

_id 4685
authors Barsky, Brian A.
year 1984
title A Description and Evaluation of Various 3-D Models
source IEEE Computer Graphics and Applications. January, 1984. vol. 4: pp. 38-52 : ill. Includes bibliography
summary The use of parametric curves and surfaces for object modeling in computer graphics is becoming increasingly popular. There is sometimes, however, a reluctance to use them because it seems that the added power they give is more than offset by the complexity of their formulations and their computations. The purpose of this article is to clarify their meanings and uses and show how much they have in common behind the diversity of their formulations. The author discusses the properties and benefits of using the parametric Hermite, Coons, Bezier, B-spline, and Beta-spline curve and surface formulations
keywords Hermite, Coons, curved surfaces, Bezier, curves, B- splines, computational geometry, computer graphics
series CADline
last changed 2003/06/02 10:24

_id c9c1
authors Basili, Victor R. and Perricone, Barry T.
year 1984
title Software Errors and Complexity : An Empirical Investigation
source communications of the ACM. January, 1984. vol. 27: pp. 42-52 : ill. includes bibliography
summary The relationships between the frequency and distribution of errors during software development, the maintenance of the developed software, and a the influence of a variety of environmental factors on software development were analyzed. These factors include the complexity of the software, the developer's experience with the application, and the reuse of existing design and code. Such relationships can not only provide an insight into the characteristics of computer software development and the effects that the environment can have on the product, but also improve its reliability and quality. The study is based on data derived from a medium- scale software development project
keywords software, engineering, programming, reliability
series CADline
last changed 2003/06/02 13:58

_id ecaadesigradi2019_449
id ecaadesigradi2019_449
authors Becerra Santacruz, Axel
year 2019
title The Architecture of ScarCity Game - The craft and the digital as an alternative design process
doi https://doi.org/10.52842/conf.ecaade.2019.3.045
source Sousa, JP, Xavier, JP and Castro Henriques, G (eds.), Architecture in the Age of the 4th Industrial Revolution - Proceedings of the 37th eCAADe and 23rd SIGraDi Conference - Volume 3, University of Porto, Porto, Portugal, 11-13 September 2019, pp. 45-52
summary The Architecture of ScarCity Game is a board game used as a pedagogical tool that challenges architecture students by involving them in a series of experimental design sessions to understand the design process of scarcity and the actual relation between the craft and the digital. This means "pragmatic delivery processes and material constraints, where the exchange between the artisan of handmade, representing local skills and technology of the digitally conceived is explored" (Huang 2013). The game focuses on understanding the different variables of the crafted design process of traditional communities under conditions of scarcity (Michel and Bevan 1992). This requires first analyzing the spatial environmental model of interaction, available human and natural resources, and the dynamic relationship of these variables in a digital era. In the first stage (Pre-Agency), the game set the concept of the craft by limiting students design exploration from a minimum possible perspective developing locally available resources and techniques. The key elements of the design process of traditional knowledge communities have to be identified (Preez 1984). In other words, this stage is driven by limited resources + chance + contingency. In the second stage (Post-Agency) students taking the architects´ role within this communities, have to speculate and explore the interface between the craft (local knowledge and low technological tools), and the digital represented by computation data, new technologies available and construction. This means the introduction of strategy + opportunity + chance as part of the design process. In this sense, the game has a life beyond its mechanics. This other life challenges the participants to exploit the possibilities of breaking the actual boundaries of design. The result is a tool to challenge conventional methods of teaching and leaning controlling a prescribed design process. It confronts the rules that professionals in this field take for granted. The game simulates a 'fake' reality by exploring in different ways with surveyed information. As a result, participants do not have anything 'real' to lose. Instead, they have all the freedom to innovate and be creative.
keywords Global south, scarcity, low tech, digital-craft, design process and innovation by challenge.
series eCAADeSIGraDi
email
last changed 2022/06/07 07:54

_id 6050
authors Bentley, Jon L.
year 1984
title Algorithm Design Techniques -- Programming Pearls
source communications of the ACM. September, 1984. vol. 27: pp. 865-871 : ill
summary The problem arose in one-dimensional pattern recognition: The input is a vector X of N real numbers; the output is the maximum sum found in any contiguous subvector of the input. The problem is when some of the numbers are negative. This column is built around that problem with an emphasis on the algorithms that solve it and the techniques used to design them
keywords techniques, programming, algorithms, pattern recognition
series CADline
last changed 2003/06/02 13:58

_id 6118
authors Bentley, Jon L.
year 1984
title Code Tuning -- Programming Pearls
source communications of the ACM. February, 1984. vol. 27: pp. 91-96
summary Efficiency is one of many problems in programming, and there are many ways to achieve it. This column is about a low-level approach . 'Code tuning' locates the expensive parts of an existing program and then modifies that code to improve its performance
keywords programming, search, algorithms, techniques
series CADline
last changed 2003/06/02 13:58

_id c159
authors Bentley, Jon L.
year 1984
title A Case Study in Applied Algorithm Design
source IEEE Computer. February, 1984. vol. 17: pp. 75-88 : ill. tables. includes bibliography
summary In this article the author describes how an algorithm design was used in the development of a small routine in a software system
keywords algorithms, programming, techniques
series CADline
last changed 2003/06/02 13:58

For more results click below:

this is page 0show page 1show page 2show page 3show page 4HOMELOGIN (you are user _anon_945858 from group guest) CUMINCAD Papers Powered by SciX Open Publishing Services 1.002