CumInCAD is a Cumulative Index about publications in Computer Aided Architectural Design
supported by the sibling associations ACADIA, CAADRIA, eCAADe, SIGraDi, ASCAAD and CAAD futures

PDF papers
References

Hits 1 to 20 of 79

_id b190
authors Goldberg, Adele and Robson, David
year 1983
title Smalltalk-80: The language and its implementation
source New York, NY: Addison Wesley Co
summary Smalltalk-80 is the classic standard Smalltalk language as described in Smalltalk-80: The Language and Its Implementation by Goldberg and Robson. This book is commonly called "the Blue Book". Squeak implements the dialect of Smalltalk described in this book, but has a different implementation. Overview of the Smalltalk Language Smalltalk is a general purpose, high level programming language. It was the first original "pure" object oriented language, but not the first to use the object oriented concept, which is credited to Simula 67. The explosive growth of Object Oriented Programming (OOP) technologies began in the early 1980's, with Smalltalk's introduction. Behind it was the idea that the individual human user should be the most important component of any computing system, and that programming should be a natural extension of thinking, and also a dynamic and evolutionary process consistent with the model of human learning activity. In Smalltalk, these ideas are embodied in a framework for human-computer communication. In a sense, Smalltalk is yet another language like C and Pascal, and programs can be written in Smalltalk that have the look and feel of such conventional languages. The difference lies * in the amount of code that can be reduced, * less cryptic syntax, * and code that is easier to handle for application maintenance and enhancement. But Smalltalk's most powerful feature is easy code reuse. Smalltalk makes reuse of programs, routines, and subroutines (methods) far easier. Though procedural languages allow reuse too, it is harder to do, and much easier to cheat. It is no surprise that Smalltalk is relatively easy to learn, mainly due to its simple syntax and semantics, as well as few concepts. Objects, classes, messages, and methods form the basis of programming in Smalltalk. The general methodology to use Smalltalk The notion of human-computer interface also results in Smalltalk promoting the development of safer systems. Errors in Smalltalk may be viewed as objects telling users that confusion exists as to how to perform a desired function.
series other
last changed 2003/04/23 15:14

_id ga9809
id ga9809
authors Kälviäinen, Mirja
year 1998
title The ideological basis of generative expression in design
source International Conference on Generative Art
summary This paper will discuss issues concerning the design ideology supporting the use and development of generative design. This design ideology is based on the unique qualities of craft production and on the forms or ideas from nature or the natural characteristics of materials. The main ideology presented here is the ideology of the 1980´s art craft production in Finland. It is connected with the general Finnish design ideology and with the design ideology of other western countries. The ideology for these professions is based on the common background of design principles stated in 19th century England. The early principles developed through the Arts and Crafts tradition which had a great impact on design thinking in Europe and in the United States. The strong continuity of this design ideology from 19th century England to the present computerized age can be detected. The application of these design principles through different eras shows the difference in the interpretations and in the permission of natural decorative forms. The ideology of the 1980ïs art craft in Finland supports the ideas and fulfilment of generative design in many ways. The reasons often given as the basis for making generative design with computers are in very many respects the same as the ideology for art craft. In Finland there is a strong connection between art craft and design ideology. The characteristics of craft have often been seen as the basis for industrial design skills. The main themes in the ideology of the 1980´s art craft in Finland can be compared to the ideas of generative design. The main issues in which the generative approach reflects a distinctive ideological thinking are: Way of Life: The work is the communication of the maker´s inner ideas. The concrete relationship with the environment, personality, uniqueness, communication, visionary qualities, development and growth of the maker are important. The experiments serve as a media for learning. Taste and Aesthetic Education: The real love affair is created by the non living object with the help of memories and thought. At their best objects create the basis in their stability and communication for durable human relationships. People have warm relationships especially with handmade products in which they can detect unique qualities and the feeling that the product has been made solely for them. Counter-culture: The aim of the work is to produce alternatives for technoburocracy and mechanical production and to bring subjective and unique experiences into the customerïs monotonious life. This ideology rejects the usual standardized mass production of our times. Mythical character: There is a metamorphosis in the birth of the product. In many ways the design process is about birth and growth. The creative process is a development story of the maker. The complexity of communication is the expression of the moments that have been lived. If you can sense the process of making in the product it makes it more real and nearer to life. Each piece of wood has its own beauty. Before you can work with it you must find the deep soul of its quality. The distinctive traits of the material, technique and the object are an essential part of the metamorphosis which brings the product into life. The form is not only for formïs sake but for other purposes, too. You cannot find loose forms in nature. Products have their beginnings in the material and are a part of the nature. This art craft ideology that supports the ideas of generative design can be applied either to the hand made crafts production or to the production exploiting new technology. The unique characteristics of craft and the expression of the material based development are a way to broaden the expression and forms of industrial products. However, for a crafts person it is not meaningful to fill the world with objects. In generative, computer based production this is possible. But maybe the production of unique pieces is still slower and makes the industrial production in that sense more ecological. People will be more attached to personal and unique objects, and thus the life cycle of the objects produced will be longer.
series other
email
more http://www.generativeart.com/
last changed 2003/08/07 17:25

_id eea9
authors Weiler, Kevin
year 1980
title Polygon Comparison Using a Graph Representation
source SIGGRAPH '80 Conference Proceedings July, 1980. vol. 14 ;no. 3: pp. 10-18 : ill. includes bibliography.
summary All of the information necessary to perform the polygon set operations (union, intersection, and difference) and therefore polygon clipping can be generated by a single application of a process called polygon comparison. This process accepts two or more input polygons and generates one or more polygons as output. These output polygons contain unique homogenous areas, each falling within the domain of one or more input polygons. Each output polygon is classified by the list of input polygons in which its area may be found. The union contour of all input is also generated, completing all of the information necessary to perform the polygon set operations. This paper introduces a polygon comparison algorithm which features reduced complexity due to its use of a graph data representation. The paper briefly introduces some of the possible approaches to the general problem of polygon comparison including the polygon set and clipping problems. The new algorithm is then introduced and explained in detail. The algorithm is sufficiently general to compare sets of concave polygons with holes. More than two polygons can be compared at one time; all information for future comparisons of subsets of the original input polygon sets is available from the results of the initial application of the process. The algorithm represents polygons using a graph of the boundaries of the polygons. These graphs are imbedded in a two dimensional geometric space. The use of the graph representation simplifies the comparison process considerably by eliminating many special cases from explicit consideration. Polygon operations like the ones described above are useful in a variety of application areas, especially those which deal with problems involving two dimensional or projected two dimensional geometric areas. Examples include VLSI circuit design, cartographic and demographic applications, and polygon clipping for graphic applications such as viewport clipping, hidden surface and line removal, detailing, and shadowing
keywords boolean operations, clipping, graphs, polygons, computational geometry, algorithms
series CADline
last changed 1999/02/12 15:10

_id cf2011_p170
id cf2011_p170
authors Barros, Mário; Duarte José, Chaparro Bruno
year 2011
title Thonet Chairs Design Grammar: a Step Towards the Mass Customization of Furniture
source Computer Aided Architectural Design Futures 2011 [Proceedings of the 14th International Conference on Computer Aided Architectural Design Futures / ISBN 9782874561429] Liege (Belgium) 4-8 July 2011, pp. 181-200.
summary The paper presents the first phase of research currently under development that is focused on encoding Thonet design style into a generative design system using a shape grammar. The ultimate goal of the work is the design and production of customizable chairs using computer assisted tools, establishing a feasible practical model of the paradigm of mass customization (Davis, 1987). The current research step encompasses the following three steps: (1) codification of the rules describing Thonet design style into a shape grammar; (2) implementing the grammar into a computer tool as parametric design; and (3) rapid prototyping of customized chair designs within the style. Future phases will address the transformation of the Thonet’s grammar to create a new style and the production of real chair designs in this style using computer aided manufacturing. Beginning in the 1830’s, Austrian furniture designer Michael Thonet began experimenting with forming steam beech, in order to produce lighter furniture using fewer components, when compared with the standards of the time. Using the same construction principles and standardized elements, Thonet produced different chairs designs with a strong formal resemblance, creating his own design language. The kit assembly principle, the reduced number of elements, industrial efficiency, and the modular approach to furniture design as a system of interchangeable elements that may be used to assemble different objects enable him to become a pioneer of mass production (Noblet, 1993). The most paradigmatic example of the described vision of furniture design is the chair No. 14 produced in 1858, composed of six structural elements. Due to its simplicity, lightness, ability to be stored in flat and cubic packaging for individual of collective transportation, respectively, No. 14 became one of the most sold chairs worldwide, and it is still in production nowadays. Iconic examples of mass production are formally studied to provide insights to mass customization studies. The study of the shape grammar for the generation of Thonet chairs aimed to ensure rules that would make possible the reproduction of the selected corpus, as well as allow for the generation of new chairs within the developed grammar. Due to the wide variety of Thonet chairs, six chairs were randomly chosen to infer the grammar and then this was fine tuned by checking whether it could account for the generation of other designs not in the original corpus. Shape grammars (Stiny and Gips, 1972) have been used with sucesss both in the analysis as in the synthesis of designs at different scales, from product design to building and urban design. In particular, the use of shape grammars has been efficient in the characterization of objects’ styles and in the generation of new designs within the analyzed style, and it makes design rules amenable to computers implementation (Duarte, 2005). The literature includes one other example of a grammar for chair design by Knight (1980). In the second step of the current research phase, the outlined shape grammar was implemented into a computer program, to assist the designer in conceiving and producing customized chairs using a digital design process. This implementation was developed in Catia by converting the grammar into an equivalent parametric design model. In the third phase, physical models of existing and new chair designs were produced using rapid prototyping. The paper describes the grammar, its computer implementation as a parametric model, and the rapid prototyping of physical models. The generative potential of the proposed digital process is discussed in the context of enabling the mass customization of furniture. The role of the furniture designer in the new paradigm and ideas for further work also are discussed.
keywords Thonet; furniture design; chair; digital design process; parametric design; shape grammar
series CAAD Futures
email
last changed 2012/02/11 19:21

_id e825
authors Baybars, Ilker and Eastman, Charles M.
year 1980
title Enumerating Architectural Arrangements by Generating Their Underlying Graphs
source Environment and Planning B. 1980. vol. 7: pp. 289- 310 : ill. includes bibliography. -- See also 'Enumerating Architectural Arrangements: Comment on a Recent Paper by Baybars and Eastman' by C.F. Earl
summary One mathematical correspondence to the partitioning of the plane is a Weighted Plane Graph (WPG). This paper first focuses on the systematic generation of WPGs, in a fashion similar to crystal growth. During this process, the WPGs are represented by adjacency matrices. The authors, thus, present a method for embedding the WPG in the plane, given its adjacency matrix. These graphs can, then, be mapped into floor plans. The common practice here is the use of the `geometric dual' of a WPG. The authors propose, instead, the use of the `Pseudogeometric dual' of a WPG directly to translate (part of) a design brief into alternative spatial layouts. Also discussed is the ability to create courtyards and/or circulation spaces given a specific WPG, without increasing the size of the problem
keywords enumeration, architecture, floor plans, graphs, design process, automation, algorithms, space allocation, CAD
series CADline
email
last changed 2003/05/17 10:15

_id 0189
authors Brodlie, K.W. (editor)
year 1980
title Mathematical Methods in Computer Graphics and Design
source xi, 147 p. : ill. New York: Academic Press, 1980. includes subject index
summary Based on the proceeding of the conference on mathematical methods in computer graphics and design, organized by the Institute of Mathematics and Its Applications and held at the university of Leicester on september 28th, 1978
keywords algorithms, geometric modeling, techniques, computer graphics, mathematics
series CADline
last changed 2003/06/02 13:58

_id ddss2004_ra-33
id ddss2004_ra-33
authors Diappi, L., P. Bolchim, and M. Buscema
year 2004
title Improved Understanding of Urban Sprawl Using Neural Networks
source Van Leeuwen, J.P. and H.J.P. Timmermans (eds.) Recent Advances in Design & Decision Support Systems in Architecture and Urban Planning, Dordrecht: Kluwer Academic Publishers, ISBN: 14020-2408-8, p. 33-49
summary It is widely accepted that the spatial pattern of settlements is a crucial factor affecting quality of life and environmental sustainability, but few recent studies have attempted to examine the phenomenon of sprawl by modelling the process rather than adopting a descriptive approach. The issue was partly addressed by models of land use and transportation which were mainly developed in the UK and US in the 1970s and 1980s, but the major advances were made in the area of modelling transportation, while very little was achieved in the area of spatial and temporal land use. Models of land use and transportation are well-established tools, based on explicit, exogenouslyformulated rules within a theoretical framework. The new approaches of artificial intelligence, and in particular, systems involving parallel processing, (Neural Networks, Cellular Automata and Multi-Agent Systems) defined by the expression “Neurocomputing”, allow problems to be approached in the reverse, bottom-up, direction by discovering rules, relationships and scenarios from a database. In this article we examine the hypothesis that territorial micro-transformations occur according to a local logic, i.e. according to use, accessibility, the presence of services and conditions of centrality, periphericity or isolation of each territorial “cell” relative to its surroundings. The prediction capabilities of different architectures of supervised Neural networks are implemented to the south Metropolitan area of Milan at two different temporal thresholds and discussed. Starting from data on land use in 1980 and 1994 and by subdividing the area into square cells on an orthogonal grid, the model produces a spatial and functional map of urbanisation in 2008. An implementation of the SOM (Self Organizing Map) processing to the Data Base allows the typologies of transformation to be identified, i.e. the classes of area which are transformed in the same way and which give rise to territorial morphologies; this is an interesting by-product of the approach.
keywords Neural Networks, Self-Organizing Maps, Land-Use Dynamics, Supervised Networks
series DDSS
last changed 2004/07/03 22:13

_id acadia07_164
id acadia07_164
authors Diniz, Nancy; Turner, Alasdair
year 2007
title Towards a Living Architecture
doi https://doi.org/10.52842/conf.acadia.2007.164
source Expanding Bodies: Art • Cities• Environment [Proceedings of the 27th Annual Conference of the Association for Computer Aided Design in Architecture / ISBN 978-0-9780978-6-8] Halifax (Nova Scotia) 1-7 October 2007, 164-173
summary Interaction is the latest currency in architecture, as responsive components are now reacting to the inhabitant of the space. These components are designed and installed by the architect with a view to the phenomenology of space, where the experience of the environment is previewed and pre-constructed before it is translated into the conception of the space. However, this traditional approach to new technology leaves no scope for the architecture to be alive in and of itself, and thus the installed piece quickly becomes just that—an installation: isolated and uncontained by its environment. In this paper, we argue that a way to approach a responsive architecture is to design for a piece that is truly living, and in order to propose a living architecture first we need to understand what the architecture of a living system is. This paper suggests a conceptual framework based on the theory of Autopoiesis in order to create a “self-producing” system through an experiment entitled, “The Life of a Wall” (Maturana and Varela 1980). The wall has a responsive membrane controlled by a genetic algorithm that reconfigures its behaviour and learns to adapt itself continually to the evolutionary properties of the environment, thus becoming a situated, living piece.
series ACADIA
email
last changed 2022/06/07 07:55

_id 6a59
authors Franklin, Randolph
year 1980
title A Linear Time Exact Hidden Surface Algorithm
source SIGGRAPH '80 Conference Proceedings. July, 1980. vol. 14 ; no. 3: pp. 117-133 : ill. includes bibliography
summary This Paper presents a new hidden surface algorithm. Its output is the set of the visible pieces of edges and faces, and is as accurate as the arithmetic precision of the computer. Thus calculating the hidden surfaces for a higher resolution device takes no more time. If the faces are independently and identically distributed, then the execution time is linear in the number of faces. In particular, the execution time does not increase with the depth complexity. This algorithm overlays a grid on the screen whose fineness depends on the number and size of the faces. Edges and faces are sorted into grid cells. Only objects in the same cell can intersect or hide each other. Also, if a face completely covers a cell then nothing behind it in the cell is relevant. Three programs have tested this algorithm. The first verified the variable grid concept on 50,000 intersecting edges. The second verified the linear time, fast speed, and irrelevance of depth complexity for hidden lines on 10,000 spheres. This also tested depth complexities up to 30, and showed that perspective scenes with the farther objects smaller are even faster to calculate. The third verified this for hidden surfaces on 3,000 squares
keywords hidden surfaces, algorithms, hidden lines, variables, grids, computer graphics, programming
series CADline
last changed 2003/06/02 13:58

_id b04c
authors Goerger, S., Darken, R., Boyd, M., Gagnon, T., Liles, S., Sullivan, J. and Lawson, J.
year 1996
title Spatial Knowledge Acquisition from Maps and Virtual Environments in Complex Architectural Space
source Proc. 16 th Applied Behavioral Sciences Symposium, 22-23 April, U.S. Airforce Academy, Colorado Springs, CO., 1996, 6-10
summary It has often been suggested that due to its inherent spatial nature, a virtual environment (VE) might be a powerful tool for spatial knowledge acquisition of a real environment, as opposed to the use of maps or some other two-dimensional, symbolic medium. While interesting from a psychological point of view, a study of the use of a VE in lieu of a map seems nonsensical from a practical point of view. Why would the use of a VE preclude the use of a map? The more interesting investigation would be of the value added of the VE when used with a map. If the VE could be shown to substantially improve navigation performance, then there might be a case for its use as a training tool. If not, then we have to assume that maps continue to be the best spatial knowledge acquisition tool available. An experiment was conducted at the Naval Postgraduate School to determine if the use of an interactive, three-dimensional virtual environment would enhance spatial knowledge acquisition of a complex architectural space when used in conjunction with floor plan diagrams. There has been significant interest in this research area of late. Witmer, Bailey, and Knerr (1995) showed that a VE was useful in acquiring route knowledge of a complex building. Route knowledge is defined as the procedural knowledge required to successfully traverse paths between distant locations (Golledge, 1991). Configurational (or survey) knowledge is the highest level of spatial knowledge and represents a map-like internal encoding of the environment (Thorndyke, 1980). The Witmer study could not confirm if configurational knowledge was being acquired. Also, no comparison was made to a map-only condition, which we felt is the most obvious alternative. Comparisons were made only to a real world condition and a symbolic condition where the route is presented verbally.
series other
last changed 2003/04/23 15:50

_id 66eb
authors Grayer, Alan R.
year 1980
title Alternative Approaches in Geometric Modelling
source Computer Aided Design. July, 1980. vol. 12: pp. 189-192 : ill. includes bibliography
summary As systems for computer-aided design and production of mechanical parts have developed there has arisen a need for techniques for the comprehensive description of the desired part, including its three-dimensional shape. The creation and manipulation of shapes is generally known as Geometric Modelling, but some misconceptions have arisen as to the true meaning and import of the term. The paper argues for a broad, flexible approach to the subject, allowing the use of many techniques suited to particular applications, unifying them through common data structures
keywords data structures, geometric modeling, solids
series CADline
last changed 2003/06/02 10:24

_id 76ce
authors Grimson, W.
year 1985
title Computational Experiments with a Feature Based Stereo Algorithm
source IEEE Trans. Pattern Anal. Machine Intell., Vol. PAMI-7, No. 1
summary Computational models of the human stereo system' can provide insight into general information processing constraints that apply to any stereo system, either artificial or biological. In 1977, Marr and Poggio proposed one such computational model, that was characterized as matching certain feature points in difference-of-Gaussian filtered images, and using the information obtained by matching coarser resolution representations to restrict the search'space for matching finer resolution representations. An implementation of the algorithm and'its testing on a range of images was reported in 1980. Since then a number of psychophysical experiments have suggested possible refinements to the model and modifications to the algorithm. As well, recent computational experiments applying the algorithm to a variety of natural images, especially aerial photographs, have led to a number of modifications. In this article, we present a version of the Marr-Poggio-Gfimson algorithm that embodies these modifications and illustrate its performance on a series of natural images.
series journal paper
last changed 2003/04/23 15:14

_id ddss2006-hb-187
id DDSS2006-HB-187
authors Lidia Diappi and Paola Bolchi
year 2006
title Gentrification Waves in the Inner-City of Milan - A multi agent / cellular automata model based on Smith's Rent Gap theory
source Van Leeuwen, J.P. and H.J.P. Timmermans (eds.) 2006, Innovations in Design & Decision Support Systems in Architecture and Urban Planning, Dordrecht: Springer, ISBN-10: 1-4020-5059-3, ISBN-13: 978-1-4020-5059-6, p. 187-201
summary The aim of this paper is to investigate the gentrification process by applying an urban spatial model of gentrification, based on Smith's (1979; 1987; 1996) Rent Gap theory. The rich sociological literature on the topic mainly assumes gentrification to be a cultural phenomenon, namely the result of a demand pressure of the suburban middle and upper class, willing to return to the city (Ley, 1980; Lipton, 1977, May, 1996). Little attempt has been made to investigate and build a sound economic explanation on the causes of the process. The Rent Gap theory (RGT) of Neil Smith still represents an important contribution in this direction. At the heart of Smith's argument there is the assumption that gentrification takes place because capitals return to the inner city, creating opportunities for residential relocation and profit. This paper illustrates a dynamic model of Smith's theory through a multi-agent/ cellular automata system approach (Batty, 2005) developed on a Netlogo platform. A set of behavioural rules for each agent involved (homeowner, landlord, tenant and developer, and the passive 'dwelling' agent with their rent and level of decay) are formalised. The simulations show the surge of neighbouring degradation or renovation and population turn over, starting with different initial states of decay and estate rent values. Consistent with a Self Organized Criticality approach, the model shows that non linear interactions at local level may produce different configurations of the system at macro level. This paper represents a further development of a previous version of the model (Diappi, Bolchi, 2005). The model proposed here includes some more realistic factors inspired by the features of housing market dynamics in the city of Milan. It includes the shape of the potential rent according to city form and functions, the subdivision in areal submarkets according to the current rents, and their maintenance levels. The model has a more realistic visualisation of the city and its form, and is able to show the different dynamics of the emergent neighbourhoods in the last ten years in Milan.
keywords Multi agent systems, Housing market, Gentrification, Emergent systems
series DDSS
last changed 2006/08/29 12:55

_id 0565
authors Oxman, Robert and Oxman, Rivka
year 1990
title The Computability of Architectural Knowledge
source The Electronic Design Studio: Architectural Knowledge and Media in the Computer Era [CAAD Futures ‘89 Conference Proceedings / ISBN 0-262-13254-0] Cambridge (Massachusetts / USA), 1989, pp. 171-185
summary In an important contribution to the theoretical foundation of design computing, Mitchell noted "an increasingly urgent need to establish a demonstrably sound, comprehensive, rigorously formalized theoretical foundation upon which to base practical software development efforts" (Mitchell, 1986). In this paper we propose such a theoretical framework. A basic assumption of this work is that the advancement of design computing is dependent upon the emergence of a rigorous formulation of knowledge in design. We present a model of knowledge in architectural design which suggests a promising conceptual basis for dealing with knowledge in computer-aided design systems. We require models which can represent the formal knowledge and manipulative operations of the designer in all of their complexity-that is formal models rather than just geometric models. Shape Grammars (Stiny,1980) represent an example of such models, and constitute a relatively high level of design knowledge as compared to, for example, use of symmetry operations to generate simple formal configurations. Building upon an understanding of the classes of design knowledge as the conceptual basis for formal modeling systems may contribute a new realization of the potential of the medium for design. This will require a comprehensive approach to the definition of architectural and design knowledge. We consider here the implications of a well-defined body of architectural and design knowledge for design education and the potential mutual interaction-in a knowledge-rich environment-of design learning and CAAD learning. The computational factors connected with the representation of design knowledge and its integration in design systems are among the key problems of CAAD. Mitchell's model of knowledge in design incorporates formal knowledge in a comprehensive, multi-level, hierarchical structure in which types of knowledge are correlated with computational concepts. In the main focus of this paper we present a structured, multi-level model of design knowledge which we discuss with respect to current architectural theoretical considerations. Finally, we analyze the computational and educational relevance of such models.
series CAAD Futures
email
last changed 2003/05/16 20:58

_id c5c4
authors Samet, Hanan
year 1980
title Region Representation : Quadtrees from Boundary Codes
source Communications of the ACM. March, 1980. vol. 23: pp. 163-170 : some ill. includes bibliography
summary An algorithm is presented for constructing a quadtree for a region given its boundary in the form of a chain code. Analysis of the algorithm reveals that its execution time is proportional to the product of the perimeter and the log of the diameter of the region
keywords representation, data structures, quadtree, image processing
series CADline
last changed 1999/02/12 15:09

_id 89c9
authors Sata, Toshio and Warman, Ernest (editor)
year 1980
title Man-Machine Communication in CAD/CAM
source IFIP WG5.2-5.3 Working Conference Proceedings on Man-Machine Communication in CAD/CAM. ix, 274 p. : ill. Amsterdam: North-Holland Pub. Co., 1980. Each paper has its own bibliography
summary Explores man-machine interaction in CAD/CAM, and environmental influences upon the design and manufacturing process
keywords What are the best tools to use in communications with a system? CAD, CAM, user interface
series CADline
last changed 2003/06/02 13:58

_id a5c3
authors Er, M.C.
year 1981
title The Relations of the Computation of Fibonnaci Numbers with the Polyphase Sort
source 8 p. Wollongong: Department of Computing Science, University of Wollongong, September, 1981. includes bibliography
summary The theory of polyphase sort has simplified the mathematical derivations of Wilson and Shortt's (1980) algorithm, and offered an intuitive explanation of why Gries and Levin's (1980), and Urbanek's (1980) algorithms work. The computation of order-k Fibonacci numbers is equivalent to moving a window of matrix upwards in a series of ideal distributions
keywords Fibonacci, sorting, mathematics, algorithms
series CADline
last changed 2003/06/02 13:58

_id acadia21_76
id acadia21_76
authors Smith, Rebecca
year 2021
title Passive Listening and Evidence Collection
doi https://doi.org/10.52842/conf.acadia.2021.076
source ACADIA 2021: Realignments: Toward Critical Computation [Proceedings of the 41st Annual Conference of the Association of Computer Aided Design in Architecture (ACADIA) ISBN 979-8-986-08056-7]. Online and Global. 3-6 November 2021. edited by B. Bogosian, K. Dörfler, B. Farahi, J. Garcia del Castillo y López, J. Grant, V. Noel, S. Parascho, and J. Scott. 76-81.
summary In this paper, I present the commercial, urban-scale gunshot detection system ShotSpotter in contrast with a range of ecological sensing examples which monitor animal vocalizations. Gunshot detection sensors are used to alert law enforcement that a gunshot has occurred and to collect evidence. They are intertwined with processes of criminalization, in which the individual, rather than the collective, is targeted for punishment. Ecological sensors are used as a “passive” practice of information gathering which seeks to understand the health of a given ecosystem through monitoring population demographics, and to document the collective harms of anthropogenic change (Stowell and Sueur 2020). In both examples, the ability of sensing infrastructures to “join up and speed up” (Gabrys 2019, 1) is increasing with the use of machine learning to identify patterns and objects: a new form of expertise through which the differential agendas of these systems are implemented and made visible. I trace the differential agendas of these systems as they manifest through varied components: the spatial distribution of hardware in the existing urban environment and / or landscape; the software and other informational processes that organize and translate the data; the visualization of acoustical sensing data; the commercial factors surrounding the production of material components; and the apps, platforms, and other forms of media through which information is made available to different stakeholders. I take an interpretive and qualitative approach to the analysis of these systems as cultural artifacts (Winner 1980), to demonstrate how the political and social stakes of the technology are embedded throughout them.
series ACADIA
type paper
email
last changed 2023/10/22 12:06

_id 9fcb
authors Steele, Guy Lewis
year 1980
title The Definition and Implementation of a Computer Programming Language Based Constraints
source MIT - AITR-595
summary The constraint paradigm is a model of computation in which values are deduced whenever possible, under the limitation that deductions be local in a certain sense. One may visualize a constraint 'program' as a network of devices connected by wires. Data values may flow along the wires, and computation is performed by the devices. A device computes using only locally available information (with a few exceptions), and places newly derived values on other, locally attached wires. In this way computed values are propagated. An advantage of the constraint paradigm (not unique to it) is that a single relationship can be used in more than one direction. The connections to a device are not labelled as inputs and outputs; a device will compute with whatever values are available, and produce as many new values as it can. General theorem provers are capable of such behavior, but tend to suffer from combinatorial explosion; it is not usually useful to derive all the possible consequences of a set of hypotheses. The constraint paradigm places a certain kind of limitation on the deduction process. The limitations imposed by the constraint paradigm are not the only one possible. It is argued, however, that they are restrictive enough to forestall combinatorial explosion in many interesting computational situations, yet permissive enough to allow useful computations in practical situations. Moreover, the paradigm is intuitive: It is easy to visualize the computational effects of these particular limitations, and the paradigm is a natural way of expressing programs for certain applications, in particular relationships arising in computer-aided design. A number of implementations of constraint-based programming languages are presented. A progression of ever more powerful languages is described, complete implementations are presented and design difficulties and alternatives are discussed. The goal approached, though not quite reached, is a complete programming system which will implicitly support the constraint paradigm to the same extent that LISP, say, supports automatic storage management.
series thesis:PhD
email
more ftp://publications.ai.mit.edu/ai-publications/pdf/AITR-595.pdf
last changed 2003/02/12 22:37

_id a2d4
authors Timmer, H.G. and Stern, J.M.
year 1980
title Computation of Global Geometric Properties of Solid Objects
source Computer Aided Design November, 1980. vol. 12: pp. 301-304 : ill. includes bibliography.
summary A computational scheme for determining global geometric properties of solid object models is presented. The method operates directly on the boundary representation of the model. The scheme is tested on a number of models produced by an experimental modeling system. Primitive objects combined for the tests are all represented in terms of parametric bicubic patches
keywords objects, solid modeling, computation, B-rep, curved surfaces
series CADline
last changed 2003/06/02 13:58

For more results click below:

this is page 0show page 1show page 2show page 3HOMELOGIN (you are user _anon_955196 from group guest) CUMINCAD Papers Powered by SciX Open Publishing Services 1.002