CumInCAD is a Cumulative Index about publications in Computer Aided Architectural Design
supported by the sibling associations ACADIA, CAADRIA, eCAADe, SIGraDi, ASCAAD and CAAD futures

PDF papers
References

Hits 1 to 20 of 75

_id cf2011_p170
id cf2011_p170
authors Barros, Mário; Duarte José, Chaparro Bruno
year 2011
title Thonet Chairs Design Grammar: a Step Towards the Mass Customization of Furniture
source Computer Aided Architectural Design Futures 2011 [Proceedings of the 14th International Conference on Computer Aided Architectural Design Futures / ISBN 9782874561429] Liege (Belgium) 4-8 July 2011, pp. 181-200.
summary The paper presents the first phase of research currently under development that is focused on encoding Thonet design style into a generative design system using a shape grammar. The ultimate goal of the work is the design and production of customizable chairs using computer assisted tools, establishing a feasible practical model of the paradigm of mass customization (Davis, 1987). The current research step encompasses the following three steps: (1) codification of the rules describing Thonet design style into a shape grammar; (2) implementing the grammar into a computer tool as parametric design; and (3) rapid prototyping of customized chair designs within the style. Future phases will address the transformation of the Thonet’s grammar to create a new style and the production of real chair designs in this style using computer aided manufacturing. Beginning in the 1830’s, Austrian furniture designer Michael Thonet began experimenting with forming steam beech, in order to produce lighter furniture using fewer components, when compared with the standards of the time. Using the same construction principles and standardized elements, Thonet produced different chairs designs with a strong formal resemblance, creating his own design language. The kit assembly principle, the reduced number of elements, industrial efficiency, and the modular approach to furniture design as a system of interchangeable elements that may be used to assemble different objects enable him to become a pioneer of mass production (Noblet, 1993). The most paradigmatic example of the described vision of furniture design is the chair No. 14 produced in 1858, composed of six structural elements. Due to its simplicity, lightness, ability to be stored in flat and cubic packaging for individual of collective transportation, respectively, No. 14 became one of the most sold chairs worldwide, and it is still in production nowadays. Iconic examples of mass production are formally studied to provide insights to mass customization studies. The study of the shape grammar for the generation of Thonet chairs aimed to ensure rules that would make possible the reproduction of the selected corpus, as well as allow for the generation of new chairs within the developed grammar. Due to the wide variety of Thonet chairs, six chairs were randomly chosen to infer the grammar and then this was fine tuned by checking whether it could account for the generation of other designs not in the original corpus. Shape grammars (Stiny and Gips, 1972) have been used with sucesss both in the analysis as in the synthesis of designs at different scales, from product design to building and urban design. In particular, the use of shape grammars has been efficient in the characterization of objects’ styles and in the generation of new designs within the analyzed style, and it makes design rules amenable to computers implementation (Duarte, 2005). The literature includes one other example of a grammar for chair design by Knight (1980). In the second step of the current research phase, the outlined shape grammar was implemented into a computer program, to assist the designer in conceiving and producing customized chairs using a digital design process. This implementation was developed in Catia by converting the grammar into an equivalent parametric design model. In the third phase, physical models of existing and new chair designs were produced using rapid prototyping. The paper describes the grammar, its computer implementation as a parametric model, and the rapid prototyping of physical models. The generative potential of the proposed digital process is discussed in the context of enabling the mass customization of furniture. The role of the furniture designer in the new paradigm and ideas for further work also are discussed.
keywords Thonet; furniture design; chair; digital design process; parametric design; shape grammar
series CAAD Futures
email
last changed 2012/02/11 19:21

_id 241c
authors Boehm, Wolfgang
year 1980
title Inserting New Knots into B-spline Curves
source IPC Business Press. July, 1980. vol. 12: pp. 199-201 : ill. includes bibliography
summary For some applications, further subdivision of a segment of a B-spline curve or B-spline surface is desirable. This paper provides an algorithm for this. The structure is similar to de Boor's algorithm for the calculation of a point on a curve. An application of the subdivision is illustrated
keywords algorithms, B-splines, curves, curved surfaces
series CADline
last changed 1999/02/12 15:07

_id ddss2004_ra-33
id ddss2004_ra-33
authors Diappi, L., P. Bolchim, and M. Buscema
year 2004
title Improved Understanding of Urban Sprawl Using Neural Networks
source Van Leeuwen, J.P. and H.J.P. Timmermans (eds.) Recent Advances in Design & Decision Support Systems in Architecture and Urban Planning, Dordrecht: Kluwer Academic Publishers, ISBN: 14020-2408-8, p. 33-49
summary It is widely accepted that the spatial pattern of settlements is a crucial factor affecting quality of life and environmental sustainability, but few recent studies have attempted to examine the phenomenon of sprawl by modelling the process rather than adopting a descriptive approach. The issue was partly addressed by models of land use and transportation which were mainly developed in the UK and US in the 1970s and 1980s, but the major advances were made in the area of modelling transportation, while very little was achieved in the area of spatial and temporal land use. Models of land use and transportation are well-established tools, based on explicit, exogenouslyformulated rules within a theoretical framework. The new approaches of artificial intelligence, and in particular, systems involving parallel processing, (Neural Networks, Cellular Automata and Multi-Agent Systems) defined by the expression “Neurocomputing”, allow problems to be approached in the reverse, bottom-up, direction by discovering rules, relationships and scenarios from a database. In this article we examine the hypothesis that territorial micro-transformations occur according to a local logic, i.e. according to use, accessibility, the presence of services and conditions of centrality, periphericity or isolation of each territorial “cell” relative to its surroundings. The prediction capabilities of different architectures of supervised Neural networks are implemented to the south Metropolitan area of Milan at two different temporal thresholds and discussed. Starting from data on land use in 1980 and 1994 and by subdividing the area into square cells on an orthogonal grid, the model produces a spatial and functional map of urbanisation in 2008. An implementation of the SOM (Self Organizing Map) processing to the Data Base allows the typologies of transformation to be identified, i.e. the classes of area which are transformed in the same way and which give rise to territorial morphologies; this is an interesting by-product of the approach.
keywords Neural Networks, Self-Organizing Maps, Land-Use Dynamics, Supervised Networks
series DDSS
last changed 2004/07/03 22:13

_id 076e
authors Ennis, G. and Lindsay, M.
year 1999
title VRML Possibilities: The evolution of the Glasgow Model
source Proceedings of International Conference on Virtual Systems and MultiMedia. University of Abertay. Dundee
summary During the 1980's, ABACUS, a research unit at the University of Strathclyde developed an interest in the ability to model and manipulate large geometrical databases of urban topography. Initially, this interest lay solely in the ability to source, capture and store the relevant data. However, once constructed, these models proved genuinely useful to a wide range of users and there was soon a demand for more functionality relating to the manipulation not just of the graphics, but also the range of urban attributes. Although a number of improvements were implemented there were drawbacks to the wide adoption of the software produced. The problems were almost all due to deficiencies in the then current hardware and software system available to the professions, and although this strand of research continued to be pursued, most of the development had to be focused on research applications and deployment. However, the recent advent of the Virtual Reality Modelling Language (VRML) standards have rekindled interest in this field since this language enables many of the issues that have proved problematic in the past to be addressed and solved. The potential now exists to provide wide access to large scale urban models. This paper focuses on the application of VRML as applied to the 'Glasgow Model'.
series other
email
last changed 2003/04/23 15:50

_id b190
authors Goldberg, Adele and Robson, David
year 1983
title Smalltalk-80: The language and its implementation
source New York, NY: Addison Wesley Co
summary Smalltalk-80 is the classic standard Smalltalk language as described in Smalltalk-80: The Language and Its Implementation by Goldberg and Robson. This book is commonly called "the Blue Book". Squeak implements the dialect of Smalltalk described in this book, but has a different implementation. Overview of the Smalltalk Language Smalltalk is a general purpose, high level programming language. It was the first original "pure" object oriented language, but not the first to use the object oriented concept, which is credited to Simula 67. The explosive growth of Object Oriented Programming (OOP) technologies began in the early 1980's, with Smalltalk's introduction. Behind it was the idea that the individual human user should be the most important component of any computing system, and that programming should be a natural extension of thinking, and also a dynamic and evolutionary process consistent with the model of human learning activity. In Smalltalk, these ideas are embodied in a framework for human-computer communication. In a sense, Smalltalk is yet another language like C and Pascal, and programs can be written in Smalltalk that have the look and feel of such conventional languages. The difference lies * in the amount of code that can be reduced, * less cryptic syntax, * and code that is easier to handle for application maintenance and enhancement. But Smalltalk's most powerful feature is easy code reuse. Smalltalk makes reuse of programs, routines, and subroutines (methods) far easier. Though procedural languages allow reuse too, it is harder to do, and much easier to cheat. It is no surprise that Smalltalk is relatively easy to learn, mainly due to its simple syntax and semantics, as well as few concepts. Objects, classes, messages, and methods form the basis of programming in Smalltalk. The general methodology to use Smalltalk The notion of human-computer interface also results in Smalltalk promoting the development of safer systems. Errors in Smalltalk may be viewed as objects telling users that confusion exists as to how to perform a desired function.
series other
last changed 2003/04/23 15:14

_id 76ce
authors Grimson, W.
year 1985
title Computational Experiments with a Feature Based Stereo Algorithm
source IEEE Trans. Pattern Anal. Machine Intell., Vol. PAMI-7, No. 1
summary Computational models of the human stereo system' can provide insight into general information processing constraints that apply to any stereo system, either artificial or biological. In 1977, Marr and Poggio proposed one such computational model, that was characterized as matching certain feature points in difference-of-Gaussian filtered images, and using the information obtained by matching coarser resolution representations to restrict the search'space for matching finer resolution representations. An implementation of the algorithm and'its testing on a range of images was reported in 1980. Since then a number of psychophysical experiments have suggested possible refinements to the model and modifications to the algorithm. As well, recent computational experiments applying the algorithm to a variety of natural images, especially aerial photographs, have led to a number of modifications. In this article, we present a version of the Marr-Poggio-Gfimson algorithm that embodies these modifications and illustrate its performance on a series of natural images.
series journal paper
last changed 2003/04/23 15:14

_id 0a4c
authors Holt, R.C. and Hume, J.N.P.
year 1980
title Programming Standard PASCAL
source x, 381 p. Reston, Verginia: Reston Publishing Company, Inc., 1980. includes index
summary A comprehensive look at data structures, records, files, pointers and more, for effective programming using PASCAL. A practical guide book from an introduction level through advanced coverage of numerical methods, assembly language programming and compiler construction
keywords PASCAL, programming, languages, education
series CADline
last changed 2003/06/02 13:58

_id c4b8
authors Lane, Jeffrey M. and Riesenfeld, Richard F.
year 1980
title A Theoretical Development for the Computer Generation and Display of Piecewise Polynomial Surfaces
source IEEE Transactions on Pattern Analysis and Machine Intelligence. January, 1980 Vol. PAM 1-2: pp. 35-46 : ill.
summary includes a short bibliography. Two algorithms for parametric piecewise polynomial evaluation and generation are described. The mathematical development of these algorithms is shown to generalize to new algorithms for obtaining curve and surface intersections and for the computer display of parametric curves and surfaces
keywords display, algorithms, intersection, CAD, computer graphics, B-splines, curved surfaces
series CADline
last changed 2003/06/02 13:58

_id 244d
authors Monedero, J., Casaus, A. and Coll, J.
year 1992
title From Barcelona. Chronicle and Provisional Evaluation of a New Course on Architectural Solid Modelling by Computerized Means
doi https://doi.org/10.52842/conf.ecaade.1992.351
source CAAD Instruction: The New Teaching of an Architect? [eCAADe Conference Proceedings] Barcelona (Spain) 12-14 November 1992, pp. 351-362
summary The first step made at the ETSAB in the computer field goes back to 1965, when professors Margarit and Buxade acquired an IBM computer, an electromechanical machine which used perforated cards and which was used to produce an innovative method of structural calculation. This method was incorporated in the academic courses and, at that time, this repeated question "should students learn programming?" was readily answered: the exercises required some knowledge of Fortran and every student needed this knowledge to do the exercises. This method, well known in Europe at that time, also provided a service for professional practice and marked the beginning of what is now the CC (Centro de Calculo) of our school. In 1980 the School bought a PDP1134, a computer which had 256 Kb of RAM, two disks of 5 Mb and one of lO Mb, and a multiplexor of 8 lines. Some time later the general politics of the UPC changed their course and this was related to the purchase of a VAX which is still the base of the CC and carries most of the administrative burden of the school. 1985 has probably been the first year in which we can talk of a general policy of the school directed towards computers. A report has been made that year, which includes an inquest adressed to the six Departments of the School (Graphic Expression, Projects, Structures, Construction, Composition and Urbanism) and that contains interesting data. According to the report, there were four departments which used computers in their current courses, while the two others (Projects and Composition) did not use them at all. The main user was the Department of Structures while the incidence of the remaining three was rather sporadic. The kind of problems detected in this report are very typical: lack of resources for hardware and software and for maintenance of the few computers that the school had at that moment; a demand (posed by the students) greatly exceeding the supply (computers and teachers). The main problem appeared to be the lack of computer graphic devices and proper software.

series eCAADe
email
last changed 2022/06/07 07:58

_id sigradi2014_172
id sigradi2014_172
authors Santos, Fábio Lopes Souza; Rafael Goffinet de Almeida
year 2014
title Dan Graham e a cidade contemporânea: dispositivos espaciais, comportamentos e relações de poder
source SiGraDi 2014 [Proceedings of the 18th Conference of the Iberoamerican Society of Digital Graphics - ISBN: 978-9974-99-655-7] Uruguay - Montevideo 12 - 14 November 2014, pp. 505-508
summary Dan Graham became an important reference in contemporary art developing since the 1960´s a series of works that maintain a profound relation with urban cultural phenomena. This article proposes the analysis of his works produced over the 1970´s and 1980´s which presents the use of technical supports like video, exhibition and surveillance systems and which guided his earlier aesthetic research – related to the “institutional critique” – to the investigations about the power relations between objects, public and the space where they are placed.
keywords Dan Graham; Contemporary Art; Contemporary Architecture; Contemporary City; Contemporary Spatiality
series SIGRADI
email
last changed 2016/03/10 09:59

_id 9fcb
authors Steele, Guy Lewis
year 1980
title The Definition and Implementation of a Computer Programming Language Based Constraints
source MIT - AITR-595
summary The constraint paradigm is a model of computation in which values are deduced whenever possible, under the limitation that deductions be local in a certain sense. One may visualize a constraint 'program' as a network of devices connected by wires. Data values may flow along the wires, and computation is performed by the devices. A device computes using only locally available information (with a few exceptions), and places newly derived values on other, locally attached wires. In this way computed values are propagated. An advantage of the constraint paradigm (not unique to it) is that a single relationship can be used in more than one direction. The connections to a device are not labelled as inputs and outputs; a device will compute with whatever values are available, and produce as many new values as it can. General theorem provers are capable of such behavior, but tend to suffer from combinatorial explosion; it is not usually useful to derive all the possible consequences of a set of hypotheses. The constraint paradigm places a certain kind of limitation on the deduction process. The limitations imposed by the constraint paradigm are not the only one possible. It is argued, however, that they are restrictive enough to forestall combinatorial explosion in many interesting computational situations, yet permissive enough to allow useful computations in practical situations. Moreover, the paradigm is intuitive: It is easy to visualize the computational effects of these particular limitations, and the paradigm is a natural way of expressing programs for certain applications, in particular relationships arising in computer-aided design. A number of implementations of constraint-based programming languages are presented. A progression of ever more powerful languages is described, complete implementations are presented and design difficulties and alternatives are discussed. The goal approached, though not quite reached, is a complete programming system which will implicitly support the constraint paradigm to the same extent that LISP, say, supports automatic storage management.
series thesis:PhD
email
more ftp://publications.ai.mit.edu/ai-publications/pdf/AITR-595.pdf
last changed 2003/02/12 22:37

_id 40ad
authors Yessios, Chris I.
year 1980
title Generation and Visualization of Architectural Forms with Tekton
source 1980? pp. 68-79 : ill. includes bibliography
summary Tekton is an interactive computer aided architectural design software system. It incorporates graphic input and 3-D modeling capabilities, a potent notational system which is based on an algebra like linguistic model for the representation of transformation and spatial compositions, hidden face elimination, shadowing and texture rendering. The latter feature has been specifically designed for the visualization of architectural forms and materials, through renderings of a free hand drawing quality. They are derived by generative semi-random models, included in the system. The Tekton language allows for interactive unlimited editing and modification of previously generated compositions
keywords CAD, architecture, modeling, computer graphics, rendering
series CADline
last changed 2003/06/02 13:58

_id 0830
authors Ball, A. A.
year 1980
title How to Make the Bicubic Patch Work Using Reparametrisation
source 1980 ? 11 p. includes bibliography
summary This paper comprises a series of examples in numerical surface definition, loosely strung together, to show the practical limitations of the bicubic patch and how they can be overcome by reparametrisation. The concept of reparametrisation is more general than that used in computer- aided geometric design insofar as the reparametrisation is modeled in addition to the basic parametric equation
keywords CAD, computational geometry, curved surfaces, parametrization
series CADline
last changed 2003/06/02 13:58

_id 2fdd
authors Barsky, Brian A. and Thomas, Spencer W.
year 1980
title Transpline Curve Representation System
source April, 1980. 19 p. : ill. includes bibliography
summary An interactive curve representation system has been developed based on the concept of transforming among several parametric spline curve formulations. The available formulations are the interpolatory spline, uniform B-spline, spline under tension, and NU-spline. The system implementation is described in the context of a sample design session
keywords computational geometry, curves, representation, splines
series CADline
last changed 2003/06/02 13:58

_id 8629
authors Barzilay, Amos
year 1980
title Human Problem Solving on Master Mind
source Carnegie Mellon University
summary The purpose of this work is to analyze the task of playing Master Mind and to examine subjects behaviors on solving that task. The methods and the ideas that are used in the work are the same found in the references for other tasks. The author wants to show that those ideas and methods can be used for that specific task as well. In other words, subjects behave in such a domain as an information processing system. [includes bibliography]
keywords Psychology, Problem Solving
series CADline
last changed 1999/02/15 15:10

_id e825
authors Baybars, Ilker and Eastman, Charles M.
year 1980
title Enumerating Architectural Arrangements by Generating Their Underlying Graphs
source Environment and Planning B. 1980. vol. 7: pp. 289- 310 : ill. includes bibliography. -- See also 'Enumerating Architectural Arrangements: Comment on a Recent Paper by Baybars and Eastman' by C.F. Earl
summary One mathematical correspondence to the partitioning of the plane is a Weighted Plane Graph (WPG). This paper first focuses on the systematic generation of WPGs, in a fashion similar to crystal growth. During this process, the WPGs are represented by adjacency matrices. The authors, thus, present a method for embedding the WPG in the plane, given its adjacency matrix. These graphs can, then, be mapped into floor plans. The common practice here is the use of the `geometric dual' of a WPG. The authors propose, instead, the use of the `Pseudogeometric dual' of a WPG directly to translate (part of) a design brief into alternative spatial layouts. Also discussed is the ability to create courtyards and/or circulation spaces given a specific WPG, without increasing the size of the problem
keywords enumeration, architecture, floor plans, graphs, design process, automation, algorithms, space allocation, CAD
series CADline
email
last changed 2003/05/17 10:15

_id cf2011_p127
id cf2011_p127
authors Benros, Deborah; Granadeiro Vasco, Duarte Jose, Knight Terry
year 2011
title Integrated Design and Building System for the Provision of Customized Housing: the Case of Post-Earthquake Haiti
source Computer Aided Architectural Design Futures 2011 [Proceedings of the 14th International Conference on Computer Aided Architectural Design Futures / ISBN 9782874561429] Liege (Belgium) 4-8 July 2011, pp. 247-264.
summary The paper proposes integrated design and building systems for the provision of sustainable customized housing. It advances previous work by applying a methodology to generate these systems from vernacular precedents. The methodology is based on the use of shape grammars to derive and encode a contemporary system from the precedents. The combined set of rules can be applied to generate housing solutions tailored to specific user and site contexts. The provision of housing to shelter the population affected by the 2010 Haiti earthquake illustrates the application of the methodology. A computer implementation is currently under development in C# using the BIM platform provided by Revit. The world experiences a sharp increase in population and a strong urbanization process. These phenomena call for the development of effective means to solve the resulting housing deficit. The response of the informal sector to the problem, which relies mainly on handcrafted processes, has resulted in an increase of urban slums in many of the big cities, which lack sanitary and spatial conditions. The formal sector has produced monotonous environments based on the idea of mass production that one size fits all, which fails to meet individual and cultural needs. We propose an alternative approach in which mass customization is used to produce planed environments that possess qualities found in historical settlements. Mass customization, a new paradigm emerging due to the technological developments of the last decades, combines the economy of scale of mass production and the aesthetics and functional qualities of customization. Mass customization of housing is defined as the provision of houses that respond to the context in which they are built. The conceptual model for the mass customization of housing used departs from the idea of a housing type, which is the combined result of three systems (Habraken, 1988) -- spatial, building system, and stylistic -- and it includes a design system, a production system, and a computer system (Duarte, 2001). In previous work, this conceptual model was tested by developing a computer system for existing design and building systems (Benr__s and Duarte, 2009). The current work advances it by developing new and original design, building, and computer systems for a particular context. The urgent need to build fast in the aftermath of catastrophes quite often overrides any cultural concerns. As a result, the shelters provided in such circumstances are indistinct and impersonal. However, taking individual and cultural aspects into account might lead to a better identification of the population with their new environment, thereby minimizing the rupture caused in their lives. As the methodology to develop new housing systems is based on the idea of architectural precedents, choosing existing vernacular housing as a precedent permits the incorporation of cultural aspects and facilitates an identification of people with the new housing. In the Haiti case study, we chose as a precedent a housetype called “gingerbread houses”, which includes a wide range of houses from wealthy to very humble ones. Although the proposed design system was inspired by these houses, it was decided to adopt a contemporary take. The methodology to devise the new type was based on two ideas: precedents and transformations in design. In architecture, the use of precedents provides designers with typical solutions for particular problems and it constitutes a departing point for a new design. In our case, the precedent is an existing housetype. It has been shown (Duarte, 2001) that a particular housetype can be encoded by a shape grammar (Stiny, 1980) forming a design system. Studies in shape grammars have shown that the evolution of one style into another can be described as the transformation of one shape grammar into another (Knight, 1994). The used methodology departs takes off from these ideas and it comprises the following steps (Duarte, 2008): (1) Selection of precedents, (2) Derivation of an archetype; (3) Listing of rules; (4) Derivation of designs; (5) Cataloguing of solutions; (6) Derivation of tailored solution.
keywords Mass customization, Housing, Building system, Sustainable construction, Life cycle energy consumption, Shape grammar
series CAAD Futures
email
last changed 2012/02/11 19:21

_id 8a27
authors Bentley, Jon L. and Carruthers, Wendy
year 1980
title Algorithms for Testing the Inclusion of Points in Polygons
source Allertorn Conference on Communication, Control and Computing (18th : 1980). (10) p. includes bibliography
summary Determining whether a given point lies inside or outside a simple polygon is an important problem in many applications, including computer vision systems and computer-assisted political redistricting systems. In this paper the authors give algorithms for inclusion problems that are efficient for polygons that are 'close to convex' in a certain precise sense. An empirical study of polygons that arise in several applications shows that typical polygons are indeed 'close to convex,' and a program implementing the algorithm shows that is extremely efficient on point sets of practical sizes
keywords point inclusion, polygons, algorithms, computational geometry
series CADline
last changed 2003/06/02 13:58

_id 4580
authors Borgerson, B. R. and Johnson, Robert H.
year 1980
title Beyond CAD to Computer Aided Engineering
source (8) p. : ill. Manufacturing Data Systems Incorporated, 1980? includes bibliography
summary Current CAD systems significantly aid the drafting function and many provide some aid to selected design activities. For the development of mechanical systems, much more can be done. Future systems will aid the interactive engineering process of design, analysis, control, documentation, and manufacturing engineering. Computer based systems which address this broader spectrum of engineering activities are referred to as `Computer Aided Engineering,' or `CAE,' systems. CAE systems will use volumetric techniques to create and evaluate the individual components of a machine design in conjunction with data base management schemas to support the interrelationships of the components of machines. This paper focuses on computer assistance to the engineering of mechanical systems
keywords mechanical engineering, CAE, solid modeling, objects
series CADline
last changed 2003/06/02 13:58

_id 0105
authors Bossan, Mario and Ronchi, Alfredo M.
year 1989
title Presentazione Esperienza Didattica del Dipartimento di Ingegneria dei Sistemi Edilizi e Territoriali - Politecnico di Milano
doi https://doi.org/10.52842/conf.ecaade.1989.x.x4i
source CAAD: Education - Research and Practice [eCAADe Conference Proceedings / ISBN 87-982875-2-4] Aarhus (Denmark) 21-23 September 1989, pp. 9.8.1-9.8.19
summary Didactic and research experience developed at the "Dipartimento di Ingegneria dei Sistemi Edilizi e Territoriali del Politecnico di Milano" in the environment of Computer Aided Architectural Design (CAAD). From the early part of the 1980's, using initially at an experimental level the resources available at the departmental centre of calculation various applications of CAD techniques in the building sector have been effected at DISET (Dipartimento di Ingegneria del Politecnico di Milano). During 1983, after a three year period of experimenting with these systems, it was decided to organise and activate a small computer aided design centre, within the department, the use of which was reserved for dissertation and research students.

series eCAADe
email
last changed 2022/06/07 07:50

For more results click below:

this is page 0show page 1show page 2show page 3HOMELOGIN (you are user _anon_679257 from group guest) CUMINCAD Papers Powered by SciX Open Publishing Services 1.002