CumInCAD is a Cumulative Index about publications in Computer Aided Architectural Design
supported by the sibling associations ACADIA, CAADRIA, eCAADe, SIGraDi, ASCAAD and CAAD futures

PDF papers
References

Hits 1 to 20 of 81

_id 0105
authors Bossan, Mario and Ronchi, Alfredo M.
year 1989
title Presentazione Esperienza Didattica del Dipartimento di Ingegneria dei Sistemi Edilizi e Territoriali - Politecnico di Milano
source CAAD: Education - Research and Practice [eCAADe Conference Proceedings / ISBN 87-982875-2-4] Aarhus (Denmark) 21-23 September 1989, pp. 9.8.1-9.8.19
doi https://doi.org/10.52842/conf.ecaade.1989.x.x4i
summary Didactic and research experience developed at the "Dipartimento di Ingegneria dei Sistemi Edilizi e Territoriali del Politecnico di Milano" in the environment of Computer Aided Architectural Design (CAAD). From the early part of the 1980's, using initially at an experimental level the resources available at the departmental centre of calculation various applications of CAD techniques in the building sector have been effected at DISET (Dipartimento di Ingegneria del Politecnico di Milano). During 1983, after a three year period of experimenting with these systems, it was decided to organise and activate a small computer aided design centre, within the department, the use of which was reserved for dissertation and research students.

series eCAADe
email
last changed 2022/06/07 07:50

_id e952
authors Carrara, Gianfranco and Paoluzzi, Alberto
year 1980
title A Systems Approach to Building Program Planning
source computer Aided Building Design Laboratory Research Report. 80 p. : ill. Rome, Italy: December, 1980. CABD LAB RR. 80-02. includes bibliography
summary In this paper problems of design performance and of building program planning are considered from the view point of the general system theory. After having formalized the concept of requirement, performance and performance specification, it is shown that a set of building objects (spaces and constructive elements) foreseeable within a program is a semilattice, and that therefore the ordering of constructive elements and spaces corresponds to an ordering of relations among feasible 'behaviors.' A set of feasible behaviors is then presented as an abstract system, eventually discussing some assumptions on which to base an input-state-output representation of it
keywords theory, methods, problem solving, architecture, design, knowledge
series CADline
last changed 2003/06/02 13:58

_id a587
authors Cohen, Elaine, Lyche, Tom and Riesenfeld, Richard F.
year 1980
title Discrete B-Splines and Subdivision Techniques in Computer-Aided Geometric Design and Computer Graphics
source computer Graphics and Image Processing. October, 1980. vol. 14: pp. 87- 111 : ill. includes bibliography
summary The relevant theory of discrete B-splines with associated new algorithms is extended to provide a framework for understanding and implementing general subdivision schemes for nonuniform B-splines. The new derived polygon corresponding to an arbitrary refinement of the knot vector for an existing B-spline curve, including multiplicities, is shown to be formed by successive evaluations of the discrete B-spline defined by the original vertices, the original knot vector, and the refined knot vector. Existing subdivision algorithms can be seen as proper special cases. General subdivision has widespread applications in computer-aided geometric design, computer graphics, and numerical analysis. The new algorithms resulting from the new theory lead to a unification of the display model, the analysis model, and other needed models into a single geometric model from which other necessary models are easily derived. New sample algorithms for interference calculation, contouring, surface rendering, and other important calculations are presented
keywords computational geometry, theory, algorithms, computer graphics, B-splines, curved surfaces
series CADline
last changed 2003/06/02 13:58

_id b04c
authors Goerger, S., Darken, R., Boyd, M., Gagnon, T., Liles, S., Sullivan, J. and Lawson, J.
year 1996
title Spatial Knowledge Acquisition from Maps and Virtual Environments in Complex Architectural Space
source Proc. 16 th Applied Behavioral Sciences Symposium, 22-23 April, U.S. Airforce Academy, Colorado Springs, CO., 1996, 6-10
summary It has often been suggested that due to its inherent spatial nature, a virtual environment (VE) might be a powerful tool for spatial knowledge acquisition of a real environment, as opposed to the use of maps or some other two-dimensional, symbolic medium. While interesting from a psychological point of view, a study of the use of a VE in lieu of a map seems nonsensical from a practical point of view. Why would the use of a VE preclude the use of a map? The more interesting investigation would be of the value added of the VE when used with a map. If the VE could be shown to substantially improve navigation performance, then there might be a case for its use as a training tool. If not, then we have to assume that maps continue to be the best spatial knowledge acquisition tool available. An experiment was conducted at the Naval Postgraduate School to determine if the use of an interactive, three-dimensional virtual environment would enhance spatial knowledge acquisition of a complex architectural space when used in conjunction with floor plan diagrams. There has been significant interest in this research area of late. Witmer, Bailey, and Knerr (1995) showed that a VE was useful in acquiring route knowledge of a complex building. Route knowledge is defined as the procedural knowledge required to successfully traverse paths between distant locations (Golledge, 1991). Configurational (or survey) knowledge is the highest level of spatial knowledge and represents a map-like internal encoding of the environment (Thorndyke, 1980). The Witmer study could not confirm if configurational knowledge was being acquired. Also, no comparison was made to a map-only condition, which we felt is the most obvious alternative. Comparisons were made only to a real world condition and a symbolic condition where the route is presented verbally.
series other
last changed 2003/04/23 15:50

_id 244d
authors Monedero, J., Casaus, A. and Coll, J.
year 1992
title From Barcelona. Chronicle and Provisional Evaluation of a New Course on Architectural Solid Modelling by Computerized Means
source CAAD Instruction: The New Teaching of an Architect? [eCAADe Conference Proceedings] Barcelona (Spain) 12-14 November 1992, pp. 351-362
doi https://doi.org/10.52842/conf.ecaade.1992.351
summary The first step made at the ETSAB in the computer field goes back to 1965, when professors Margarit and Buxade acquired an IBM computer, an electromechanical machine which used perforated cards and which was used to produce an innovative method of structural calculation. This method was incorporated in the academic courses and, at that time, this repeated question "should students learn programming?" was readily answered: the exercises required some knowledge of Fortran and every student needed this knowledge to do the exercises. This method, well known in Europe at that time, also provided a service for professional practice and marked the beginning of what is now the CC (Centro de Calculo) of our school. In 1980 the School bought a PDP1134, a computer which had 256 Kb of RAM, two disks of 5 Mb and one of lO Mb, and a multiplexor of 8 lines. Some time later the general politics of the UPC changed their course and this was related to the purchase of a VAX which is still the base of the CC and carries most of the administrative burden of the school. 1985 has probably been the first year in which we can talk of a general policy of the school directed towards computers. A report has been made that year, which includes an inquest adressed to the six Departments of the School (Graphic Expression, Projects, Structures, Construction, Composition and Urbanism) and that contains interesting data. According to the report, there were four departments which used computers in their current courses, while the two others (Projects and Composition) did not use them at all. The main user was the Department of Structures while the incidence of the remaining three was rather sporadic. The kind of problems detected in this report are very typical: lack of resources for hardware and software and for maintenance of the few computers that the school had at that moment; a demand (posed by the students) greatly exceeding the supply (computers and teachers). The main problem appeared to be the lack of computer graphic devices and proper software.

series eCAADe
email
last changed 2022/06/07 07:58

_id ddss9860
id ddss9860
authors Vakalo, E-G. and Fahmy, A.
year 1998
title A Theoretical Framework for the Analysis and Derivation of Orthogonal Building Plans and Sections
source Timmermans, Harry (Ed.), Fourth Design and Decision Support Systems in Architecture and Urban Planning Maastricht, the Netherlands), ISBN 90-6814-081-7, July 26-29, 1998
summary Architects are generally perceived as “Formgivers with an extraordinary gift” (Ackerman, 1980:12). Implicit in this statement is the belief that the operations that architects employ to compose their designs are the product of a creative faculty that is beyond the reach of rational discourse, and thereby cannot be subjected to logical investigation. This view is detrimental to the advancement of knowledge about architectural composition and adversely affects both practice and education in architecture. More specifically, it prevents the architectural community from acquiring of a more refined conception about how architects derive their designs. In contrast to this view, this study demonstrates that architectural form-making is amenable to logical analysis. In specific, this is to be done through a theoretical and computational framework that describe and explain the tasks involved in the making of orthogonal building plans and sections. In addition to illustrating the susceptibility of architectural form-making to logical analysis, the frameworks proposed in this study overcome the limitations of previously established theories thatdeal with architectural form-making. These can be divided into two categories: normative and positive theories.Normative theories include architectural treatises and manifestos. A major limitation of normativetheories is that they have limited explanatory power. Their concern is with promoting a specific aesthetic ideology and prescribing rules that can be used to derive compositions that conform to it. Therefore, they cannot be used to explain form-making in general. Positive frameworks, such asshape grammar, rely on rules to describe derivation and analysis processes. Nevertheless, they do not provide a comprehensive description of the tasks involved in architectural form-making. This causes the relation between the rules and compositional tasks to be ambiguous. It also affects adversely the ability of these frameworks to provide architects with a complete understanding of the role of compositional rules in derivation or analysis processes.
series DDSS
type normal paper
last changed 2010/05/16 09:11

_id cf2011_p170
id cf2011_p170
authors Barros, Mário; Duarte José, Chaparro Bruno
year 2011
title Thonet Chairs Design Grammar: a Step Towards the Mass Customization of Furniture
source Computer Aided Architectural Design Futures 2011 [Proceedings of the 14th International Conference on Computer Aided Architectural Design Futures / ISBN 9782874561429] Liege (Belgium) 4-8 July 2011, pp. 181-200.
summary The paper presents the first phase of research currently under development that is focused on encoding Thonet design style into a generative design system using a shape grammar. The ultimate goal of the work is the design and production of customizable chairs using computer assisted tools, establishing a feasible practical model of the paradigm of mass customization (Davis, 1987). The current research step encompasses the following three steps: (1) codification of the rules describing Thonet design style into a shape grammar; (2) implementing the grammar into a computer tool as parametric design; and (3) rapid prototyping of customized chair designs within the style. Future phases will address the transformation of the Thonet’s grammar to create a new style and the production of real chair designs in this style using computer aided manufacturing. Beginning in the 1830’s, Austrian furniture designer Michael Thonet began experimenting with forming steam beech, in order to produce lighter furniture using fewer components, when compared with the standards of the time. Using the same construction principles and standardized elements, Thonet produced different chairs designs with a strong formal resemblance, creating his own design language. The kit assembly principle, the reduced number of elements, industrial efficiency, and the modular approach to furniture design as a system of interchangeable elements that may be used to assemble different objects enable him to become a pioneer of mass production (Noblet, 1993). The most paradigmatic example of the described vision of furniture design is the chair No. 14 produced in 1858, composed of six structural elements. Due to its simplicity, lightness, ability to be stored in flat and cubic packaging for individual of collective transportation, respectively, No. 14 became one of the most sold chairs worldwide, and it is still in production nowadays. Iconic examples of mass production are formally studied to provide insights to mass customization studies. The study of the shape grammar for the generation of Thonet chairs aimed to ensure rules that would make possible the reproduction of the selected corpus, as well as allow for the generation of new chairs within the developed grammar. Due to the wide variety of Thonet chairs, six chairs were randomly chosen to infer the grammar and then this was fine tuned by checking whether it could account for the generation of other designs not in the original corpus. Shape grammars (Stiny and Gips, 1972) have been used with sucesss both in the analysis as in the synthesis of designs at different scales, from product design to building and urban design. In particular, the use of shape grammars has been efficient in the characterization of objects’ styles and in the generation of new designs within the analyzed style, and it makes design rules amenable to computers implementation (Duarte, 2005). The literature includes one other example of a grammar for chair design by Knight (1980). In the second step of the current research phase, the outlined shape grammar was implemented into a computer program, to assist the designer in conceiving and producing customized chairs using a digital design process. This implementation was developed in Catia by converting the grammar into an equivalent parametric design model. In the third phase, physical models of existing and new chair designs were produced using rapid prototyping. The paper describes the grammar, its computer implementation as a parametric model, and the rapid prototyping of physical models. The generative potential of the proposed digital process is discussed in the context of enabling the mass customization of furniture. The role of the furniture designer in the new paradigm and ideas for further work also are discussed.
keywords Thonet; furniture design; chair; digital design process; parametric design; shape grammar
series CAAD Futures
email
last changed 2012/02/11 19:21

_id cf2011_p127
id cf2011_p127
authors Benros, Deborah; Granadeiro Vasco, Duarte Jose, Knight Terry
year 2011
title Integrated Design and Building System for the Provision of Customized Housing: the Case of Post-Earthquake Haiti
source Computer Aided Architectural Design Futures 2011 [Proceedings of the 14th International Conference on Computer Aided Architectural Design Futures / ISBN 9782874561429] Liege (Belgium) 4-8 July 2011, pp. 247-264.
summary The paper proposes integrated design and building systems for the provision of sustainable customized housing. It advances previous work by applying a methodology to generate these systems from vernacular precedents. The methodology is based on the use of shape grammars to derive and encode a contemporary system from the precedents. The combined set of rules can be applied to generate housing solutions tailored to specific user and site contexts. The provision of housing to shelter the population affected by the 2010 Haiti earthquake illustrates the application of the methodology. A computer implementation is currently under development in C# using the BIM platform provided by Revit. The world experiences a sharp increase in population and a strong urbanization process. These phenomena call for the development of effective means to solve the resulting housing deficit. The response of the informal sector to the problem, which relies mainly on handcrafted processes, has resulted in an increase of urban slums in many of the big cities, which lack sanitary and spatial conditions. The formal sector has produced monotonous environments based on the idea of mass production that one size fits all, which fails to meet individual and cultural needs. We propose an alternative approach in which mass customization is used to produce planed environments that possess qualities found in historical settlements. Mass customization, a new paradigm emerging due to the technological developments of the last decades, combines the economy of scale of mass production and the aesthetics and functional qualities of customization. Mass customization of housing is defined as the provision of houses that respond to the context in which they are built. The conceptual model for the mass customization of housing used departs from the idea of a housing type, which is the combined result of three systems (Habraken, 1988) -- spatial, building system, and stylistic -- and it includes a design system, a production system, and a computer system (Duarte, 2001). In previous work, this conceptual model was tested by developing a computer system for existing design and building systems (Benr__s and Duarte, 2009). The current work advances it by developing new and original design, building, and computer systems for a particular context. The urgent need to build fast in the aftermath of catastrophes quite often overrides any cultural concerns. As a result, the shelters provided in such circumstances are indistinct and impersonal. However, taking individual and cultural aspects into account might lead to a better identification of the population with their new environment, thereby minimizing the rupture caused in their lives. As the methodology to develop new housing systems is based on the idea of architectural precedents, choosing existing vernacular housing as a precedent permits the incorporation of cultural aspects and facilitates an identification of people with the new housing. In the Haiti case study, we chose as a precedent a housetype called “gingerbread houses”, which includes a wide range of houses from wealthy to very humble ones. Although the proposed design system was inspired by these houses, it was decided to adopt a contemporary take. The methodology to devise the new type was based on two ideas: precedents and transformations in design. In architecture, the use of precedents provides designers with typical solutions for particular problems and it constitutes a departing point for a new design. In our case, the precedent is an existing housetype. It has been shown (Duarte, 2001) that a particular housetype can be encoded by a shape grammar (Stiny, 1980) forming a design system. Studies in shape grammars have shown that the evolution of one style into another can be described as the transformation of one shape grammar into another (Knight, 1994). The used methodology departs takes off from these ideas and it comprises the following steps (Duarte, 2008): (1) Selection of precedents, (2) Derivation of an archetype; (3) Listing of rules; (4) Derivation of designs; (5) Cataloguing of solutions; (6) Derivation of tailored solution.
keywords Mass customization, Housing, Building system, Sustainable construction, Life cycle energy consumption, Shape grammar
series CAAD Futures
email
last changed 2012/02/11 19:21

_id 241c
authors Boehm, Wolfgang
year 1980
title Inserting New Knots into B-spline Curves
source IPC Business Press. July, 1980. vol. 12: pp. 199-201 : ill. includes bibliography
summary For some applications, further subdivision of a segment of a B-spline curve or B-spline surface is desirable. This paper provides an algorithm for this. The structure is similar to de Boor's algorithm for the calculation of a point on a curve. An application of the subdivision is illustrated
keywords algorithms, B-splines, curves, curved surfaces
series CADline
last changed 1999/02/12 15:07

_id 2ddb
authors Davies, R.S.
year 1980
title A Review of Computer Techniques for Representation of Geometry
source 1980? pp. 213-225 : ill. includes bibliography
summary The primary role of the computer in the design process is to provide a means of recording the design, and subsequently of extracting information from the record. The choice of technique for recording geometry depends on the characteristics of the component and the nature of the information subsequently required about it. This paper reviews the principal techniques currently in use with particular emphasis on these two aspects
keywords CAD, curves, representation, geometric modeling, techniques
series CADline
last changed 2003/06/02 13:58

_id ddss2004_ra-33
id ddss2004_ra-33
authors Diappi, L., P. Bolchim, and M. Buscema
year 2004
title Improved Understanding of Urban Sprawl Using Neural Networks
source Van Leeuwen, J.P. and H.J.P. Timmermans (eds.) Recent Advances in Design & Decision Support Systems in Architecture and Urban Planning, Dordrecht: Kluwer Academic Publishers, ISBN: 14020-2408-8, p. 33-49
summary It is widely accepted that the spatial pattern of settlements is a crucial factor affecting quality of life and environmental sustainability, but few recent studies have attempted to examine the phenomenon of sprawl by modelling the process rather than adopting a descriptive approach. The issue was partly addressed by models of land use and transportation which were mainly developed in the UK and US in the 1970s and 1980s, but the major advances were made in the area of modelling transportation, while very little was achieved in the area of spatial and temporal land use. Models of land use and transportation are well-established tools, based on explicit, exogenouslyformulated rules within a theoretical framework. The new approaches of artificial intelligence, and in particular, systems involving parallel processing, (Neural Networks, Cellular Automata and Multi-Agent Systems) defined by the expression “Neurocomputing”, allow problems to be approached in the reverse, bottom-up, direction by discovering rules, relationships and scenarios from a database. In this article we examine the hypothesis that territorial micro-transformations occur according to a local logic, i.e. according to use, accessibility, the presence of services and conditions of centrality, periphericity or isolation of each territorial “cell” relative to its surroundings. The prediction capabilities of different architectures of supervised Neural networks are implemented to the south Metropolitan area of Milan at two different temporal thresholds and discussed. Starting from data on land use in 1980 and 1994 and by subdividing the area into square cells on an orthogonal grid, the model produces a spatial and functional map of urbanisation in 2008. An implementation of the SOM (Self Organizing Map) processing to the Data Base allows the typologies of transformation to be identified, i.e. the classes of area which are transformed in the same way and which give rise to territorial morphologies; this is an interesting by-product of the approach.
keywords Neural Networks, Self-Organizing Maps, Land-Use Dynamics, Supervised Networks
series DDSS
last changed 2004/07/03 22:13

_id acadia07_164
id acadia07_164
authors Diniz, Nancy; Turner, Alasdair
year 2007
title Towards a Living Architecture
source Expanding Bodies: Art • Cities• Environment [Proceedings of the 27th Annual Conference of the Association for Computer Aided Design in Architecture / ISBN 978-0-9780978-6-8] Halifax (Nova Scotia) 1-7 October 2007, 164-173
doi https://doi.org/10.52842/conf.acadia.2007.164
summary Interaction is the latest currency in architecture, as responsive components are now reacting to the inhabitant of the space. These components are designed and installed by the architect with a view to the phenomenology of space, where the experience of the environment is previewed and pre-constructed before it is translated into the conception of the space. However, this traditional approach to new technology leaves no scope for the architecture to be alive in and of itself, and thus the installed piece quickly becomes just that—an installation: isolated and uncontained by its environment. In this paper, we argue that a way to approach a responsive architecture is to design for a piece that is truly living, and in order to propose a living architecture first we need to understand what the architecture of a living system is. This paper suggests a conceptual framework based on the theory of Autopoiesis in order to create a “self-producing” system through an experiment entitled, “The Life of a Wall” (Maturana and Varela 1980). The wall has a responsive membrane controlled by a genetic algorithm that reconfigures its behaviour and learns to adapt itself continually to the evolutionary properties of the environment, thus becoming a situated, living piece.
series ACADIA
email
last changed 2022/06/07 07:55

_id d22c
authors Eastman, C.M.
year 1980
title System Facilities for CAD Databases
source 17th Design Automation Conference Proceedings
summary In this paper, an attempt is made to lay out the special needs of design databases, as compared to the facilities provided in conventional database systems now commercially available. The paper starts from a point of commonality and focusses on the limitations and shortcomings commonly found in current database systems. It is impossible and unwise to make universal statements about DBMS capabilities. Instead, the goal is to identify those special features that, by their capability, provide distinctions beyond the general notions of speed and ratio of logical size to physical size.
series journal paper
email
last changed 2003/05/15 21:22

_id 76ce
authors Grimson, W.
year 1985
title Computational Experiments with a Feature Based Stereo Algorithm
source IEEE Trans. Pattern Anal. Machine Intell., Vol. PAMI-7, No. 1
summary Computational models of the human stereo system' can provide insight into general information processing constraints that apply to any stereo system, either artificial or biological. In 1977, Marr and Poggio proposed one such computational model, that was characterized as matching certain feature points in difference-of-Gaussian filtered images, and using the information obtained by matching coarser resolution representations to restrict the search'space for matching finer resolution representations. An implementation of the algorithm and'its testing on a range of images was reported in 1980. Since then a number of psychophysical experiments have suggested possible refinements to the model and modifications to the algorithm. As well, recent computational experiments applying the algorithm to a variety of natural images, especially aerial photographs, have led to a number of modifications. In this article, we present a version of the Marr-Poggio-Gfimson algorithm that embodies these modifications and illustrate its performance on a series of natural images.
series journal paper
last changed 2003/04/23 15:14

_id f773
id f773
authors Johnson, Brian R.
year 1990
title Inside Out
source From Research to Practice [ACADIA Conference Proceedings] Big Sky (Montana - USA) 4-6 October 1990, pp. 219-231
doi https://doi.org/10.52842/conf.acadia.1990.219
summary An effort to generate discussion, this paper suggests that between 1980 and 1990 a significant and undesirable change has occurred in academic architectural CAD. We have moved from being developers of ideas and technology on the inside of the development loop to being consumers of products developed in the commercial market place, outside the loop. Certain negative consequences are discussed. Finally, some suggestions are made for turning ourselves "right side out" again.
series ACADIA
type normal paper
email
last changed 2022/06/07 07:52

_id c3f4
authors Joy, William
year 1980
title An Introduction to Display Editing with VI
source September, 1980. 30 p
summary VI (Visual) is a display oriented interactive text editor. When using VI the screen of the terminal acts as a window into the file which is being editing. Changes which made to the file are reflected in what is seen. Using VI the user can insert new text any place in the file quite easily. Most of the commands to VI move the cursor around in the file. There are commands to move the cursor forward and backward in units of characters, words, sentences and paragraphs. A small set of operators, like d for delete and c for change, are combined with the motion commands to form operations such as delete word or change paragraph, in a simple and natural way. This regularity and the mnemonic assignment of commands to keys makes the editor command set easy to remember and to use. VI works on a large number of display terminals, and new terminals are easily driven after editing a terminal description file. While it is advantageous to have an intelligent terminal which can locally insert and delete lines and characters from the display, the editor will function quite well on dumb terminals over slow phone lines. The editor makes allowances for the low bandwidth in these situations and uses smaller window sizes and different display updating algorithms to make best use of the limited speed available. It is also possible to use the command set of VI on hardcopy terminals, storage tubes and 'glass ty's' using a one line editing window; thus VI's command set is available on all terminals. The full command set of the more traditional, line oriented editor ED is available within VI; it is quite simple to switch between the two modes of editing
keywords UNIX, display, word processing, software
series CADline
last changed 1999/02/12 15:08

_id ga9809
id ga9809
authors Kälviäinen, Mirja
year 1998
title The ideological basis of generative expression in design
source International Conference on Generative Art
summary This paper will discuss issues concerning the design ideology supporting the use and development of generative design. This design ideology is based on the unique qualities of craft production and on the forms or ideas from nature or the natural characteristics of materials. The main ideology presented here is the ideology of the 1980´s art craft production in Finland. It is connected with the general Finnish design ideology and with the design ideology of other western countries. The ideology for these professions is based on the common background of design principles stated in 19th century England. The early principles developed through the Arts and Crafts tradition which had a great impact on design thinking in Europe and in the United States. The strong continuity of this design ideology from 19th century England to the present computerized age can be detected. The application of these design principles through different eras shows the difference in the interpretations and in the permission of natural decorative forms. The ideology of the 1980ïs art craft in Finland supports the ideas and fulfilment of generative design in many ways. The reasons often given as the basis for making generative design with computers are in very many respects the same as the ideology for art craft. In Finland there is a strong connection between art craft and design ideology. The characteristics of craft have often been seen as the basis for industrial design skills. The main themes in the ideology of the 1980´s art craft in Finland can be compared to the ideas of generative design. The main issues in which the generative approach reflects a distinctive ideological thinking are: Way of Life: The work is the communication of the maker´s inner ideas. The concrete relationship with the environment, personality, uniqueness, communication, visionary qualities, development and growth of the maker are important. The experiments serve as a media for learning. Taste and Aesthetic Education: The real love affair is created by the non living object with the help of memories and thought. At their best objects create the basis in their stability and communication for durable human relationships. People have warm relationships especially with handmade products in which they can detect unique qualities and the feeling that the product has been made solely for them. Counter-culture: The aim of the work is to produce alternatives for technoburocracy and mechanical production and to bring subjective and unique experiences into the customerïs monotonious life. This ideology rejects the usual standardized mass production of our times. Mythical character: There is a metamorphosis in the birth of the product. In many ways the design process is about birth and growth. The creative process is a development story of the maker. The complexity of communication is the expression of the moments that have been lived. If you can sense the process of making in the product it makes it more real and nearer to life. Each piece of wood has its own beauty. Before you can work with it you must find the deep soul of its quality. The distinctive traits of the material, technique and the object are an essential part of the metamorphosis which brings the product into life. The form is not only for formïs sake but for other purposes, too. You cannot find loose forms in nature. Products have their beginnings in the material and are a part of the nature. This art craft ideology that supports the ideas of generative design can be applied either to the hand made crafts production or to the production exploiting new technology. The unique characteristics of craft and the expression of the material based development are a way to broaden the expression and forms of industrial products. However, for a crafts person it is not meaningful to fill the world with objects. In generative, computer based production this is possible. But maybe the production of unique pieces is still slower and makes the industrial production in that sense more ecological. People will be more attached to personal and unique objects, and thus the life cycle of the objects produced will be longer.
series other
email
more http://www.generativeart.com/
last changed 2003/08/07 17:25

_id ddss2006-hb-187
id DDSS2006-HB-187
authors Lidia Diappi and Paola Bolchi
year 2006
title Gentrification Waves in the Inner-City of Milan - A multi agent / cellular automata model based on Smith's Rent Gap theory
source Van Leeuwen, J.P. and H.J.P. Timmermans (eds.) 2006, Innovations in Design & Decision Support Systems in Architecture and Urban Planning, Dordrecht: Springer, ISBN-10: 1-4020-5059-3, ISBN-13: 978-1-4020-5059-6, p. 187-201
summary The aim of this paper is to investigate the gentrification process by applying an urban spatial model of gentrification, based on Smith's (1979; 1987; 1996) Rent Gap theory. The rich sociological literature on the topic mainly assumes gentrification to be a cultural phenomenon, namely the result of a demand pressure of the suburban middle and upper class, willing to return to the city (Ley, 1980; Lipton, 1977, May, 1996). Little attempt has been made to investigate and build a sound economic explanation on the causes of the process. The Rent Gap theory (RGT) of Neil Smith still represents an important contribution in this direction. At the heart of Smith's argument there is the assumption that gentrification takes place because capitals return to the inner city, creating opportunities for residential relocation and profit. This paper illustrates a dynamic model of Smith's theory through a multi-agent/ cellular automata system approach (Batty, 2005) developed on a Netlogo platform. A set of behavioural rules for each agent involved (homeowner, landlord, tenant and developer, and the passive 'dwelling' agent with their rent and level of decay) are formalised. The simulations show the surge of neighbouring degradation or renovation and population turn over, starting with different initial states of decay and estate rent values. Consistent with a Self Organized Criticality approach, the model shows that non linear interactions at local level may produce different configurations of the system at macro level. This paper represents a further development of a previous version of the model (Diappi, Bolchi, 2005). The model proposed here includes some more realistic factors inspired by the features of housing market dynamics in the city of Milan. It includes the shape of the potential rent according to city form and functions, the subdivision in areal submarkets according to the current rents, and their maintenance levels. The model has a more realistic visualisation of the city and its form, and is able to show the different dynamics of the emergent neighbourhoods in the last ten years in Milan.
keywords Multi agent systems, Housing market, Gentrification, Emergent systems
series DDSS
last changed 2006/08/29 12:55

_id eb7b
authors Liggett, Robin S.
year 1980
title The Quadratic Assignment problem: an Analysis of Applications and Solution Strategies
source Environment and Planning B. 1980. vol. 7: pp. 141-162 : tables. includes bibliography
summary A wide variety of practical problem in design, planning and management can be formulated as quadratic assignment problems, and this paper discusses this class of problem. Since algorithms for producing optimal solutions to such problems are computationally infeasible for all but small problems of this type, heuristic techniques must usually be employed for the solution of real practical problems. This paper explores and compares a variety of solution techniques found in the literature considering the trade-offs between computational efficiency and quality of solutions generated. Recommendations are made about the key factors to be considered in developing and applying heuristic solution procedures
keywords design process, algorithms, graphs, quadratic assignment, operations research, optimization, automation, synthesis, heuristics, space allocation, floor plans, management, planning
series CADline
email
last changed 2003/06/02 13:58

_id caadria2014_102
id caadria2014_102
authors Lopes, João V.; Alexandra C. Paio and José P. Sousa
year 2014
title Parametric Urban Models Based on Frei Otto’s Generative Form-Finding Processes
source Rethinking Comprehensive Design: Speculative Counterculture, Proceedings of the 19th International Conference on Computer-Aided Architectural Design Research in Asia (CAADRIA 2014) / Kyoto 14-16 May 2014, pp. 595–604
doi https://doi.org/10.52842/conf.caadria.2014.595
summary Presently there is a progressive tendency to incorporate parametric design strategies in urban planning and design. Although the computational technologies that allow it are recent, fundamental theories and thinking processes behind it can be traced back to the work conducted at the Institute for Lightweight Structures (IL) in Stuttgart, between the 1960’s and 1980’s. This paper describes an experimental urban research work based on Frei Otto and Eda Schaur's thoughts on unplanned settlements, and on the form-finding experiences carried out at IL. By exploring the digital development of parametric and algorithmic interactive models, two urban design proposals were developed for a site in Porto city. Out of this experience, this paper suggests that today the act of design can benefit from a deeper understanding of the natural processes of occupation and connection.
keywords Parametric urbanism; generative design; form-finding; Frei Otto
series CAADRIA
email
last changed 2022/06/07 07:59

For more results click below:

this is page 0show page 1show page 2show page 3show page 4HOMELOGIN (you are user _anon_673380 from group guest) CUMINCAD Papers Powered by SciX Open Publishing Services 1.002