CumInCAD is a Cumulative Index about publications in Computer Aided Architectural Design
supported by the sibling associations ACADIA, CAADRIA, eCAADe, SIGraDi, ASCAAD and CAAD futures

PDF papers
References

Hits 1 to 20 of 172

_id ca50
authors Ayrle, Hartmut
year 1991
title XNET2 - Methodical Design of Local Area Networks in Buildings - An Application of the A4 Intelligent Design Tool
source Computer Aided Architectural Design Futures: Education, Research, Applications [CAAD Futures ‘91 Conference Proceedings / ISBN 3-528-08821-4] Zürich (Switzerland), July 1991, pp. 443-450
summary XNET2 is a prototype program, that helps network planners to design Ethernet-conform data-networks for sites and buildings. It is implemented as an example application of the ARMILLA4 Intelligent Design Tool under Knowledge Craft. It is based on a knowledge acquisition phase with experts from DECsite, the network-branch of DEC. The ARMILLA Design Tool is developed on the basis of Fritz Haller's ARMILLA ' a set of geometrical and operational rules for the integration of technical ductwork into a building's construction.
series CAAD Futures
last changed 2003/11/21 15:16

_id ecaade2023_281
id ecaade2023_281
authors Prokop, Šimon, Kubalík, Jiøí and Kurilla, Lukáš
year 2023
title Neural Networks for Estimating Wind Pressure on Complex Double-Curved Facades
source Dokonal, W, Hirschberg, U and Wurzer, G (eds.), Digital Design Reconsidered - Proceedings of the 41st Conference on Education and Research in Computer Aided Architectural Design in Europe (eCAADe 2023) - Volume 2, Graz, 20-22 September 2023, pp. 639–647
doi https://doi.org/10.52842/conf.ecaade.2023.2.639
summary Due to their complex geometry, it is challenging to assess wind effects on the freeform, double-curved building facades. The traditional building code EN 1991-1-4 (730035) only accounts for basic shapes such as cubes, spheres, and cylinders. Moreover, even though wind tunnel measurements are considered to be more precise than other methods, they are still limited by the number of measurement points that can be taken. This limitation, combined with the time and resources required for the analysis, can limit the ability to fully capture detailed wind effects on the whole complex freeform shape of the building. In this study, we propose the use of neural network models trained to predict wind pressure on complex double-curved facades. The neural network is a powerful data-driven machine learning technique that can, in theory, learn an approximation of any function from data, making it well-suited for this application. Our approach was empirically evaluated using a set of 31 points measured in the wind tunnel on a 3D printed model in 1:300 scale of the real architectural design of a concert hall in Ostrava. The results of this evaluation demonstrate the effectiveness of our neural network method in estimating wind pressures on complex freeform facades.
keywords wind pressure, double-curved façade, neural network
series eCAADe
email
last changed 2023/12/10 10:49

_id 22d6
authors Ballheim, F. and Leppert, J.
year 1991
title Architecture with Machines, Principles and Examples of CAAD-Education at the Technische Universität München
source Experiences with CAAD in Education and Practice [eCAADe Conference Proceedings] Munich (Germany) 17-19 October 1991
doi https://doi.org/10.52842/conf.ecaade.1991.x.h3w
summary "Design tools affect the results of the design process" - this is the starting point of our considerations about the efficient use of CAAD within architecture. To give you a short overview about what we want to say with this thesis lets have a short - an surely incomplete - trip through the fourth dimension back into the early time of civil engineering. As CAD in our faculty is integrated in the "Lehrstuhl für Hochbaustatik und Tragwerksplanung" (if we try to say it in English it would approximately be "institute of structural design"), we chose an example we are very familiar with because of its mathematical background - the cone sections: Circle, ellipse, parabola and hyperbola. If we start our trip two thousand years ago we only find the circle - or in very few cases the ellipse - in their use for the ground plan of greek or roman theaters - if you think of Greek amphitheaters or the Colosseum in Rome - or for the design of the cross section of a building - for example the Pantheon, roman aqueducts or bridges. With the rediscovery of the perspective during the Renaissance the handling of the ellipse was brought to perfection. May be the most famous example is the Capitol in Rome designed by Michelangelo Buonarotti with its elliptical ground plan that looks like a circle if the visitor comes up the famous stairway. During the following centuries - caused by the further development of the natural sciences and the use of new construction materials, i.e. cast-iron, steel or concrete - new design ideas could be realized. With the growing influence of mathematics on the design of buildings we got the division into two professions: Civil engineering and architecture. To the regret of the architects the most innovative constructions were designed by civil engineers, e.g. the early iron bridges in Britain or the famous bridges of Robert Maillard. Nowadays we are in the situation that we try to reintegrate the divided professions. We will return to that point later discussing possible solutions of this problem. But let us continue our 'historical survey demonstrating the state of the art we have today. As the logical consequence of the parabolic and hyperbolic arcs the hyperbolic parabolic shells were developed using traditional design techniques like models and orthogonal sections. Now we reach the point where the question comes up whether complex structures can be completely described by using traditional methods. A question that can be answered by "no" if we take the final step to the completely irregular geometry of cable- net-constructions or deconstructivistic designs. What we see - and what seems to support our thesis of the connection between design tools and the results of the design process - is, that on the one hand new tools enabled the designer to realize new ideas and on the other hand new ideas affected the development of new tools to realize them.

series eCAADe
more http://www.mediatecture.at/ecaade/91/ballheim_leppert.pdf
last changed 2022/06/07 07:50

_id c1ca
authors Daru, Roel
year 1991
title Sketch as Sketch Can - Design Sketching with Imperfect Aids and Sketchpads of the Future
source Experiences with CAAD in Education and Practice [eCAADe Conference Proceedings] Munich (Germany) 17-19 October 1991
doi https://doi.org/10.52842/conf.ecaade.1991.x.k1t
summary Sketching plays a manifold role in design and design education now as much as it did in the computerless days. Design sketching is indispensable during the early phases of the architectural design process. But if we ask architects and design educators alike what they are doing with computers, idea sketching is the least mentioned answer if not left out entirely. It is not because they are computer-illiterates, as the computer industry would tend to imply, but because their computers are not offering an adequate environment for design sketching. In education this means that those trying to create computeraided design sketching courses are confronted with the choice of either working with imperfect tools, or waiting for better tools. But by exploring the possibilities in available surrogates we will build the necessary experiences for specifying what is really useful for idea-sketching. Without such exercises, we will never go beyond the electronic metaphor of the sketchbook with pencil or marker.

series eCAADe
email
last changed 2022/06/07 07:50

_id a113
authors Milne, Murray
year 1991
title Design Tools: Future Design Environments for Visualizing Building Performance
source Computer Aided Architectural Design Futures: Education, Research, Applications [CAAD Futures ‘91 Conference Proceedings / ISBN 3-528-08821-4] Zürich (Switzerland), July 1991, pp. 485-496
summary In the future of Computer Aided Architectural Design (CAAD), architects clearly need more than just computer aided design and drafting systems (CAD). Unquestionably CAD systems continue to become increasingly powerful, but there is more to designing a good building than its three-dimensional existence, especially in the eyes of all the non-architects of the world: users, owners, contractors, regulators, environmentalists. The ultimate measure of a building's quality has something to do with how well it behaves over time. Predictions about its performance have many different dimensions; how much it costs to build, to operate, and to demolish; how comfortable it is; how effectively people can perform their functions in it; how much energy it uses or wastes. Every year dozens of building performance simulation programs are being written that can predict performance over time along any of these dimensions. That is why the need for both CAD systems and performance predictors can be taken for granted, and why instead it may be more interesting to speculate about the need for 'design tools'. A design tool can be defined as a piece of software that is easy and natural for architects to use, that easily accommodates three-dimensional representations of the building, and that-predicts something useful about a building's performance. There are at least five different components of design tools that will be needed for the design environment of the future.
series CAAD Futures
email
last changed 2003/05/16 20:58

_id eaca
authors Davis, L. (ed.)
year 1991
title Handbook of genetic algorithms
source Van Nostrand Reinhold, New York
summary This book sets out to explain what genetic algorithms are and how they can be used to solve real-world problems. The first objective is tackled by the editor, Lawrence Davis. The remainder of the book is turned over to a series of short review articles by a collection of authors, each explaining how genetic algorithms have been applied to problems in their own specific area of interest. The first part of the book introduces the fundamental genetic algorithm (GA), explains how it has traditionally been designed and implemented and shows how the basic technique may be applied to a very simple numerical optimisation problem. The basic technique is then altered and refined in a number of ways, with the effects of each change being measured by comparison against the performance of the original. In this way, the reader is provided with an uncluttered introduction to the technique and learns to appreciate why certain variants of GA have become more popular than others in the scientific community. Davis stresses that the choice of a suitable representation for the problem in hand is a key step in applying the GA, as is the selection of suitable techniques for generating new solutions from old. He is refreshingly open in admitting that much of the business of adapting the GA to specific problems owes more to art than to science. It is nice to see the terminology associated with this subject explained, with the author stressing that much of the field is still an active area of research. Few assumptions are made about the reader's mathematical background. The second part of the book contains thirteen cameo descriptions of how genetic algorithmic techniques have been, or are being, applied to a diverse range of problems. Thus, one group of authors explains how the technique has been used for modelling arms races between neighbouring countries (a non- linear, dynamical system), while another group describes its use in deciding design trade-offs for military aircraft. My own favourite is a rather charming account of how the GA was applied to a series of scheduling problems. Having attempted something of this sort with Simulated Annealing, I found it refreshing to see the authors highlighting some of the problems that they had encountered, rather than sweeping them under the carpet as is so often done in the scientific literature. The editor points out that there are standard GA tools available for either play or serious development work. Two of these (GENESIS and OOGA) are described in a short, third part of the book. As is so often the case nowadays, it is possible to obtain a diskette containing both systems by sending your Visa card details (or $60) to an address in the USA.
series other
last changed 2003/04/23 15:14

_id 9ad2
authors Owen, J.C.
year 1991
title Algebraic Solution for Geometry from Dimensional Constraints
source ACM Symp. Found. of Solid Modeling, Austin TX, pp. 397-407
summary We investigate general configurations of distance and angle dimensions between points, lines and circles on a plane. A simple graphical representation is described for the system of coupled ctuadratic equations which results from treating the geometries as variables and the dimensions as defining equations. For many configurations of practical interest we show that these equations are poorly suited to numerical solution. We describe an algorithm for computing the solution to a subset of all possible configurations of geometry and dimensions using purely algebraic methods (in fact the usual arithmetic operations plus square roots). We show that this algorithm solves for all configurations of a practically useful type and that it solves for any configuration which can in principle be solved using these algebraic operations. Specifically, we use the Galois theory of equations to show that the following statements are equivalent. 1. The geometry can be constructed in principle on a drawing board using a ruler and compasses. 2. The coordinates of the geometries can be computed algebraically using only arithmetic operations plus square root. 3. The coordinates of the geometries lie in a normal field extension over the dimension values of degree 2n for some n. 4. For general (i.e. algebraically independent) dimension values the algorithm described will compute the geometries. We also describe a working implementation of the algorithm and describe some extensions to the basic ideaa which are necessary to make it a practically useful way to specify geometry by means of dimensional constraints.
series other
last changed 2003/04/23 15:50

_id b5be
authors Stok, Leon
year 1991
title Architectural synthesis and optimization of digital systems
source Eindhoven University of Technology
summary High level synthesis means going from an functional specification of a digits-system at the algorithmic level to a register transfer level structure. Different appli-cations will ask for different design styles. Despite this diversity in design styles many tasks in the synthesis will be similar. There is no need to write a new synthesis system for each design style. The best way to go seems a decomposition of the high level synthesis problems in several well defined subproblems. How the problem is decomposed depends heavily on a) the type of network architecture chosen, b) the constraints applied to the design and c) on the functional description itself. From this architecture style, the constraints and the functional description a synthesis scheme can be derived. Once this scheme is fixed, algorithms can be chosen which fit into this scheme and solve the subproblems in a fast and, when possible, optimal way. To support such a synthesis philosophy, a framework is needed in which all design information can be stored in a unique way during the various phases of the design process. This asks for a design data base capable of handling all design information with a formally defined interface to all design tools. This thesis gives a formal way to describe both the functional representation, the register transfer level structure and the controller and the relations between all three of them. Special attention has been paid to the efficient representation of mutual exclusive operations and array accesses. The scheduling and allocation problems are defined as mappings between these formal representations. Both the existing synthesis algorithms and the new algorithms described in this thesis fit into this framework. Three new allocation algorithms are presented in this thesis: an algorithm for optimal register allocation in cyclic data flow graphs, an exact polynomial algorithm to do the module allocation and a new scheme to minimize the number of interconnections during all stages of the data path allocation. Cyclic data flow graphs result from high level behavioral descriptions that contain loops. Algorithms for register allocation in high level synthesis published up till now, only considered loop free data flow graphs, When these algorithms are applied to data flow graphs with loops, unnecessary register transfer operations are introduced. A new algorithm is presented that performs a minimal register allocation and eliminates all superfluous register transfer operations. The problem is reformulated as a multicommodity network flow problem for which very efficient solutions exist. Experiments on a benchmark set have shown that in all test cases all register transfers could be eliminated at no increase in register cost. Only heuristic algorithms appeared in literature to solve the module allocation problem. The module allocation problem is usually defined as a clique cover problem on a so-called module allocation graph. It is shown that, under certain conditions, the module allocation graph belongs to the special class of comparability graphs. A polynomial time algorithm can optimally find a clique cover of such a graph. Even when interconnect weights are taken into account, this can be solved exactly. This problem can be transformed into a maximal cost network flow problem, which can be solved exactly in polynomial time. An algorithm is described which solves the module allocation problem with interconnect weights exactly, with a complexity O(kn2), where n is the number of operations In previous research, interconnection was optimized when the module allocation for the operations and the register allocation for the variables already had been done. However, the amount of multiplexing and interconnect are crucial factors to both the delay and the area of a circuit. A new scheme is presented to minimize the number of interconnections during the data path allocation. This scheme first groups all values based on their read and write times. Values belonging to the same group can share a register file. This minimizes the number of data transfers with different sources and destinations. Secondly, registers are allocated for each group separately. Finally the interconnect allocation is done. During the interconnect allocation, the module allocation is determined. The value grouping is based on edge coloring algorithms providing a sharp upper bound on the number of colors needed two techniques: splitting read and write phases of values and introducing serial (re-)write operations for the same value, make that even more efficient exact edge coloring algorithms can be used. It is shown that when variables are grouped into register files and operations are assigned to modules during the interconnection minimization, significant savings (20%) can be obtained in the number of local interconnections and the amount of global interconnect, at the expense of only slightly more register area.
keywords Digital Systems; Digital Systems
series thesis:PhD
email
last changed 2003/02/12 22:37

_id 4eed
authors Benedickt, Michael (ed.)
year 1991
title Cyberspace: First Steps
source The MIT Press, Cambridge, MA and London, UK
summary Cyberspace has been defined as "an infinite artificial world where humans navigate in information-based space" and as "the ultimate computer-human interface." These original contributions take up the philosophical basis for cyberspace in virtual realities, basic communications principles, ramifications of cyberspace for future workplaces, and more.
series other
last changed 2003/04/23 15:14

_id 6266
authors Carini, Alessandra
year 1991
title REVIEW OF MOST RECENT ACTIVITIES OF THE "LABORATORIO TIPOLOGICO NAZIONALE"
source Proceedings of the 3rd European Full-Scale Modelling Conference / ISBN 91-7740044-5 / Lund (Sweden) 13-16 September 1990, pp. 20-22
summary ??{??'s activities did not start immediately after its opening since the following year was mainly given over to the definition of criteria and procedures for the management of the Laboratory itself by OIKOS. Actual research started in 1990 on the basis of a programme drawn up with the collaboration of the Public Housing Committee ('Comitato per I'Edilizia Residenziale").
keywords Full-scale Modeling, Model Simulation, Real Environments
series other
type normal paper
more http://info.tuwien.ac.at/efa
last changed 2004/05/04 15:24

_id 00bc
authors Chen, Chen-Cheng
year 1991
title Analogical and inductive reasoning in architectural design computation
source Swiss Federal Institute of Technology, ETH Zurich
summary Computer-aided architectural design technology is now a crucial tool of modern architecture, from the viewpoint of higher productivity and better products. As technologies advance, the amount of information and knowledge that designers can apply to a project is constantly increasing. This requires development of more advanced knowledge acquisition technology to achieve higher functionality, flexibility, and efficient performance of the knowledge-based design systems in architecture. Human designers do not solve design problems from scratch, they utilize previous problem solving episodes for similar design problems as a basis for developmental decision making. This observation leads to the starting point of this research: First, we can utilize past experience to solve a new problem by detecting the similarities between the past problem and the new problem. Second, we can identify constraints and general rules implied by those similarities and the similar parts of similar situations. That is, by applying analogical and inductive reasoning we can advance the problem solving process. The main objective of this research is to establish the theory that (1) design process can be viewed as a learning process, (2) design innovation involves analogical and inductive reasoning, and (3) learning from a designer's previous design cases is necessary for the development of the next generation in a knowledge-based design system. This thesis draws upon results from several disciplines, including knowledge representation and machine learning in artificial intelligence, and knowledge acquisition in knowledge engineering, to investigate a potential design environment for future developments in computer-aided architectural design. This thesis contains three parts which correspond to the different steps of this research. Part I, discusses three different ways - problem solving, learning and creativity - of generating new thoughts based on old ones. In Part II, the problem statement of the thesis is made and a conceptual model of analogical and inductive reasoning in design is proposed. In Part III, three different methods of building design systems for solving an architectural design problem are compared rule-based, example-based, and case-based. Finally, conclusions are made based on the current implementation of the work, and possible future extensions of this research are described. It reveals new approaches for knowledge acquisition, machine learning, and knowledge-based design systems in architecture.
series thesis:PhD
email
last changed 2003/05/10 05:42

_id ga9921
id ga9921
authors Coates, P.S. and Hazarika, L.
year 1999
title The use of genetic programming for applications in the field of spatial composition
source International Conference on Generative Art
summary Architectural design teaching using computers has been a preoccupation of CECA since 1991. All design tutors provide their students with a set of models and ways to form, and we have explored a set of approaches including cellular automata, genetic programming ,agent based modelling and shape grammars as additional tools with which to explore architectural ( and architectonic) ideas.This paper discusses the use of genetic programming (G.P.) for applications in the field of spatial composition. CECA has been developing the use of Genetic Programming for some time ( see references ) and has covered the evolution of L-Systems production rules( coates 1997, 1999b), and the evolution of generative grammars of form (Coates 1998 1999a). The G.P. was used to generate three-dimensional spatial forms from a set of geometrical structures .The approach uses genetic programming with a Genetic Library (G.Lib) .G.P. provides a way to genetically breed a computer program to solve a problem.G. Lib. enables genetic programming to define potentially useful subroutines dynamically during a run .* Exploring a shape grammar consisting of simple solid primitives and transformations. * Applying a simple fitness function to the solid breeding G.P.* Exploring a shape grammar of composite surface objects. * Developing grammarsfor existing buildings, and creating hybrids. * Exploring the shape grammar of abuilding within a G.P.We will report on new work using a range of different morphologies ( boolean operations, surface operations and grammars of style ) and describe the use of objective functions ( natural selection) and the "eyeball test" ( artificial selection) as ways of controlling and exploring the design spaces thus defined.
series other
more http://www.generativeart.com/
last changed 2003/08/07 17:25

_id 2e56
authors Coyne, Robert Francis
year 1991
title ABLOOS : an evolving hierarchical design framework
source Carnegie Mellon University, Department of Architecture
summary The research reported in this thesis develops an approach toward a more effective use of hierarchical decomposition in computational design systems. The approach is based on providing designers a convenient interactive means to specify and experiment with the decompositional structure of design problems, rather than having decompositions pre-specified and encoded in the design system. Following this approach, a flexible decomposition capability is combined with an underlying design method to form the basis for an extensible and evolving framework for cooperative (humdcomputer) design. As a testbed for this approach, the ABLOOS framework for layout design is designed and constructed as a hierarchical extension of LOOS.’The framework enables a layout task to be hierarchically decomposed, and for the LOOS methodology to be applied recursively to layout subtasks at appropriate levels of abstraction within the hierarchy; layout solutions for the subtasks are then recomposed to achieve an overall solution, Research results thus far are promising: ABLOOS has produced high quality solutions for a class of industrial layout design tasks (an analog power board layout with 60 components that have multiple complex constraints on their placement); the adaptability of the framework across domains and disciplines has been demonstrated; and, further development of ABLOOS is underway including its extension to layouts in 2 1/2D space and truly 3D arrangements. The contribution of this work is in demonstrating an effective, flexible and extensible capability for hierarchical decomposition in design. It has also produced a more comprehensive layout system that can serve as a foundation for the further investigation of hierarchical decomposition in a variety of design domains.
series thesis:PhD
last changed 2003/02/12 22:37

_id 2e03
authors Diederiks, H.J. and van Staveren, R.J.
year 1991
title Dynamic Information System for Modelling of Design Processes
source Computer Integrated Future, CIB W78 Seminar. september, 1991
summary Unnumbered : ill. DINAMO is a Dynamic Information System for Modelling of Design Processes. It is intended for use along with product models, data management systems and existing applications. In DINAMO a programming user can define processes. These processes are represented by graphs. The graphs are characterized by nodes and relations between nodes. Each node in a graph represents a task, and each relation can be restricted to conditions. So the way in which a process is actually being performed, that is, the actual path to be evaluated through the graph, can depend on certain conditions. Processes and functions (=software modules) are available to the user as tasks. A consuming user can activate tasks; the DINAMO system regulates the dispatch of the tasks, conform the process and function definitions. Tasks are collected on sheets; sheets are collected in a task box. A task box can be regarded as a certain environment, determined by the programming user. A consuming user can choose between the environments which are available at that moment. With the DINAMO system software and process definitions can be re-used in a simple way
keywords design process, modeling, graphs, information, relations, software
series CADline
last changed 2003/06/02 13:58

_id caadria2020_231
id caadria2020_231
authors Doe, Robert
year 2020
title sensMOD - Computational Design through the lens of Henri Lefebvre's Spatial Theory
source D. Holzer, W. Nakapan, A. Globa, I. Koh (eds.), RE: Anthropocene, Design in the Age of Humans - Proceedings of the 25th CAADRIA Conference - Volume 1, Chulalongkorn University, Bangkok, Thailand, 5-6 August 2020, pp. 701-710
doi https://doi.org/10.52842/conf.caadria.2020.1.701
summary Spatial productivity is the first of the elements comprising sensMOD, a student elective that implemented a methodology addressing the exigent need of our time for transformation in the architecture, engineering and construction (AEC) sector. The second and third elements of sensMOD are parts and interaction which focus attention on the nature of complexity and connectivity in our networked world. The paper proposes a methodology that was used to guide the teaching of an elective for third year architecture students at a UK university. Its wider purpose is to contribute to discussion concerning the dysfunctional state of an AEC sector that needs to consider its productivity as projections of wider networks of resource and energy relationships. Henri Lefebvre's spatial theory (1991) guides the narrative and formulation of sensMOD.
keywords computational design; spatial productivity; modularity; interaction design
series CAADRIA
email
last changed 2022/06/07 07:55

_id a6be
authors Doyle, J.
year 1991
title Static and Dynamic Analysis of Structures, with an emphasis on mechanics and computer methods
source Kluwer Academic Pub., Dordrecht
summary This book is concerned with the static and dynamic analysis of structures. Specifically, it uses the stiffness formulated matrix methods for use on computers to tackle some of the fundamental problems facing engineers in structural mechanics. This is done by covering the Mechanics of Structures, its rephrasing in terms of the Matrix Methods and then their Computational implementation, all within a cohesive setting. Although the book is designed primarily as a text for use at the upper-graduate and beginning graduate level, many practising structural engineers will find it useful as a reference and self-study guide. Each chapter is supplemented with a collection of pertinent problems that indicate extensions of the theory and the applications. These problems, combined with selected references to relevant literature, can form the basis for further study. The book sets as its goal the treatment of structural dynamics starting with the basic mechanics and going all the way to their implementation on digital computers. Rather than discuss particular commercial packages, Professor Doyle has developed STADYN: a complete (but lean) program to perform each of the standard procedures used in commercial programs. Each module in the program is reasonably complete in itself, and all were written with the sole aim of clarity plus a modicum of efficiency, compactness and elegance.
series other
last changed 2003/04/23 15:14

_id sigradi2016_710
id sigradi2016_710
authors Duarte, Rovenir Bertola; Lepri, Louisa Savignon; Sanches, Malu Magalh?es
year 2016
title Objectile e o projeto paramétrico [Objectile and parametric design]
source SIGraDi 2016 [Proceedings of the 20th Conference of the Iberoamerican Society of Digital Graphics - ISBN: 978-956-7051-86-1] Argentina, Buenos Aires 9 - 11 November 2016, pp.149-156
summary The objectile was a concept developed by Deleuze and Cache in the 80s. It treats the object as a variable and anticipates the society of obsolescence, an inquiry about the contemporary life of the object (marketing, function, representation, modeling, production and consumption). This concept deals with the object where“... fluctuation of the norm replaces the permanence of a law; where the object assumes a place in a continuum by variation” (Deleuze, 1991, p.38). This paper proposes to think objectile as the object of the architectural design, on three types of approximations between design and objectile: (a) Objectile as variable of the design, (b) Objectile as a design variable, and (c) Objectile as architecture (variable architecture). The second approximation (b) enables to discuss the conception of continuous design with power to cross other projects - a meta-design. The main aspect of this meta-design is the variability, another way of control based on concepts of patterns and modulations; however, objectile can mean the release of mind for new types of thought and new kinds of design based on “continuum by variation”: meta-design.
keywords Objectile; parametric design; Gilles Deleuze; Modulado; Digital design
series SIGRADI
email
last changed 2021/03/28 19:58

_id 0faa
authors Duelund Mortensen, Peder
year 1991
title THE FULL-SCALE MODEL WORKSHOP
source Proceedings of the 3rd European Full-Scale Modelling Conference / ISBN 91-7740044-5 / Lund (Sweden) 13-16 September 1990, pp. 10-11
summary The workshop is an institution, available for use by the public and established at the Laboratory of Housing in the Art Academy's school of Architecture for a 3 year trial period beginning April 1985. This resumé contains brief descriptions of a variety of representative model projects and an overview of all projects carried out so far, including the pilot projects from 1983 and planned projects to and including January 1987. The Full Scale Model Workshop builds full size models of buildings, rooms and parts of buildings. The purpose of the Full Scale Model Workshop is to promote communication among building's users. The workshop is a tool in an attempt to build bridges between theory and practice in research, experimentation and communication of research results. New ideas and experiments of various sorts can be tried out cheaply, quickly and efficiently through the building of full scale models. Changes can be done on the spot as a planned part of the project and on the basis of ideas and experiments achieved through the model work itself. Buildings and their space can thus be communicated directly to all involved persons, regardless of technical background or training in evaluation of building projects.
keywords Full-scale Modeling, Model Simulation, Real Environments
series other
type normal paper
more http://info.tuwien.ac.at/efa
last changed 2004/05/04 15:23

_id c967
authors Fantacone, Enrico
year 1994
title Exporting CAD Teaching into Developing Countries
source The Virtual Studio [Proceedings of the 12th European Conference on Education in Computer Aided Architectural Design / ISBN 0-9523687-0-6] Glasgow (Scotland) 7-10 September 1994, p. 222
doi https://doi.org/10.52842/conf.ecaade.1994.x.t3s
summary In 1986 the Faculty of Architecture was established in Maputo. It is financed by the Italian Ministry of Foreign Affairs and managed by a Scientific Council of the Faculty of Architecture of "Università La Sapienza" of Rome. The need to create human technical resources beeing able to work profesionally as soon as they finish their studies, made the teaching basis for lab exercises and design. The new architects (the first six students graduated in 1991), need to design and make very important decisions without any control by more experienced local technical institutions. The creation of a CAAD laboratory, and the teaching of information technologies and metodologies in architectural designing aimes to achieve a double goal: (-) to make the new architects able to manage on their own, because of the lack of qualified human resources, large quantity of data, and difficult design problems; (-) to make University, the most important scientific center in the country, an information exchange center between developped countries, and Moçambique.
series eCAADe
last changed 2022/06/07 07:50

_id ecaade2007_073
id ecaade2007_073
authors Francis, Sabu
year 2007
title Web Based Collaborative Architectural Practice Using a Fractal System
source Predicting the Future [25th eCAADe Conference Proceedings / ISBN 978-0-9541183-6-5] Frankfurt am Main (Germany) 26-29 September 2007, pp. 727-734
doi https://doi.org/10.52842/conf.ecaade.2007.727
summary I have been working on an architecture representation system in India since 1991; that markedly deviates from the need of traditional drawings as we know. Over three million square feet of work has been done that took advantage of this system as it was being developed. The system has now matured sufficiently to be put into practice as a comprehensive architectural system of practice. It takes advantage of creation of just-in-time dynamic multi-organizations that can get formed (and dismantled) over the Internet on a project to project basis. The raison d’être of the representation system is that it would expose the “source-code” (metaphorically) of any work of architecture to stakeholders, much the same way as an open-source software project exposes the internal representation to fellow developers. I believe the design of architecture must go through an “open source” process in order to produce socially responsible designs. Such a stance is explained in this paper. The paper also explains the system in detail; its mathematical basis and justifies the need for such an approach. It also explores how a collaborative practice can be put into place using the system in the context of Internet technologies.
keywords Collaborative practice, fractals, representation system
series eCAADe
email
last changed 2022/06/07 07:50

For more results click below:

this is page 0show page 1show page 2show page 3show page 4show page 5... show page 8HOMELOGIN (you are user _anon_441636 from group guest) CUMINCAD Papers Powered by SciX Open Publishing Services 1.002