CumInCAD is a Cumulative Index about publications in Computer Aided Architectural Design
supported by the sibling associations ACADIA, CAADRIA, eCAADe, SIGraDi, ASCAAD and CAAD futures

PDF papers
References

Hits 1 to 20 of 221

_id eaca
authors Davis, L. (ed.)
year 1991
title Handbook of genetic algorithms
source Van Nostrand Reinhold, New York
summary This book sets out to explain what genetic algorithms are and how they can be used to solve real-world problems. The first objective is tackled by the editor, Lawrence Davis. The remainder of the book is turned over to a series of short review articles by a collection of authors, each explaining how genetic algorithms have been applied to problems in their own specific area of interest. The first part of the book introduces the fundamental genetic algorithm (GA), explains how it has traditionally been designed and implemented and shows how the basic technique may be applied to a very simple numerical optimisation problem. The basic technique is then altered and refined in a number of ways, with the effects of each change being measured by comparison against the performance of the original. In this way, the reader is provided with an uncluttered introduction to the technique and learns to appreciate why certain variants of GA have become more popular than others in the scientific community. Davis stresses that the choice of a suitable representation for the problem in hand is a key step in applying the GA, as is the selection of suitable techniques for generating new solutions from old. He is refreshingly open in admitting that much of the business of adapting the GA to specific problems owes more to art than to science. It is nice to see the terminology associated with this subject explained, with the author stressing that much of the field is still an active area of research. Few assumptions are made about the reader's mathematical background. The second part of the book contains thirteen cameo descriptions of how genetic algorithmic techniques have been, or are being, applied to a diverse range of problems. Thus, one group of authors explains how the technique has been used for modelling arms races between neighbouring countries (a non- linear, dynamical system), while another group describes its use in deciding design trade-offs for military aircraft. My own favourite is a rather charming account of how the GA was applied to a series of scheduling problems. Having attempted something of this sort with Simulated Annealing, I found it refreshing to see the authors highlighting some of the problems that they had encountered, rather than sweeping them under the carpet as is so often done in the scientific literature. The editor points out that there are standard GA tools available for either play or serious development work. Two of these (GENESIS and OOGA) are described in a short, third part of the book. As is so often the case nowadays, it is possible to obtain a diskette containing both systems by sending your Visa card details (or $60) to an address in the USA.
series other
last changed 2003/04/23 15:14

_id 22d6
authors Ballheim, F. and Leppert, J.
year 1991
title Architecture with Machines, Principles and Examples of CAAD-Education at the Technische Universität München
source Experiences with CAAD in Education and Practice [eCAADe Conference Proceedings] Munich (Germany) 17-19 October 1991
doi https://doi.org/10.52842/conf.ecaade.1991.x.h3w
summary "Design tools affect the results of the design process" - this is the starting point of our considerations about the efficient use of CAAD within architecture. To give you a short overview about what we want to say with this thesis lets have a short - an surely incomplete - trip through the fourth dimension back into the early time of civil engineering. As CAD in our faculty is integrated in the "Lehrstuhl für Hochbaustatik und Tragwerksplanung" (if we try to say it in English it would approximately be "institute of structural design"), we chose an example we are very familiar with because of its mathematical background - the cone sections: Circle, ellipse, parabola and hyperbola. If we start our trip two thousand years ago we only find the circle - or in very few cases the ellipse - in their use for the ground plan of greek or roman theaters - if you think of Greek amphitheaters or the Colosseum in Rome - or for the design of the cross section of a building - for example the Pantheon, roman aqueducts or bridges. With the rediscovery of the perspective during the Renaissance the handling of the ellipse was brought to perfection. May be the most famous example is the Capitol in Rome designed by Michelangelo Buonarotti with its elliptical ground plan that looks like a circle if the visitor comes up the famous stairway. During the following centuries - caused by the further development of the natural sciences and the use of new construction materials, i.e. cast-iron, steel or concrete - new design ideas could be realized. With the growing influence of mathematics on the design of buildings we got the division into two professions: Civil engineering and architecture. To the regret of the architects the most innovative constructions were designed by civil engineers, e.g. the early iron bridges in Britain or the famous bridges of Robert Maillard. Nowadays we are in the situation that we try to reintegrate the divided professions. We will return to that point later discussing possible solutions of this problem. But let us continue our 'historical survey demonstrating the state of the art we have today. As the logical consequence of the parabolic and hyperbolic arcs the hyperbolic parabolic shells were developed using traditional design techniques like models and orthogonal sections. Now we reach the point where the question comes up whether complex structures can be completely described by using traditional methods. A question that can be answered by "no" if we take the final step to the completely irregular geometry of cable- net-constructions or deconstructivistic designs. What we see - and what seems to support our thesis of the connection between design tools and the results of the design process - is, that on the one hand new tools enabled the designer to realize new ideas and on the other hand new ideas affected the development of new tools to realize them.

series eCAADe
more http://www.mediatecture.at/ecaade/91/ballheim_leppert.pdf
last changed 2022/06/07 07:50

_id ga9921
id ga9921
authors Coates, P.S. and Hazarika, L.
year 1999
title The use of genetic programming for applications in the field of spatial composition
source International Conference on Generative Art
summary Architectural design teaching using computers has been a preoccupation of CECA since 1991. All design tutors provide their students with a set of models and ways to form, and we have explored a set of approaches including cellular automata, genetic programming ,agent based modelling and shape grammars as additional tools with which to explore architectural ( and architectonic) ideas.This paper discusses the use of genetic programming (G.P.) for applications in the field of spatial composition. CECA has been developing the use of Genetic Programming for some time ( see references ) and has covered the evolution of L-Systems production rules( coates 1997, 1999b), and the evolution of generative grammars of form (Coates 1998 1999a). The G.P. was used to generate three-dimensional spatial forms from a set of geometrical structures .The approach uses genetic programming with a Genetic Library (G.Lib) .G.P. provides a way to genetically breed a computer program to solve a problem.G. Lib. enables genetic programming to define potentially useful subroutines dynamically during a run .* Exploring a shape grammar consisting of simple solid primitives and transformations. * Applying a simple fitness function to the solid breeding G.P.* Exploring a shape grammar of composite surface objects. * Developing grammarsfor existing buildings, and creating hybrids. * Exploring the shape grammar of abuilding within a G.P.We will report on new work using a range of different morphologies ( boolean operations, surface operations and grammars of style ) and describe the use of objective functions ( natural selection) and the "eyeball test" ( artificial selection) as ways of controlling and exploring the design spaces thus defined.
series other
more http://www.generativeart.com/
last changed 2003/08/07 17:25

_id ga0024
id ga0024
authors Ferrara, Paolo and Foglia, Gabriele
year 2000
title TEAnO or the computer assisted generation of manufactured aesthetic goods seen as a constrained flux of technological unconsciousness
source International Conference on Generative Art
summary TEAnO (Telematica, Elettronica, Analisi nell'Opificio) was born in Florence, in 1991, at the age of 8, being the direct consequence of years of attempts by a group of computer science professionals to use the digital computers technology to find a sustainable match among creation, generation (or re-creation) and recreation, the three basic keywords underlying the concept of “Littérature potentielle” deployed by Oulipo in France and Oplepo in Italy (see “La Littérature potentielle (Créations Re-créations Récréations) published in France by Gallimard in 1973). During the last decade, TEAnO has been involving in the generation of “artistic goods” in aesthetic domains such as literature, music, theatre and painting. In all those artefacts in the computer plays a twofold role: it is often a tool to generate the good (e.g. an editor to compose palindrome sonnets of to generate antonymic music) and, sometimes it is the medium that makes the fruition of the good possible (e.g. the generator of passages of definition literature). In that sense such artefacts can actually be considered as “manufactured” goods. A great part of such creation and re-creation work has been based upon a rather small number of generation constraints borrowed from Oulipo, deeply stressed by the use of the digital computer massive combinatory power: S+n, edge extraction, phonetic manipulation, re-writing of well known masterpieces, random generation of plots, etc. Regardless this apparently simple underlying generation mechanisms, the systematic use of computer based tools, as weel the analysis of the produced results, has been the way to highlight two findings which can significantly affect the practice of computer based generation of aesthetic goods: ? the deep structure of an aesthetic work persists even through the more “desctructive” manipulations, (such as the antonymic transformation of the melody and lyrics of a music work) and become evident as a sort of profound, earliest and distinctive constraint; ? the intensive flux of computer generated “raw” material seems to confirm and to bring to our attention the existence of what Walter Benjamin indicated as the different way in which the nature talk to a camera and to our eye, and Franco Vaccari called “technological unconsciousness”. Essential references R. Campagnoli, Y. Hersant, “Oulipo La letteratura potenziale (Creazioni Ri-creazioni Ricreazioni)”, 1985 R. Campagnoli “Oupiliana”, 1995 TEAnO, “Quaderno n. 2 Antologia di letteratura potenziale”, 1996 W. Benjiamin, “Das Kunstwerk im Zeitalter seiner technischen Reprodizierbarkeit”, 1936 F. Vaccari, “Fotografia e inconscio tecnologico”, 1994
series other
more http://www.generativeart.com/
last changed 2003/08/07 17:25

_id ga0010
id ga0010
authors Moroni, A., Zuben, F. Von and Manzolli, J.
year 2000
title ArTbitrariness in Music
source International Conference on Generative Art
summary Evolution is now considered not only powerful enough to bring about the biological entities as complex as humans and conciousness, but also useful in simulation to create algorithms and structures of higher levels of complexity than could easily be built by design. In the context of artistic domains, the process of human-machine interaction is analyzed as a good framework to explore creativity and to produce results that could not be obtained without this interaction. When evolutionary computation and other computational intelligence methodologies are involved, every attempt to improve aesthetic judgement we denote as ArTbitrariness, and is interpreted as an interactive iterative optimization process. ArTbitrariness is also suggested as an effective way to produce art through an efficient manipulation of information and a proper use of computational creativity to increase the complexity of the results without neglecting the aesthetic aspects [Moroni et al., 2000]. Our emphasis will be in an approach to interactive music composition. The problem of computer generation of musical material has received extensive attention and a subclass of the field of algorithmic composition includes those applications which use the computer as something in between an instrument, in which a user "plays" through the application's interface, and a compositional aid, which a user experiments with in order to generate stimulating and varying musical material. This approach was adopted in Vox Populi, a hybrid made up of an instrument and a compositional environment. Differently from other systems found in genetic algorithms or evolutionary computation, in which people have to listen to and judge the musical items, Vox Populi uses the computer and the mouse as real-time music controllers, acting as a new interactive computer-based musical instrument. The interface is designed to be flexible for the user to modify the music being generated. It explores evolutionary computation in the context of algorithmic composition and provides a graphical interface that allows to modify the tonal center and the voice range, changing the evolution of the music by using the mouse[Moroni et al., 1999]. A piece of music consists of several sets of musical material manipulated and exposed to the listener, for example pitches, harmonies, rhythms, timbres, etc. They are composed of a finite number of elements and basically, the aim of a composer is to organize those elements in an esthetic way. Modeling a piece as a dynamic system implies a view in which the composer draws trajectories or orbits using the elements of each set [Manzolli, 1991]. Nonlinear iterative mappings are associated with interface controls. In the next page two examples of nonlinear iterative mappings with their resulting musical pieces are shown.The mappings may give rise to attractors, defined as geometric figures that represent the set of stationary states of a non-linear dynamic system, or simply trajectories to which the system is attracted. The relevance of this approach goes beyond music applications per se. Computer music systems that are built on the basis of a solid theory can be coherently embedded into multimedia environments. The richness and specialty of the music domain are likely to initiate new thinking and ideas, which will have an impact on areas such as knowledge representation and planning, and on the design of visual formalisms and human-computer interfaces in general. Above and bellow, Vox Populi interface is depicted, showing two nonlinear iterative mappings with their resulting musical pieces. References [Manzolli, 1991] J. Manzolli. Harmonic Strange Attractors, CEM BULLETIN, Vol. 2, No. 2, 4 -- 7, 1991. [Moroni et al., 1999] Moroni, J. Manzolli, F. Von Zuben, R. Gudwin. Evolutionary Computation applied to Algorithmic Composition, Proceedings of CEC99 - IEEE International Conference on Evolutionary Computation, Washington D. C., p. 807 -- 811,1999. [Moroni et al., 2000] Moroni, A., Von Zuben, F. and Manzolli, J. ArTbitration, Las Vegas, USA: Proceedings of the 2000 Genetic and Evolutionary Computation Conference Workshop Program – GECCO, 143 -- 145, 2000.
series other
email
more http://www.generativeart.com/
last changed 2003/08/07 17:25

_id 2abf
id 2abf
authors Rafi, A
year 2001
title Design creativity in emerging technologies
source In Von, H., Stocker, G. and Schopf, C. (Eds.), Takeover: Who’s doing art of tomorrow (pp. 41-54), New York: SpringerWein.
summary Human creativity works best when there are constraints – pressures to react to, to shape, to suggest. People are generally not very good at making it all up from scratch (Laurel, 1991). Emerging technology particularly virtual reality (VR) Multimedia and Internet is yet to be fully discovered as it allows unprecedented creative talent, ability, skill set, creative thinking, representation, exploration, observation and reference. In an effort to deliver interactive content, designers tend to freely borrow from different fields such as advertising, medicine, game, fine art, commerce, entertainment, edutainment, film-making and architecture (Rafi, Kamarulzaman, Fauzan and Karboulonis, 2000). As a result, content becomes a base that developers transfer the technique of conventional medium design media to the computer. What developers (e.g. artist and technologist) often miss is that to develop the emerging technology content based on the nature of the medium. In this context, the user is the one that will be the best judge to value the effectiveness of the content.

The paper will introduce Global Information Infrastructure (GII) that is currently being developed in the Asian region and discuss its impact on the Information Age society. It will further highlight the ‘natural’ value and characteristics of the emerging technologies in particular Virtual Reality (VR), Multimedia and Internet as a guidance to design an effective, rich and innovative content development. This paper also argues that content designers of the future must not only be both artist and technologist, but artist and technologist that are aware of the re-convergence of art and science and context in which content is being developed. Some of our exploration at the Faculty of Creative Multimedia, Multimedia University will also be demonstrated. It is hoped that this will be the evidence to guide future ‘techno-creative designers’.

keywords design, creativity, content, emerging technologies
series book
type normal paper
email
last changed 2007/09/13 03:46

_id a1dc
authors Budd, T.
year 1991
title An introduction to Object Oriented programming
source Addison-Wesley
summary In An Introduction to Object-Oriented Programming, Timothy Budd provides a language-independent presentation of object-oriented principles, such as objects, methods, inheritance (including multiple inheritance) and polymorphism. Examples are drawn from several different languages, including (among others) C++, C#, Java, CLOS, Delphi, Eiffel, Objective-C and Smalltalk. By examining many languages, the reader is better able to appreciate the general principles that lie beyond the syntax of the individual languages.
series other
last changed 2003/04/23 15:14

_id c12b
authors Sakr, Yasser H. and Johnson, Robert E.
year 1991
title Computer-Aided Architectural Design Strategies: One Size Does Not Fit All
source Reality and Virtual Reality [ACADIA Conference Proceedings / ISBN 1-880250-00-4] Los Angeles (California - USA) October 1991, pp. 15-31
doi https://doi.org/10.52842/conf.acadia.1991.015
summary The practice of architecture is in the midst of significant change and an increasingly uncertain future. Socio-economic factors external to the profession are forcing firms to develop new strategies for delivering design services. Overlaying these external changes is the uncertainty resulting from the inevitable introduction of information technology, which is only beginning to have an impact on the profession. Some advocates see the emergence of a new form of design firm -the computerized design firm - as an intelligent organization structured around electronic work groups with powerful computation and communications tools (Catalano 1990). On the other hand, many practitioners still see CADD as an expensive technology whose primary result leads to an increase in overhead costs. But some practitioners and researchers (Coyne, 1991) recognize both the potential and, problems that computer-aided design presents to the profession. This research presents a framework for understanding how changing information technology might be appropriately integrated into the design firm. It argues that design is an increasingly diverse enterprise, and that this diversity must be understood in order to effectively integrate information technology. The study is divided into three sections. The first section develops an overview of major social, economic, and structural changes within the profession. The second section discusses two alternative approaches that have been utilized to integrate information technology into firms. The third part presents a framework for understanding how information technology may have an impact on strategies for structuring and organizing architectural firms.
series ACADIA
last changed 2022/06/07 07:56

_id c00e
authors Tolman, F. P. and Kuiper, P.
year 1991
title Some Integration Requirements for Computer Integrated Building
source The Computer Integrated Future, CIB W78 Seminar. september, 1991. Unnumbered : ill. includes a short bibliography
summary Introduction of computer technology in the Building and Construction industries follows a bottom-up approach. Bottom up approaches always lead to (1) communication problems on higher levels -- in this case recognized as 'islands of automation' -- subsequently followed by more recently (2) a plea for integration. Although the word 'integration' quickly became in vogue, it is not clear what it really means and what it is that we are supposed to integrate. Another interesting and pressing question is: 'How to integrate the different integration efforts'? The paper discusses five hierarchical technical levels of integration. Each level is elaborated in some detail. Also the relations between the levels are brought into perspective. Non-technical integration requirements (e.g. social, organizational, or legal) are not discussed
keywords integration, systems, CAD, building, construction
series CADline
last changed 2003/06/02 10:24

_id 241f
authors Van Wyk, C.S.G., Bhat, R., Gauchel, J. and Hartkopf, V.
year 1991
title A Knowledge-based Approach to Building Design and Performance Evaluation
source Reality and Virtual Reality [ACADIA Conference Proceedings / ISBN 1-880250-00-4] Los Angeles (California - USA) October 1991, pp. 1-14
doi https://doi.org/10.52842/conf.acadia.1991.001
summary The introduction of physically-based description and simulation methods to issues of building performance (i.e., acoustic, visual, and air quality; thermal comfort, cost, and long-term system integrity) began in the early 1960s as one of the first examples of computer-aided design in architecture. Since that time, the development of commercially-available computer-aided design systems has largely been oriented towards the visualization and representation of the geometry of buildings, while the development of building performance applications has been concerned with approaches to mathematical and physics-based modeling for predictive purposes.
series ACADIA
email
last changed 2022/06/07 07:58

_id f9bd
authors Amor, R.W.
year 1991
title ICAtect: Integrating Design Tools for Preliminary Architectural Design
source Wellington, New Zealand: Computer Science Department, Victoria University
summary ICAtect is a knowledge based system that provides an interface between expert systems, simulation packages and CAD systems used for preliminary architectural design. This thesis describes its structure and development.The principal work discussed in this thesis involves the formulation of a method for representing a building. This is developed through an examination of a number of design tools used in architectural design, and the ways in which each of these describe a building.Methods of enabling data to be transferred between design tools are explored. A Common Building Model (CBM), forming the core of the ICAtect system, is developed to represent the design tools knowledge of a building. This model covers the range of knowledge required by a large set of disparate design tools used by architects at the initial design stage.Standard methods of integrating information from the tools were examined, but required augmentation to encompass the unusual constraints found in some of the design tools. The integration of the design tools and the CBM is discussed in detail, with example methods developed for each type of design tool. These example methods provide a successful way of moving information between the different representations. Some problems with mapping data between very different representations were encountered in this process, and the solutions or ideas for remedies are detailed. A model for control and use of ICAtect is developed in the thesis, and the extensions to enable a graphical user interface are discussed.The methods developed in this thesis demonstrate the feasibility of an integrated system of this nature, while the discussion of future work indicates the scope and potential power of ICAtect.
series other
last changed 2003/04/23 15:14

_id 27d2
authors Ayrle, Hartmut
year 1991
title Computers for Architects - Only a Tool?
source Experiences with CAAD in Education and Practice [eCAADe Conference Proceedings] Munich (Germany) 17-19 October 1991
doi https://doi.org/10.52842/conf.ecaade.1991.x.i9j
summary The paper states that, as a result of the schism between architecture as art and engineering as rationalism, the architectural community underestimates the computer as tool with a potential to substantially enlarge the possibilities of building design. It is claimed that the computer could serve as coordination tool for the ruptured design process, as a virtual workbench where all design disciplines sit together and develop their designs in enhanced conscience of what the whole design demands. The paper then concludes, that to develop such software tools, architects must participate in the development of software and may no longer be restricted to the role of applicants, especially during their universitary instruction. The corresponding research and training facilities at the University of Karlsruhe, Faculty of Architecture are described.

series eCAADe
last changed 2022/06/07 07:50

_id 227a
authors Bourdeau, L., Dubois, A.-M. and Poyet, P.
year 1991
title A Common Data Model for Computer Integrated Building
source computer Integrated Future, CIB W78 Seminar. September, 1991. Unnumbered : some ill. includes bibliography
summary The connection of various building performance evaluation tools in a collaborative way is an essential request to develop true CAD systems. It is a basic requirement for the future of integrated information systems for building projects, where data concerning multiple aspects of the project can be exchanged during the different design steps. This paper deals with the on-going research concerning the generation of a common data model in the framework of a European collaborative action, the COMBINE Project, which is supported by the CEC, General Directorate XII for Research Science and Development, within the JOULE programme. The first step of the research concerns the progressive construction of a conceptual model and the paper focuses on the development of this Integrated Data Model (IDM). The paper reports on the definition of the architecture of the IDM. The main issues and the methodology of the IDM development are presented. The IDM development methodology is based on successive steps dealing with the identification of the data and context which are considered by the Design Tool Prototypes (DTP) to be connected through the IDM, the conceptual integration of this knowledge, and the implementation of the model on an appropriate software environment
keywords standards, integration, communication, building, evaluation, modeling
series CADline
last changed 2003/06/02 14:41

_id 0b1c
authors Bridges, Alan
year 1991
title Computer Exercises in Architectural Design Theory
source Experiences with CAAD in Education and Practice [eCAADe Conference Proceedings] Munich (Germany) 17-19 October 1991
doi https://doi.org/10.52842/conf.ecaade.1991.x.f9w
summary This paper discusses how architectural theory may be taught using computer based exercises to explore the practical application of those theories. The particular view of architecture developed is, necessarily, a restricted one but the objectives behind the exercises are slightly different to those that a pure architectural theorist or historian might have The formal teaching of architectural theory and composition has not been very fashionable in Schools of Architecture for several years now: indeed there is a considerable inbuilt resistance in students to the application of any form of rules or procedures. There is however a general interest in computing and this can be utilised to advantage. In concentrating on computer applications in design eclectic use has been made of a number of architectural examples ranging from Greek temples to the work of modern deconstructionists. Architectural theory since Vitruvius is littered with attempts to define universal theories of design and this paper certainly does not presume to anything so grand: I have merely looked at buildings, compared them and noted what they have in common and how that might relate to computer-aided design. I have ignored completely any sociological, philosophical or phenomenological questions but would readily agree with the criticism that Cartesian rationality is not, on its own, a sufficient base upon which to build a theory of design. However I believe there is merit in articulating design by separating it from other concerns and making it a subject of study in its own right. Work in design research will provide the models and intellectual structures to facilitate discourse about design and might be expected to benefit the development of design skills by providing material that could be formally taught and debated in a way that is removed from the ephemeral "fashionable designer" debate. Of course, some of the ideas discussed here may prove to be equally ephemeral but that does not entirely negate their value.

series eCAADe
email
last changed 2022/06/07 07:50

_id 85f9
authors Brisson, E., Debras, P. and Poyet, Patrice
year 1991
title A First Step Towards an Intelligent Integrated Design System in the Building Field
source computer Integrated Future, CIB W78 Seminar. September, 1991. Unnumbered pages : ill. includes bibliography
summary This article presents the work the Knowledge Base Group is achieving towards the integration of Artificial Intelligence based facilities in the Building design process. After an overview of the current state of the integrated design process, the context and the technical guidelines to realize computer integrated software in the building design field is described. Then some tools are presented to model the knowledge (the HBDS method) and to implement such model in our Mips home-made knowledge modeling software platform (including object-oriented database management facilities, expert system reasoning facilities, hypertext edition facilities, 3D-design and 3D-view modules...). Finally the authors describe the Quakes application devoted to assess detached house anti-seismic capabilities during the design process. A deep conceptual model considers all the semantic entities (columns, resistant panels, openings, ...) involved in the anti-seismic expertise. Using both this conceptual model description of a detached house and the 3D design tool, they input the project. Then the seismic expertise is driven in a divide and conquer approach and records the alleged configuration recognized automatically linked to the corresponding section of the building regulation
keywords AI, design, knowledge, software, integration, building, CAD, structures
series CADline
last changed 2003/06/02 13:58

_id b6b3
authors Brown, J.S. and Duguid, P.
year 1991
title Organizational Learning and Communities-of-Practice: Toward a Unified View of Working, Learning, and Innovation
source Organization Science, 2(1), 40-57
summary Recent ethnographic studies of workplace practices indicate that the ways people actually work usually differ fundamentally from the ways organizations describe that work in manuals, training programs, organizational charts, and job descriptions. Organizations tend to rely on the latter in their attempts to understand and improve work practice. We relate the conclusions of one study of work practices to compatible investigations of learning and innovation to argue that conventional descriptions of jobs mask not only the ways people work, but also the learning and innovation generated in the informal communities-of-practice in which they work. By reassessing the apparently conflicting triad of work, learning, and innovation in the context of actual communities and actual practices, we suggest that the synergistic connections between these three become apparent. With a unified view of working, learning, and innovating, it should be possible to reconceive of and redesign organizations to improve all three.
series journal paper
last changed 2003/04/23 15:14

_id 403c
authors Coyne, Richard
year 1991
title The Impact of Computer Use on Design Practice
source Computer Aided Architectural Design Futures: Education, Research, Applications [CAAD Futures ‘91 Conference Proceedings / ISBN 3-528-08821-4] Zürich (Switzerland), July 1991, pp. 413-424
summary This paper presents a critical review of the impact of computing on design practice. It presents an overview of - the impact as it relates to the intrusion of a highly technical resources into organizations. This involves a discussion of the changing nature of computing, the implications of training, management, and perceptions about the compatibility of computers and design. This leads to a consideration of less direct implications in terms of power structures, how computers influence the way we carry out intellectual tasks, the emphasis instilled by computers on form in design, and the influence of computers on attitudes of self worth.
series CAAD Futures
email
last changed 2003/05/16 20:58

_id eb51
authors Coyne, Richard
year 1996
title CAAD, Curriculum and Controversy
source Education for Practice [14th eCAADe Conference Proceedings / ISBN 0-9523687-2-2] Lund (Sweden) 12-14 September 1996, pp. 121-130
doi https://doi.org/10.52842/conf.ecaade.1996.121
summary This paper brings some of the debate within educational theory to bear on CAAD teaching, outlining the contributions of conservatism, critical theory, radical hermeneutics and pragmatism. The paper concludes by recommending that CAAD teaching move away from conservative concepts of teaching, design and technology to integrate it into the studio. In a highly illuminating book on education theory, Shaun Gallagher (1991) outlines four current views on education that correspond to four major positions in contemporary social theory and philosophy. I will extend these categories to a consideration of attitudes to information technology, and the teaching of computing in architecture. These four positions are conservatism, critical theory, radical hermeneutics, and pragmatism. I will show how certain issues cluster around them, how each position provides the focus of various discursive practices, or intellectual conversations in contemporary thinking, and how information technology is caught up in those conversations. These four positions are not "cognitive styles," but vigorously argued domains of debate involving writers such as Gadamer, Habermas and Derrida about the theory of interpretation. The field of interpretation is known as hermeneutics, which is concerned less with epistemology and knowledge than with understanding. Interpretation theory applies to reading texts, interpreting the law, and appreciating art, but also to the application of any practical task, such as making art, drawing, defining and solving problems, and design (Coyne and Snodgrass, 1995). Hermeneutics provides a coherent focus for considering many contemporary issues and many domains of practice. I outline what these positions in education mean in terms of CAAD (computer-aided architectural design) in the curriculum.

series eCAADe
email
more http://www.caad.ac.uk/~richard
last changed 2022/06/07 07:56

_id a6be
authors Doyle, J.
year 1991
title Static and Dynamic Analysis of Structures, with an emphasis on mechanics and computer methods
source Kluwer Academic Pub., Dordrecht
summary This book is concerned with the static and dynamic analysis of structures. Specifically, it uses the stiffness formulated matrix methods for use on computers to tackle some of the fundamental problems facing engineers in structural mechanics. This is done by covering the Mechanics of Structures, its rephrasing in terms of the Matrix Methods and then their Computational implementation, all within a cohesive setting. Although the book is designed primarily as a text for use at the upper-graduate and beginning graduate level, many practising structural engineers will find it useful as a reference and self-study guide. Each chapter is supplemented with a collection of pertinent problems that indicate extensions of the theory and the applications. These problems, combined with selected references to relevant literature, can form the basis for further study. The book sets as its goal the treatment of structural dynamics starting with the basic mechanics and going all the way to their implementation on digital computers. Rather than discuss particular commercial packages, Professor Doyle has developed STADYN: a complete (but lean) program to perform each of the standard procedures used in commercial programs. Each module in the program is reasonably complete in itself, and all were written with the sole aim of clarity plus a modicum of efficiency, compactness and elegance.
series other
last changed 2003/04/23 15:14

_id 0faa
authors Duelund Mortensen, Peder
year 1991
title THE FULL-SCALE MODEL WORKSHOP
source Proceedings of the 3rd European Full-Scale Modelling Conference / ISBN 91-7740044-5 / Lund (Sweden) 13-16 September 1990, pp. 10-11
summary The workshop is an institution, available for use by the public and established at the Laboratory of Housing in the Art Academy's school of Architecture for a 3 year trial period beginning April 1985. This resumé contains brief descriptions of a variety of representative model projects and an overview of all projects carried out so far, including the pilot projects from 1983 and planned projects to and including January 1987. The Full Scale Model Workshop builds full size models of buildings, rooms and parts of buildings. The purpose of the Full Scale Model Workshop is to promote communication among building's users. The workshop is a tool in an attempt to build bridges between theory and practice in research, experimentation and communication of research results. New ideas and experiments of various sorts can be tried out cheaply, quickly and efficiently through the building of full scale models. Changes can be done on the spot as a planned part of the project and on the basis of ideas and experiments achieved through the model work itself. Buildings and their space can thus be communicated directly to all involved persons, regardless of technical background or training in evaluation of building projects.
keywords Full-scale Modeling, Model Simulation, Real Environments
series other
type normal paper
more http://info.tuwien.ac.at/efa
last changed 2004/05/04 15:23

For more results click below:

this is page 0show page 1show page 2show page 3show page 4show page 5... show page 11HOMELOGIN (you are user _anon_66531 from group guest) CUMINCAD Papers Powered by SciX Open Publishing Services 1.002