CumInCAD is a Cumulative Index about publications in Computer Aided Architectural Design
supported by the sibling associations ACADIA, CAADRIA, eCAADe, SIGraDi, ASCAAD and CAAD futures

PDF papers
References

Hits 1 to 20 of 84

_id cf5c
authors Carpenter, B.
year 1992
title The logic of typed feature structures with applications to unification grammars, logic programs and constraint resolution
source Cambridge Tracts in Theoretical Computer Science, Cambridge University Press
summary This book develops the theory of typed feature structures, a new form of data structure that generalizes both the first-order terms of logic programs and feature-structures of unification-based grammars to include inheritance, typing, inequality, cycles and intensionality. It presents a synthesis of many existing ideas into a uniform framework, which serves as a logical foundation for grammars, logic programming and constraint-based reasoning systems. Throughout the text, a logical perspective is adopted that employs an attribute-value description language along with complete equational axiomatizations of the various systems of feature structures. Efficiency concerns are discussed and complexity and representability results are provided. The application of feature structures to phrase structure grammars is described and completeness results are shown for standard evaluation strategies. Definite clause logic programs are treated as a special case of phrase structure grammars. Constraint systems are introduced and an enumeration technique is given for solving arbitrary attribute-value logic constraints. This book with its innovative approach to data structures will be essential reading for researchers in computational linguistics, logic programming and knowledge representation. Its self-contained presentation makes it flexible enough to serve as both a research tool and a textbook.
series other
last changed 2003/04/23 15:14

_id a3f5
authors Zandi-Nia, Abolfazl
year 1992
title Topgene: An artificial Intelligence Approach to a Design Process
source Delft University of Technology
summary This work deals with two architectural design (AD) problems at the topological level and in presence of the social norms community, privacy, circulation-cost, and intervening opportunity. The first problem concerns generating a design with respect to a set of above mentioned norms, and the second problem requires evaluation of existing designs with respect to the same set of norms. Both problems are based on the structural-behavioral relationship in buildings. This work has challenged above problems in the following senses: (1) A working system, called TOPGENE (The TOpological Pattern GENErator) has been developed. (2) Both problems may be vague and may lack enough information in their statement. For example, an AD in the presence of the social norms requires the degrees of interactions between the location pairs in the building. This information is not always implicitly available, and must be explicated from the design data. (3) An AD problem at topological level is intractable with no fast and efficient algorithm for its solution. To reduce the search efforts in the process of design generation, TOPGENE uses a heuristic hill climbing strategy that takes advantage of domain specific rules of thumbs to choose a path in the search space of a design. (4) TOPGENE uses the Q-analysis method for explication of hidden information, also hierarchical clustering of location-pairs with respect to their flow generation potential as a prerequisite information for the heuristic reasoning process. (5) To deal with a design of a building at topological level TOPGENE takes advantage of existing graph algorithms such as path-finding and planarity testing during its reasoning process. This work also presents a new efficient algorithm for keeping track of distances in a growing graph. (6) This work also presents a neural net implementation of a special case of the design generation problem. This approach is based on the Hopfield model of neural networks. The result of this approach has been used test TOPGENE approach in generating designs. A comparison of these two approaches shows that the neural network provides mathematically more optimal designs, while TOPGENE produces more realistic designs. These two systems may be integrated to create a hybrid system.
series thesis:PhD
last changed 2003/02/12 22:37

_id cef3
authors Bridges, Alan H.
year 1992
title Computing and Problem Based Learning at Delft University of Technology Faculty of Architecture
source CAAD Instruction: The New Teaching of an Architect? [eCAADe Conference Proceedings] Barcelona (Spain) 12-14 November 1992, pp. 289-294
doi https://doi.org/10.52842/conf.ecaade.1992.289
summary Delft University of Technology, founded in 1842, is the oldest and largest technical university in the Netherlands. It provides education for more than 13,000 students in fifteen main subject areas. The Faculty of Architecture, Housing, Urban Design and Planning is one of the largest faculties of the DUT with some 2000 students and over 500 staff members. The course of study takes four academic years: a first year (Propaedeuse) and a further three years (Doctoraal) leading to the "ingenieur" qualification. The basic course material is delivered in the first two years and is taken by all students. The third and fourth years consist of a smaller number of compulsory subjects in each of the department's specialist areas together with a wide range of option choices. The five main subject areas the students may choose from for their specialisation are Architecture, Building and Project Management, Building Technology, Urban Design and Planning, and Housing.

The curriculum of the Faculty has been radically revised over the last two years and is now based on the concept of "Problem-Based Learning". The subject matter taught is divided thematically into specific issues that are taught in six week blocks. The vehicles for these blocks are specially selected and adapted case studies prepared by teams of staff members. These provide a focus for integrating specialist subjects around a studio based design theme. In the case of second year this studio is largely computer-based: many drawings are produced by computer and several specially written computer applications are used in association with the specialist inputs.

This paper describes the "block structure" used in second year, giving examples of the special computer programs used, but also raises a number of broader educational issues. Introduction of the block system arose as a method of curriculum integration in response to difficulties emerging from the independent functioning of strong discipline areas in the traditional work groups. The need for a greater level of selfdirected learning was recognised as opposed to the "passive information model" of student learning in which the students are seen as empty vessels to be filled with knowledge - which they are then usually unable to apply in design related contexts in the studio. Furthermore, the value of electives had been questioned: whilst enabling some diversity of choice, they may also be seen as diverting attention and resources from the real problems of teaching architecture.

series eCAADe
email
last changed 2022/06/07 07:54

_id e412
authors Fargas, Josep and Papazian, Pegor
year 1992
title Modeling Regulations and Intentions for Urban Development: The Role of Computer Simulation in the Urban Design Studio
source CAAD Instruction: The New Teaching of an Architect? [eCAADe Conference Proceedings] Barcelona (Spain) 12-14 November 1992, pp. 201-212
doi https://doi.org/10.52842/conf.ecaade.1992.201
summary In this paper we present a strategy for modeling urban development in order to study the role of urban regulations and policies in the transformation of cities. We also suggest a methodology for using computer models as experimental tools in the urban design studio in order to make explicit the factors involved in shaping cities, and for the automatic visualization of projected development. The structure of the proposed model is based on different modules which represent, on the one hand, the rules regulating the physical growth of a city and, on the other hand, heuristics corresponding to different interests such as Real Estate Developers, City Hall Planners, Advocacy and Community Groups, and so on. Here we present a case study dealing with the Boston Redevelopment Authority zoning code for the Midtown Cultural District of Boston. We introduce a computer program which develops the district, adopting a particular point of view regarding urban regulation. We then generalize the notion of this type of computer modeling and simulation, and draw some conclusions about its possible uses in the teaching and practice of design.
series eCAADe
email
last changed 2022/06/07 07:55

_id 68c8
authors Flemming, U., Coyne, R. and Fenves, S. (et al.)
year 1994
title SEED: A Software Environment to Support the Early Phases in Building Design
source Proceeding of IKM '94, Weimar, Germany, pp. 5-10
summary The SEED project intends to develop a software environment that supports the early phases in building design (Flemming et al., 1993). The goal is to provide support, in principle, for the preliminary design of buildings in all aspects that can gain from computer support. This includes using the computer not only for analysis and evaluation, but also more actively for the generation of designs, or more accurately, for the rapid generation of design representations. A major motivation for the development of SEED is to bring the results of two multi-generational research efforts focusing on `generative' design systems closer to practice: 1. LOOS/ABLOOS, a generative system for the synthesis of layouts of rectangles (Flemming et al., 1988; Flemming, 1989; Coyne and Flemming, 1990; Coyne, 1991); 2. GENESIS, a rule-based system that supports the generation of assemblies of 3-dimensional solids (Heisserman, 1991; Heisserman and Woodbury, 1993). The rapid generation of design representations can take advantage of special opportunities when it deals with a recurring building type, that is, a building type dealt with frequently by the users of the system. Design firms - from housing manufacturers to government agencies - accumulate considerable experience with recurring building types. But current CAD systems capture this experience and support its reuse only marginally. SEED intends to provide systematic support for the storing and retrieval of past solutions and their adaptation to similar problem situations. This motivation aligns aspects of SEED closely with current work in Artificial Intelligence that focuses on case-based design (see, for example, Kolodner, 1991; Domeshek and Kolodner, 1992; Hua et al., 1992).
series other
email
last changed 2003/04/23 15:14

_id a2e6
authors Liggett, R.S., Mitchell, W.J. and Tan, M.
year 1992
title Multi-Level Analysis and Optimization of Design
source New York: John Wiley & Sons, 1992. pp. 2512-269 : ill. includes bibliography
summary This paper discusses a knowledge-based computer-aided design system, that provides multi-level analysis capabilities, and that automatically propagates constraints on design variables from level to level. It also Supports formulation and solution of optimization problems at different levels, so that a solution can be approached by solving a sequence of appropriately constrained sub-optimization problems. Theory and implementation are discussed, and a detailed case study of application to the design of small house plans is provided
keywords constraints, design, methods, knowledge base, CAD, systems, analysis, optimization, automation, user interface, shape grammars
series CADline
email
last changed 2003/06/02 14:41

_id ddss9215
id ddss9215
authors Mortola, E. and Giangrande, A.
year 1993
title A trichotomic segmentation procedure to evaluate projects in architecture
source Timmermans, Harry (Ed.), Design and Decision Support Systems in Architecture (Proceedings of a conference held in Mierlo, the Netherlands in July 1992), ISBN 0-7923-2444-7
summary This paper illustrates a model used to construct the evaluation module for An Interface for Designing (AID), a system to aid architectural design. The model can be used at the end of every cycle of analysis-synthesis-evaluation in the intermediate phases of design development. With the aid of the model it is possible to evaluate the quality of a project in overall terms to establish whether the project is acceptable, whether it should be elaborated ex-novo, or whether it is necessary to begin a new cycle to improve it. In this last case, it is also possible to evaluate the effectiveness of the possible actions and strategies for improvement. The model is based on a procedure of trichotomic segmentation, developed with MCDA (Multi-Criteria Decision Aid), which uses the outranking relation to compare the project with some evaluation profiles taken as projects of reference. An application of the model in the teaching field will also be described.
series DDSS
last changed 2003/08/07 16:36

_id 975e
authors Pearce, M. and Goel, A. (et al.)
year 1992
title Case-Based Design support: A case study in architectural design
source IEEE Expert 7(5): 14-20
summary Archie, a small computer-based library of architectural design cases, is described. Archie helps architects in the high-level task of conceptual design as opposed to low-level tasks such as drawing and drafting, numerical calculations, and constraint propagation. Archie goes beyond supporting architects in design proposal and critiquing. It acts as a shared external memory that supports two kinds of design collaboration. First, by including enough knowledge about the goals, plans, outcomes, and lessons of past cases, it lets the designer access the work of previous architects. Second, by providing access to the perspectives of domain experts via the domain models, Archie helps architects anticipate and accommodate experts' views on evolving designs. The lessons learned about building large case-based systems to support real-world decision making in developing Archie are discussed.
series journal paper
last changed 2003/04/23 15:14

_id 89ab
authors Villegas, A.F. and Esparta, J.B.
year 1992
title Didactic Interactive Tools in Architectural Education: A Case Study
source CAAD Instruction: The New Teaching of an Architect? [eCAADe Conference Proceedings] Barcelona (Spain) 12-14 November 1992, pp. 145-155
doi https://doi.org/10.52842/conf.ecaade.1992.145
summary This paper presents a proposal based on the use of new didactic interactive tools, mainly multimedia and hypertext, the combination of which is sometimes known as hypermedia.
series eCAADe
last changed 2022/06/07 07:58

_id 6cfd
authors Harfmann, Anton C. and Majkowski, Bruce R.
year 1992
title Component-Based Spatial Reasoning
source Mission - Method - Madness [ACADIA Conference Proceedings / ISBN 1-880250-01-2] 1992, pp. 103-111
doi https://doi.org/10.52842/conf.acadia.1992.103
summary The design process and ordering of individual components through which architecture is realized relies on the use of abstract "models" to represent a proposed design. The emergence and use of these abstract "models" for building representation has a long history and tradition in the field of architecture. Models have been made and continue to be made for the patron, occasionally the public, and as a guide for the builders. Models have also been described as a means to reflect on the design and to allow the design to be in dialogue with the creator.

The term "model" in the above paragraph has been used in various ways and in this context is defined as any representation through which design intent is expressed. This includes accurate/ rational or abstract drawings (2- dimensional and 3-dimensional), physical models (realistic and abstract) and computer models (solid, void and virtual reality). The various models that fall within the categories above have been derived from the need to "view" the proposed design in various ways in order to support intuitive reasoning about the proposal and for evaluation purposes. For example, a 2-dimensional drawing of a floor plan is well suited to support reasoning about spatial relationships and circulation patterns while scaled 3-dimensional models facilitate reasoning about overall form, volume, light, massing etc. However, the common denominator of all architectural design projects (if the intent is to construct them in actual scale, physical form) are the discrete building elements from which the design will be constructed. It is proposed that a single computational model representing individual components supports all of the above "models" and facilitates "viewing"' the design according to the frame of reference of the viewer.

Furthermore, it is the position of the authors that all reasoning stems from this rudimentary level of modeling individual components.

The concept of component representation has been derived from the fact that a "real" building (made from individual components such as nuts, bolts and bar joists) can be "viewed" differently according to the frame of reference of the viewer. Each individual has the ability to infer and abstract from the assemblies of components a variety of different "models" ranging from a visceral, experiential understanding to a very technical, physical understanding. The component concept has already proven to be a valuable tool for reasoning about assemblies, interferences between components, tracing of load path and numerous other component related applications. In order to validate the component-based modeling concept this effort will focus on the development of spatial understanding from the component-based model. The discussions will, therefore, center about the representation of individual components and the development of spatial models and spatial reasoning from the component model. In order to frame the argument that spatial modeling and reasoning can be derived from the component representation, a review of the component-based modeling concept will precede the discussions of spatial issues.

series ACADIA
email
last changed 2022/06/07 07:49

_id cc2f
authors Jog, Bharati
year 1992
title Evaluation of Designs for Energy Performance Using A Knowledge-Based System
source New York: John Wiley & Sons, 1992. pp. 293-304 : ill. includes a bibliography
summary Principles of knowledge-based (or expert) systems have been applied in different knowledge-rich domains such as geology, medicine, and very large scale integrated circuits (VLSI). There have been some efforts to develop expert systems for evaluation and prediction of architectural designs in this decade. This paper presents a prototype system, Energy Expert, which quickly computes the approximate yearly energy performance of a building design, analyzes the energy performance, and gives advice on possible ways of improving the design. These modifications are intended to make the building more energy efficient and help cut down on heating and cooling costs. The system is designed for the schematic design phase of an architectural project. Also discussed briefly is the reasoning behind developing such a system for the schematic design rather than the final design phase
keywords expert systems, energy, evaluation, performance, knowledge base, architecture, reasoning, programming, prediction
series CADline
last changed 1999/02/12 15:08

_id a07c
authors Mitchell, William J.
year 1992
title The Uses of Inconsistency in Design
source New York: John Wiley & Sons, 1992. pp. 1-13 : ill. includes bibliography.--- This article is the introduction chapter to the book
summary In this paper two of the central dogmas underlying most current theories of design evaluation are challenged: that the representations used by a designer must be well formed, and that a designer must have a consistent belief framework within which to make judgements about design proposals. The crucial roles in design of ambiguous and inconsistent representations and provisional beliefs are examined, and a model of design exploration based on nonmonotonic modes of reasoning is sketched
keywords reasoning, evaluation, prediction, design process, architecture
series CADline
email
last changed 2003/06/02 10:24

_id cb5a
authors Oxman, Rivka E.
year 1992
title Multiple Operative and Interactive Modes in Knowledge-Based Design Systems
source New York: John Wiley & Sons, 1992. pp. 125-143 : ill. includes bibliography
summary A conceptual basis for the development of an expert system which is capable of integrating various modes of generation and evaluation in design is presented. This approach is based upon two sets of reasoning processes in the design system. The first enables a mapping between design requirements and solution descriptions in a generative mode of design; and the second enables a mapping between solution descriptions and performance evaluation in an evaluative and predictive mode. This concept supports a formal framework necessary for a knowledge-based design system to operate in a design partnership relation with the designer. Another fundamental concept in expert systems for design, dual direction interpretation between graphic and textual modes, is presented and elaborated. This encoding of knowledge behind the geometrical representation can be achieved in knowledge- based design systems by the development of a 'semantic interpreter' which supports a dual direction mapping process employing a geometrical knowledge, typological knowledge and evaluative knowledge. An implemented expert system for design, PREDIKT, demonstrates these concepts in the domain of kitchen design. It provides the user with a choice of alternative modes of interaction, such as: a 'design critic' for the evaluation of a design, a 'design generator' for the generation of a design, or a 'design critic-generator' for the completion of partial solutions
keywords architecture, knowledge base, design, systems, expert systems
series CADline
email
last changed 2003/06/02 10:24

_id 3ff5
authors Abbo, I.A., La Scalea, L., Otero, E. and Castaneda, L.
year 1992
title Full-Scale Simulations as Tool for Developing Spatial Design Ability
source Proceedings of the 4rd European Full-Scale Modelling Conference / Lausanne (Switzerland) 9-12 September 1992, Part C, pp. 7-10
summary Spatial Design Ability has been defined as the capability to anticipate effects (psychological impressions on potential observers or users) produced by mental manipulation of elements of architectural or urban spaces. This ability, of great importance in choosing the appropriate option during the design process, is not specifically developed in schools of architecture and is partially obtained as a by-product of drawing, designing or architectural criticism. We use our Laboratory as a tool to present spaces to people so that they can evaluate them. By means of a series of exercises, students confront their anticipations with the psychological impressions produced in other people. For this occasion, we present an experience in which students had to propose a space for an exhibition hag in which architectural projects (student thesis) were to be shown. Following the Spatial Design Ability Development Model which we have been using for several years, students first get acquainted with the use of evaluation instruments for psychological impressions as well as with research methodology. In this case, due to the short period available, we reduced research to investigate the effects produced by the manipulation of only 2 independents variables: students manipulated first the form of the roof, walls and interiors elements, secondly, color and texture of those elements. They evaluated spatial quality, character and the other psychological impressions that manipulations produced in people. They used three dimensional scale models 1/10 and 1/1.
keywords Full-scale Modeling, Model Simulation, Real Environments
series other
email
more http://info.tuwien.ac.at/efa
last changed 2003/08/25 10:12

_id 6208
authors Abou-Jaoude, Georges
year 1992
title To Master a Tool
source Proceedings of the 4rd European Full-Scale Modelling Conference / Lausanne (Switzerland) 9-12 September 1992, Part B, p. 15
summary The tool here is the computer or to be precise, a unit that includes the computer, the peripherals and the software needed to fulfill a task. These tools are getting very sophisticated and user interfaces extremly friendly, therefore it is very easy to become the slave of such electronic tools and reach self satisfaction with strait forward results and attractive images. In order to master and not to become slaves of sophisticated tools, a very solid knowledge of related fields or domains of application becomes necessary. In the case of this seminar, full scale modelling, is a way to understand the relation between a mental model and it's full-scale modelling, it is a way of communicating what is in a designers mind. Computers and design programs can have the same goal, rather than chosing one method or the other let us try to say how important it is today to complement designing with computer with other means and media such as full scale modelling, and what computer modelling and simulation can bring to full scale modelling or other means.
keywords Full-scale Modeling, Model Simulation, Real Environments
series other
more http://info.tuwien.ac.at/efa
last changed 2003/08/25 10:12

_id ddss9219
id ddss9219
authors Bourdakis, V. and Fellows, R.F.
year 1993
title A model appraising the performance of structural systems used in sports hall and swimming pool buildings in greece
source Timmermans, Harry (Ed.), Design and Decision Support Systems in Architecture (Proceedings of a conference held in Mierlo, the Netherlands in July 1992), ISBN 0-7923-2444-7
summary The selection of the best performing structural system (among steel, timber laminated, concrete, fabric tents) for medium span (30-50m) sports halls and swimming pools in Greece formed the impetus for this research. Decision-making concerning selection of the structural system is difficult in this sector of construction, as was explained in the "Long Span Structures" conference (November 1990, Athens. Greece). From the literature it has been found that most building appraisals end up at the level of data analysis and draw conclusions on the individual aspects they investigate. These approaches usually focus on a fraction of the problem, examining it very deeply and theoretically. Their drawback is loss of comprehensiveness and ability to draw conclusions on an overall level and consequently being applicable to the existing conditions. Research on an inclusive level is sparse. In this particular research project, an inclusive appraisal approach was adopted, leading to the identification of three main variables: resources, human-user-satisfaction, and technical. Consequently, this led to a combination of purely quantitative and qualitative data. Case studies were conducted on existing buildings in order to assess the actual performance of the various alternative structural systems. This paper presents the procedure followed for the identification of the research variables and the focus on the development of the model of quantification. The latter is of vital importance if the problem of incompatibility of data is to be solved, overall relation of findings is to be achieved and holistic conclusions are to be drawn.
series DDSS
last changed 2003/11/21 15:16

_id 0ac0
authors Coyne, Richard and Newton, Sidney
year 1992
title Metaphors, Computers and Architectural Education
source CAAD Instruction: The New Teaching of an Architect? [eCAADe Conference Proceedings] Barcelona (Spain) 12-14 November 1992, pp. 307-318
doi https://doi.org/10.52842/conf.ecaade.1992.307
summary In this paper we present the case for employing metaphor to explain the impact of technology. This contrasts with the empirical-theoretical method of inquiry. We also contrast two widely held metaphors of architectural education (the EPISTEMOLOGICAL and the COMMUNITY metaphors) and of the role of the computer (the MAINFRAME and the UBIQUITOUS COMPUTING metaphors). We show how in each case both metaphors result in different kinds of decision making in relation to resourcing an architecture school.
series eCAADe
email
last changed 2022/06/07 07:56

_id 9f8a
authors Davidow, William H.
year 1992
title The Virtual Corporation: Structuring and Revitalizing the Corporation for the 21St Century
source New York: Harper Collins Publishers
summary The great value of this timely, important book is that it provides an integrated picture of the customer-driven company of the future. We have begun to learn about lean production technology, stripped-down management, worker empowerment, flexible customized manufacturing, and other modern strategies, but Davidow and Malone show for the first time how these ideas are fitting together to create a new kind of corporation and a worldwide business revolution. Their research is fascinating. The authors provide illuminating case studies of American, Japanese, and European companies that have discovered the keys to improved competitiveness, redesigned their businesses and their business relationships, and made extraordinary gains. They also write bluntly and critically about a number of American corporations that are losing market share by clinging to outmoded thinking. Business success in the global marketplace of the future is going to depend upon corporations producing "virtual" products high in added value, rich in variety, and available instantly in response to customer needs. At the heart of this revolution will be fast new information technologies; increased emphasis on quality; accelerated product development; changing management practices, including new alignments between management and labor; and new linkages between company, supplier, and consumer, and between industry and government. The Virtual Corporation is an important cutting-edge book that offers a creative synthesis of the most influential ideas in modern business theory. It has already fired excitement and debate in industry, academia, and government, and it is essential reading for anyone involved in the leadership of America's business and the shaping of America's economic future.
series other
last changed 2003/04/23 15:14

_id 7ce5
authors Gal, Shahaf
year 1992
title Computers and Design Activities: Their Mediating Role in Engineering Education
source Sociomedia, ed. Edward Barret. MIT Press
summary Sociomedia: With all the new words used to describe electronic communication (multimedia, hypertext, cyberspace, etc.), do we need another one? Edward Barrett thinks we do; hence, he coins the term "sociomedia." It is meant to displace a computing economy in which technicity is hypostasized over sociality. Sociomedia, a compilation of twenty-five articles on the theory, design and practice of educational multimedia and hypermedia, attempts to re-value the communicational face of computing. Value, of course, is "ultimately a social construct." As such, it has everything to do with knowledge, power, education and technology. The projects discussed in this book represent the leading edge of electronic knowledge production in academia (not to mention major funding) and are determining the future of educational media. For these reasons, Sociomedia warrants close inspection. Barrett's introduction sets the tone. For him, designing computer media involves hardwiring a mechanism for the social construction of knowledge (1). He links computing to a process of social and communicative interactivity for constructing and desseminating knowledge. Through a mechanistic mapping of the university as hypercontext (a huge network that includes classrooms as well as services and offices), Barrett models intellectual work in such a way as to avoid "limiting definitions of human nature or human development." Education, then, can remain "where it should be--in the human domain (public and private) of sharing ideas and information through the medium of language." By leaving education in a virtual realm (where we can continue to disagree about its meaning and execution), it remains viral, mutating and contaminating in an intellectually healthy way. He concludes that his mechanistic model, by means of its reductionist approach, preserves value (7). This "value" is the social construction of knowledge. While I support the social orientation of Barrett's argument, discussions of value are related to power. I am not referring to the traditional teacher-student power structure that is supposedly dismantled through cooperative and constructivist learning strategies. The power to be reckoned with in the educational arena is foundational, that which (pre)determines value and the circulation of knowledge. "Since each of you reading this paragraph has a different perspective on the meaning of 'education' or 'learning,' and on the processes involved in 'getting an education,' think of the hybris in trying to capture education in a programmable function, in a displayable object, in a 'teaching machine'" (7). Actually, we must think about that hybris because it is, precisely, what informs teaching machines. Moreover, the basic epistemological premises that give rise to such productions are too often assumed. In the case of instructional design, the episteme of cognitive sciences are often taken for granted. It is ironic that many of the "postmodernists" who support electronic hypertextuality seem to have missed Jacques Derrida's and Michel Foucault's "deconstructions" of the epistemology underpinning cognitive sciences (if not of epistemology itself). Perhaps it is the glitz of the technology that blinds some users (qua developers) to the belief systems operating beneath the surface. Barrett is not guilty of reactionary thinking or politics; he is, in fact, quite in line with much American deconstructive and postmodern thinking. The problem arises in that he leaves open the definitions of "education," "learning" and "getting an education." One cannot engage in the production of new knowledge without orienting its design, production and dissemination, and without negotiating with others' orientations, especially where largescale funding is involved. Notions of human nature and development are structural, even infrastructural, whatever the medium of the teaching machine. Although he addresses some dynamics of power, money and politics when he talks about the recession and its effects on the conference, they are readily visible dynamics of power (3-4). Where does the critical factor of value determination, of power, of who gets what and why, get mapped onto a mechanistic model of learning institutions? Perhaps a mapping of contributors' institutions, of the funding sources for the projects showcased and for participation in the conference, and of the disciplines receiving funding for these sorts of projects would help visualize the configurations of power operative in the rising field of educational multimedia. Questions of power and money notwithstanding, Barrett's introduction sets the social and textual thematics for the collection of essays. His stress on interactivity, on communal knowledge production, on the society of texts, and on media producers and users is carried foward through the other essays, two of which I will discuss. Section I of the book, "Perspectives...," highlights the foundations, uses and possible consequences of multimedia and hypertextuality. The second essay in this section, "Is There a Class in This Text?," plays on the robust exchange surrounding Stanley Fish's book, Is There a Text in This Class?, which presents an attack on authority in reading. The author, John Slatin, has introduced electronic hypertextuality and interaction into his courses. His article maps the transformations in "the content and nature of work, and the workplace itself"-- which, in this case, is not industry but an English poetry class (25). Slatin discovered an increase of productive and cooperative learning in his electronically- mediated classroom. For him, creating knowledge in the electronic classroom involves interaction between students, instructors and course materials through the medium of interactive written discourse. These interactions lead to a new and persistent understanding of the course materials and of the participants' relation to the materials and to one another. The work of the course is to build relationships that, in my view, constitute not only the meaning of individual poems, but poetry itself. The class carries out its work in the continual and usually interactive production of text (31). While I applaud his strategies which dismantle traditional hierarchical structures in academia, the evidence does not convince me that the students know enough to ask important questions or to form a self-directing, learning community. Stanley Fish has not relinquished professing, though he, too, espouses the indeterminancy of the sign. By the fourth week of his course, Slatin's input is, by his own reckoning, reduced to 4% (39). In the transcript of the "controversial" Week 6 exchange on Gertrude Stein--the most disliked poet they were discussing at the time (40)--we see the blind leading the blind. One student parodies Stein for three lines and sums up his input with "I like it." Another, finds Stein's poetry "almost completey [sic] lacking in emotion or any artistic merit" (emphasis added). On what grounds has this student become an arbiter of "artistic merit"? Another student, after admitting being "lost" during the Wallace Steven discussion, talks of having more "respect for Stevens' work than Stein's" and adds that Stein's poetry lacks "conceptual significance[, s]omething which people of varied opinion can intelligently discuss without feeling like total dimwits...." This student has progressed from admitted incomprehension of Stevens' work to imposing her (groundless) respect for his work over Stein's. Then, she exposes her real dislike for Stein's poetry: that she (the student) missed the "conceptual significance" and hence cannot, being a person "of varied opinion," intelligently discuss it "without feeling like [a] total dimwit." Slatin's comment is frightening: "...by this point in the semester students have come to feel increasingly free to challenge the instructor" (41). The students that I have cited are neither thinking critically nor are their preconceptions challenged by student-governed interaction. Thanks to the class format, one student feels self-righteous in her ignorance, and empowered to censure. I believe strongly in student empowerment in the classroom, but only once students have accrued enough knowledge to make informed judgments. Admittedly, Slatin's essay presents only partial data (there are six hundred pages of course transcripts!); still, I wonder how much valuable knowledge and metaknowledge was gained by the students. I also question the extent to which authority and professorial dictature were addressed in this course format. The power structures that make it possible for a college to require such a course, and the choice of texts and pedagogy, were not "on the table." The traditional professorial position may have been displaced, but what took its place?--the authority of consensus with its unidentifiable strong arm, and the faceless reign of software design? Despite Slatin's claim that the students learned about the learning process, there is no evidence (in the article) that the students considered where their attitudes came from, how consensus operates in the construction of knowledge, how power is established and what relationship they have to bureaucratic insitutions. How do we, as teaching professionals, negotiate a balance between an enlightened despotism in education and student-created knowledge? Slatin, and other authors in this book, bring this fundamental question to the fore. There is no definitive answer because the factors involved are ultimately social, and hence, always shifting and reconfiguring. Slatin ends his article with the caveat that computerization can bring about greater estrangement between students, faculty and administration through greater regimentation and control. Of course, it can also "distribute authority and power more widely" (50). Power or authority without a specific face, however, is not necessarily good or just. Shahaf Gal's "Computers and Design Activities: Their Mediating Role in Engineering Education" is found in the second half of the volume, and does not allow for a theory/praxis dichotomy. Gal recounts a brief history of engineering education up to the introduction of Growltiger (GT), a computer-assisted learning aid for design. He demonstrates GT's potential to impact the learning of engineering design by tracking its use by four students in a bridge-building contest. What his text demonstrates clearly is that computers are "inscribing and imaging devices" that add another viewpoint to an on-going dialogue between student, teacher, earlier coursework, and other teaching/learning tools. The less proficient students made a serious error by relying too heavily on the technology, or treating it as a "blueprint provider." They "interacted with GT in a way that trusted the data to represent reality. They did not see their interaction with GT as a negotiation between two knowledge systems" (495). Students who were more thoroughly informed in engineering discourses knew to use the technology as one voice among others--they knew enough not simply to accept the input of the computer as authoritative. The less-advanced students learned a valuable lesson from the competition itself: the fact that their designs were not able to hold up under pressure (literally) brought the fact of their insufficient knowledge crashing down on them (and their bridges). They also had, post factum, several other designs to study, especially the winning one. Although competition and comparison are not good pedagogical strategies for everyone (in this case the competitors had volunteered), at some point what we think we know has to be challenged within the society of discourses to which it belongs. Students need critique in order to learn to push their learning into auto-critique. This is what is lacking in Slatin's discussion and in the writings of other avatars of constructivist, collaborative and computer-mediated pedagogies. Obviously there are differences between instrumental types of knowledge acquisition and discoursive knowledge accumulation. Indeed, I do not promote the teaching of reading, thinking and writing as "skills" per se (then again, Gal's teaching of design is quite discursive, if not dialogic). Nevertheless, the "soft" sciences might benefit from "bridge-building" competitions or the re-institution of some forms of agonia. Not everything agonistic is inhuman agony--the joy of confronting or creating a sound argument supported by defensible evidence, for example. Students need to know that soundbites are not sound arguments despite predictions that electronic writing will be aphoristic rather than periodic. Just because writing and learning can be conceived of hypertextually does not mean that rigor goes the way of the dinosaur. Rigor and hypertextuality are not mutually incompatible. Nor is rigorous thinking and hard intellectual work unpleasurable, although American anti-intellectualism, especially in the mass media, would make it so. At a time when the spurious dogmatics of a Rush Limbaugh and Holocaust revisionist historians circulate "aphoristically" in cyberspace, and at a time when knowledge is becoming increasingly textualized, the role of critical thinking in education will ultimately determine the value(s) of socially constructed knowledge. This volume affords the reader an opportunity to reconsider knowledge, power, and new communications technologies with respect to social dynamics and power relationships.
series other
last changed 2003/04/23 15:14

_id acadia03_036
id acadia03_036
authors Gerzso, J. Michael
year 2003
title On the Limitations of Shape Grammars: Comments on Aaron Fleisher’s Article “Grammatical Architecture?”
source Connecting >> Crossroads of Digital Discourse [Proceedings of the 2003 Annual Conference of the Association for Computer Aided Design In Architecture / ISBN 1-880250-12-8] Indianapolis (Indiana) 24-27 October 2003, pp. 279-287
doi https://doi.org/10.52842/conf.acadia.2003.279
summary Shape grammars were introduced by Gips and Stiny in 1972. Since then, there have been many articles and books written by them and their associates. In 1992, Aaron Fleisher, a professor at the School of Planning, MIT, wrote a critique of their work in an article titled “Grammatical Architecture?” published in the journal Environment and Planning B. According to him, Gips, Stiny and later Mitchell, propose a hypothesis that states that shape grammars are presumed to represent knowledge of architectural form, that grammars are “formable,” and that there is a visual correspondence to verbal grammar. The strong version of “the hypothesis requires that an architectural form be equivalent to a grammar.” Fleisher considers these hypotheses unsustainable, and argues his case by analyzing the differences between language, and architecture, and by dealing with the concepts of lexicons, syntax and semantics. He concludes by stating that architectural design is negotiated in two modalities: the verbal and the visual, and that equivalences are not at issue; they do not exist. If there is such thing as a language for design, it would provide the means to maintain a discussion of the consequences in one mode, of the state and conditions of the other. Fleisher’s observations serve as the basis of this paper, a tribute to him, and also an opportunity to present an outline to an alternate approach or hypothesis to shape grammars, which is “nonlinguistic” but “generative,” in the sense that it uses production rules. A basic aspect of this hypothesis is that the only similarity between syntactic rules in language and some rules in architecture is that they are recursive.
series ACADIA
last changed 2022/06/07 07:51

For more results click below:

this is page 0show page 1show page 2show page 3show page 4HOMELOGIN (you are user _anon_651461 from group guest) CUMINCAD Papers Powered by SciX Open Publishing Services 1.002