CumInCAD is a Cumulative Index about publications in Computer Aided Architectural Design
supported by the sibling associations ACADIA, CAADRIA, eCAADe, SIGraDi, ASCAAD and CAAD futures

PDF papers
References

Hits 1 to 20 of 4756

_id aa28
authors Lawrence, Peter
year 2002
title Designing Business
doi https://doi.org/10.52842/conf.caadria.2002.009
source CAADRIA 2002 [Proceedings of the 7th International Conference on Computer Aided Architectural Design Research in Asia / ISBN 983-2473-42-X] Cyberjaya (Malaysia) 18–20 April 2002, pp. 009-17
summary On a number of occasions after telling people about Corporate Design Foundation and what we do, their reaction is “well, business now understands the importance of design, right?” The answer is yes and no. There is, as they say, good news and bad news. A growing number of senior business executives do understand the possibilities of design. But many still do not. Unfortunately, the majority of mid-level managers do not. While there has been an increasing amount written about design in the business press, there has also been an equal amount in the general press which is misleading or just wrong. There is a great deal more to do.
series CAADRIA
email
more http://www.cdf.org
last changed 2022/06/07 07:52

_id c207
authors Branzell, Arne
year 1993
title The Studio CTH-A and the Searching Picture
source Endoscopy as a Tool in Architecture [Proceedings of the 1st European Architectural Endoscopy Association Conference / ISBN 951-722-069-3] Tampere (Finland), 25-28 August 1993, pp. 129-140
summary What happens during an architect’s search for the best solution? How does he (or she) begin, which tools are chosen, what happens when he comes to a standstill? The activities – sketching, discussions with other people, making models, taking walks to think, visits to the library, etc? What is an ordinary procedure and what is more specific? Do the tools have an impact on the final solution chosen? What happens during periods of no activity? Are they important? In which fields of activities are signs of the searching process to be found? In other words — what is the process of creative thinking for architects? Mikael Hedin and myself at Design Methods, Chalmers University of Technology, have started research into architects’ problem-solving. We have finished a pilot study on a very experienced architect working traditionally, without Cad (”The Bo Cederlöf Case”). We have started preliminary discussions with our second ”Case”, an architect in another situation, who has been working for many years with Cad equipment (Gert Wingårdh). For our next case, we will study a third situation – two or more architects who share the responsibility for the solution and where the searching is a consequence of a dialogue between equal partners. At present, we are preparing a report on theories in and methods for Searching and Creativity. I will give you some results of our work up till now, in the form of ten hypotheses on the searching process. Finally, I would like to present those fields of activity where we have so far found signs of searching. Our approach, in comparison with earlier investigations into searching (the most respected being Arnheim’s study on Picasso’s completion of the Guernica) is to collect and observe signs of searching during the process, not afterwards. We are, to use a metaphor, following in the footsteps of the hunter, recording the path he chooses, what marks he makes, what tools, implements and equipment he uses. For practising architects: a better understanding of what is going on and encouragement to try new ways of searching, for architectural students: better preparation and training for problem solving. It all began while we compared the different objects in our collection of sketches at the Chalmers STUDIO for Visualisation and Communication. (For some years, we have been gathering sketches by Alvar Aalto, Jorn Utzon, Ralph Erskine, Erik and Tore Ahlsén, Lewerenz, Nyrén, Lindroos, Wingårdh and others in a permanent exhibition). We observed similarities in these sketches which allowed us to frame ten hypotheses about the searching process.

keywords Architectural Endoscopy
series EAEA
email
more http://info.tuwien.ac.at/eaea/
last changed 2005/09/09 10:43

_id ddssar9604
id ddssar9604
authors Demir, Yueksel
year 1996
title CAD Systems for early design phases or CAD systems for designers' early phases
source Timmermans, Harry (Ed.), Third Design and Decision Support Systems in Architecture and Urban Planning - Part one: Architecture Proceedings (Spa, Belgium), August 18-21, 1996
summary Most of the problems, related with the use of CAD systems are the results of some general principles; the philosophy that, those systems are based on. Therefore, mainly the relation between these principles and early design phase performance of CAD systems and designers are discussed in this paper. The circumstances of novice CAD user architects in Turkey is considered first. In formation of the research, the knowledge gained during my personal experience based on real cases from the university (education, research) and practice (design, consulting) is used. Beside this the results of a survey including a serious of interviews projecting the opinions of the architects is used. Vendors of commonly used CAD systems were interviewed. In this manner to answer the main question about the relation of "CAD" and "early design phase" the answers of some following questions and facts were investigated: What means CAD for architects? What are the main purposes of using CAD? Are CAD systems sufficient to be used in early design phases in terms of either hardware and / or software, or should we say thinkware?. The advantages and disadvantages of using CAD. The target user fact and its consequences (the difference between general purpose systems and the sophisticated architectural systems). Should we adapt to computerized way of thinking? Is 3D a basic feature? What are the education related problems of CAD? Is software integration problem solved? Modularity concept for CAD systems. What is the minimum time, and the budget required for a start? The illegal software use problem Complaints, demands, needs and thanks of architects? Simply, what do architects expect from CAD during design process and particularly in early phases (both of design and designer)? Do CAD systems match this?
keywords CAD, Information Technology, Office Automation
series DDSS
last changed 2003/08/07 16:36

_id ijac201412305
id ijac201412305
authors Davis, Daniel
year 2014
title Quantitatively Analysing Parametric Models
source International Journal of Architectural Computing vol. 12 - no. 3, 307-320
summary Architectural practices regularly work with parametric models, yet almost nothing is known about the general properties of these models. We do not know how large a typical model is, or how complicated, or even what the typical parametric model does. These knowledge gaps are the focus of this article, which documents the first large-scale quantitative parametric model survey. In this paper three key quantitative metrics - dimensionality, size, and cyclomatic complexity - are applied to a collection of 2002 parametric models created by 575 designers. The results show that parametric models generally exhibit a number of strong correlations, which reveal a practice of parametric modelling that has as much to do with the management of data as it does with the modelling of geometry. These findings demonstrate the utility of software engineering metrics in the description and analysis of parametric models.
series journal
last changed 2019/05/24 09:55

_id 54b0
authors Duarte, J.P., Heitor, M. and Mitchell, W.J.
year 2002
title The Glass Chair - Competence Building for Innovation
doi https://doi.org/10.52842/conf.ecaade.2002.180
source Connecting the Real and the Virtual - design e-ducation [20th eCAADe Conference Proceedings / ISBN 0-9541183-0-8] Warsaw (Poland) 18-20 September 2002, pp. 180-185
summary This paper tells the strange tale of a glass chair. Creating a glass chair might seem a perverse – maybe impossible – enterprise. After all, chairs are normally held together by moment connections, such as those joining the legs to the seat. Glass is a notoriously bad material for forming moment connections; it is brittle, and quickly snaps if you subject it to bending. But there are advantages to such startling formulations of design problems. They force you to challenge conventional wisdom, to ignore standard prototypes, and to ask interesting new questions. How might you design a chair without moment connections? How might you do so without making the result impossibly heavy? How would you built it? And what interesting qualities might such a chair have? These were questions investigated in the design project pursued jointly by students at an American and a Portuguese school, in collaboration with glass and molding fabricators. The students explored many possibilities, and in doing so learned a great deal about chairs and about the properties and potentials of glass. The final project is a particularly elegant outcome of their investigations. It is created from just two curved pieces of glass, which held together by metal tie-rods. In the end, the finished glass chair looked just like the initial computer visualizations.
series eCAADe
email
last changed 2022/06/07 07:55

_id 349e
authors Durmisevic, Sanja
year 2002
title Perception Aspects in Underground Spaces using Intelligent Knowledge Modeling
source Delft University of Technology
summary The intensification, combination and transformation are main strategies for future spatial development of the Netherlands, which are stated in the Fifth Bill regarding Spatial Planning. These strategies indicate that in the future, space should be utilized in a more compact and more efficient way requiring, at the same time, re-evaluation of the existing built environment and finding ways to improve it. In this context, the concept of multiple space usage is accentuated, which would focus on intensive 4-dimensional spatial exploration. The underground space is acknowledged as an important part of multiple space usage. In the document 'Spatial Exploration 2000', the underground space is recognized by policy makers as an important new 'frontier' that could provide significant contribution to future spatial requirements.In a relatively short period, the underground space became an important research area. Although among specialists there is appreciation of what underground space could provide for densely populated urban areas, there are still reserved feelings by the public, which mostly relate to the poor quality of these spaces. Many realized underground projects, namely subways, resulted in poor user satisfaction. Today, there is still a significant knowledge gap related to perception of underground space. There is also a lack of detailed documentation on actual applications of the theories, followed by research results and applied techniques. This is the case in different areas of architectural design, but for underground spaces perhaps most evident due to their infancv role in general architectural practice. In order to create better designs, diverse aspects, which are very often of qualitative nature, should be considered in perspective with the final goal to improve quality and image of underground space. In the architectural design process, one has to establish certain relations among design information in advance, to make design backed by sound rationale. The main difficulty at this point is that such relationships may not be determined due to various reasons. One example may be the vagueness of the architectural design data due to linguistic qualities in them. Another, may be vaguely defined design qualities. In this work, the problem was not only the initial fuzziness of the information but also the desired relevancy determination among all pieces of information given. Presently, to determine the existence of such relevancy is more or less a matter of architectural subjective judgement rather than systematic, non-subjective decision-making based on an existing design. This implies that the invocation of certain tools dealing with fuzzy information is essential for enhanced design decisions. Efficient methods and tools to deal with qualitative, soft data are scarce, especially in the architectural domain. Traditionally well established methods, such as statistical analysis, have been used mainly for data analysis focused on similar types to the present research. These methods mainly fall into a category of pattern recognition. Statistical regression methods are the most common approaches towards this goal. One essential drawback of this method is the inability of dealing efficiently with non-linear data. With statistical analysis, the linear relationships are established by regression analysis where dealing with non-linearity is mostly evaded. Concerning the presence of multi-dimensional data sets, it is evident that the assumption of linear relationships among all pieces of information would be a gross approximation, which one has no basis to assume. A starting point in this research was that there maybe both linearity and non-linearity present in the data and therefore the appropriate methods should be used in order to deal with that non-linearity. Therefore, some other commensurate methods were adopted for knowledge modeling. In that respect, soft computing techniques proved to match the quality of the multi-dimensional data-set subject to analysis, which is deemed to be 'soft'. There is yet another reason why soft-computing techniques were applied, which is related to the automation of knowledge modeling. In this respect, traditional models such as Decision Support Systems and Expert Systems have drawbacks. One important drawback is that the development of these systems is a time-consuming process. The programming part, in which various deliberations are required to form a consistent if-then rule knowledge based system, is also a time-consuming activity. For these reasons, the methods and tools from other disciplines, which also deal with soft data, should be integrated into architectural design. With fuzzy logic, the imprecision of data can be dealt with in a similar way to how humans do it. Artificial neural networks are deemed to some extent to model the human brain, and simulate its functions in the form of parallel information processing. They are considered important components of Artificial Intelligence (Al). With neural networks, it is possible to learn from examples, or more precisely to learn from input-output data samples. The combination of the neural and fuzzy approach proved to be a powerful combination for dealing with qualitative data. The problem of automated knowledge modeling is efficiently solved by employment of machine learning techniques. Here, the expertise of prof. dr. Ozer Ciftcioglu in the field of soft computing was crucial for tool development. By combining knowledge from two different disciplines a unique tool could be developed that would enable intelligent modeling of soft data needed for support of the building design process. In this respect, this research is a starting point in that direction. It is multidisciplinary and on the cutting edge between the field of Architecture and the field of Artificial Intelligence. From the architectural viewpoint, the perception of space is considered through relationship between a human being and a built environment. Techniques from the field of Artificial Intelligence are employed to model that relationship. Such an efficient combination of two disciplines makes it possible to extend our knowledge boundaries in the field of architecture and improve design quality. With additional techniques, meta know/edge, or in other words "knowledge about knowledge", can be created. Such techniques involve sensitivity analysis, which determines the amount of dependency of the output of a model (comfort and public safety) on the information fed into the model (input). Another technique is functional relationship modeling between aspects, which is derivation of dependency of a design parameter as a function of user's perceptions. With this technique, it is possible to determine functional relationships between dependent and independent variables. This thesis is a contribution to better understanding of users' perception of underground space, through the prism of public safety and comfort, which was achieved by means of intelligent knowledge modeling. In this respect, this thesis demonstrated an application of ICT (Information and Communication Technology) as a partner in the building design process by employing advanced modeling techniques. The method explained throughout this work is very generic and is possible to apply to not only different areas of architectural design, but also to other domains that involve qualitative data.
keywords Underground Space; Perception; Soft Computing
series thesis:PhD
email
last changed 2003/02/12 22:37

_id ga0215
id ga0215
authors Kabala, Joanna
year 2002
title The Side Effect of a Generative Experiment
source International Conference on Generative Art
summary This paper discusses the issue expressed in the call for the Generative Art 2002 conference that says: "GA is identifiable as one of the most advanced approaches in creative and design world." In this paper the value of Generative Art for the art, science and design worlds is described in the reference to a generative experiment. The experiment has been conducted in industrial environment with the aim of defining possibilities for natural interaction of humans with machines. In specific, the experiment examined an option for visual adaptation in accordance to user feedback. In the context of the experiment's outcome the issue of recognizability of Generative Art values is discussed. Generative Art can be identified but is not widely recognized as "one of the most advanced approaches in creative and design world". What makes it difficult for designers to switch to generative thinking and accept immediately Generative Art as the possible way of advancing traditional design methods? And what makes it promising to keep searching for ways of application of Generative Art in contemporary design? Some possible answers, proposed in this paper, aim at contributing to the discussion about the changing role of artists and designers in the contemporary society.
series other
email
more http://www.generativeart.com/
last changed 2003/08/07 17:25

_id c7e9
authors Maver, T.W.
year 2002
title Predicting the Past, Remembering the Future
source SIGraDi 2002 - [Proceedings of the 6th Iberoamerican Congress of Digital Graphics] Caracas (Venezuela) 27-29 november 2002, pp. 2-3
summary Charlas Magistrales 2There never has been such an exciting moment in time in the extraordinary 30 year history of our subject area, as NOW,when the philosophical theoretical and practical issues of virtuality are taking centre stage.The PastThere have, of course, been other defining moments during these exciting 30 years:• the first algorithms for generating building layouts (circa 1965).• the first use of Computer graphics for building appraisal (circa 1966).• the first integrated package for building performance appraisal (circa 1972).• the first computer generated perspective drawings (circa 1973).• the first robust drafting systems (circa 1975).• the first dynamic energy models (circa 1982).• the first photorealistic colour imaging (circa 1986).• the first animations (circa 1988)• the first multimedia systems (circa 1995), and• the first convincing demonstrations of virtual reality (circa 1996).Whereas the CAAD community has been hugely inventive in the development of ICT applications to building design, it hasbeen woefully remiss in its attempts to evaluate the contribution of those developments to the quality of the built environmentor to the efficiency of the design process. In the absence of any real evidence, one can only conjecture regarding the realbenefits which fall, it is suggested, under the following headings:• Verisimilitude: The extraordinary quality of still and animated images of the formal qualities of the interiors and exteriorsof individual buildings and of whole neighborhoods must surely give great comfort to practitioners and their clients thatwhat is intended, formally, is what will be delivered, i.e. WYSIWYG - what you see is what you get.• Sustainability: The power of «first-principle» models of the dynamic energetic behaviour of buildings in response tochanging diurnal and seasonal conditions has the potential to save millions of dollars and dramatically to reduce thedamaging environmental pollution created by badly designed and managed buildings.• Productivity: CAD is now a multi-billion dollar business which offers design decision support systems which operate,effectively, across continents, time-zones, professions and companies.• Communication: Multi-media technology - cheap to deliver but high in value - is changing the way in which we canexplain and understand the past and, envisage and anticipate the future; virtual past and virtual future!MacromyopiaThe late John Lansdown offered the view, in his wonderfully prophetic way, that ...”the future will be just like the past, onlymore so...”So what can we expect the extraordinary trajectory of our subject area to be?To have any chance of being accurate we have to have an understanding of the phenomenon of macromyopia: thephenomenon exhibitted by society of greatly exaggerating the immediate short-term impact of new technologies (particularlythe information technologies) but, more importantly, seriously underestimating their sustained long-term impacts - socially,economically and intellectually . Examples of flawed predictions regarding the the future application of information technologiesinclude:• The British Government in 1880 declined to support the idea of a national telephonic system, backed by the argumentthat there were sufficient small boys in the countryside to run with messages.• Alexander Bell was modest enough to say that: «I am not boasting or exaggerating but I believe, one day, there will bea telephone in every American city».• Tom Watson, in 1943 said: «I think there is a world market for about 5 computers».• In 1977, Ken Olssop of Digital said: «There is no reason for any individuals to have a computer in their home».The FutureJust as the ascent of woman/man-kind can be attributed to her/his capacity to discover amplifiers of the modest humancapability, so we shall discover how best to exploit our most important amplifier - that of the intellect. The more we know themore we can figure; the more we can figure the more we understand; the more we understand the more we can appraise;the more we can appraise the more we can decide; the more we can decide the more we can act; the more we can act themore we can shape; and the more we can shape, the better the chance that we can leave for future generations a trulysustainable built environment which is fit-for-purpose, cost-beneficial, environmentally friendly and culturally significactCentral to this aspiration will be our understanding of the relationship between real and virtual worlds and how to moveeffortlessly between them. We need to be able to design, from within the virtual world, environments which may be real ormay remain virtual or, perhaps, be part real and part virtual.What is certain is that the next 30 years will be every bit as exciting and challenging as the first 30 years.
series SIGRADI
email
last changed 2016/03/10 09:55

_id 0f18
authors Bailey, Rohan
year 2001
title A Digital Design Coach for Young Designers
doi https://doi.org/10.52842/conf.acadia.2001.330
source Reinventing the Discourse - How Digital Tools Help Bridge and Transform Research, Education and Practice in Architecture [Proceedings of the Twenty First Annual Conference of the Association for Computer-Aided Design in Architecture / ISBN 1-880250-10-1] Buffalo (New York) 11-14 October 2001, pp. 330-335
summary The present use of digital media in architectural practice and education is primarily focused on representation, communication of ideas and production. Designers, however, still use pencil and paper to assist the early conception of ideas. Recently, research into providing digital tools for designers to use in conceptual designing has focused on enhancing or assisting the designer. Rarely has the computer been regarded as a potential teaching tool for design skills. Based on previous work by the author about visual thinking and the justification for a digital design assistant, the intention of this paper is to illustrate to the reader the feasibility of a digital design coach. Reference is made to recent advances in research about design computability. In particular, research by Mark Gross and Ellen Do with respect to their Electronic Cocktail Napkin project is used as a basis on which to determine what such a digital coach may look and feel like.
keywords Design Education, Protocol Analysis, CADD, Sketching
series ACADIA
email
last changed 2022/06/07 07:54

_id 536e
authors Bouman, Ole
year 1997
title RealSpace in QuickTimes: architecture and digitization
source Rotterdam: Nai Publishers
summary Time and space, drastically compressed by the computer, have become interchangeable. Time is compressed in that once everything has been reduced to 'bits' of information, it becomes simultaneously accessible. Space is compressed in that once everything has been reduced to 'bits' of information, it can be conveyed from A to B with the speed of light. As a result of digitization, everything is in the here and now. Before very long, the whole world will be on disk. Salvation is but a modem away. The digitization process is often seen in terms of (information) technology. That is to say, one hears a lot of talk about the digital media, about computer hardware, about the modem, mobile phone, dictaphone, remote control, buzzer, data glove and the cable or satellite links in between. Besides, our heads are spinning from the progress made in the field of software, in which multimedia applications, with their integration of text, image and sound, especially attract our attention. But digitization is not just a question of technology, it also involves a cultural reorganization. The question is not just what the cultural implications of digitization will be, but also why our culture should give rise to digitization in the first place. Culture is not simply a function of technology; the reverse is surely also true. Anyone who thinks about cultural implications, is interested in the effects of the computer. And indeed, those effects are overwhelming, providing enough material for endless speculation. The digital paradigm will entail a new image of humankind and a further dilution of the notion of social perfectibility; it will create new notions of time and space, a new concept of cause and effect and of hierarchy, a different sort of public sphere, a new view of matter, and so on. In the process it will indubitably alter our environment. Offices, shopping centres, dockyards, schools, hospitals, prisons, cultural institutions, even the private domain of the home: all the familiar design types will be up for review. Fascinated, we watch how the new wave accelerates the process of social change. The most popular sport nowadays is 'surfing' - because everyone is keen to display their grasp of dirty realism. But there is another way of looking at it: under what sort of circumstances is the process of digitization actually taking place? What conditions do we provide that enable technology to exert the influence it does? This is a perspective that leaves room for individual and collective responsibility. Technology is not some inevitable process sweeping history along in a dynamics of its own. Rather, it is the result of choices we ourselves make and these choices can be debated in a way that is rarely done at present: digitization thanks to or in spite of human culture, that is the question. In addition to the distinction between culture as the cause or the effect of digitization, there are a number of other distinctions that are accentuated by the computer. The best known and most widely reported is the generation gap. It is certainly stretching things a bit to write off everybody over the age of 35, as sometimes happens, but there is no getting around the fact that for a large group of people digitization simply does not exist. Anyone who has been in the bit business for a few years can't help noticing that mum and dad are living in a different place altogether. (But they, at least, still have a sense of place!) In addition to this, it is gradually becoming clear that the age-old distinction between market and individual interests are still relevant in the digital era. On the one hand, the advance of cybernetics is determined by the laws of the marketplace which this capital-intensive industry must satisfy. Increased efficiency, labour productivity and cost-effectiveness play a leading role. The consumer market is chiefly interested in what is 'marketable': info- and edutainment. On the other hand, an increasing number of people are not prepared to wait for what the market has to offer them. They set to work on their own, appropriate networks and software programs, create their own domains in cyberspace, domains that are free from the principle whereby the computer simply reproduces the old world, only faster and better. Here it is possible to create a different world, one that has never existed before. One, in which the Other finds a place. The computer works out a new paradigm for these creative spirits. In all these distinctions, architecture plays a key role. Owing to its many-sidedness, it excludes nothing and no one in advance. It is faced with the prospect of historic changes yet it has also created the preconditions for a digital culture. It is geared to the future, but has had plenty of experience with eternity. Owing to its status as the most expensive of arts, it is bound hand and foot to the laws of the marketplace. Yet it retains its capacity to provide scope for creativity and innovation, a margin of action that is free from standardization and regulation. The aim of RealSpace in QuickTimes is to show that the discipline of designing buildings, cities and landscapes is not only a exemplary illustration of the digital era but that it also provides scope for both collective and individual activity. It is not just architecture's charter that has been changed by the computer, but also its mandate. RealSpace in QuickTimes consists of an exhibition and an essay.
series other
email
last changed 2003/04/23 15:14

_id 48a7
authors Brooks
year 1999
title What's Real About Virtual Reality
source IEEE Computer Graphics and Applications, Vol. 19, no. 6, Nov/Dec, 27
summary As is usual with infant technologies, the realization of the early dreams for VR and harnessing it to real work has taken longer than the wild hype predicted, but it is now happening. I assess the current state of the art, addressing the perennial questions of technology and applications. By 1994, one could honestly say that VR "almost works." Many workers at many centers could doe quite exciting demos. Nevertheless, the enabling technologies had limitations that seriously impeded building VR systems for any real work except entertainment and vehicle simulators. Some of the worst problems were end-to-end system latencies, low-resolution head-mounted displays, limited tracker range and accuracy, and costs. The technologies have made great strides. Today one can get satisfying VR experiences with commercial off-the-shelf equipment. Moreover, technical advances have been accompanied by dropping costs, so it is both technically and economically feasible to do significant application. VR really works. That is not to say that all the technological problems and limitations have been solved. VR technology today "barely works." Nevertheless, coming over the mountain pass from "almost works" to "barely works" is a major transition for the discipline. I have sought out applications that are now in daily productive use, in order to find out exactly what is real. Separating these from prototype systems and feasibility demos is not always easy. People doing daily production applications have been forthcoming about lessons learned and surprises encountered. As one would expect, the initial production applications are those offering high value over alternate approaches. These applications fall into a few classes. I estimate that there are about a hundred installations in daily productive use worldwide.
series journal paper
email
last changed 2003/04/23 15:14

_id d60a
authors Casti, J.C.
year 1997
title Would be Worlds: How simulation is changing the frontiers of science
source John Wiley & Sons, Inc., New York.
summary Five Golden Rules is caviar for the inquiring reader. Anyone who enjoyed solving math problems in high school will be able to follow the author's explanations, even if high school was a long time ago. There is joy here in watching the unfolding of these intricate and beautiful techniques. Casti's gift is to be able to let the nonmathematical reader share in his understanding of the beauty of a good theory.-Christian Science Monitor "[Five Golden Rules] ranges into exotic fields such as game theory (which played a role in the Cuban Missile Crisis) and topology (which explains how to turn a doughnut into a coffee cup, or vice versa). If you'd like to have fun while giving your brain a first-class workout, then check this book out."-San Francisco Examiner "Unlike many popularizations, [this book] is more than a tour d'horizon: it has the power to change the way you think. Merely knowing about the existence of some of these golden rules may spark new, interesting-maybe even revolutionary-ideas in your mind. And what more could you ask from a book?"-New Scientist "This book has meat! It is solid fare, food for thought . . . makes math less forbidding, and much more interesting."-Ben Bova, The Hartford Courant "This book turns math into beauty."-Colorado Daily "John Casti is one of the great science writers of the 1990s."-San Francisco Examiner In the ever-changing world of science, new instruments often lead to momentous discoveries that dramatically transform our understanding. Today, with the aid of a bold new instrument, scientists are embarking on a scientific revolution as profound as that inspired by Galileo's telescope. Out of the bits and bytes of computer memory, researchers are fashioning silicon surrogates of the real world-elaborate "artificial worlds"-that allow them to perform experiments that are too impractical, too costly, or, in some cases, too dangerous to do "in the flesh." From simulated tests of new drugs to models of the birth of planetary systems and galaxies to computerized petri dishes growing digital life forms, these laboratories of the future are the essential tools of a controversial new scientific method. This new method is founded not on direct observation and experiment but on the mapping of the universe from real space into cyberspace. There is a whole new science happening here-the science of simulation. The most exciting territory being mapped by artificial worlds is the exotic new frontier of "complex, adaptive systems." These systems involve living "agents" that continuously change their behavior in ways that make prediction and measurement by the old rules of science impossible-from environmental ecosystems to the system of a marketplace economy. Their exploration represents the horizon for discovery in the twenty-first century, and simulated worlds are charting the course. In Would-Be Worlds, acclaimed author John Casti takes readers on a fascinating excursion through a number of remarkable silicon microworlds and shows us how they are being used to formulate important new theories and to solve a host of practical problems. We visit Tierra, a "computerized terrarium" in which artificial life forms known as biomorphs grow and mutate, revealing new insights into natural selection and evolution. We play a game of Balance of Power, a simulation of the complex forces shaping geopolitics. And we take a drive through TRANSIMS, a model of the city of Albuquerque, New Mexico, to discover the root causes of events like traffic jams and accidents. Along the way, Casti probes the answers to a host of profound questions these "would-be worlds" raise about the new science of simulation. If we can create worlds inside our computers at will, how real can we say they are? Will they unlock the most intractable secrets of our universe? Or will they reveal instead only the laws of an alternate reality? How "real" do these models need to be? And how real can they be? The answers to these questions are likely to change the face of scientific research forever.
series other
last changed 2003/04/23 15:14

_id 9e26
authors Do, Ellen Yi-Luen,
year 1999
title The right tool at the right time : investigation of freehand drawing as an interface to knowledge based design tools
source College of Architecture, Georgia Institute of Technology
summary Designers use different symbols and configurations in their drawings to explore alternatives and to communicate with each other. For example, when thinking about spatial arrangements, they draw bubble diagrams; when thinking about natural lighting, they draw a sun symbol and light rays. Given the connection between drawings and thinking, one should be able infer design intentions from a drawing and ultimately use such inferences to program a computer to understand our drawings. This dissertation reports findings from empirical studies on drawings and explores the possibility of using the computer to automatically infer designer's concerns from the drawings a designer makes. This dissertation consists of three parts: 1) a literature review of design studies, cognitive studies of drawing and computational sketch systems, and a set of pilot projects; 2) empirical studies of diagramming design intentions and a design drawing experiment; and 3) the implementation of a prototype system called Right-Tool-Right-Time. The main goal is to find out what is in design drawings that a computer program should be able to recognize and support. Experiments were conducted to study the relation between drawing conventions and the design tasks with which they are associated. It was found from the experiments that designers use certain symbols and configurations when thinking about certain design concerns. When thinking about allocating objects or spaces with a required dimensions, designers wrote down numbers beside the drawing to reason xviii about size and to calculate dimensions. When thinking about visual analysis, designers drew sight lines from a view point on a floor plan. Based on the recognition that it is possible to associate symbols and spatial arrangements in a drawing with a designer's intention, or task context, the second goal is to find out whether a computer can be programed to recognize these drawing conventions. Given an inferred intention and context, a program should be able to activate appropriate design tools automatically. For example, concerns about visual analysis can activate a visual simulation program, and number calculations can activate a calculator. The Right- Tool-Right-Time prototype program demonstrates how a freehand sketching system that infers intentions would support the automatic activation of different design tools based on a designers' drawing acts.
series thesis:PhD
email
more http://www.arch.gatech.edu/~ellen/thesis.html
last changed 2004/10/04 07:49

_id ecaade2023_000
id ecaade2023_000
authors Dokonal, Wolfgang, Hirschberg, Urs and Wurzer, Gabriel
year 2023
title eCAADe 2023 Digital Design Reconsidered - Volume 1
doi https://doi.org/10.52842/conf.ecaade.2023.1.001
source Dokonal, W, Hirschberg, U and Wurzer, G (eds.), Digital Design Reconsidered - Proceedings of the 41st Conference on Education and Research in Computer Aided Architectural Design in Europe (eCAADe 2023) - Volume 1, Graz, 20-22 September 2023, 905 p.
summary The conference logo is a bird’s eye view of spiral stairs that join and separate – an homage to the famous double spiral staircase in Graz, a tourist attraction of this city and a must-see for any architecturally minded visitor. Carved out of limestone, the medieval construction of the original is a daring feat of masonry as well as a symbolic gesture. The design speaks of separation and reconciliation: The paths of two people that climb the double spiral stairs separate and then meet again at each platform. The relationship between architectural design and the growing digital repertoire of tools and possibilities seems to undergo similar cycles of attraction and rejection: enthusiasm about digital innovations – whether in Virtual Reality, Augmented Reality, Energy Design, Robotic Fabrication, the many Dimensions of BIM or, as right now, in AI and Machine Learning – is typically followed by a certain disillusionment and a realization that the promises were somewhat overblown. But a turn away from these digital innovations can only be temporary. In our call for papers we refer to the first and second ‘digital turns’, a term Mario Carpo coined. Yes, it’s a bit of a pun, but you could indeed see these digital turns in our logo as well. Carpo would probably agree that design and the digital have become inseparably intertwined. While they may be circling in different directions, an innovative rejoinder is always just around the corner. The theme of the conference asked participants to re-consider the relationship between Design and the Digital. The notion of a cycle is already present in the syllable “re”. Indeed, 20 years earlier, in 2003, we held an ECAADE conference in Graz simply under the title “Digital Design” and our re-using – or is it re-cycling? – the theme can be seen as the completion of one of those cycles described above: One level up, we meet again, we’ve come full circle. The question of the relationship between Design and the Digital is still in flux, still worthy of renewed consideration. There is a historical notion implicit in the theme. To reconsider something, one needs to take a step back, to look into the past as well as into the future. Indeed, at this conference we wanted to take a longer view, something not done often enough in the fast-paced world of digital technology. Carefully considering one’s past can be a source of inspiration. In fact, the double spiral stair that inspired our conference logo also inspired many architects through the ages. Konrad Wachsmann, for example, is said to have come up with his famous Grapevine assembly system based on this double spiral stair and its intricate joinery. More recently, Rem Koolhaas deemed the double spiral staircase in Graz important enough to include a detailed model of it in his “elements of architecture” exhibition at the Venice Biennale in 2014. Our interpretation of the stair is a typically digital one, you might say. First of all: it’s a rendering of a virtual model; it only exists inside a computer. Secondly, this virtual model isn’t true to the original. Instead, it does what the digital has made so easy to do: it exaggerates. Where the original has just two spiral stairs that separate and join, our model consists of countless stairs that are joined in this way. We see only a part of the model, but the stairs appear to continue in all directions. The implication is of an endless field of spiral stairs. As the 3D model was generated with a parametric script, it would be very easy to change all parameters of it – including the number of stairs that make it up. Everyone at this conference is familiar with the concept of parametric design: it makes generating models of seemingly endless amounts of connected spiral stairs really easy. Although, of course, if we’re too literal about the term ‘endless’, generating our stair model will eventually crash even the most advanced computers. We know that, too. – That's another truth about the Digital: it makes a promise of infinity, which, in the end, it can’t keep. And even if it could: what’s the point of just adding more of the same: more variations, more options, more possible ways to get lost? Doesn’t the original double spiral staircase contain all those derivatives already? Don’t we know that ‘more’ isn’t necessarily better? In the original double spiral stair the happy end is guaranteed: the lovers’ paths meet at the top as well as when they exit the building. Therefore, the stair is also colloquially known as the Busserlstiege (the kissing stair) or the Versöhnungsstiege (reconciliation stair). In our digitally enhanced version, this outcome is no longer clear: we can choose between multiple directions at each level and we risk losing sight of the one we were with. This is also emblematic of our field of research. eCAADe was founded to promote “good practice and sharing information in relation to the use of computers in research and education in architecture and related professions” (see ecaade.org). That may have seemed a straightforward proposition forty years ago, when the association was founded. A look at the breadth and depth of research topics presented and discussed at this conference (and as a consequence in this book, for which you’re reading the editorial) shows how the field has developed over these forty years. There are sessions on Digital Design Education, on Digital Fabrication, on Virtual Reality, on Virtual Heritage, on Generative Design and Machine Learning, on Digital Cities, on Simulation and Digital Twins, on BIM, on Sustainability, on Circular Design, on Design Theory and on Digital Design Experimentations. We hope you will find what you’re looking for in this book and at the conference – and maybe even more than that: surprising turns and happy encounters between Design and the Digital.
series eCAADe
email
last changed 2023/12/10 10:49

_id ecaade2023_001
id ecaade2023_001
authors Dokonal, Wolfgang, Hirschberg, Urs and Wurzer, Gabriel
year 2023
title eCAADe 2023 Digital Design Reconsidered - Volume 2
doi https://doi.org/10.52842/conf.ecaade.2023.2.001
source Dokonal, W, Hirschberg, U and Wurzer, G (eds.), Digital Design Reconsidered - Proceedings of the 41st Conference on Education and Research in Computer Aided Architectural Design in Europe (eCAADe 2023) - Volume 2, Graz, 20-22 September 2023, 899 p.
summary The conference logo is a bird’s eye view of spiral stairs that join and separate – an homage to the famous double spiral staircase in Graz, a tourist attraction of this city and a must-see for any architecturally minded visitor. Carved out of limestone, the medieval construction of the original is a daring feat of masonry as well as a symbolic gesture. The design speaks of separation and reconciliation: The paths of two people that climb the double spiral stairs separate and then meet again at each platform. The relationship between architectural design and the growing digital repertoire of tools and possibilities seems to undergo similar cycles of attraction and rejection: enthusiasm about digital innovations – whether in Virtual Reality, Augmented Reality, Energy Design, Robotic Fabrication, the many Dimensions of BIM or, as right now, in AI and Machine Learning – is typically followed by a certain disillusionment and a realization that the promises were somewhat overblown. But a turn away from these digital innovations can only be temporary. In our call for papers we refer to the first and second ‘digital turns’, a term Mario Carpo coined. Yes, it’s a bit of a pun, but you could indeed see these digital turns in our logo as well. Carpo would probably agree that design and the digital have become inseparably intertwined. While they may be circling in different directions, an innovative rejoinder is always just around the corner. The theme of the conference asked participants to re-consider the relationship between Design and the Digital. The notion of a cycle is already present in the syllable “re”. Indeed, 20 years earlier, in 2003, we held an ECAADE conference in Graz simply under the title “Digital Design” and our re-using – or is it re-cycling? – the theme can be seen as the completion of one of those cycles described above: One level up, we meet again, we’ve come full circle. The question of the relationship between Design and the Digital is still in flux, still worthy of renewed consideration. There is a historical notion implicit in the theme. To reconsider something, one needs to take a step back, to look into the past as well as into the future. Indeed, at this conference we wanted to take a longer view, something not done often enough in the fast-paced world of digital technology. Carefully considering one’s past can be a source of inspiration. In fact, the double spiral stair that inspired our conference logo also inspired many architects through the ages. Konrad Wachsmann, for example, is said to have come up with his famous Grapevine assembly system based on this double spiral stair and its intricate joinery. More recently, Rem Koolhaas deemed the double spiral staircase in Graz important enough to include a detailed model of it in his “elements of architecture” exhibition at the Venice Biennale in 2014. Our interpretation of the stair is a typically digital one, you might say. First of all: it’s a rendering of a virtual model; it only exists inside a computer. Secondly, this virtual model isn’t true to the original. Instead, it does what the digital has made so easy to do: it exaggerates. Where the original has just two spiral stairs that separate and join, our model consists of countless stairs that are joined in this way. We see only a part of the model, but the stairs appear to continue in all directions. The implication is of an endless field of spiral stairs. As the 3D model was generated with a parametric script, it would be very easy to change all parameters of it – including the number of stairs that make it up. Everyone at this conference is familiar with the concept of parametric design: it makes generating models of seemingly endless amounts of connected spiral stairs really easy. Although, of course, if we’re too literal about the term ‘endless’, generating our stair model will eventually crash even the most advanced computers. We know that, too. – That's another truth about the Digital: it makes a promise of infinity, which, in the end, it can’t keep. And even if it could: what’s the point of just adding more of the same: more variations, more options, more possible ways to get lost? Doesn’t the original double spiral staircase contain all those derivatives already? Don’t we know that ‘more’ isn’t necessarily better? In the original double spiral stair the happy end is guaranteed: the lovers’ paths meet at the top as well as when they exit the building. Therefore, the stair is also colloquially known as the Busserlstiege (the kissing stair) or the Versöhnungsstiege (reconciliation stair). In our digitally enhanced version, this outcome is no longer clear: we can choose between multiple directions at each level and we risk losing sight of the one we were with. This is also emblematic of our field of research. eCAADe was founded to promote “good practice and sharing information in relation to the use of computers in research and education in architecture and related professions” (see ecaade.org). That may have seemed a straightforward proposition forty years ago, when the association was founded. A look at the breadth and depth of research topics presented and discussed at this conference (and as a consequence in this book, for which you’re reading the editorial) shows how the field has developed over these forty years. There are sessions on Digital Design Education, on Digital Fabrication, on Virtual Reality, on Virtual Heritage, on Generative Design and Machine Learning, on Digital Cities, on Simulation and Digital Twins, on BIM, on Sustainability, on Circular Design, on Design Theory and on Digital Design Experimentations. We hope you will find what you’re looking for in this book and at the conference – and maybe even more than that: surprising turns and happy encounters between Design and the Digital.
series eCAADe
type normal paper
email
last changed 2024/08/29 08:36

_id 3905
authors Duffy, T.M. and Cunningham, D.J.
year 1996
title Constructivism: Implications for the design and delivery of instruction
source D.H. Jonassen, (Ed) Handbook of research for educational communications and technology, N.Y; Macmillan Library reference USA
summary This will be a seminar that examines Constructivist theory as it applies to our thinking about instruction. Many folks think of constructivism as a method of instruction -- it is not. It is a framework for thinking about learning or what it means to come to know. As such, it is a framework for understanding (interpreting) any learning environment as well as a framework for designing instruction. The seminar will be organized around weekly readings. We will examine the alternative constructivist theories, e.g., socio-cultural constructivism and cognitive constructivism, and the pragmatism of Richard Rorty. However, rather than focusing on the differences between these frameworks, our emphasis will be on the implications of the broader, common framework for the design of instruction. Hence we will spend most of the semester discussing strategies for designing and delivering instruction, e.g., the work of Bransford, Collins, Pea, Jonassen, Spiro, Fosnot, Senge, and Schank. We will consider both business and schooling environments for learning -- there is significant work in both domains. There will be particular emphasis of the use of technology in instruction. We will look at the communication, information, and context providing roles of technology as contrasted to the traditional approach of using technology to deliver instruction (to teach). We will also pay particular attention to problem based learning as one instructional model. In PBL there is particular emphasis on the role of the facilitator as a learning coach (process orientation) as opposed to a content provider. There is also a particular emphasis on supporting the development of abductive reasoning skills so that the learner develops the ability to be an effective problem solver in the content domain. The major paper/project for the course will be the design of instruction to train individuals to be learning coaches in a problem based learning or goal based scenario learning environment. That is, how do you support teachers in adapting the role of learning coach (which, of course, requires us to understand what it means to be a learning coach). Design teams will be formed with the teams all working on this same design problem. A comprehensive prototype of the learning environment is required as well as a paper provide the theoretical framework and rationale for the design strategy. While not required, I would expect that computer technology will play a significant role in the design of your learning environment. With that in mind, let me note that it is not required that the prototype be delivered on the computer, i.e., I am not requiring programming skills but rather design skills and so "storyboards" is all that is required.
series other
last changed 2003/04/23 15:14

_id acadia09_18
id acadia09_18
authors d’Estrée Sterk, Tristan
year 2009
title Introduction: Thoughts for Gen X-Speculating about the Rise of Continuous Measurement in Architecture
doi https://doi.org/10.52842/conf.acadia.2009.018
source ACADIA 09: reForm( ) - Building a Better Tomorrow [Proceedings of the 29th Annual Conference of the Association for Computer Aided Design in Architecture (ACADIA) ISBN 978-0-9842705-0-7] Chicago (Illinois) 22-25 October, 2009), pp. 18-22
summary We are here, in Chicago, not to talk about what we know, but what we do not know. We are here to share ideas and to speculate about what the world might look like if it were challenged, rethought, and rebuilt. We are here to uncover, piece by piece, a sense of our own ambitions for an architecture influenced by today but motivated by tomorrow. We are all speculators and dreamers. We find places for dreaming in our work, our models, our essays, our lectures, our research, and our teaching. Through these activities we speculate on the architecture of tomorrow. Sometimes these speculations hold great promise, while at other times they do not – certainly much of what we do can be improved, refined, qualified, quantified, and genuinely benefit from being computed. This could be horrifying; it could set the scene for an engineered architecture if we do not adapt.But architecture is changing and responding to very fresh and different ways of thinking. As a movement, young architects are questioning their inheritance and establishing new values, new methods, and new forms of practice. We might best think of these young architects as the Generation X of architecture – a generation who shapes discourse through technological, social, and environmental lenses. From its smallest technical process to its highest level of thought, this conference represents the spirit of this movement.
keywords Introduction, Measurement, dynamic design
series ACADIA
type normal paper
email
last changed 2022/06/07 07:55

_id caadria2020_272
id caadria2020_272
authors Erhan, Halil, Abuzuraiq, Ahmed M., Zarei, Maryam, AlSalman, Osama, Woodbury, Robert and Dill, John
year 2020
title What do Design Data say About Your Model? - A Case Study on Reliability and Validity
doi https://doi.org/10.52842/conf.caadria.2020.1.557
source D. Holzer, W. Nakapan, A. Globa, I. Koh (eds.), RE: Anthropocene, Design in the Age of Humans - Proceedings of the 25th CAADRIA Conference - Volume 1, Chulalongkorn University, Bangkok, Thailand, 5-6 August 2020, pp. 557-567
summary Parametric modeling systems are widely used in architectural design. Their use for designing complex built environments raises important practical challenges when composed by multiple people with diverse interests and using mostly unverified computational modules. Through a case study, we investigate possible concerns identifiable from a real-world collaborative design setting and how such concerns can be revealed through interactive data visualizations of parametric models. We then present our approach for resolving these concerns using a design analytic workflow for examine their reliability and validity. We summarize the lessons learnt from the case study, such as the importance of an abundance of test cases, reproducible design instances, accessing and interacting with data during all phases of design, and seeking high cohesion and decoupling between design geometry and evaluation components. We suggest a systematic integration of design modeling and analytics for enhancing a reliable design decision-making.
keywords Model Reliability; Model Validity; Parametric Modeling; Design Analytics; Design Visualization
series CAADRIA
email
last changed 2022/06/07 07:55

_id 7ce5
authors Gal, Shahaf
year 1992
title Computers and Design Activities: Their Mediating Role in Engineering Education
source Sociomedia, ed. Edward Barret. MIT Press
summary Sociomedia: With all the new words used to describe electronic communication (multimedia, hypertext, cyberspace, etc.), do we need another one? Edward Barrett thinks we do; hence, he coins the term "sociomedia." It is meant to displace a computing economy in which technicity is hypostasized over sociality. Sociomedia, a compilation of twenty-five articles on the theory, design and practice of educational multimedia and hypermedia, attempts to re-value the communicational face of computing. Value, of course, is "ultimately a social construct." As such, it has everything to do with knowledge, power, education and technology. The projects discussed in this book represent the leading edge of electronic knowledge production in academia (not to mention major funding) and are determining the future of educational media. For these reasons, Sociomedia warrants close inspection. Barrett's introduction sets the tone. For him, designing computer media involves hardwiring a mechanism for the social construction of knowledge (1). He links computing to a process of social and communicative interactivity for constructing and desseminating knowledge. Through a mechanistic mapping of the university as hypercontext (a huge network that includes classrooms as well as services and offices), Barrett models intellectual work in such a way as to avoid "limiting definitions of human nature or human development." Education, then, can remain "where it should be--in the human domain (public and private) of sharing ideas and information through the medium of language." By leaving education in a virtual realm (where we can continue to disagree about its meaning and execution), it remains viral, mutating and contaminating in an intellectually healthy way. He concludes that his mechanistic model, by means of its reductionist approach, preserves value (7). This "value" is the social construction of knowledge. While I support the social orientation of Barrett's argument, discussions of value are related to power. I am not referring to the traditional teacher-student power structure that is supposedly dismantled through cooperative and constructivist learning strategies. The power to be reckoned with in the educational arena is foundational, that which (pre)determines value and the circulation of knowledge. "Since each of you reading this paragraph has a different perspective on the meaning of 'education' or 'learning,' and on the processes involved in 'getting an education,' think of the hybris in trying to capture education in a programmable function, in a displayable object, in a 'teaching machine'" (7). Actually, we must think about that hybris because it is, precisely, what informs teaching machines. Moreover, the basic epistemological premises that give rise to such productions are too often assumed. In the case of instructional design, the episteme of cognitive sciences are often taken for granted. It is ironic that many of the "postmodernists" who support electronic hypertextuality seem to have missed Jacques Derrida's and Michel Foucault's "deconstructions" of the epistemology underpinning cognitive sciences (if not of epistemology itself). Perhaps it is the glitz of the technology that blinds some users (qua developers) to the belief systems operating beneath the surface. Barrett is not guilty of reactionary thinking or politics; he is, in fact, quite in line with much American deconstructive and postmodern thinking. The problem arises in that he leaves open the definitions of "education," "learning" and "getting an education." One cannot engage in the production of new knowledge without orienting its design, production and dissemination, and without negotiating with others' orientations, especially where largescale funding is involved. Notions of human nature and development are structural, even infrastructural, whatever the medium of the teaching machine. Although he addresses some dynamics of power, money and politics when he talks about the recession and its effects on the conference, they are readily visible dynamics of power (3-4). Where does the critical factor of value determination, of power, of who gets what and why, get mapped onto a mechanistic model of learning institutions? Perhaps a mapping of contributors' institutions, of the funding sources for the projects showcased and for participation in the conference, and of the disciplines receiving funding for these sorts of projects would help visualize the configurations of power operative in the rising field of educational multimedia. Questions of power and money notwithstanding, Barrett's introduction sets the social and textual thematics for the collection of essays. His stress on interactivity, on communal knowledge production, on the society of texts, and on media producers and users is carried foward through the other essays, two of which I will discuss. Section I of the book, "Perspectives...," highlights the foundations, uses and possible consequences of multimedia and hypertextuality. The second essay in this section, "Is There a Class in This Text?," plays on the robust exchange surrounding Stanley Fish's book, Is There a Text in This Class?, which presents an attack on authority in reading. The author, John Slatin, has introduced electronic hypertextuality and interaction into his courses. His article maps the transformations in "the content and nature of work, and the workplace itself"-- which, in this case, is not industry but an English poetry class (25). Slatin discovered an increase of productive and cooperative learning in his electronically- mediated classroom. For him, creating knowledge in the electronic classroom involves interaction between students, instructors and course materials through the medium of interactive written discourse. These interactions lead to a new and persistent understanding of the course materials and of the participants' relation to the materials and to one another. The work of the course is to build relationships that, in my view, constitute not only the meaning of individual poems, but poetry itself. The class carries out its work in the continual and usually interactive production of text (31). While I applaud his strategies which dismantle traditional hierarchical structures in academia, the evidence does not convince me that the students know enough to ask important questions or to form a self-directing, learning community. Stanley Fish has not relinquished professing, though he, too, espouses the indeterminancy of the sign. By the fourth week of his course, Slatin's input is, by his own reckoning, reduced to 4% (39). In the transcript of the "controversial" Week 6 exchange on Gertrude Stein--the most disliked poet they were discussing at the time (40)--we see the blind leading the blind. One student parodies Stein for three lines and sums up his input with "I like it." Another, finds Stein's poetry "almost completey [sic] lacking in emotion or any artistic merit" (emphasis added). On what grounds has this student become an arbiter of "artistic merit"? Another student, after admitting being "lost" during the Wallace Steven discussion, talks of having more "respect for Stevens' work than Stein's" and adds that Stein's poetry lacks "conceptual significance[, s]omething which people of varied opinion can intelligently discuss without feeling like total dimwits...." This student has progressed from admitted incomprehension of Stevens' work to imposing her (groundless) respect for his work over Stein's. Then, she exposes her real dislike for Stein's poetry: that she (the student) missed the "conceptual significance" and hence cannot, being a person "of varied opinion," intelligently discuss it "without feeling like [a] total dimwit." Slatin's comment is frightening: "...by this point in the semester students have come to feel increasingly free to challenge the instructor" (41). The students that I have cited are neither thinking critically nor are their preconceptions challenged by student-governed interaction. Thanks to the class format, one student feels self-righteous in her ignorance, and empowered to censure. I believe strongly in student empowerment in the classroom, but only once students have accrued enough knowledge to make informed judgments. Admittedly, Slatin's essay presents only partial data (there are six hundred pages of course transcripts!); still, I wonder how much valuable knowledge and metaknowledge was gained by the students. I also question the extent to which authority and professorial dictature were addressed in this course format. The power structures that make it possible for a college to require such a course, and the choice of texts and pedagogy, were not "on the table." The traditional professorial position may have been displaced, but what took its place?--the authority of consensus with its unidentifiable strong arm, and the faceless reign of software design? Despite Slatin's claim that the students learned about the learning process, there is no evidence (in the article) that the students considered where their attitudes came from, how consensus operates in the construction of knowledge, how power is established and what relationship they have to bureaucratic insitutions. How do we, as teaching professionals, negotiate a balance between an enlightened despotism in education and student-created knowledge? Slatin, and other authors in this book, bring this fundamental question to the fore. There is no definitive answer because the factors involved are ultimately social, and hence, always shifting and reconfiguring. Slatin ends his article with the caveat that computerization can bring about greater estrangement between students, faculty and administration through greater regimentation and control. Of course, it can also "distribute authority and power more widely" (50). Power or authority without a specific face, however, is not necessarily good or just. Shahaf Gal's "Computers and Design Activities: Their Mediating Role in Engineering Education" is found in the second half of the volume, and does not allow for a theory/praxis dichotomy. Gal recounts a brief history of engineering education up to the introduction of Growltiger (GT), a computer-assisted learning aid for design. He demonstrates GT's potential to impact the learning of engineering design by tracking its use by four students in a bridge-building contest. What his text demonstrates clearly is that computers are "inscribing and imaging devices" that add another viewpoint to an on-going dialogue between student, teacher, earlier coursework, and other teaching/learning tools. The less proficient students made a serious error by relying too heavily on the technology, or treating it as a "blueprint provider." They "interacted with GT in a way that trusted the data to represent reality. They did not see their interaction with GT as a negotiation between two knowledge systems" (495). Students who were more thoroughly informed in engineering discourses knew to use the technology as one voice among others--they knew enough not simply to accept the input of the computer as authoritative. The less-advanced students learned a valuable lesson from the competition itself: the fact that their designs were not able to hold up under pressure (literally) brought the fact of their insufficient knowledge crashing down on them (and their bridges). They also had, post factum, several other designs to study, especially the winning one. Although competition and comparison are not good pedagogical strategies for everyone (in this case the competitors had volunteered), at some point what we think we know has to be challenged within the society of discourses to which it belongs. Students need critique in order to learn to push their learning into auto-critique. This is what is lacking in Slatin's discussion and in the writings of other avatars of constructivist, collaborative and computer-mediated pedagogies. Obviously there are differences between instrumental types of knowledge acquisition and discoursive knowledge accumulation. Indeed, I do not promote the teaching of reading, thinking and writing as "skills" per se (then again, Gal's teaching of design is quite discursive, if not dialogic). Nevertheless, the "soft" sciences might benefit from "bridge-building" competitions or the re-institution of some forms of agonia. Not everything agonistic is inhuman agony--the joy of confronting or creating a sound argument supported by defensible evidence, for example. Students need to know that soundbites are not sound arguments despite predictions that electronic writing will be aphoristic rather than periodic. Just because writing and learning can be conceived of hypertextually does not mean that rigor goes the way of the dinosaur. Rigor and hypertextuality are not mutually incompatible. Nor is rigorous thinking and hard intellectual work unpleasurable, although American anti-intellectualism, especially in the mass media, would make it so. At a time when the spurious dogmatics of a Rush Limbaugh and Holocaust revisionist historians circulate "aphoristically" in cyberspace, and at a time when knowledge is becoming increasingly textualized, the role of critical thinking in education will ultimately determine the value(s) of socially constructed knowledge. This volume affords the reader an opportunity to reconsider knowledge, power, and new communications technologies with respect to social dynamics and power relationships.
series other
last changed 2003/04/23 15:14

_id c4a6
authors Haapasalo, Harri
year 1997
title The Role of CAD In Creative Architectural Sketching
doi https://doi.org/10.52842/conf.ecaade.1997.x.o2b
source Challenges of the Future [15th eCAADe Conference Proceedings / ISBN 0-9523687-3-0] Vienna (Austria) 17-20 September 1997
summary The history of computers in architectural design is very short; only a few decades; when compared to the development of methods in practical design (Gero 1983). However, the development of the user interfaces has been very fast. According to the practical observations of over one hundred architects, user interfaces are at present inflexible in sketching, although computers can make drafts and the creation of alternatives quicker and more effective in the final stages of designing (Haapasalo 1997). Based on our research in the field of practical design we would wish to stimulate a wider debate about the theory of design. More profound perusal compels us to examine human modes, pre-eminently different levels of thinking and manners of inference. What is the meaning of subconscious and conscious thinking in design? What is the role of intuition in practical design? Do the computer aided design programs apply to creative architectural sketching? To answer such questions, distinct, profound and broad understanding from different disciplines is required. Even then, in spite of such specialist knowledge we cannot hope to unambiguously and definitively answer such questions.
keywords Creativity, Design Process, Architectural Design, Sketching, Computer Aided Design
series eCAADe
email
more http://info.tuwien.ac.at/ecaade/proc/haapas/haapas.htm
last changed 2022/06/07 07:50

For more results click below:

this is page 0show page 1show page 2show page 3show page 4show page 5... show page 237HOMELOGIN (you are user _anon_178541 from group guest) CUMINCAD Papers Powered by SciX Open Publishing Services 1.002