Search Results

Hits 1 to 20 of 33

_id 9c41
authors Ahmad Rafi, M.E., Chee W.K., Mai, N., Ken, T.-K. N. and Sharifah Nur, A.S.A. (Eds.)
year 2002
title CAADRIA 2002 [Conference Proceedings]
source Proceedings of the 7th International Conference on Computer Aided Architectural Design Research in Asia / ISBN 983-2473-42-X / Cyberjaya (Malaysia) 18–20 April 2002, 370 p.
summary Evolution of trends in the realm of computer aided architectural design (CAAD) has seen the convergence of technologies – complementing traditional tools with emerging sciences like Information Technology (IT) and multimedia applications. This appliqué of technologies has not just expanded the scope and enhanced the realm of CAAD research and practice, but is also breaking new frontiers. This creative nexus will be realised at the 7th International Conference on Computer Aided Architectural Design Research In Asia (CAADRIA 2002) to be held at the Faculty of Creative Multimedia, Multimedia University, Malaysia, between 18th-20th April, 2002. CAADRIA 2002’s theme, "Redefining Content", seeks to recognise and infuse these emerging components in the field of architecture and design with a holistic approach towards online, digital and interactive systems. The 41 papers compiled were selected through a blind review process conducted by an international review panel. To reflect the multi-disciplinary nature of this year's conference, the chapters are arranged topically to facilitate the in-depth study of key components. The component sessions include: // Web Design, Database and Networks // CAD, Modelling and Tools // Collaborative Design, Creative Design and Case Reasoning // Simulation and Prototyping // Virtual Environment and Knowledge Management // Design Education, Teaching and Learning /// We believe that this specialised approach will provide a deeper and more illuminating feel of the various components and their critical convergence in the field of architecture and design.
series CAADRIA
email ahmadrafi.eshaq@mmu.edu.my
more www.caadria.org
last changed 2002/04/25 17:26

_id avocaad_2001_05
id avocaad_2001_05
authors Alexander Koutamanis
year 2001
title Analysis and the descriptive approach
source AVOCAAD - ADDED VALUE OF COMPUTER AIDED ARCHITECTURAL DESIGN, Nys Koenraad, Provoost Tom, Verbeke Johan, Verleye Johan (Eds.), (2001) Hogeschool voor Wetenschap en Kunst - Departement Architectuur Sint-Lucas, Campus Brussel, ISBN 80-76101-05-1
summary The rise of consciousness concerning the quality of working and living conditions has been a permanent though frequently underplayed theme in architecture and building since the reconstruction period. It has led to an explosive growth of programmatic requirements on building behaviour and performance, thus also stimulating the development of design analysis. The first stage of development was characterized by the evolution of prescriptive systems. These reversed the structure of pre-existing proscriptive systems into sequences of known steps that should be taken in order to achieve adequate results. Prescriptive systems complemented rather than replaced proscriptive ones, thereby creating an uncertain mixture of orthodoxy and orthopraxy that failed to provide design guidance for improving design performance and quality.The second stage in the development of design analysis focuses on descriptive methods and techniques for analyzing and supporting evaluation. Technologies such as simulation and scientific visualization are employed so as to produce detailed, accurate and reliable projections of building behaviour and performance. These projections can be correlated into a comprehensive and coherent description of a building using representations of form as information carriers. In these representations feedback and interaction assume a visual character that fits both design attitudes and lay perception of the built environment, but on the basis of a quantitative background that justifies, verifies and refines design actions. Descriptive analysis is currently the most promising direction for confronting and resolving design complexity. It provides the designer with useful insights into the causes and effects of various design problems but frequently comes short of providing clear design guidance for two main reasons: (1) it adds substantial amounts of information to the already unmanageable loads the designer must handle, and (2) it may provide incoherent cues for the further development of a design. Consequently the descriptive approach to analysis is always in danger of been supplanted by abstract decision making.One way of providing the desired design guidance is to complement the connection of descriptive analyses to representations of form (and from there to synthesis) with two interface components. The first is a memory component, implemented as case-bases of precedent designs. These designs encapsulate integrated design information that can be matched to the design in hand in terms of form, function and performance. Comparison between precedents with a known performance and a new design facilitate identification of design aspects that need be improved, as well as of wider formal and functional consequences. The second component is an adaptive generative system capable of guiding exploration of these aspects, both in the precedents and the new design. The aim of this system is to provide feedback from analysis to synthesis. By exploring the scope of the analysis and the applicability of the conclusions to more designs, the designer generates a coherent and consistent collection of partial solutions that explore a relevant solution space. Development of the first component, the design case-bases, is no trivial task. Transformability in the representation of cases and flexible classification in a database are critical to the identification and treatment of a design aspect. Nevertheless, the state of the art in case-based reasoning and the extensive corpus of analysed designs provide the essential building blocks. The second component, the adaptive generative system, poses more questions. Existing generative techniques do not possess the necessary richness or multidimensionality. Moreover, it is imperative that the designer plays a more active role in the control of the process than merely tweaking local variables. At the same time, the system should prevent that redesigning degenerates into a blind trial-and-error enumeration of possibilities. Guided empirical design research arguably provides the means for the evolutionary development of the second component.
series AVOCAAD
email a.koutamanis@bk.tudelft.nl
last changed 2005/09/09 08:48

_id 2ec8
authors Arditi, Aries and Gillman, Arthur E.
year 1986
title Computing for the Blind User
source BYTE Publication Inc. March, 1986. pp. 199-208. includes some reference notes
summary In this article the authors present some of the human-factors issues specific to non visual personal computing. The authors' concern is with the accuracy, speed, and generality of the blind-user interface, to make computers more accessible and efficient for blind and visually impaired persons
keywords user interface, disabilities
series CADline
last changed 2003/06/02 11:58

_id ecaade2015_27
id ecaade2015_27
authors Asanowicz, Aleksander
year 2015
title Museum 2.0 - Implementation of 3D Digital Tools
source Martens, B, Wurzer, G, Grasl T, Lorenz, WE and Schaffranek, R (eds.), Real Time - Proceedings of the 33rd eCAADe Conference - Volume 1, Vienna University of Technology, Vienna, Austria, 16-18 September 2015, pp. 709-715
summary The aim of this work is to try to set out how new technologies can influence the perception of a museum exposition. The problem which will be analysed is how to adapt an exhibition to the needs of visually impaired people. The problem will be considered on the basis of the case studies which were the part of an agreement between the Army museum in Bialystok and our Faculty. In traditional museums the main principle is the prohibition of touching exhibits.The project goal was to help blind people understand the features of the environment around them through the sense of touch. The novelty of this work is the study of how new digital technologies may improve the perception for the visually impaired.In the paper the method of 3D scanning, modelling and 3D printing will be presented. In conclusion the encountered problems and plans for further action will be discussed.
wos WOS:000372317300077
series eCAADe
email asanowicz@gmail.com
more https://mh-engage.ltcc.tuwien.ac.at/engage/ui/watch.html?id=8e079058-702a-11e5-8ac3-d3d5c9e6f5fe
last changed 2016/05/16 09:08

_id 4805
authors Bentley, P.
year 1999
title Evolutionary Design by Computers Morgan Kaufmann
source San Francisco, CA
summary Computers can only do what we tell them to do. They are our blind, unconscious digital slaves, bound to us by the unbreakable chains of our programs. These programs instruct computers what to do, when to do it, and how it should be done. But what happens when we loosen these chains? What happens when we tell a computer to use a process that we do not fully understand, in order to achieve something we do not fully understand? What happens when we tell a computer to evolve designs? As this book will show, what happens is that the computer gains almost human-like qualities of autonomy, innovative flair, and even creativity. These 'skills'which evolution so mysteriously endows upon our computers open up a whole new way of using computers in design. Today our former 'glorified typewriters' or 'overcomplicated drawing boards' can do everything from generating new ideas and concepts in design, to improving the performance of designs well beyond the abilities of even the most skilled human designer. Evolving designs on computers now enables us to employ computers in every stage of the design process. This is no longer computer aided design - this is becoming computer design. The pages of this book testify to the ability of today's evolutionary computer techniques in design. Flick through them and you will see designs of satellite booms, load cells, flywheels, computer networks, artistic images, sculptures, virtual creatures, house and hospital architectural plans, bridges, cranes, analogue circuits and even coffee tables. Out of all of the designs in the world, the collection you see in this book have a unique history: they were all evolved by computer, not designed by humans.
series other
last changed 2003/04/23 13:14

_id cf2013_159
id cf2013_159
authors Celani, Gabriela; Vilson Zattera, Marcelo Fernandes de Oliveira, and Jorge Vicente Lopes da Silva
year 2013
title “Seeing” with the Hands: Teaching Architecture for the Visually-Impaired with Digitally-Fabricated Scale Models
source Global Design and Local Materialization[Proceedings of the 15th International Conference on Computer Aided Architectural Design Futures / ISBN 978-3-642-38973-3] Shanghai, China, July 3-5, 2013, pp. 159-166.
summary Accessibility of information for the visually-impaired has greatly benefited from information and communication technologies (ICT’s) in the past decades. However, the interpretation of images by the blind still represents a challenge. Bidimensional representations can be understood by those who have seen at least sometime in their lives but they are too abstract for those with congenital blindness, for whom three-dimensional representations are more effective, especially during the conceptualization phase, when children are still forming mental images of the world. Ideally, educators who work with the visually-impaired should be able to produce custom 3D models as they are needed for the explanation of concepts. This paper presents an undergoing project that aims at developing a protocol for making 3D technologies technically and economically available to them.
keywords Tactile models, rapid prototyping, architectural concepts
series CAAD Futures
email celani@fec.unicamp.br
last changed 2014/03/24 06:08

_id caadria2003_0
id caadria2003_0
authors Choutgrajank, A., Charoensilp, E., Keatruangkamala, K. and Nakapan, W. (eds.)
year 2003
title CAADRIA 2003
source Proceedings of the 8th International Conference on Computer Aided Architectural Design Research in Asia / ISBN 974-9584-13-9 / Bangkok (Thailand) 18-20 October 2003, 370 p.
summary The proceedings of the Eighth International Conference on Computer-Aided Architectural Design Research in Asia, presents 70 papers. These papers were selected from the 180 submissions through a blind review of 46 international review committee. Each submission was reviewed by three reviewers and the final acceptance was based on the reviewers' recommendations. Introduced in these proceedings are the papers presented at the Conference under the following session headings: - Collaborative Design - Knowledge Representation - Design Education - Virtual Environment and Computer Media - Information Systems - Simulations
series CAADRIA
type normal paper
email araya@rangsit.rsu.ac.th
more www.caadria.org
last changed 2007/07/23 05:33

_id cf2015_434
id cf2015_434
authors Dalla Vecchia, Luisa Félix; da Silva, Adriane Borda; Pires, Janice; Veiga, Mônica; Vasconselos, Tássia and Borges, Letícia
year 2015
title Tactile models of elements of architectural heritage: from the building scale to the detail
source The next city - New technologies and the future of the built environment [16th International Conference CAAD Futures 2015. Sao Paulo, July 8-10, 2015. Electronic Proceedings/ ISBN 978-85-85783-53-2] Sao Paulo, Brazil, July 8-10, 2015, pp. 434-446.
summary This paper describes the development of three-dimensional models, produced using digital fabrication techniques with the goal of providing a haptic experience of architectural heritage. These models were produced in three different representations: the building as a whole, elements and details. This study first undertakes a process of analysis and the formal decomposition of architectural components to identify basic or simplified elements which make it easier to understand the represented object by touching. The results obtained come from assessment tests of the tactile models as experienced by mainly blind individuals. Secondly, as part of this process, a method of constructing such models is defined. This study facilitates a greater understanding of the relationship between the represented objects (historic buildings) and the tactile models, and provides a technological and discursive basis for future implementation of tactile models in a specific context.
keywords tactile models, architectural heritage, digital fabrication, haptic experience.
series CAAD Futures
email luisafelixd@gmail.com
last changed 2015/06/29 05:55

_id 7e01
authors Earl Mark
year 2000
title A Prospectus on Computers Throughout the Design Curriculum
source Promise and Reality: State of the Art versus State of Practice in Computing for the Design and Planning Process [18th eCAADe Conference Proceedings / ISBN 0-9523687-6-5] Weimar (Germany) 22-24 June 2000, pp. 77-83
summary Computer aided architectural design has spread throughout architecture schools in the United States as if sown upon the wind. Yet, the proliferation alone may not be a good measure of the computer’s impact on the curriculum or signify the true emergence of a digital design culture. The aura of a relatively new technology may blind us from understanding its actual place in the continuum of design education. The promise of the technology is to completely revolutionize design; however, the reality of change is perhaps rooted in an underlying connection to core design methods. This paper considers a transitional phase within a School reviewing its entire curriculum. Lessons may be found in the Bauhaus educational program at the beginning of the 20 th century and its response to the changing shape of society and industry.
keywords Pedagogy, Computer Based Visualization, Spatial and Data Analysis Methods, Interdisciplinary Computer Based Models
series eCAADe
email ejmark@virginia.edu
more http://www.uni-weimar.de/ecaade/
last changed 2003/05/29 04:45

_id caadria2008_34_session4a_278
id caadria2008_34_session4a_278
authors Fischer, Thomas
year 2008
title Obstructed Magic: On the Myths of Observing Designing and of Sharing Design Observations
source CAADRIA 2008 [Proceedings of the 13th International Conference on Computer Aided Architectural Design Research in Asia] Chiang Mai (Thailand) 9-12 April 2008, pp. 278-284
summary Much design research, including much research in the computer-aided architectural design field, is based on the assumption that the process of designing is observable and that what happens in designing can be known, explicitly described and shared. In this paper I examine this assumption from my subjective viewpoint and conclude that designing occurs behind a blind spot. It can be concluded that existing design process models used in the “science of design” are based on invention rather than on empirical evidence, which in turn suggests that science should be studied as a form of design instead of studying designing scientifically.
keywords Design research, observation, design process, language
series CAADRIA
email sdtom@polyu.edu.hk
last changed 2012/05/30 19:29

_id 2068
authors Frazer, John
year 1995
title AN EVOLUTIONARY ARCHITECTURE
source London: Architectural Association
summary In "An Evolutionary Architecture", John Frazer presents an overview of his work for the past 30 years. Attempting to develop a theoretical basis for architecture using analogies with nature's processes of evolution and morphogenesis. Frazer's vision of the future of architecture is to construct organic buildings. Thermodynamically open systems which are more environmentally aware and sustainable physically, sociologically and economically. The range of topics which Frazer discusses is a good illustration of the breadth and depth of the evolutionary design problem. Environmental Modelling One of the first topics dealt with is the importance of environmental modelling within the design process. Frazer shows how environmental modelling is often misused or misinterpreted by architects with particular reference to solar modelling. From the discussion given it would seem that simplifications of the environmental models is the prime culprit resulting in misinterpretation and misuse. The simplifications are understandable given the amount of information needed for accurate modelling. By simplifying the model of the environmental conditions the architect is able to make informed judgments within reasonable amounts of time and effort. Unfortunately the simplications result in errors which compound and cause the resulting structures to fall short of their anticipated performance. Frazer obviously believes that the computer can be a great aid in the harnessing of environmental modelling data, providing that the same simplifying assumptions are not made and that better models and interfaces are possible. Physical Modelling Physical modelling has played an important role in Frazer's research. Leading to the construction of several novel machine readable interactive models, ranging from lego-like building blocks to beermat cellular automata and wall partitioning systems. Ultimately this line of research has led to the Universal Constructor and the Universal Interactor. The Universal Constructor The Universal Constructor features on the cover of the book. It consists of a base plug-board, called the "landscape", on top of which "smart" blocks, or cells, can be stacked vertically. The cells are individually identified and can communicate with neighbours above and below. Cells communicate with users through a bank of LEDs displaying the current state of the cell. The whole structure is machine readable and so can be interpreted by a computer. The computer can interpret the states of the cells as either colour or geometrical transformations allowing a wide range of possible interpretations. The user interacts with the computer display through direct manipulation of the cells. The computer can communicate and even direct the actions of the user through feedback with the cells to display various states. The direct manipulation of the cells encourages experimentation by the user and demonstrates basic concepts of the system. The Universal Interactor The Universal Interactor is a whole series of experimental projects investigating novel input and output devices. All of the devices speak a common binary language and so can communicate through a mediating central hub. The result is that input, from say a body-suit, can be used to drive the out of a sound system or vice versa. The Universal Interactor opens up many possibilities for expression when using a CAD system that may at first seem very strange.However, some of these feedback systems may prove superior in the hands of skilled technicians than more standard devices. Imagine how a musician might be able to devise structures by playing melodies which express the character. Of course the interpretation of input in this form poses a difficult problem which will take a great deal of research to achieve. The Universal Interactor has been used to provide environmental feedback to affect the development of evolving genetic codes. The feedback given by the Universal Interactor has been used to guide selection of individuals from a population. Adaptive Computing Frazer completes his introduction to the range of tools used in his research by giving a brief tour of adaptive computing techniques. Covering topics including cellular automata, genetic algorithms, classifier systems and artificial evolution. Cellular Automata As previously mentioned Frazer has done some work using cellular automata in both physical and simulated environments. Frazer discusses how surprisingly complex behaviour can result from the simple local rules executed by cellular automata. Cellular automata are also capable of computation, in fact able to perform any computation possible by a finite state machine. Note that this does not mean that cellular automata are capable of any general computation as this would require the construction of a Turing machine which is beyond the capabilities of a finite state machine. Genetic Algorithms Genetic algorithms were first presented by Holland and since have become a important tool for many researchers in various areas.Originally developed for problem-solving and optimization problems with clearly stated criteria and goals. Frazer fails to mention one of the most important differences between genetic algorithms and other adaptive problem-solving techniques, ie. neural networks. Genetic algorithms have the advantage that criteria can be clearly stated and controlled within the fitness function. The learning by example which neural networks rely upon does not afford this level of control over what is to be learned. Classifier Systems Holland went on to develop genetic algorithms into classifier systems. Classifier systems are more focussed upon the problem of learning appropriate responses to stimuli, than searching for solutions to problems. Classifier systems receive information from the environment and respond according to rules, or classifiers. Successful classifiers are rewarded, creating a reinforcement learning environment. Obviously, the mapping between classifier systems and the cybernetic view of organisms sensing, processing and responding to environmental stimuli is strong. It would seem that a central process similar to a classifier system would be appropriate at the core of an organic building. Learning appropriate responses to environmental conditions over time. Artificial Evolution Artificial evolution traces it's roots back to the Biomorph program which was described by Dawkins in his book "The Blind Watchmaker". Essentially, artificial evolution requires that a user supplements the standard fitness function in genetic algorithms to guide evolution. The user may provide selection pressures which are unquantifiable in a stated problem and thus provide a means for dealing ill-defined criteria. Frazer notes that solving problems with ill-defined criteria using artificial evolution seriously limits the scope of problems that can be tackled. The reliance upon user interaction in artificial evolution reduces the practical size of populations and the duration of evolutionary runs. Coding Schemes Frazer goes on to discuss the encoding of architectural designs and their subsequent evolution. Introducing two major systems, the Reptile system and the Universal State Space Modeller. Blueprint vs. Recipe Frazer points out the inadequacies of using standard "blueprint" design techniques in developing organic structures. Using a "recipe" to describe the process of constructing a building is presented as an alternative. Recipes for construction are discussed with reference to the analogous process description given by DNA to construct an organism. The Reptile System The Reptile System is an ingenious construction set capable of producing a wide range of structures using just two simple components. Frazer saw the advantages of this system for rule-based and evolutionary systems in the compactness of structure descriptions. Compactness was essential for the early computational work when computer memory and storage space was scarce. However, compact representations such as those described form very rugged fitness landscapes which are not well suited to evolutionary search techniques. Structures are created from an initial "seed" or minimal construction, for example a compact spherical structure. The seed is then manipulated using a series of processes or transformations, for example stretching, shearing or bending. The structure would grow according to the transformations applied to it. Obviously, the transformations could be a predetermined sequence of actions which would always yield the same final structure given the same initial seed. Alternatively, the series of transformations applied could be environmentally sensitive resulting in forms which were also sensitive to their location. The idea of taking a geometrical form as a seed and transforming it using a series of processes to create complex structures is similar in many ways to the early work of Latham creating large morphological charts. Latham went on to develop his ideas into the "Mutator" system which he used to create organic artworks. Generalising the Reptile System Frazer has proposed a generalised version of the Reptile System to tackle more realistic building problems. Generating the seed or minimal configuration from design requirements automatically. From this starting point (or set of starting points) solutions could be evolved using artificial evolution. Quantifiable and specific aspects of the design brief define the formal criteria which are used as a standard fitness function. Non-quantifiable criteria, including aesthetic judgments, are evaluated by the user. The proposed system would be able to learn successful strategies for satisfying both formal and user criteria. In doing so the system would become a personalised tool of the designer. A personal assistant which would be able to anticipate aesthetic judgements and other criteria by employing previously successful strategies. Ultimately, this is a similar concept to Negroponte's "Architecture Machine" which he proposed would be computer system so personalised so as to be almost unusable by other people. The Universal State Space Modeller The Universal State Space Modeller is the basis of Frazer's current work. It is a system which can be used to model any structure, hence the universal claim in it's title. The datastructure underlying the modeller is a state space of scaleless logical points, called motes. Motes are arranged in a close-packing sphere arrangement, which makes each one equidistant from it's twelve neighbours. Any point can be broken down into a self-similar tetrahedral structure of logical points. Giving the state space a fractal nature which allows modelling at many different levels at once. Each mote can be thought of as analogous to a cell in a biological organism. Every mote carries a copy of the architectural genetic code in the same way that each cell within a organism carries a copy of it's DNA. The genetic code of a mote is stored as a sequence of binary "morons" which are grouped together into spatial configurations which are interpreted as the state of the mote. The developmental process begins with a seed. The seed develops through cellular duplication according to the rules of the genetic code. In the beginning the seed develops mainly in response to the internal genetic code, but as the development progresses the environment plays a greater role. Cells communicate by passing messages to their immediate twelve neighbours. However, it can send messages directed at remote cells, without knowledge of it's spatial relationship. During the development cells take on specialised functions, including environmental sensors or producers of raw materials. The resulting system is process driven, without presupposing the existence of a construction set to use. The datastructure can be interpreted in many ways to derive various phenotypes. The resulting structure is a by-product of the cellular activity during development and in response to the environment. As such the resulting structures have much in common with living organisms which are also the emergent result or by-product of local cellular activity. Primordial Architectural Soups To conclude, Frazer presents some of the most recent work done, evolving fundamental structures using limited raw materials, an initial seed and massive feedback. Frazer proposes to go further and do away with the need for initial seed and start with a primordial soup of basic architectural concepts. The research is attempting to evolve the starting conditions and evolutionary processes without any preconditions. Is there enough time to evolve a complex system from the basic building blocks which Frazer proposes? The computational complexity of the task being embarked upon is not discussed. There is an implicit assumption that the "superb tactics" of natural selection are enough to cut through the complexity of the task. However, Kauffman has shown how self-organisation plays a major role in the early development of replicating systems which we may call alive. Natural selection requires a solid basis upon which it can act. Is the primordial soup which Frazer proposes of the correct constitution to support self-organisation? Kauffman suggests that one of the most important attributes of a primordial soup to be capable of self-organisation is the need for a complex network of catalysts and the controlling mechanisms to stop the reactions from going supracritical. Can such a network be provided of primitive architectural concepts? What does it mean to have a catalyst in this domain? Conclusion Frazer shows some interesting work both in the areas of evolutionary design and self-organising systems. It is obvious from his work that he sympathizes with the opinions put forward by Kauffman that the order found in living organisms comes from both external evolutionary pressure and internal self-organisation. His final remarks underly this by paraphrasing the words of Kauffman, that life is always to found on the edge of chaos. By the "edge of chaos" Kauffman is referring to the area within the ordered regime of a system close to the "phase transition" to chaotic behaviour. Unfortunately, Frazer does not demonstrate that the systems he has presented have the necessary qualities to derive useful order at the edge of chaos. He does not demonstrate, as Kauffman does repeatedly, that there exists a "phase transition" between ordered and chaotic regimes of his systems. He also does not make any studies of the relationship of useful forms generated by his work to phase transition regions of his systems should they exist. If we are to find an organic architecture, in more than name alone, it is surely to reside close to the phase transition of the construction system of which is it built. Only there, if we are to believe Kauffman, are we to find useful order together with environmentally sensitive and thermodynamically open systems which can approach the utility of living organisms.
series other
type normal paper
last changed 2004/05/22 12:12

_id 7ce5
authors Gal, Shahaf
year 1992
title Computers and Design Activities: Their Mediating Role in Engineering Education
source Sociomedia, ed. Edward Barret. MIT Press
summary Sociomedia: With all the new words used to describe electronic communication (multimedia, hypertext, cyberspace, etc.), do we need another one? Edward Barrett thinks we do; hence, he coins the term "sociomedia." It is meant to displace a computing economy in which technicity is hypostasized over sociality. Sociomedia, a compilation of twenty-five articles on the theory, design and practice of educational multimedia and hypermedia, attempts to re-value the communicational face of computing. Value, of course, is "ultimately a social construct." As such, it has everything to do with knowledge, power, education and technology. The projects discussed in this book represent the leading edge of electronic knowledge production in academia (not to mention major funding) and are determining the future of educational media. For these reasons, Sociomedia warrants close inspection. Barrett's introduction sets the tone. For him, designing computer media involves hardwiring a mechanism for the social construction of knowledge (1). He links computing to a process of social and communicative interactivity for constructing and desseminating knowledge. Through a mechanistic mapping of the university as hypercontext (a huge network that includes classrooms as well as services and offices), Barrett models intellectual work in such a way as to avoid "limiting definitions of human nature or human development." Education, then, can remain "where it should be--in the human domain (public and private) of sharing ideas and information through the medium of language." By leaving education in a virtual realm (where we can continue to disagree about its meaning and execution), it remains viral, mutating and contaminating in an intellectually healthy way. He concludes that his mechanistic model, by means of its reductionist approach, preserves value (7). This "value" is the social construction of knowledge. While I support the social orientation of Barrett's argument, discussions of value are related to power. I am not referring to the traditional teacher-student power structure that is supposedly dismantled through cooperative and constructivist learning strategies. The power to be reckoned with in the educational arena is foundational, that which (pre)determines value and the circulation of knowledge. "Since each of you reading this paragraph has a different perspective on the meaning of 'education' or 'learning,' and on the processes involved in 'getting an education,' think of the hybris in trying to capture education in a programmable function, in a displayable object, in a 'teaching machine'" (7). Actually, we must think about that hybris because it is, precisely, what informs teaching machines. Moreover, the basic epistemological premises that give rise to such productions are too often assumed. In the case of instructional design, the episteme of cognitive sciences are often taken for granted. It is ironic that many of the "postmodernists" who support electronic hypertextuality seem to have missed Jacques Derrida's and Michel Foucault's "deconstructions" of the epistemology underpinning cognitive sciences (if not of epistemology itself). Perhaps it is the glitz of the technology that blinds some users (qua developers) to the belief systems operating beneath the surface. Barrett is not guilty of reactionary thinking or politics; he is, in fact, quite in line with much American deconstructive and postmodern thinking. The problem arises in that he leaves open the definitions of "education," "learning" and "getting an education." One cannot engage in the production of new knowledge without orienting its design, production and dissemination, and without negotiating with others' orientations, especially where largescale funding is involved. Notions of human nature and development are structural, even infrastructural, whatever the medium of the teaching machine. Although he addresses some dynamics of power, money and politics when he talks about the recession and its effects on the conference, they are readily visible dynamics of power (3-4). Where does the critical factor of value determination, of power, of who gets what and why, get mapped onto a mechanistic model of learning institutions? Perhaps a mapping of contributors' institutions, of the funding sources for the projects showcased and for participation in the conference, and of the disciplines receiving funding for these sorts of projects would help visualize the configurations of power operative in the rising field of educational multimedia. Questions of power and money notwithstanding, Barrett's introduction sets the social and textual thematics for the collection of essays. His stress on interactivity, on communal knowledge production, on the society of texts, and on media producers and users is carried foward through the other essays, two of which I will discuss. Section I of the book, "Perspectives...," highlights the foundations, uses and possible consequences of multimedia and hypertextuality. The second essay in this section, "Is There a Class in This Text?," plays on the robust exchange surrounding Stanley Fish's book, Is There a Text in This Class?, which presents an attack on authority in reading. The author, John Slatin, has introduced electronic hypertextuality and interaction into his courses. His article maps the transformations in "the content and nature of work, and the workplace itself"-- which, in this case, is not industry but an English poetry class (25). Slatin discovered an increase of productive and cooperative learning in his electronically- mediated classroom. For him, creating knowledge in the electronic classroom involves interaction between students, instructors and course materials through the medium of interactive written discourse. These interactions lead to a new and persistent understanding of the course materials and of the participants' relation to the materials and to one another. The work of the course is to build relationships that, in my view, constitute not only the meaning of individual poems, but poetry itself. The class carries out its work in the continual and usually interactive production of text (31). While I applaud his strategies which dismantle traditional hierarchical structures in academia, the evidence does not convince me that the students know enough to ask important questions or to form a self-directing, learning community. Stanley Fish has not relinquished professing, though he, too, espouses the indeterminancy of the sign. By the fourth week of his course, Slatin's input is, by his own reckoning, reduced to 4% (39). In the transcript of the "controversial" Week 6 exchange on Gertrude Stein--the most disliked poet they were discussing at the time (40)--we see the blind leading the blind. One student parodies Stein for three lines and sums up his input with "I like it." Another, finds Stein's poetry "almost completey [sic] lacking in emotion or any artistic merit" (emphasis added). On what grounds has this student become an arbiter of "artistic merit"? Another student, after admitting being "lost" during the Wallace Steven discussion, talks of having more "respect for Stevens' work than Stein's" and adds that Stein's poetry lacks "conceptual significance[, s]omething which people of varied opinion can intelligently discuss without feeling like total dimwits...." This student has progressed from admitted incomprehension of Stevens' work to imposing her (groundless) respect for his work over Stein's. Then, she exposes her real dislike for Stein's poetry: that she (the student) missed the "conceptual significance" and hence cannot, being a person "of varied opinion," intelligently discuss it "without feeling like [a] total dimwit." Slatin's comment is frightening: "...by this point in the semester students have come to feel increasingly free to challenge the instructor" (41). The students that I have cited are neither thinking critically nor are their preconceptions challenged by student-governed interaction. Thanks to the class format, one student feels self-righteous in her ignorance, and empowered to censure. I believe strongly in student empowerment in the classroom, but only once students have accrued enough knowledge to make informed judgments. Admittedly, Slatin's essay presents only partial data (there are six hundred pages of course transcripts!); still, I wonder how much valuable knowledge and metaknowledge was gained by the students. I also question the extent to which authority and professorial dictature were addressed in this course format. The power structures that make it possible for a college to require such a course, and the choice of texts and pedagogy, were not "on the table." The traditional professorial position may have been displaced, but what took its place?--the authority of consensus with its unidentifiable strong arm, and the faceless reign of software design? Despite Slatin's claim that the students learned about the learning process, there is no evidence (in the article) that the students considered where their attitudes came from, how consensus operates in the construction of knowledge, how power is established and what relationship they have to bureaucratic insitutions. How do we, as teaching professionals, negotiate a balance between an enlightened despotism in education and student-created knowledge? Slatin, and other authors in this book, bring this fundamental question to the fore. There is no definitive answer because the factors involved are ultimately social, and hence, always shifting and reconfiguring. Slatin ends his article with the caveat that computerization can bring about greater estrangement between students, faculty and administration through greater regimentation and control. Of course, it can also "distribute authority and power more widely" (50). Power or authority without a specific face, however, is not necessarily good or just. Shahaf Gal's "Computers and Design Activities: Their Mediating Role in Engineering Education" is found in the second half of the volume, and does not allow for a theory/praxis dichotomy. Gal recounts a brief history of engineering education up to the introduction of Growltiger (GT), a computer-assisted learning aid for design. He demonstrates GT's potential to impact the learning of engineering design by tracking its use by four students in a bridge-building contest. What his text demonstrates clearly is that computers are "inscribing and imaging devices" that add another viewpoint to an on-going dialogue between student, teacher, earlier coursework, and other teaching/learning tools. The less proficient students made a serious error by relying too heavily on the technology, or treating it as a "blueprint provider." They "interacted with GT in a way that trusted the data to represent reality. They did not see their interaction with GT as a negotiation between two knowledge systems" (495). Students who were more thoroughly informed in engineering discourses knew to use the technology as one voice among others--they knew enough not simply to accept the input of the computer as authoritative. The less-advanced students learned a valuable lesson from the competition itself: the fact that their designs were not able to hold up under pressure (literally) brought the fact of their insufficient knowledge crashing down on them (and their bridges). They also had, post factum, several other designs to study, especially the winning one. Although competition and comparison are not good pedagogical strategies for everyone (in this case the competitors had volunteered), at some point what we think we know has to be challenged within the society of discourses to which it belongs. Students need critique in order to learn to push their learning into auto-critique. This is what is lacking in Slatin's discussion and in the writings of other avatars of constructivist, collaborative and computer-mediated pedagogies. Obviously there are differences between instrumental types of knowledge acquisition and discoursive knowledge accumulation. Indeed, I do not promote the teaching of reading, thinking and writing as "skills" per se (then again, Gal's teaching of design is quite discursive, if not dialogic). Nevertheless, the "soft" sciences might benefit from "bridge-building" competitions or the re-institution of some forms of agonia. Not everything agonistic is inhuman agony--the joy of confronting or creating a sound argument supported by defensible evidence, for example. Students need to know that soundbites are not sound arguments despite predictions that electronic writing will be aphoristic rather than periodic. Just because writing and learning can be conceived of hypertextually does not mean that rigor goes the way of the dinosaur. Rigor and hypertextuality are not mutually incompatible. Nor is rigorous thinking and hard intellectual work unpleasurable, although American anti-intellectualism, especially in the mass media, would make it so. At a time when the spurious dogmatics of a Rush Limbaugh and Holocaust revisionist historians circulate "aphoristically" in cyberspace, and at a time when knowledge is becoming increasingly textualized, the role of critical thinking in education will ultimately determine the value(s) of socially constructed knowledge. This volume affords the reader an opportunity to reconsider knowledge, power, and new communications technologies with respect to social dynamics and power relationships.
series other
last changed 2003/04/23 13:14

_id b07d
authors Gero, J.S., Chase, S. and Rosenman, M. (Eds.)
year 2001
title CAADRIA 2001 [Conference Proceedings]
source Proceedings of the Sixth Conference on Computer Aided Architectural Design Research in Asia / ISBN 1-86487-096-6 / Sydney 19-21 April 2001, 506 p.
summary Computer-aided architectural design research and teaching has a long history going back to the 1960s. However, the last ten years has seen a dramatic upsurge in activity brought about by factors such as the increasing use of CAD systems in practice, the increase in computer literacy generally and perhaps, equally importantly, the development and widespread use of the World Wide Web. The CAADRIA conference series provides a forum for the presentation and exchange of ideas and experiences in CAAD, particularly focussed on Asian research and teaching. This, the proceedings of the Sixth International Conference on Computer-Aided Architectural Design Research and Teaching in Asia, presents 57 (31 long and 26 short) papers. The 57 papers were selected from the 114 submissions following a blind review of extended abstracts. Each submission was reviewed by two referees and the decision to accept was based on a committee's assessment of all the submissions. The final papers were assessed to determine if the reviewers' recommendations had been complied with. The authors of the selected papers represent 17 countries, making this an international as well as an Asian conference.
series CAADRIA
email john@arch.usyd.edu.au
more http://www.caadria.org
last changed 2001/05/27 16:27

_id ga9928
id ga9928
authors Goulthorpe
year 1999
title Hyposurface: from Autoplastic to Alloplastic Space
source International Conference on Generative Art
summary By way of immediate qualification to an essay which attempts to orient current technical developments in relation to a series of dECOi projects, I would suggest that the greatest liberation offered by new technology in architecture is not its formal potential as much as the patterns of creativity and practice it engenders. For increasingly in the projects presented here dECOi operates as an extended network of technical expertise: Mark Burry and his research team at Deakin University in Australia as architects and parametric/ programmatic designers; Peter Wood in New Zealand as programmer; Alex Scott in London as mathematician; Chris Glasow in London as systems engineer; and the engineers (structural/services) of David Glover’s team at Ove Arup in London. This reflects how we’re working in a new technical environment - a new form of practice, in a sense - a loose and light network which deploys highly specialist technical skill to suit a particular project. By way of a second disclaimer, I would suggest that the rapid technological development we're witnessing, which we struggle to comprehend given the sheer pace of change that overwhelms us, is somehow of a different order than previous technological revolutions. For the shift from an industrial society to a society of mass communication, which is the essential transformation taking place in the present, seems to be a subliminal and almost inexpressive technological transition - is formless, in a sense - which begs the question of how it may be expressed in form. If one holds that architecture is somehow the crystallization of cultural change in concrete form, one suspects that in the present there is no simple physical equivalent for the burst of communication technologies that colour contemporary life. But I think that one might effectively raise a series of questions apropos technology by briefly looking at 3 or 4 of our current projects, and which suggest a range of possibilities fostered by new technology. By way of a third doubt, we might qualify in advance the apparent optimism of architects for CAD technology by thinking back to Thomas More and his island ‘Utopia’, which marks in some way the advent of Modern rationalism. This was, if not quite a technological utopia, certainly a metaphysical one, More’s vision typically deductive, prognostic, causal. But which by the time of Francis Bacon’s New Atlantis is a technological utopia availing itself of all the possibilities put at humanity’s disposal by the known machines of the time. There’s a sort of implicit sanction within these two accounts which lies in their nature as reality optimized by rational DESIGN as if the very ethos of design were sponsored by Modern rationalist thought and its utopian leanings. The faintly euphoric ‘technological’ discourse of architecture at present - a sort of Neue Bauhaus - then seems curiously misplaced historically given the 20th century’s general anti-, dis-, or counter-utopian discourse. But even this seems to have finally run its course, dissolving into the electronic heterotopia of the present with its diverse opportunities of irony and distortion (as it’s been said) as a liberating potential.1 This would seem to mark the dissolution of design ethos into non-causal process(ing), which begs the question of ‘design’ itself: who 'designs' anymore? Or rather, has 'design' not become uncoupled from its rational, deterministic, tradition? The utopianism that attatches to technological discourse in the present seems blind to the counter-finality of technology's own accomplishments - that transparency has, as it were, by its own more and more perfect fulfillment, failed by its own success. For what we seem to have inherited is not the warped utopia depicted in countless visions of a singular and tyrranical technology (such as that in Orwell's 1984), but a rich and diverse heterotopia which has opened the possibility of countless channels of local dialect competing directly with the channels of power. Undoubtedly such multiplicitous and global connectivity has sent creative thought in multiple directions…
series other
more http://www.generativeart.com/
last changed 2003/08/07 15:25

_id 64b8
authors Grabowski, M. and Barner, K.
year 1998
title Data Visualization Methods for the Blind using Force Feedback and Sonification
source Stein, M. (Ed.). Telemanipulator and Telepresence Technologies V. Vol. 3524
summary Research in the field of scientific visualization has begun to articulate systematic methods for mapping data to a perceivable form. Most work in this area has focused on the visual sense. For blind people in particular, systematic visualization methods which utilize other sense need further development. In this work we develop methods for adding aural feedback to an existing haptic force feedback interface to create a multimodal visualization system. We examine some fundamental components of a visualization system which include the following: characterization of the data, definition of user directives and interpretation aims, cataloging of sensual representations of information, and finally, matching the data and user interpretation aims with the appropriate sensual representations. We focus on the last two components as they relate to the aural and haptic sensor. Cataloging of sensual representations is drawn form current research in sonification and haptics. The matching procedure can be thought of as a type of encoding which should be the inverse of the decoding mechanism of our aural and haptic systems. Topics in human perception are discussed, and issues related to the interaction between the two sensor are addressed. We have implemented a visualization system in the from of a Windows NT application using a sound card and a 3 DOF point interaction haptic interface. The system present a 2D data set primarily as a polygonal haptic surface although other capabilities of the haptic sensor are utilized such as texture discernment. In addition, real time aural feedback is presented as the user explores the surface. Parameters of sound such as pitch and spectral content are used to convey information. Evaluation of the system's effectiveness as an assistive technology for the blind reveals that combining aural and haptic feedback improves visualization over using either of the two senses alone.
series other
last changed 2003/04/23 13:14

_id caadria2014_000
id caadria2014_000
authors Gu, Ning; Shun Watanabe, Halil Erhan, Matthias Hank Haeusler, Weixin Huang and Ricardo Sosa (eds.)
year 2014
title Rethinking Comprehensive Design: Speculative Counterculture
source Proceedings of the 19th International Conference on Computer-Aided Architectural Design Research in Asia (CAADRIA 2014) / Kyoto 14-16 May 2014, 994 p.
summary Rethinking Comprehensive Design—the 19th International Conference on Computer-Aided Architectural Design Research in Asia (CAADRIA 2014)—emphasises a cross-disciplinary context to challenge the mainstream culture of computational design in architecture. It aims to (re)explore the potential of computational design methods and technologies in architecture from a holistic perspective. The conference provides an international forum where academics and practitioners share their novel research development and reflection for defining the future of computation in architectural design. Hosted by the Department of Design, Engineering and Management at the Kyoto Institute of Technology, CAADRIA 2014 presents 88 peer-reviewed full papers from all over the world. These high-quality research papers are complimented by 34 short work-in-progress papers submitted for the poster session of the conference. The conference proceedings were produced by a motivated team of volunteers from the CAADRIA community through an extensive collaboration. The 88 full papers rigorously double-blind reviewed by the dedicated International Review Committee (consisting of 74 experts), testify to CAADRIA’s highly respectable international standing. Call for abstracts sent out in July 2013 attracted 298 submissions. They were initially reviewed by the Paper Selection Committee who accepted 198 abstracts for further development. Of these, 118 full papers were eventually submitted in the final stage. Each submitted paper was then assessed by at least two members of the International Review Committee. Following the reviewers’ recommendations, 91 papers were accepted by the conference, of which 88 are included in this volume and for presentation in CAADRIA 2014. Collectively, these 88 papers define Rethinking Comprehensive Design in terms of the following research streams: Shape Studies; User Participation in Design; Human-Computer Interaction; Digital Fabrication and Construction; Computational Design Analysis; New Digital Design Concepts and Strategies; Practice-Based and Interdisciplinary Computational Design Research; Collaborative and Collective Design; Generative, Parametric and Evolutionary Design; Design Cognition and Creativity; Virtual / Augmented Reality and Interactive Environments; Computational Design Research and Education; and Theory, Philosophy and Methodology of Computational Design Research. In the following pages, you will find a wide range of scholarly papers organised under these streams that truly capture the quintessence of the research concepts. This volume will certainly inspire you and facilitate your journey in Rethinking Comprehensive Design.
series CAADRIA
last changed 2014/04/22 08:23

_id cf2011_p027
id cf2011_p027
authors Herssens, Jasmien; Heylighen Ann
year 2011
title A Framework of Haptic Design Parameters for Architects: Sensory Paradox Between Content and Representation
source Computer Aided Architectural Design Futures 2011 [Proceedings of the 14th International Conference on Computer Aided Architectural Design Futures / ISBN 9782874561429] Liege (Belgium) 4-8 July 2011, pp. 685-700.
summary Architects—like other designers—tend to think, know and work in a visual way. In design research, this way of knowing and working is highly valued as paramount to design expertise (Cross 1982, 2006). In case of architecture, however, it is not only a particular strength, but may as well be regarded as a serious weakness. The absence of non-visual features in traditional architectural spatial representations indicates how these are disregarded as important elements in conceiving space (Dischinger 2006). This bias towards vision, and the suppression of other senses—in the way architecture is conceived, taught and critiqued—results in a disappearance of sensory qualities (Pallasmaa 2005). Nevertheless, if architects design with more attention to non visual senses, they are able to contribute to more inclusive environments. Indeed if an environment offers a range of sensory triggers, people with different sensory capacities are able to navigate and enjoy it. Rather than implementing as many sensory triggers as possible, the intention is to make buildings and spaces accessible and enjoyable for more people, in line with the objective of inclusive design (Clarkson et al. 2007), also called Design for All or Universal Design (Ostroff 2001). Within this overall objective, the aim of our study is to develop haptic design parameters that support architects during design in paying more attention to the role of haptics, i.e. the sense of touch, in the built environment by informing them about the haptic implications of their design decisions. In the context of our study, haptic design parameters are defined as variables that can be decided upon by designers throughout the design process, and the value of which determines the haptic characteristics of the resulting design. These characteristics are based on the expertise of people who are congenitally blind, as they are more attentive to non visual information, and of professional caregivers working with them. The parameters do not intend to be prescriptive, nor to impose a particular method. Instead they seek to facilitate a more inclusive design attitude by informing designers and helping them to think differently. As the insights from the empirical studies with people born blind and caregivers have been reported elsewhere (Authors 2010), this paper starts by outlining the haptic design parameters resulting from them. Following the classification of haptics into active, dynamic and passive touch, the built environment unfolds into surfaces that can act as “movement”, “guiding” and/or “rest” plane. Furthermore design techniques are suggested to check the haptic qualities during the design process. Subsequently, the paper reports on a focus group interview/workshop with professional architects to assess the usability of the haptic design parameters for design practice. The architects were then asked to try out the parameters in the context of a concrete design project. The reactions suggest that the participating architects immediately picked up the underlying idea of the parameters, and recognized their relevance in relation to the design project at stake, but that their representation confronts us with a sensory paradox: although the parameters question the impact of the visual in architectural design, they are meant to be used by designers, who are used to think, know and work in a visual way.
keywords blindness, design parameters, haptics, inclusive design, vision
series CAAD Futures
email jherssens@gmail.com
last changed 2012/02/11 18:21

_id ecaade2008_110
id ecaade2008_110
authors Ireland, Tim
year 2008
title Space Diagrams
source Architecture in Computro [26th eCAADe Conference Proceedings / ISBN 978-0-9541183-7-2] Antwerpen (Belgium) 17-20 September 2008, pp. 91-98
summary Decomposing typical hierarchies of architectural space we look to the use of agents to generate architectonic form in a process of distributed representation. This paper forms a part of this on going research; a component focusing on the problem of circulation. The work presented looks to swarm intelligence and the well-trodden field of computational way finding techniques based on the route finding means of social insects. Ant foraging algorithms are used generally towards optimization and tend to rely on a priori knowledge of the environment. Outlined here is an investigation of emergent route formation and spatial connectivity based on simple agent and pheromone interaction. Optimization is not the key, but emergent connectivity through blind local communication.
keywords Agents, self-organisation, circulation, ants, pheromones
series eCAADe
email t.ireland@ucl.ac.uk
last changed 2008/09/09 13:55

_id caadria2015_054
id caadria2015_054
authors Joseph, Daniel; Alan Kim, Andrew Butler and M. Hank Haeusler
year 2015
title Optimisation for Sport Stadium Designs
source Emerging Experience in Past, Present and Future of Digital Architecture, Proceedings of the 20th International Conference of the Association for Computer-Aided Architectural Design Research in Asia (CAADRIA 2015) / Daegu 20-22 May 2015, pp. 573-582
summary Applying computational optimisation tools for sport stadium designs has become common practice. However, optimizations often occur only on a macro level (analysing stadium as a whole) and not on a micro level (a view from each seat). Consequently, items on a micro level with design details like guardrails can be overlooked, leading to financial losses for operators. Hence, the research argues that every seat is encouraged to have a clear field of view to avoid financial complications. In order to address this problem the research team developed and evaluated a script that allowed importing an existing design into Rhino. Firstly, the script evaluates the view of each seat via a colour coded response system. Secondly, the designer can select the respective seat, and view the sightline from the occupant’s sightline to various spots on the field to analyse where the obstruction is occurring. This ‘binocular view’ enables the designer to evaluate blind spots from each seat prior to project completion. As the script allows the designer to automate the micro level analysis, the research arguably provides a significant improvement for stadium design by comparing the time used for a design optimisation in a conventional method with the automated one.
keywords Stadium design; Design optimisation; Design analysis; Customised software development; Grasshopper scripting.
series CAADRIA
email m.haeusler@unsw.edu.au
last changed 2015/06/05 05:14

_id caadria2012_045
id caadria2012_045
authors Khoo, C. K. and F. D. Salim
year 2012
title A responsive morphing media skin
source Proceedings of the 17th International Conference on Computer Aided Architectural Design Research in Asia / Chennai 25-28 April 2012, pp. 517–526
summary Existing media façades do not function as fenestration devices. They have been used mainly for visual communication and aesthetic purposes. This paper introduces a responsive morphing skin that can act as an active fenestration device as well as a media skin. We investigate new possibilities of using form-changing materials in designing responsive morphing skins that respond to environmental conditions and act as a communicative display. The design experiment that embodied this investigation, namely Blind, serves as a new layer of analogue media brise-soleil for existing space. It communicates the relationships between interior and exterior spaces visually and projects mutable imageries to the surrounding environment through sunlight. The design process of Blind simulates the responsive behaviour of the intended architectural skin by integrating physical computing and parametric design tools. This process includes the integration of soft apertures and architectural morphing skin to introduce a novel design method that enables an architectural skin to be a means of communication and fenestration. It responds to changing stimuli and intends to improve the spatial quality of existing environments through two types of transformations: morphological and patterned.
keywords Media façades; elasticity; responsive architecture; formchanging materials; kinetic skin
series CAADRIA
email mosphosis@hotmail.com
last changed 2012/05/29 07:34

For more results click below:

this is page 0show page 1