CumInCAD is a Cumulative Index about publications in Computer Aided Architectural Design
supported by the sibling associations ACADIA, CAADRIA, eCAADe, SIGraDi, ASCAAD and CAAD futures

PDF papers
References

Hits 1 to 15 of 15

_id 7670
authors Sawicki, Bogumil
year 1995
title Ray Tracing – New Chances, Possibilities and Limitations in AutoCAD
source CAD Space [Proceedings of the III International Conference Computer in Architectural Design] Bialystock 27-29 April 1995, pp. 121-136
summary Realistic image synthesis is nowadays widely used in engineering applications. Some of these applications, such as architectural, interior, lighting and industrial design demand accurate visualization of non-existent scenes as they would look to us, when built in reality. This can only be archived by using physically based models of light interaction with surfaces, and simulating propagation of light through an environment. Ray tracing is one of the most powerful techniques used in computer graphics, which can produce such very realistic images. Ray tracing algorithm follows the paths of light rays backwards from observer into the scene. It is very time consuming process and as such one could not be developed until proper computers appeared, In recent years the technological improvements in computer industry brought more powerful machines with bigger storage capacities and better graphic devices. Owing to increasing these hardware capabilities successful implementation of ray tracing in different CAD software became possible also on PC machines. Ray tracing in AutoCAD r.12 - the most popular CAD package in the world - is the best of that example. AccuRender and AutoVision are an AutoCAD Development System (ADS) applications that use ray tracing to create photorealistic images from 3D AutoCAD models. These ,internal"' applications let users generate synthetic images of threedimensional models and scenes entirely within AutoCAD space and show effects directly on main AutoCAD screen. Ray tracing algorithm accurately calculates and displays shadows, transparency, diffusion, reflection, and refraction from surface qualities of user-defined materials. The accurate modelling of light lets produce sophisticated effects and high-quality images, which these ray tracers always generates at 24-bit pixel depth,"providing 16,7 million colours. That results can be quite impressive for some architects and are almost acceptable for others but that coloured virtual world, which is presented by ray tracing in AutoCAD space in such convincing way, is still not exactly the same as the real world. Main limitations of realism are due to the nature of ray tracing method Classical ray tracing technique takes into account the effects of light reflection from neighbouring surfaces but, leaves out of account the ambient and global illumination arising out of complex interreflections in an environment. So models generated by ray tracing belong to an "ideal" world where real materials and environment can't find their right place. We complain about that fact and say that ray tracing shows us "too specular world", but (...) (...) there is anything better on the horizon? It should be concluded, that typical abilities of today's graphics software and hardware are far from exploited. As was observed in literature there have been various works carried along with the explicit intention of overcoming all these ray tracing limitations, These researches seem to be very promising and let us hope that their results will be seen in CAD applications soon. As it happens with modelling, perhaps the answer will come from a variety of techniques that can be combined together with ray tracing depending on the case we are dealing with. Therefore from the point of view of an architects that try to keep alive some interest on the nature of materials and their interaction with form, "ray tracing" seems to be right path of research and development that we can still a long way follow, From the point of view of the school, a critical assimilation of "ray tracing" processes is required and one that might help to determinate exactly their distortions and to indicate the correct way of its development and right place in CAAD education. I trust that ray tracing will become standard not only in AutoCAD but in all architectural space modelling CAD applications and will be established as a powerful and real tool for experimental researches in architectural design process. Will be the technological progress so significant in the nearest future as it is anticipated?
series plCAD
last changed 2000/01/24 10:08

_id 819d
authors Eiteljorg, H.
year 1988
title Computing Assisted Drafting and Design: new technologies for old problems
source Center for the study of architecture, Bryn Mawr, Pennsylvania
summary In past issues of the Newsletter, George Tressel and I have written about virtual reality and renderings. We have each discussed particular problems with the technology, and both of us mentioned how compelling computer visualizations can be. In my article ("Virtual Reality and Rendering," February, 1995, Vol. 7, no. 4), I indicated my concerns about the quality of the scholarship and the level of detail used in making renderings or virtual worlds. Mr. Tressel (in "Visualizing the Ancient World," November, 1996, Vol. IX, no. 3) wrote about the need to distinguish between real and hypothetical parts of a visualization, the need to differentiate materials, and the difficulties involved in creating the visualizations (some of which were included in the Newsletter in black-and-white and on the Web in color). I am returning to this topic now, in part because the quality of the images available to us is improving so fast and in part because it seems now that neither Mr. Tressel nor I treated all the issues raised by the use of high-quality visualizations. The quality may be illustrated by new images of the older propylon that were created by Mr. Tressel (Figs. 1 - 3); these images are significantly more realistic than the earlier ones, but they do not represent the ultimate in quality, since they were created on a personal computer.
series other
last changed 2003/04/23 15:50

_id 14b5
authors Fang, Lian and Gossard, David C.
year 1995
title Multidimensional curve fitting to unorganized data points by nonlinear minimization
source Computer-Aided Design, Vol. 27 (1) (1995) pp. 48-58
summary Many papers have addressed the problem of fitting curves to data points. However, most of the approaches are subject to a restriction that the data points must be ordered.The paper presents a method for generating a piecewise continuous parametric curve from a set of unordered and error-filled data points. The resulting curve not only providesa good fit to the original data but also possesses good fairness. Excluding the endpoints of the curve, none of the connectivity information needs to be specified, thuseliminating the necessity of an initial parameterization. The standard regularization method for univariate functions is modified for multidimensional parametric functions andresults in a nonlinear minimization problem. Successive quadratic programming is applied to find the optimal solution. A physical model is also supplied to facilitate anintuitive understanding of the mathematical background.
keywords Data Interpolation, Regularization, Nonlinear Minimization
series journal paper
last changed 2003/05/15 21:33

_id d8ea
authors Kumar, Subodh and Manocha, Dinesh
year 1995
title Efficient rendering of trimmed NURBS surfaces
source Computer-Aided Design, Vol. 27 (7) (1995) pp. 509-521
summary An algorithm for the interactive display of trimmed nurbs surfaces is presented. The algorithm converts the nurbs surfaces to Bézier surfaces, and nurbs trimming curves toBézier curves. It tessellates each trimmed Bézier surface into triangles, and renders them using the triangle rendering capabilities common in current graphics systems. Itmakes use of tight bounds for the uniform tessellation of Bézier surfaces into cells and it traces the trimming curves to compute the trimmed regions of each cell. Thisoperation is based on the tracing of trimming curves, intersection computation with the cells, and triangulation of the cells. The resulting technique also makes use of spatialand temporal coherence between successive frames for cell computation and triangulation. Polygonization anomalies such as cracks and angularities are avoided as well. Thealgorithm can display trimmed models described using thousands of Bézier surfaces at interactive frame rates on high end graphics systems.
keywords Trimmed Nurbs, Bezier Surfaces, Rendering
series journal paper
last changed 2003/05/15 21:33

_id 3d4a
authors Kasprisin, Ronald J.
year 1995
title Visual Thinking For Architects And Designers: Visualizing Context In Design
source Van Nostrand Reinhold
summary Here at last is a book that will help architects and designers avoid the pitfall of creating buildings that battle aesthetically with everything within a three-block radius. In Visual Thinking for Architects and Designers, Ron Kasprisin and James Pettinari unveil a solution to designing for the complex urban landscape: visual thinking. A concept twenty-five years in the making, this integrative approach will help harried professionals prevent environmental disasters. The authors present three-dimensional drawing (visual thinking) as a communication and decision-making tool to be used during the design and planning process. Because architects, landscape architects, and urban designers often work independently, on different scales, and at different interludes, no one can truly envision the completed project. Visual thinking is a way of getting input from every member of the team. Here, you'll learn how to use graphics, whether hand-drawn or computer-generated, as a language to express complex systems, interrelationships, and environments. Using over 300 high quality drawings that are connected at many different scales; from aerial perspectives of entire regions to individual rooms and buildings-this groundbreaking book lays out an urban design process and methodology in a sequential and easily understood manner. The book is illustrated by the authors; own work, which has been recognized in national design competitions, and by the AIA, APA, and NEA. The authors masterfully cover the use of drawing to analyze and create spaces, drawing technique, and communicating complex information to the public. Case studies convincingly illustrate the authors; approach.
series other
last changed 2003/04/23 15:14

_id 2068
authors Frazer, John
year 1995
title AN EVOLUTIONARY ARCHITECTURE
source London: Architectural Association
summary In "An Evolutionary Architecture", John Frazer presents an overview of his work for the past 30 years. Attempting to develop a theoretical basis for architecture using analogies with nature's processes of evolution and morphogenesis. Frazer's vision of the future of architecture is to construct organic buildings. Thermodynamically open systems which are more environmentally aware and sustainable physically, sociologically and economically. The range of topics which Frazer discusses is a good illustration of the breadth and depth of the evolutionary design problem. Environmental Modelling One of the first topics dealt with is the importance of environmental modelling within the design process. Frazer shows how environmental modelling is often misused or misinterpreted by architects with particular reference to solar modelling. From the discussion given it would seem that simplifications of the environmental models is the prime culprit resulting in misinterpretation and misuse. The simplifications are understandable given the amount of information needed for accurate modelling. By simplifying the model of the environmental conditions the architect is able to make informed judgments within reasonable amounts of time and effort. Unfortunately the simplications result in errors which compound and cause the resulting structures to fall short of their anticipated performance. Frazer obviously believes that the computer can be a great aid in the harnessing of environmental modelling data, providing that the same simplifying assumptions are not made and that better models and interfaces are possible. Physical Modelling Physical modelling has played an important role in Frazer's research. Leading to the construction of several novel machine readable interactive models, ranging from lego-like building blocks to beermat cellular automata and wall partitioning systems. Ultimately this line of research has led to the Universal Constructor and the Universal Interactor. The Universal Constructor The Universal Constructor features on the cover of the book. It consists of a base plug-board, called the "landscape", on top of which "smart" blocks, or cells, can be stacked vertically. The cells are individually identified and can communicate with neighbours above and below. Cells communicate with users through a bank of LEDs displaying the current state of the cell. The whole structure is machine readable and so can be interpreted by a computer. The computer can interpret the states of the cells as either colour or geometrical transformations allowing a wide range of possible interpretations. The user interacts with the computer display through direct manipulation of the cells. The computer can communicate and even direct the actions of the user through feedback with the cells to display various states. The direct manipulation of the cells encourages experimentation by the user and demonstrates basic concepts of the system. The Universal Interactor The Universal Interactor is a whole series of experimental projects investigating novel input and output devices. All of the devices speak a common binary language and so can communicate through a mediating central hub. The result is that input, from say a body-suit, can be used to drive the out of a sound system or vice versa. The Universal Interactor opens up many possibilities for expression when using a CAD system that may at first seem very strange.However, some of these feedback systems may prove superior in the hands of skilled technicians than more standard devices. Imagine how a musician might be able to devise structures by playing melodies which express the character. Of course the interpretation of input in this form poses a difficult problem which will take a great deal of research to achieve. The Universal Interactor has been used to provide environmental feedback to affect the development of evolving genetic codes. The feedback given by the Universal Interactor has been used to guide selection of individuals from a population. Adaptive Computing Frazer completes his introduction to the range of tools used in his research by giving a brief tour of adaptive computing techniques. Covering topics including cellular automata, genetic algorithms, classifier systems and artificial evolution. Cellular Automata As previously mentioned Frazer has done some work using cellular automata in both physical and simulated environments. Frazer discusses how surprisingly complex behaviour can result from the simple local rules executed by cellular automata. Cellular automata are also capable of computation, in fact able to perform any computation possible by a finite state machine. Note that this does not mean that cellular automata are capable of any general computation as this would require the construction of a Turing machine which is beyond the capabilities of a finite state machine. Genetic Algorithms Genetic algorithms were first presented by Holland and since have become a important tool for many researchers in various areas.Originally developed for problem-solving and optimization problems with clearly stated criteria and goals. Frazer fails to mention one of the most important differences between genetic algorithms and other adaptive problem-solving techniques, ie. neural networks. Genetic algorithms have the advantage that criteria can be clearly stated and controlled within the fitness function. The learning by example which neural networks rely upon does not afford this level of control over what is to be learned. Classifier Systems Holland went on to develop genetic algorithms into classifier systems. Classifier systems are more focussed upon the problem of learning appropriate responses to stimuli, than searching for solutions to problems. Classifier systems receive information from the environment and respond according to rules, or classifiers. Successful classifiers are rewarded, creating a reinforcement learning environment. Obviously, the mapping between classifier systems and the cybernetic view of organisms sensing, processing and responding to environmental stimuli is strong. It would seem that a central process similar to a classifier system would be appropriate at the core of an organic building. Learning appropriate responses to environmental conditions over time. Artificial Evolution Artificial evolution traces it's roots back to the Biomorph program which was described by Dawkins in his book "The Blind Watchmaker". Essentially, artificial evolution requires that a user supplements the standard fitness function in genetic algorithms to guide evolution. The user may provide selection pressures which are unquantifiable in a stated problem and thus provide a means for dealing ill-defined criteria. Frazer notes that solving problems with ill-defined criteria using artificial evolution seriously limits the scope of problems that can be tackled. The reliance upon user interaction in artificial evolution reduces the practical size of populations and the duration of evolutionary runs. Coding Schemes Frazer goes on to discuss the encoding of architectural designs and their subsequent evolution. Introducing two major systems, the Reptile system and the Universal State Space Modeller. Blueprint vs. Recipe Frazer points out the inadequacies of using standard "blueprint" design techniques in developing organic structures. Using a "recipe" to describe the process of constructing a building is presented as an alternative. Recipes for construction are discussed with reference to the analogous process description given by DNA to construct an organism. The Reptile System The Reptile System is an ingenious construction set capable of producing a wide range of structures using just two simple components. Frazer saw the advantages of this system for rule-based and evolutionary systems in the compactness of structure descriptions. Compactness was essential for the early computational work when computer memory and storage space was scarce. However, compact representations such as those described form very rugged fitness landscapes which are not well suited to evolutionary search techniques. Structures are created from an initial "seed" or minimal construction, for example a compact spherical structure. The seed is then manipulated using a series of processes or transformations, for example stretching, shearing or bending. The structure would grow according to the transformations applied to it. Obviously, the transformations could be a predetermined sequence of actions which would always yield the same final structure given the same initial seed. Alternatively, the series of transformations applied could be environmentally sensitive resulting in forms which were also sensitive to their location. The idea of taking a geometrical form as a seed and transforming it using a series of processes to create complex structures is similar in many ways to the early work of Latham creating large morphological charts. Latham went on to develop his ideas into the "Mutator" system which he used to create organic artworks. Generalising the Reptile System Frazer has proposed a generalised version of the Reptile System to tackle more realistic building problems. Generating the seed or minimal configuration from design requirements automatically. From this starting point (or set of starting points) solutions could be evolved using artificial evolution. Quantifiable and specific aspects of the design brief define the formal criteria which are used as a standard fitness function. Non-quantifiable criteria, including aesthetic judgments, are evaluated by the user. The proposed system would be able to learn successful strategies for satisfying both formal and user criteria. In doing so the system would become a personalised tool of the designer. A personal assistant which would be able to anticipate aesthetic judgements and other criteria by employing previously successful strategies. Ultimately, this is a similar concept to Negroponte's "Architecture Machine" which he proposed would be computer system so personalised so as to be almost unusable by other people. The Universal State Space Modeller The Universal State Space Modeller is the basis of Frazer's current work. It is a system which can be used to model any structure, hence the universal claim in it's title. The datastructure underlying the modeller is a state space of scaleless logical points, called motes. Motes are arranged in a close-packing sphere arrangement, which makes each one equidistant from it's twelve neighbours. Any point can be broken down into a self-similar tetrahedral structure of logical points. Giving the state space a fractal nature which allows modelling at many different levels at once. Each mote can be thought of as analogous to a cell in a biological organism. Every mote carries a copy of the architectural genetic code in the same way that each cell within a organism carries a copy of it's DNA. The genetic code of a mote is stored as a sequence of binary "morons" which are grouped together into spatial configurations which are interpreted as the state of the mote. The developmental process begins with a seed. The seed develops through cellular duplication according to the rules of the genetic code. In the beginning the seed develops mainly in response to the internal genetic code, but as the development progresses the environment plays a greater role. Cells communicate by passing messages to their immediate twelve neighbours. However, it can send messages directed at remote cells, without knowledge of it's spatial relationship. During the development cells take on specialised functions, including environmental sensors or producers of raw materials. The resulting system is process driven, without presupposing the existence of a construction set to use. The datastructure can be interpreted in many ways to derive various phenotypes. The resulting structure is a by-product of the cellular activity during development and in response to the environment. As such the resulting structures have much in common with living organisms which are also the emergent result or by-product of local cellular activity. Primordial Architectural Soups To conclude, Frazer presents some of the most recent work done, evolving fundamental structures using limited raw materials, an initial seed and massive feedback. Frazer proposes to go further and do away with the need for initial seed and start with a primordial soup of basic architectural concepts. The research is attempting to evolve the starting conditions and evolutionary processes without any preconditions. Is there enough time to evolve a complex system from the basic building blocks which Frazer proposes? The computational complexity of the task being embarked upon is not discussed. There is an implicit assumption that the "superb tactics" of natural selection are enough to cut through the complexity of the task. However, Kauffman has shown how self-organisation plays a major role in the early development of replicating systems which we may call alive. Natural selection requires a solid basis upon which it can act. Is the primordial soup which Frazer proposes of the correct constitution to support self-organisation? Kauffman suggests that one of the most important attributes of a primordial soup to be capable of self-organisation is the need for a complex network of catalysts and the controlling mechanisms to stop the reactions from going supracritical. Can such a network be provided of primitive architectural concepts? What does it mean to have a catalyst in this domain? Conclusion Frazer shows some interesting work both in the areas of evolutionary design and self-organising systems. It is obvious from his work that he sympathizes with the opinions put forward by Kauffman that the order found in living organisms comes from both external evolutionary pressure and internal self-organisation. His final remarks underly this by paraphrasing the words of Kauffman, that life is always to found on the edge of chaos. By the "edge of chaos" Kauffman is referring to the area within the ordered regime of a system close to the "phase transition" to chaotic behaviour. Unfortunately, Frazer does not demonstrate that the systems he has presented have the necessary qualities to derive useful order at the edge of chaos. He does not demonstrate, as Kauffman does repeatedly, that there exists a "phase transition" between ordered and chaotic regimes of his systems. He also does not make any studies of the relationship of useful forms generated by his work to phase transition regions of his systems should they exist. If we are to find an organic architecture, in more than name alone, it is surely to reside close to the phase transition of the construction system of which is it built. Only there, if we are to believe Kauffman, are we to find useful order together with environmentally sensitive and thermodynamically open systems which can approach the utility of living organisms.
series other
type normal paper
last changed 2004/05/22 14:12

_id c05a
authors Bridges, Alan
year 1995
title Design Precedents for Virtual Worlds
source Sixth International Conference on Computer-Aided Architectural Design Futures [ISBN 9971-62-423-0] Singapore, 24-26 September 1995, pp. 293-302
summary The usual precedents cited in relation to Cyberspace are William Gibson's book "Neuromancer" and Ridley Scott's film. "Bladerunner". This paper argues that, whilst literature and film are appropriate precedents, there are more suitable sources to refer to when designing virtual worlds. The paper discusses the use of computer modelling in exploring architectonic concepts in three-dimensional space. In doing so it draws on the philosophy of simulation and gives examples from alternative film and literature sources but concludes that one of the most appropriate metaphors is widely available in the form of the television soap opera.
keywords Design Simulation, Space, Time, Virtual Reality
series CAAD Futures
email
last changed 2003/11/21 15:16

_id 913a
authors Brutzman, D.P., Macedonia, M.R. and Zyda, M.J.
year 1995
title Internetwork Infrastructure Requirements for Virtual Environments
source NIl 2000 Forum of the Computer Science and Telecommunications Board, National Research Council, Washington, D.C., May 1995
summary Virtual environments (VEs) are a broad multidisciplinary research area that includes all aspects of computer science, virtual reality, virtual worlds, teleoperation and telepresence. A variety of network elements are required to scale up virtual environments to arbitrarily large sizes, simultaneously connecting thousands of interacting players and all kinds of information objects. Four key communications components for virtual environments are found within the Internet Protocol (IP) suite: light-weight messages, network pointers, heavy-weight objects and real-time streams. Software and hardware shortfalls and successes for internetworked virtual environments provide specific research conclusions and recommendations. Since large-scale networked are intended to include all possible types of content and interaction, they are expected to enable new classes of interdisciplinary research and sophisticated applications that are particularly suitable for implementation using the Virtual Reality Modeling Language (VRML).
series other
last changed 2003/04/23 15:50

_id 679e
authors Coyne, R.
year 1995
title Designing Information Technology in the Postmodern Age
source The MIT Press, Cambridge, Ma and London UK
summary Designing Information Technology in the Postmodern Age puts the theoretical discussion of computer systems and information technology on a new footing. Shifting the discourse from its usual rationalistic framework, Richard Coyne shows how the conception, development, and application of computer systems is challenged and enhanced by postmodern philosophical thought. He places particular emphasis on the theory of metaphor, showing how it has more to offer than notions of method and models appropriated from science. Coyne examines the entire range of contemporary philosophical thinking -- including logical positivism, analytic philosophy, pragmatism, phenomenology, critical theory, hermeneutics, and deconstruction -- comparing them and showing how they differ in their consequences for design and development issues in electronic communications, computer representation, virtual reality, artificial intelligence, and multimedia. He also probes the claims made of information technology, including its presumptions of control, its so-called radicality, even its ability to make virtual worlds, and shows that many of these claims are poorly founded. Among the writings Coyne visits are works by Heidegger, Adorno, Benjamin, Gadamer, Derrida, Habermas, Rorty, and Foucault. He relates their views to information technology designers and critics such as Herbert Simon, Alan Kay, Terry Winograd, Hubert Dreyfus, and Joseph Weizenbaum. In particular, Coyne draws extensively from the writing of Martin Heidegger, who has presented one of the most radical critiques of technology to date.
series other
email
last changed 2003/04/23 15:14

_id ae9f
authors Damer, B.
year 1996
title Inhabited Virtual Worlds: A New Frontier for Interaction Design
source Interactions, Vol.3, No.5 ACM
summary In April of 1995 the Internet took a step into the third dimension with the introduction of the Virtual Reality Modeling Language (VRML) as a commercial standard. Another event that month caused fewer headlines but in retrospect was just as significant. A small company from San Francisco, Worlds Incorporated, launched WorldsChat, a three dimensional environment allowing any Internet user to don a digital costume, or avatar, and travel about and converse with other people inhabiting the space. WorldsChat was appropriately modeled on a space station complete with a central hub, hallways, sliding doors, windows, and escalators to outlying pods.
series journal paper
last changed 2003/04/23 15:50

_id 600e
authors Gavin, Lesley
year 1999
title Architecture of the Virtual Place
source Architectural Computing from Turing to 2000 [eCAADe Conference Proceedings / ISBN 0-9523687-5-7] Liverpool (UK) 15-17 September 1999, pp. 418-423
doi https://doi.org/10.52842/conf.ecaade.1999.418
summary The Bartlett School of Graduate Studies, University College London (UCL), set up the first MSc in Virtual Environments in the UK in 1995. The course aims to synthesise and build on research work undertaken in the arts, architecture, computing and biological sciences in exploring the realms of the creation of digital and virtual immersive spaces. The MSc is concerned primarily with equipping students from design backgrounds with the skills, techniques and theories necessary in the production of virtual environments. The course examines both virtual worlds as prototypes for real urban or built form and, over the last few years, has also developed an increasing interest in the the practice of architecture in purely virtual contexts. The MSc course is embedded in the UK government sponsored Virtual Reality Centre for the Built Environment which is hosted by the Bartlett School of Architecture. This centre involves the UCL departments of architecture, computer science and geography and includes industrial partners from a number of areas concerned with the built environment including architectural practice, surveying and estate management as well as some software companies and the telecoms industry. The first cohort of students graduated in 1997 and predominantly found work in companies working in the new market area of digital media. This paper aims to outline the nature of the course as it stands, examines the new and ever increasing market for designers within digital media and proposes possible future directions for the course.
keywords Virtual Reality, Immersive Spaces, Digital Media, Education
series eCAADe
email
more http://www.bartlett.ucl.ac.uk/ve/
last changed 2022/06/07 07:51

_id d5b3
authors Knight, Michael and Brown, Andre
year 1999
title Working in Virtual Environments through appropriate Physical Interfaces
source Architectural Computing from Turing to 2000 [eCAADe Conference Proceedings / ISBN 0-9523687-5-7] Liverpool (UK) 15-17 September 1999, pp. 431-436
doi https://doi.org/10.52842/conf.ecaade.1999.431
summary The work described here is aimed at contributing towards the debate and development relating to the construction of interfaces to explore buildings and their environs through virtual worlds. We describe a particular hardware and software configuration which is derived by the use of low cost games software to create the Virtual Environment. The Physical Interface responds to the work of other researchers, in this area, in particular Shaw (1994) and Vasquez de Velasco & Trigo (1997). Virtual Evironments might have the potential to be "a magical window into other worlds, from molecules to minds" (Rheingold, 1992), but what is the nature of that window? Currently it is often a translucent opening which gives a hazy and distorted (disembodied) view. And many versions of such openings are relatively expensive. We consider ways towards clearing the haze without too much expense, adapting techniques proposed by developers of low cost virtual reality systems (Hollands, 1995) for use in an architectural setting.
keywords Virtual Environments, Games Software
series eCAADe
email
last changed 2022/06/07 07:51

_id 02f7
authors Liebich, Thomas and Kim, Inhan
year 1995
title ID'EST: An Integrated Modelling Framework for Management of Architectural Data
source Sixth International Conference on Computer-Aided Architectural Design Futures [ISBN 9971-62-423-0] Singapore, 24-26 September 1995, pp. 377-387
summary An Integrated Design Environment, IDE, facilitates cooperation between different disciplines. The paper investigates the data modelling framework, distinguishes between homogeneous and heterogeneous model worlds, discusses the formal mapping mechanisms available to establish a heterogeneous model world, and introduces a way to incorporate CAD systems into IDE A prototype IDE has been developed to prove these methods. The ID'EST prototype comprises its own core data model, different schemas to cope with several design views, and interfaces to incorporate external CAD systems. A prototype architectural data model has been defined, that includes core data models and aspect models for enclosure system and spatial system. Conventional CAD systems can be integrated into ID'EST, if they are able to map data from the aspect models into their own data structure, and vice versa, on a high semantic level. The inherent methods of classifying data in CAD, layers, macros and attached attributes, have been used to retrieve product data from CAD data files. The usability of conventional CAD systems as data instantiation tools for IDE has been proved and a path has been shown, by which existing tools can be integrated into new technology solutions.
keywords Product Modelling, Formal Mapping Specification, Computer-Aided Design
series CAAD Futures
email
last changed 2003/05/16 20:58

_id c7e9
authors Maver, T.W.
year 2002
title Predicting the Past, Remembering the Future
source SIGraDi 2002 - [Proceedings of the 6th Iberoamerican Congress of Digital Graphics] Caracas (Venezuela) 27-29 november 2002, pp. 2-3
summary Charlas Magistrales 2There never has been such an exciting moment in time in the extraordinary 30 year history of our subject area, as NOW,when the philosophical theoretical and practical issues of virtuality are taking centre stage.The PastThere have, of course, been other defining moments during these exciting 30 years:• the first algorithms for generating building layouts (circa 1965).• the first use of Computer graphics for building appraisal (circa 1966).• the first integrated package for building performance appraisal (circa 1972).• the first computer generated perspective drawings (circa 1973).• the first robust drafting systems (circa 1975).• the first dynamic energy models (circa 1982).• the first photorealistic colour imaging (circa 1986).• the first animations (circa 1988)• the first multimedia systems (circa 1995), and• the first convincing demonstrations of virtual reality (circa 1996).Whereas the CAAD community has been hugely inventive in the development of ICT applications to building design, it hasbeen woefully remiss in its attempts to evaluate the contribution of those developments to the quality of the built environmentor to the efficiency of the design process. In the absence of any real evidence, one can only conjecture regarding the realbenefits which fall, it is suggested, under the following headings:• Verisimilitude: The extraordinary quality of still and animated images of the formal qualities of the interiors and exteriorsof individual buildings and of whole neighborhoods must surely give great comfort to practitioners and their clients thatwhat is intended, formally, is what will be delivered, i.e. WYSIWYG - what you see is what you get.• Sustainability: The power of «first-principle» models of the dynamic energetic behaviour of buildings in response tochanging diurnal and seasonal conditions has the potential to save millions of dollars and dramatically to reduce thedamaging environmental pollution created by badly designed and managed buildings.• Productivity: CAD is now a multi-billion dollar business which offers design decision support systems which operate,effectively, across continents, time-zones, professions and companies.• Communication: Multi-media technology - cheap to deliver but high in value - is changing the way in which we canexplain and understand the past and, envisage and anticipate the future; virtual past and virtual future!MacromyopiaThe late John Lansdown offered the view, in his wonderfully prophetic way, that ...”the future will be just like the past, onlymore so...”So what can we expect the extraordinary trajectory of our subject area to be?To have any chance of being accurate we have to have an understanding of the phenomenon of macromyopia: thephenomenon exhibitted by society of greatly exaggerating the immediate short-term impact of new technologies (particularlythe information technologies) but, more importantly, seriously underestimating their sustained long-term impacts - socially,economically and intellectually . Examples of flawed predictions regarding the the future application of information technologiesinclude:• The British Government in 1880 declined to support the idea of a national telephonic system, backed by the argumentthat there were sufficient small boys in the countryside to run with messages.• Alexander Bell was modest enough to say that: «I am not boasting or exaggerating but I believe, one day, there will bea telephone in every American city».• Tom Watson, in 1943 said: «I think there is a world market for about 5 computers».• In 1977, Ken Olssop of Digital said: «There is no reason for any individuals to have a computer in their home».The FutureJust as the ascent of woman/man-kind can be attributed to her/his capacity to discover amplifiers of the modest humancapability, so we shall discover how best to exploit our most important amplifier - that of the intellect. The more we know themore we can figure; the more we can figure the more we understand; the more we understand the more we can appraise;the more we can appraise the more we can decide; the more we can decide the more we can act; the more we can act themore we can shape; and the more we can shape, the better the chance that we can leave for future generations a trulysustainable built environment which is fit-for-purpose, cost-beneficial, environmentally friendly and culturally significactCentral to this aspiration will be our understanding of the relationship between real and virtual worlds and how to moveeffortlessly between them. We need to be able to design, from within the virtual world, environments which may be real ormay remain virtual or, perhaps, be part real and part virtual.What is certain is that the next 30 years will be every bit as exciting and challenging as the first 30 years.
series SIGRADI
email
last changed 2016/03/10 09:55

_id f2c8
authors Mine, M.
year 1995
title ISAAC: A virtual environment tool for the interactive construction of virtual worlds
source UNC Chapel Hill Computer Science Technical Report. TR95-020
summary absract im ordner als eps
series report
email
last changed 2003/04/23 15:50

No more hits.

HOMELOGIN (you are user _anon_491762 from group guest) CUMINCAD Papers Powered by SciX Open Publishing Services 1.002