CumInCAD is a Cumulative Index about publications in Computer Aided Architectural Design
supported by the sibling associations ACADIA, CAADRIA, eCAADe, SIGraDi, ASCAAD and CAAD futures

PDF papers
References

Hits 1 to 20 of 84

_id ascaad2016_001
id ascaad2016_001
authors Al-Attili, Aghlab; Anastasia Karandinou and Ben Daley
year 2016
title Parametricism vs Materialism - Evolution of digital technologies for development
source Parametricism Vs. Materialism: Evolution of Digital Technologies for Development [8th ASCAAD Conference Proceedings ISBN 978-0-9955691-0-2] London (United Kingdom) 7-8 November 2016, pp. VII-VIII
summary We build on previous technological developments in CAAD by looking into parametric design exploration and the development of the concept of parametricism. We use the phenomenological backdrop to account for our physical experiences and encounters as well as our mental ones; both evident in the link between parametric design as a process and an outcome. In specific, we previously examined two particular metaphors. The first metaphor addressed aspects of virtual environments that resemble our physical world; In other words, computer model as physical model and digital world as material world. In this volume, we extend the exploration into aspects of virtual environments and their resemblance to physical environments by looking at ‘performance’ aspects: the way in which environments are sensed, measured, tracked and visualised. Moreover, we reflect on matters and materiality in both virtual and physical space philosophically, theoretically, practically and reflectively. The second metaphor looked into the modes and means of interaction between our bodies and such virtual environment. Here we extend the investigation to look at the ways in which measures of environmental performance influence human interaction in real environments. The exploration takes us further to look into the area of design fabrication of the built environment, and methods in which developed processes meet environmental performance requirements, and the innovative outcomes that lead to disruptive technologies getting introduced into design and we revisit parametric design under this focus area.
series ASCAAD
email alattili@gmail.com
last changed 2017/05/25 11:06

_id caadria2014_042
id caadria2014_042
authors Alam, Jack and Jeremy J. Ham
year 2014
title Towards a BIM-Based Energy Rating System
source Rethinking Comprehensive Design: Speculative Counterculture, Proceedings of the 19th International Conference on Computer-Aided Architectural Design Research in Asia (CAADRIA 2014) / Kyoto 14-16 May 2014, pp. 285–294
summary Governments in Australia are faced with policy implementation that mandates higher energy efficient housing (Foran, Lenzen & Dey 2005). To this effect, the National Construction Code (NCC) 2013 stipulates the minimum energy performance for residential buildings as 114MJ/m2 per annum or 6 stars on an energy rating scale. Compliance with this minimum is mandatory but there are several methods through which residential buildings can be rated to comply with the deemed to satisfy provisions outlined in the NCC. FirstRate5 is by far the most commonly used simulation software used in Victoria, Australia. Meanwhile, Building Information Modelling (BIM), using software such as ArchiCAD has gained a foothold in the industry. The energy simulation software within ArchiCAD, EcoDesigner, enables the reporting on the energy performance based on BIM elements that contain thermal information. This research is founded on a comparative study between FirstRate5 and EcoDesigner. Three building types were analysed and compared. The comparison finds significant differences between simulations, being, measured areas, thermal loads and potentially serious shortcomings within FirstRate5, that are discussed along with the future potential of a fully BIM-integrated model for energy rating certification in Victoria.
keywords Building Information Modelling, energy rating, FirstRate 5, ArchiCAD EcoDesigner, Building Energy Model
series CAADRIA
email Jack.alam@gmail.com
last changed 2014/04/22 08:23

_id ddssup9601
id ddssup9601
authors Aoke, Yoshitsugu and Muraoka, Naoto
year 1996
title An optimization method of the facility location by genetic algorithm
source Timmermans, Harry (Ed.), Third Design and Decision Support Systems in Architecture and Urban Planning - Part two: Urban Planning Proceedings (Spa, Belgium), August 18-21, 1996
summary In planning of community-facilities, it is important to decide the facility location to provide the effective service for residents. The behavior of residents using the facility and the evaluation methods of the location have been studied. But, finding the optimum location is very hard in actual planning because the volume of calculation depends on the number of feasible locating points of facilities. To conquer the difficulty of searching the optimum location, we propose an optimization method using Genetic Algorithm. An alternative of location is expressed by a chromosome. Each chromosome consists of genes, and each gene expresses a located zone of the facility. We gave definitions of genetic procedures; crossing-over, mutation and selection. Alternatives of the facility location are generated by these genetic procedures like as life evolution. For each alternative, the behaviors of users are estimated by a spatial-interaction model, and the facilities that residents in each place choose are determined. The effectiveness of the location is measured by a total sum of distances between the facility and the user. After the confirmation of the effectiveness of our method by applying on ideal example problems, we applied it on the actual problem in Japanese town. By this method we could find the optimum location in about one-third time and effort as compared with the ordinal method.
series DDSS
last changed 2003/11/21 14:15

_id 174f
authors Bakker, N.H.
year 2001
title Spatial Orientation in Virtual Environments
source Delft University of Technology
summary Recently, a growing interest can be detected in the application of Virtual Environment (VE) technology as an operator interface. VEs are three-dimensional computer-generated images that can be shown on a conventional monitor, on a large screen display, or on a head-mounted display. In order to use these three-dimensional interfaces for finding and retrieving information, the user must be able to spatially orient themselves. Different types of VE technology are available for navigating in these VEs, and different types of navigation can be enabled. A choice has to be made between the different options to enable good spatial orientation of the user. There are two main types of VE interfaces: an immersive interface that provides rich sensory feedback to the user when moving around in the VE, and a non-immersive interface that provides only visual feedback to the user when moving around in the VE. Furthermore, navigation through the VE can either be continuous providing fluent motion, or can be discontinuous which means that the viewpoint is displaced instantaneously over a large distance. To provide insight into the possible effects of these options a series of nine experiments was carried out. In the experiments the quality of spatial orientation behaviour of test subjects is measured while using the different types of interface and the different types of navigation. The results of the experiments indicate that immersive navigation improves the perception of displacement through the VE, which in turn aids the acquisition of spatial knowledge. However, as soon as the spatial layout of the VE is learned the two types of navigation interface do not lead to differences in spatial orientation performance. A discontinuous displacement leads to temporary disorientation, which will hinder the acquisition of spatial knowledge. The type of discontinuous displacements has an effect on the time needed for anticipation. The disorienting effects of a discontinuous displacement can be compensated for by enabling cognitive anticipation to the destination of the displacement. These results suggest that immersive navigation might only be beneficial for application domains in which new spatial layouts have to be learned every time or in domains where the primary users are novices. For instance, in training firemen to teach them the layout of new buildings with VE, or in using architectural walkthroughs in VE to show new building designs to potential buyers. Discontinuous movement should not be allowed when exploring a new environment. Once the environment is learned and if fast displacement is essential then discontinuous displacement should be preferred. In this case, the interface designer must make sure that information is provided about the destination of a discontinuous displacement.
series thesis:PhD
last changed 2003/11/21 14:16

_id ddssar0206
id ddssar0206
authors Bax, M.F.Th. and Trum, H.M.G.J.
year 2002
title Faculties of Architecture
source Timmermans, Harry (Ed.), Sixth Design and Decision Support Systems in Architecture and Urban Planning - Part one: Architecture Proceedings Avegoor, the Netherlands), 2002
summary In order to be inscribed in the European Architect’s register the study program leading to the diploma ‘Architect’ has to meet the criteria of the EC Architect’s Directive (1985). The criteria are enumerated in 11 principles of Article 3 of the Directive. The Advisory Committee, established by the European Council got the task to examine such diplomas in the case some doubts are raised by other Member States. To carry out this task a matrix was designed, as an independent interpreting framework that mediates between the principles of Article 3 and the actual study program of a faculty. Such a tool was needed because of inconsistencies in the list of principles, differences between linguistic versions ofthe Directive, and quantification problems with time, devoted to the principles in the study programs. The core of the matrix, its headings, is a categorisation of the principles on a higher level of abstractionin the form of a taxonomy of domains and corresponding concepts. Filling in the matrix means that each study element of the study programs is analysed according to their content in terms of domains; thesummation of study time devoted to the various domains results in a so-called ‘profile of a faculty’. Judgement of that profile takes place by committee of peers. The domains of the taxonomy are intrinsically the same as the concepts and categories, needed for the description of an architectural design object: the faculties of architecture. This correspondence relates the taxonomy to the field of design theory and philosophy. The taxonomy is an application of Domain theory. This theory,developed by the authors since 1977, takes as a view that the architectural object only can be described fully as an integration of all types of domains. The theory supports the idea of a participatory andinterdisciplinary approach to design, which proved to be awarding both from a scientific and a social point of view. All types of domains have in common that they are measured in three dimensions: form, function and process, connecting the material aspects of the object with its social and proceduralaspects. In the taxonomy the function dimension is emphasised. It will be argued in the paper that the taxonomy is a categorisation following the pragmatistic philosophy of Charles Sanders Peirce. It will bedemonstrated as well that the taxonomy is easy to handle by giving examples of its application in various countries in the last 5 years. The taxonomy proved to be an adequate tool for judgement ofstudy programs and their subsequent improvement, as constituted by the faculties of a Faculty of Architecture. The matrix is described as the result of theoretical reflection and practical application of a matrix, already in use since 1995. The major improvement of the matrix is its direct connection with Peirce’s universal categories and the self-explanatory character of its structure. The connection with Peirce’s categories gave the matrix a more universal character, which enables application in other fieldswhere the term ‘architecture’ is used as a metaphor for artefacts.
series DDSS
last changed 2003/11/21 14:16

_id 8625
authors Caneparo, Luca
year 1995
title Simulation of Shape, Light, Color and Material in Design
source Multimedia and Architectural Disciplines [Proceedings of the 13th European Conference on Education in Computer Aided Architectural Design in Europe / ISBN 0-9523687-1-4] Palermo (Italy) 16-18 November 1995, pp. 417-426
summary The purpose of this paper is to analyze the role of simulation in architectural design. The concept of simulation is taken over from physics, where analytical relationships are set up between measured phenomena and mathematical models. Computer visualization applies quantitative models to physical phenomena to simulate analytical aspects, and offers to the designer programs to evaluate, in CAD models, the visual qualities and the numerical quantities of the interactions between shape, light, color and material. Regarding the light-matter simulation, the paper presents recent developments in geometry which make it possible to visualize not only the surface appearance but also the in depth structure of building materials.
series eCAADe
email caneparo@polito.it
more http://dpce.ing.unipa.it/Webshare/Wwwroot/ecaade95/Pag_35.htm
last changed 2000/12/02 12:26

_id caadria2008_59_session6a_487
id caadria2008_59_session6a_487
authors Chevrier, C.; J.P. Perrin
year 2008
title Interactive parametric modelling: POG a tool the cultural heritage monument 3D reconstruction
source CAADRIA 2008 [Proceedings of the 13th International Conference on Computer Aided Architectural Design Research in Asia] Chiang Mai (Thailand) 9-12 April 2008, pp. 487-493
summary Historic monument and archaeological site 3D reconstruction is nowadays often required for many applications (scientific and architectural studies, virtual visits for a better understanding of the monument, etc). This task is very time-consuming. Automating the modelling of the most common components could ease this 3D work and produce accurate, consistent and re-usable models. Based upon compound rules of architectural elements but also upon various other data sources such as photographs and 3D laser scans, we have conceived and developed an interactive tool for virtual 3D reconstruction of heritage monuments. It allows a quick modelling and accurate adjustments to the measured data. This tool could be a great help for architects and archaeologists. Research first has began with the study of classical architecture, and has gone on with other architectural styles. Architectural elements are described with parametric data, then generated by our tool. Our main application context was the town of Nancy in France where there are lots of classical architecture buildings which allow us to test our tool. It will be further extended to be applied to other architectural styles and will be combined with photogrammetry methods.
keywords parametric modelling, cultural heritage, 3D model
series CAADRIA
email chevrier@crai.archi.fr, perrin@crai.archi.fr
last changed 2012/05/30 19:29

_id cf2009_105
id cf2009_105
authors Chevrier, Christine; Perrin, Jean-Pierre
year 2009
title Generation of architectural parametric components: Cultural heritage 3D modelling
source T. Tidafi and T. Dorta (eds) Joining Languages, Cultures and Visions: CAADFutures 2009, PUM, 2009, pp. 105-118
summary This paper deals with 3D modeling of complex architectural elements for virtual 3D scene reconstruction based on images or point clouds. It presents a new method at the opposite of classical photogrammetry and lasergrammetry techniques: parametric components are created and then adapted to the measured data. We have conceived and developed a parametric shape generator tool for virtual 3D reconstruction of cultural heritage monuments. We present the geometrical study on the cupola shapes with all their diversity. It is illustrated with the Suleymaniyé Mosque in Turkey. The results are promising. The modeling time is greatly reduced.
keywords 3D modeling, architectural component, parametric modeling, cultural heritage
series CAAD Futures
email chevrier@crai.archi.fr
last changed 2009/06/08 18:53

_id ddss9419
id ddss9419
authors Choukry, Maha
year 1994
title Knowledge Acquisition by Measurement: The Domain of Building Change
source Second Design and Decision Support Systems in Architecture & Urban Planning (Vaals, the Netherlands), August 15-19, 1994
summary This paper presents a study that is aimed at finding a basis for systematic knowledge acquisition. More specifically, it attempts to introduce, knowledge acquisition by measurement: a method thatallows objective evaluation of empirical observations. Measurement has proven to be a significant tool to acquire, evaluate, and upgrade knowledge in some knowledge domains. In other domains,such as the domain of building change, measurement is barely subject of study. Building change knowledge acquisition by measurement seems to become a significant subject of study for several reasons: (i) increase our objective knowledge of previous building changes, (ii) allow systematic monitoring of present changes, and (iii) assist decisions planning for change in new buildings. In current studies, questions such as what were required changes, what were the building elements that fulfilled a change, how often did a building change, and what were the costs related to a change, often get no systematic or objective answers. Hence, to overcome that, I am concerned with finding a method that is to answer the following questions: 1) What is the domain of building change; 2) Is a method of knowledge acquisition by measurement adequate to represent buildingchanges; 3) Can empirical observations of building change be systematically represented and objectively evaluated using this method; and 4) How can this method be applied to assist theunderstanding of previous changes, the control of present changes, and assist planning for building change. The method introduced is based on three modules: (i) domain of building change; (ii) modelling this domain; and (iii) measurement. These three modules enable the formulation of the measurement of building change, namely the change indicator. Multiple change indicators, such as cost change indicator, or occurrence change indicator can measure empirical observations ofbuilding change. Sequential steps that lead to the development of this method start by section 1, where the domain of building change is specified. In section 2 this domain is modelled, and in section 3, knowledge acquisition by measurement method is introduced. A case study, shows how empirical building changes can be measured is explained in section 4. In section 5, three possible applications are introduced, and in section 6, I explain how a computerized prototype would enhance the efficiency of using such applications. Findings and conclusions resulting from this study are summarized in section 7.
series DDSS
email bwrbmc@urc.tue.nl
last changed 2003/08/07 14:36

_id 93cc
authors Colajanni, B., Pellitteri, G. and Concialdi, S.
year 2000
title Retrieval Tools in Building Case Bases
source Promise and Reality: State of the Art versus State of Practice in Computing for the Design and Planning Process [18th eCAADe Conference Proceedings / ISBN 0-9523687-6-5] Weimar (Germany) 22-24 June 2000, pp. 113-116
summary Most of the existing aids to building design rely on data base of cases representing solutions to problems that are thought to happen again at least in a similar way. Crucial for the success of the aid is the retrieval engine. In tour its efficiency depends on the way the cases are encoded. Whichever is this way cases will be represented at different levels of abstraction. The highest level will probably consist in an accessibility and adjacency graph. Another level could be a wire plan of the building. An easily workable representation of a graph is a square matrix. For any given building typology it is possible to write a list of encoded space types. This allows forming matrices that can be compared and their diversity measured. Here we present an algorithm that makes this job. Such an algorithm can be one of the case retrieval tools in the data base. It is likely that the designer has already some idea of the shape he wants for the building he is designing. A comparison between some geometric characteristics of the wire representation of the retrieved case and the corresponding ones of the imagined solution of the design problem can constitute a second test. The matching can be done
keywords Knowledge, Case Bases, Building, Tools
series eCAADe
email bcolajan@unipa.it, pellitt@unipa.it
more http://www.uni-weimar.de/ecaade/
last changed 2002/11/23 05:59

_id cf2011_p051
id cf2011_p051
authors Cote, Pierre; Mohamed-Ahmed Ashraf, Tremblay Sebastien
year 2011
title A Quantitative Method to Compare the Impact of Design Mediums on the Architectural Ideation Process.
source Computer Aided Architectural Design Futures 2011 [Proceedings of the 14th International Conference on Computer Aided Architectural Design Futures / ISBN 9782874561429] Liege (Belgium) 4-8 July 2011, pp. 539-556.
summary If we compare the architectural design process to a black box system, we can assume that we now know quite well both inputs and outputs of the system. Indeed, everything about the early project either feasibility studies, programming, context integration, site analysis (urban, rural or natural), as well as the integration of participants in a collaborative process can all be considered to initiate and sustain the architectural design and ideation process. Similarly, outputs from that process are also, and to some extent, well known and identifiable. We are referring here, among others, to the project representations or even to the concrete building construction and its post-evaluation. But what about the black box itself that produces the ideation. This is the question that attempts to answer the research. Currently, very few research works linger to identify how the human brain accomplishes those tasks; how to identify the cognitive functions that are playing this role; to what extent they operate and complement each other, and among other things, whether there possibly a chain of causality between these functions. Therefore, this study proposes to define a model that reflects the activity of the black box based on the cognitive activity of the human brain. From an extensive literature review, two cognitive functions have been identified and are investigated to account for some of the complex cognitive activity that occurs during a design process, namely the mental workload and mental imagery. These two variables are measured quantitatively in the context of real design task. Essentially, the mental load is measured using a Bakan's test and the mental imagery with eyes tracking. The statistical software G-Power was used to identify the necessary subject number to obtain for significant variance and correlation result analysis. Thus, in the context of an exploratory research, to ensure effective sample of 0.25 and a statistical power of 0.80, 32 participants are needed. All these participants are students from 3rd, 4th or 5th grade in architecture. They are also very familiar with the architectural design process and the design mediums used, i.e., analog model, freehand drawing and CAD software, SketchUp. In three experimental sessions, participants were asked to design three different projects, namely, a bus shelter, a recycling station and a public toilet. These projects were selected and defined for their complexity similarity, taking into account the available time of 22 minutes, using all three mediums of design, and this in a randomly manner to avoid the order effect. To analyze the two cognitive functions (mental load and mental imagery), two instruments are used. Mental imagery is measured using eye movement tracking with monitoring and quantitative analysis of scan paths and the resulting number and duration of participant eye fixations (Johansson et al, 2005). The mental workload is measured using the performance of a modality hearing secondary task inspired by Bakan'sworks (Bakan et al.; 1963). Each of these three experimental sessions, lasting 90 minutes, was composed of two phases: 1. After calibrating the glasses for eye movement, the subject had to exercise freely for 3 minutes while wearing the glasses and headphones (Bakan task) to get use to the wearing hardware. Then, after reading the guidelines and criteria for the design project (± 5 minutes), he had 22 minutes to execute the design task on a drawing table allowing an upright posture. Once the task is completed, the subject had to take the NASA TLX Test, on the assessment of mental load (± 5 minutes) and a written post-experimental questionnaire on his impressions of the experiment (± 10 minutes). 2. After a break of 5-10 minutes, the participant answered a psychometric test, which is different for each session. These tests (± 20 minutes) are administered in the same order to each participant. Thus, in the first experimental session, the subject had to take the psychometric test from Ekstrom et al. (1978), on spatial performance (Factor-Referenced Cognitive Tests Kit). During the second session, the cognitive style is evaluated using Oltman's test (1971). Finally, in the third and final session, participant creativity is evaluated using Delis-Kaplan test (D-KEFS), Delis et al. (2001). Thus, this study will present the first results of quantitative measures to establish and validate the proposed model. Furthermore, the paper will also discuss the relevance of the proposed approach, considering that currently teaching of ideation in ours schools of architecture in North America is essentially done in a holistic manner through the architectural project.
keywords design, ideation process, mental workload, mental imagery, quantitative mesure
series CAAD Futures
email pierre.cote@arc.ulaval.ca
last changed 2012/02/11 18:21

_id acadia17_202
id acadia17_202
authors Cupkova, Dana; Promoppatum, Patcharapit
year 2017
title Modulating Thermal Mass Behavior Through Surface Figuration
source ACADIA 2017: DISCIPLINES & DISRUPTION [Proceedings of the 37th Annual Conference of the Association for Computer Aided Design in Architecture (ACADIA) ISBN 978-0-692-96506-1] Cambridge, MA 2-4 November, 2017), pp. 202-211
summary This research builds upon a previous body of work focused on the relationship between surface geometry and heat transfer coefficients in thermal mass passive systems. It argues for the design of passive systems with higher fidelity to multivariable space between performance and perception. Rooted in the combination of form and matter, the intention is to instrumentalize design principles for the choreography of thermal gradients between buildings and their environment from experiential, spatial and topological perspectives (Figure 1). Our work is built upon the premise that complex geometries can be used to improve both the aesthetic and thermodynamic performance of passive building systems (Cupkova and Azel 2015) by actuating thermal performance through geometric parameters primarily due to convection. Currently, the engineering-oriented approach to the design of thermal mass relies on averaged thermal calculations (Holman 2002), which do not adequately describe the nuanced differences that can be produced by complex three-dimensional geometries of passive thermal mass systems. Using a combination of computational fluid dynamic simulations with physically measured data, we investigate the relationship of heat transfer coefficients related to parameters of surface geometry. Our measured results suggest that we can deliberately and significantly delay heat absorption re-radiation purely by changing the geometric surface pattern over the same thermal mass. The goal of this work is to offer designers a more robust rule set for understanding approximate thermal lag behaviors of complex geometric systems, with a focus on the design of geometric properties rather than complex thermal calculations.
keywords design methods; information processing; physics; smart materials
series ACADIA
email danacupkova@gmail.com
last changed 2017/10/17 09:12

_id eaca
authors Davis, L. (ed.)
year 1991
title Handbook of genetic algorithms
source Van Nostrand Reinhold, New York
summary This book sets out to explain what genetic algorithms are and how they can be used to solve real-world problems. The first objective is tackled by the editor, Lawrence Davis. The remainder of the book is turned over to a series of short review articles by a collection of authors, each explaining how genetic algorithms have been applied to problems in their own specific area of interest. The first part of the book introduces the fundamental genetic algorithm (GA), explains how it has traditionally been designed and implemented and shows how the basic technique may be applied to a very simple numerical optimisation problem. The basic technique is then altered and refined in a number of ways, with the effects of each change being measured by comparison against the performance of the original. In this way, the reader is provided with an uncluttered introduction to the technique and learns to appreciate why certain variants of GA have become more popular than others in the scientific community. Davis stresses that the choice of a suitable representation for the problem in hand is a key step in applying the GA, as is the selection of suitable techniques for generating new solutions from old. He is refreshingly open in admitting that much of the business of adapting the GA to specific problems owes more to art than to science. It is nice to see the terminology associated with this subject explained, with the author stressing that much of the field is still an active area of research. Few assumptions are made about the reader's mathematical background. The second part of the book contains thirteen cameo descriptions of how genetic algorithmic techniques have been, or are being, applied to a diverse range of problems. Thus, one group of authors explains how the technique has been used for modelling arms races between neighbouring countries (a non- linear, dynamical system), while another group describes its use in deciding design trade-offs for military aircraft. My own favourite is a rather charming account of how the GA was applied to a series of scheduling problems. Having attempted something of this sort with Simulated Annealing, I found it refreshing to see the authors highlighting some of the problems that they had encountered, rather than sweeping them under the carpet as is so often done in the scientific literature. The editor points out that there are standard GA tools available for either play or serious development work. Two of these (GENESIS and OOGA) are described in a short, third part of the book. As is so often the case nowadays, it is possible to obtain a diskette containing both systems by sending your Visa card details (or $60) to an address in the USA.
series other
last changed 2003/04/23 13:14

_id e5a2
authors Debevec, P.
year 1998
title Rendering synthetic objects into real scenes: Bridging traditional and image-based graphics with global illumination and high dynamic range photography
source Proc. ACM SIGGRAPH 98, M. Cohen, Ed., 189–198
summary We present a method that uses measured scene radiance and global illumination in order to add new objects to light-based models with correct lighting. The method uses a high dynamic range imagebased model of the scene, rather than synthetic light sources, to illuminate the newobjects. To compute the illumination, the scene is considered as three components: the distant scene, the local scene, and the synthetic objects. The distant scene is assumed to be photometrically unaffected by the objects, obviating the need for re- flectance model information. The local scene is endowed with estimated reflectance model information so that it can catch shadows and receive reflected light from the new objects. Renderings are created with a standard global illumination method by simulating the interaction of light amongst the three components. A differential rendering technique allows for good results to be obtained when only an estimate of the local scene reflectance properties is known. We apply the general method to the problem of rendering synthetic objects into real scenes. The light-based model is constructed from an approximate geometric model of the scene and by using a light probe to measure the incident illumination at the location of the synthetic objects. The global illumination solution is then composited into a photograph of the scene using the differential rendering technique. We conclude by discussing the relevance of the technique to recovering surface reflectance properties in uncontrolled lighting situations. Applications of the method include visual effects, interior design, and architectural visualization.
series other
last changed 2003/04/23 13:50

_id bsct_dervishi
id bsct_dervishi
authors Dervishi, Sokol
year 2006
title Computational Derivation of Incident Irradiance on Building Facades based on Measured Global Horizontal Irradiance Data
source Vienna University of Technology; Building Science & Technology
summary Reliable simulation of buildings' energy performance requires, amongst other things, the availability of detailed information on the magnitudes of incident solar radiation on building facades. However, the availability of the measured data concerning the incident solar radiation on vertical surfaces is restricted to only few locations. In addition, concurrent measurements of horizontal global and horizontal diffuse (or direct normal) irradiance data are likewise available only for a limited number of locations. In contrast, global horizontal irradiance data is available for many locations. This research demonstrates how to computationally derive incident irradiance values on vertical (or otherwise inclined) building surfaces from measured globalirradiance values. Given this context, three methods are considered to compute incident vertical irradiance values based on measured global horizontal irradiance data. Vertical solar irradiance measurements are described. Then, the computationally derived values are compared withcorresponding measurements. The results are evaluated based on their correlation coefficients and relative error. Finally, the application of horizontal-to- vertical irradiance mapping is demonstrated using the case of an office building at Vienna University of Technology.
keywords Horizontal and vertical irradiance, measurement and simulation, energy performance
series thesis:MSc
type normal paper
email buildingscience@tuwien.ac.at
more http://cec.tuwien.ac.at
last changed 2006/07/02 20:30

_id a15e
authors Dijkstra, J. and Timmermans, H.J.P.
year 1998
title Conjoint Analysis and Virtual Reality - A Review
source Timmermans (ed.), 4 Ih Design and Decision Support Systems in Architecture and Urban Planning Conference.
summary This paper describes a review of an ongoing research project which aims to develop a conjoint analysis and virtual reality (CA&VR) system as part of a design information system in virtual reality. The research project aims to develop a design system that can be used for interactive design and evaluation of design alternatives. A virtual environment model and dynamic virtual objects representing the different design aspects of interest can present a design. The different design aspects are called attributes. Each attribute level is a different state of the concerned virtual object. In the case of a virtual walk through a building design, the system can be viewed as a visual simulation of the environment. The CA&VR system has the potential advantage that individuals' preferences can be measured in designed hypothetical choice situations. As part of the ongoing research project, principles underlying the CA&VR system will be illustrated by simple examples. The status of this research project, both in retrospect and in prospect will be described.
series other
last changed 2003/04/23 13:50

_id ddss9818
id ddss9818
authors Dijkstra, Jan and Timmermans, Harry J.P.
year 1998
title Conjoint Analysis and Virtual Reality – a Review
source Timmermans, Harry (Ed.), Fourth Design and Decision Support Systems in Architecture and Urban Planning Maastricht, the Netherlands), ISBN 90-6814-081-7, July 26-29, 1998
summary This paper describes a review of an ongoing research project which aims to develop a conjoint analysis and virtual reality (CA&VR) system as part of a design information system in virtual reality. The research project aims to develop a design system that can be used for interactive design and evaluation of design alternatives. A virtual environment model and dynamic virtual objects representing the different design aspects of interest can present a design. The different design aspects are called attributes. Each attribute level is a different state of the concerned virtual object. In the case of a virtual walk through a building design, the system can be viewed as a visual simulation of the environment. The CA&VR system has the potential advantage that individuals’ preferences can be measured indesigned hypothetical choice situations. As part of the ongoing research project, principles underlying the CA&VR system will be illustrated by simple examples. The status of this research project, both in retrospect and in prospect will be described.
series DDSS
last changed 2003/08/07 14:36

_id 6915
authors Dorninger, Peter
year 2003
title XML Technologies and Geodata
source CORP 2003, Vienna University of Technology, 25.2.-28.2.2003 [Proceedings on CD-Rom]
summary Photogrammetry and Remote Sensing are very important methods for acquisition of geodata. During the previous decade, severalrevolutionary changes occurred in this area. Until the appearance of automated image analysis tools, it was necessary to measureselected points in the images given. At that time, it was much faster and even cheaper to get images of real world objects compared tothe time and money consuming process of manual analyses. So one tried to minimize this effort by measuring only characteristicpoints such as edges, break-lines, peaks and valleys and, for sure, a grid with a given grid step which was selected to meet the efforts.Lots of information in the images was neglected.Digital point matching algorithms and airborne laser scanning provide many new possibilities. The only restriction on spatialresolution is the one of the used sensors. Given a more precise image sensor, the matching algorithm will be able to match moresurface points; given a higher frequency laser scanner, more points can be measured of the same area. And those sensors get moreand more precise every day. Besides, those techniques allow for fast repetition which is necessary to create time series as a basis for4D modeling! However, this fact is accompanied by several problems concerning the capability of available computers. Some years ago, as the first ideas of 3D city models arose, it was very difficult to acquire the necessary data. Today the new sensors and methods have thenecessary capability, but we are not able to handle the available datasets efficiently, because of shortcomings in the past. In a time ofworld wide data exchange through the internet and global datasets, it is necessary to have efficient methods and algorithms tomanage the available data. There is a need for international, vendor independent data exchange and management standards that haveto be accepted and supported by the industry. This article is going to present several methods of data encoding using standardized data formats based on eXtensible Markup Language (XML). After an introduction to this kind of data encoding, two derived applications for management, storage and presentation of geodata are described. As XML data is written in text format, the datasets have the ability to become rather long.Therefore some promising methods to reduce the amount of data are introduced afterwards. XML documents are mainly used fordata exchange between databases. Therefore the capabilities of commonly used database systems for storage of geodata are describedin the end and current implementation results of the Institute of Photogrammetry and Remote Sensing (I.P.F.) are presented.
series other
email pdo@ipf.tuwien.ac.at
last changed 2003/03/11 19:39

_id 6252
authors Drewe, Paul
year 2003
title The Relation Between the Internet Infrastructure and the Internet Industry
source CORP 2003, Vienna University of Technology, 25.2.-28.2.2003 [Proceedings on CD-Rom]
summary The scene is set by a survey of new location factors for mobile investment in Europe, published by the European Commission in 1993. This leads to two questions the first of which concerns the exact definition of the Internet industry in order to avoid confusion. The definitional issue appears to be far from simple. The second question is about the Internet infrastructure. This infrastructure, although new and almost invisible, can nevertheless be mapped and measured with less ambiguity than the Internet industry. How to connect the two? How to establish the importance of the Internet infrastructure for the location of the Internet industry? Technological determinism and urban dissolution are debunked as myths. A conceptual innovation is called for: to conceive of the connection between infrastructure and industry as a match between networks. By way of conclusion, this match is discussed from the viewpoint of non-hub cities or regions.
series other
email p.drewe@bk.tudelft.nl
last changed 2003/03/11 19:39

_id 4743
authors Dvorak, Robert W.
year 1988
title Designing in the CAD Studio
source Computing in Design Education [ACADIA Conference Proceedings] Ann Arbor (Michigan / USA) 28-30 October 1988, pp. 123-134
summary The "CAD Studio" is one of many design options that fourth year students may select in the College of Architecture. In this electronic environment, the students analyze and present their designs totally on the computer. The vehicle used is a fifteen week architectural problem called the "Calor Redesign Project".

The "Calor" problem requires the move of a famous residence to a hot arid climate. The residence must then be redesigned in the original architect's style so the building becomes as energy efficient as possible in its new arid environment. The students are required to use as design criteria a new building program, the design philosophy of the original architect, and appropriate passive energy techniques that will reduce the thermal stress on the building. The building's energy response is measured by using an envelope energy analysis program called "Calor".

Much of the learning comes from imposing a new set of restraints on a famous piece of architecture and asking the student to redesign it. The students not only need to learn and use a different design philosophy, but also develop new skills to communicate their ideas on the computer. Both Macintosh and IBM computers are used with software ranging from Microsoft Works, Superpaint, AutoCAD, MegaCAD, Dr Halo, to Calor.

series ACADIA
last changed 1999/01/01 18:28

For more results click below:

this is page 0show page 1show page 2show page 3show page 4HOMELOGIN (you are user _anon_645297 from group guest) CUMINCAD Papers Powered by SciX Open Publishing Services 1.002