CumInCAD is a Cumulative Index about publications in Computer Aided Architectural Design
supported by the sibling associations ACADIA, CAADRIA, eCAADe, SIGraDi, ASCAAD and CAAD futures

PDF papers
References

Hits 1 to 20 of 27

_id acadia17_110
id acadia17_110
authors Arnowitz, Ethan; Morse, Christopher; Greenberg, Donald P.
year 2017
title vSpline: Physical Design and the Perception of Scale in Virtual Reality
source ACADIA 2017: DISCIPLINES & DISRUPTION [Proceedings of the 37th Annual Conference of the Association for Computer Aided Design in Architecture (ACADIA) ISBN 978-0-692-96506-1] Cambridge, MA 2-4 November, 2017), pp. 110-117
summary Virtual reality provides a heightened sense of immersion and spatial awareness that provides a unique opportunity for designers to perceive and evaluate scale and space. At the same time, traditional sketches and small-size physical models provide tactile feedback that allow designers to create, comprehend, and explore complex geometric relationships. Through the development of vSpline, a modeling application for virtual reality, we explore the potential for design within a virtual spatial environment to blur the boundaries between digital and physical stages of design, and seek to combine the best of both virtual and analog worlds. By using spline-based closed meshes created directly in three-dimensional space, our software provides the capabilities to design, modify, and save the information in the virtual world and seamlessly convert the data to evaluate the printing of 3D physical models. We identify and discuss important questions that arise regarding relationships of perception of scale, digital-to-physical domains, and new methods of input and manipulation within a 3D immersive space.
keywords design methods; information processing; hci; vr; ar; mixed reality; digital craft; manual craft
series ACADIA
email eha38@cornell.edu
last changed 2017/10/17 09:12

_id a426
authors Barsky, Brian A. and Greenberg, Donald P.
year 1982
title Interactive Surface Representation System Using a B-spline Formulation with Interpolation Capability
source computer Aided Design. July, 1982. vol. 14: pp. 187-194 : col.ill. includes bibliography
summary An interactive surface representation system is described which uses a parametric uniform bicubic B-spline formulation which can describe a surface initially defined to interpolate a specified network of points
keywords CAD, curved surfaces, computational geometry, interpolation, B-splines
series CADline
last changed 2003/06/02 11:58

_id a9b1
authors Cohen, M.F., Greenberg, D.P. and Immel, D.S. (et al)
year 1986
title An Efficient Radiosity Approach for Realistic Image Synthesis
source IEEE Computer Graphics and Applications March, 1986. vol. 6: pp. 26-35 : col. ill. includes bibliography.
summary The radiosity method models the interaction of light between diffusely reflecting surfaces and accurately predicts the global illumination effects. Procedures are now available to simulate complex environments including occluded and textured surfaces. For accurate rendering, the environment must be discretized into a fine mesh, particularly in areas of high intensity gradients. The interdependence between surfaces implies solution techniques which are computationally intractable. This article describes new procedures to predict the global illumination function without excessive computational expense. Statistics indicate the enormous potential of this approach for realistic image synthesis, particularly for dynamic images of static environments
keywords computer graphics, radiosity, rendering, algorithms
series CADline
last changed 2003/06/02 12:42

_id c341
authors Cohen, Michael F. and Greenberg, Donald P.
year 1985
title The Hemi-Cube: A Radiosity Solution for Complex Environments
source SIGGRAPH '85 conference proceedings. July, 1985. vol. 19 ; no. 3: pp. 31-39 : ill. (some col.). includes bibliography
summary This paper presents a comprehensive method to calculate object to object diffuse reflections within complex environments containing hidden surfaces and shadows. In essence, each object in the environment is treated as a secondary light source. The method provides an accurate representation of the 'diffuse' and 'ambient' terms found in typical image synthesis algorithms. The phenomena of 'color bleeding' from one surface to another, shading within shadow envelopes, and penumbras along shadow boundaries are accurately reproduced. Additional advantages result because computations are independent of viewer position. This allows the efficient rendering of multiple views of the same scene for dynamic sequences. Light sources can be modulated and object reflectivities can be changed, with minimal extra computation. The procedures extend the radiosity method beyond the bounds previously imposed
keywords hidden surfaces, shadowing, computer graphics, geometric modeling, radiosity
series CADline
last changed 2003/06/02 11:58

_id acadia10_258
id acadia10_258
authors Doumpioti, Christina; Greenberg, Evan L.; Karatzas, Konstantinos
year 2010
title Embedded Intelligence: Material Responsiveness in Fašade Systems
source ACADIA 10: LIFE in:formation, On Responsive Information and Variations in Architecture [Proceedings of the 30th Annual Conference of the Association for Computer Aided Design in Architecture (ACADIA) ISBN 978-1-4507-3471-4] New York 21-24 October, 2010), pp. 258-262
summary This paper presents recent research for new mechanical systems and fašade designs that are able to respond to environmental changes through local interactions, inspired by biological systems. These are based on a model of distributed intelligence founded on insect and animal collectives, from which intelligent behavior emerges through simple local associations. Biological collective systems integrate material form and responsiveness and have the potential to inform new architectural and engineering strategies. The proposed fašade system uses integrated sensors and actuators that moderate their local environments through simple interactions with their immediate neighbors. Computational techniques coupled to manufacturing methods and material logics create an integral design framework leading to heterogeneous environmental and structural conditions, producing local responses to environmental stimuli, and ultimately, effective performance of the whole system.
series ACADIA
type normal paper
email evanlgreenberg@gmail.com
last changed 2010/11/10 06:27

_id ecaade2014_078
id ecaade2014_078
authors Elif Erdine and Evan Greenberg
year 2014
title Computing the Urban Block - Local Climate Analysis and Design Strategies
source Thompson, Emine Mine (ed.), Fusion - Proceedings of the 32nd eCAADe Conference - Volume 1, Department of Architecture and Built Environment, Faculty of Engineering and Environment, Newcastle upon Tyne, England, UK, 10-12 September 2014, pp. 145-152
wos WOS:000361384700014
summary This research develops a method for the analysis, integration and visualisation of climatic parameters in a dense urban block. In order to test this method, a typical urban block in Manila, Philippines, is investigated and results are represented through computational simulation. The translation of latent spatial qualities into visual data with common tools and techniques allows designers to gain an understanding of how to design local microclimates, and inhabitants to gain greater knowledge of the environment. In this regard, this research proposes, contrary to conventional methodologies, the use of analytical tools as the impetus to, rather than the outcome of, architectural design.
keywords Computation; urban design; environmental analysis; computational fluid dynamics; simulation
series eCAADe
email evan.greenberg@aaschool.ac.uk
last changed 2016/05/16 09:08

_id ga9811
id ga9811
authors Feuerstein, Penny L.
year 1998
title Collage, Technology, and Creative Process
source International Conference on Generative Art
summary Since the turn of the twentieth century artists have been using collage to suggest new realities and changing concepts of time. Appropriation and simulation can be found in the earliest recycled scraps in Cubist collages. Picasso and Braque liberated the art world with cubism, which integrated all planes and surfaces of the artists' subjects and combined them into a new, radical form. The computer is a natural extension of their work on collage. The identifying characteristics of the computer are integration, simultaneity and evolution which are inherent in collage. Further, the computer is about "converting information". There is something very facinating about scanning an object into the computer, creating a texture brush and drawing with the object's texture. It is as if the computer not only integrates information but different levels of awareness as well. In the act of converting the object from atoms to bits the object is portrayed at the same conscious level as the spiritual act of drawing. The speed and malleability of transforming an image on the computer can be compared to the speed and malleability of thought processes of the mind. David Salle said, "one of the impulses in new art is the desire to be a mutant, whether it involves artificial intelligence, gender or robotic parts. It is about the desire to get outside the self and the desire to trandscend one's place." I use the computer to transcend, to work in different levels of awareness at the same time - the spiritual and the physical. In the creative process of working with computer, many new images are generated from previous ones. An image can be processed in unlimited ways without degradation of information. There is no concept of original and copy. The computer alters the image and changes it back to its original in seconds. Each image is not a fixed object in time, but the result of dynamic aspects which are acquired from previous works and each new moment. In this way, using the computer to assist the mind in the creative processes of making art mirrors the changing concepts of time, space, and reality that have evolved as the twentieth century has progressed. Nineteenth-century concepts of the monolithic truth have been replaced with dualism and pluralism. In other words, the objective world independent of the observer, that assumes the mind is separate from the body, has been replaced with the mind and body as inseparable, connected to the objective world through our perception and awareness. Marshall Mcluhan said, "All media as extensions of ourselves serve to provide new transforming vision and awareness." The computer can bring such complexities and at the same time be very calming because it can be ultrafocused, promoting a higher level of awareness where life can be experienced more vividly. Nicholas Negroponte pointed out that "we are passing into a post information age, often having an audience of just one." By using the computer to juxtapose disparate elements, I create an impossible coherence, a hodgepodge of imagery not wholly illusory. Interestingly, what separates the elements also joins them. Clement Greenberg states that "the collage medium has played a pivotal role in twentieth century painting and sculpture"(1) Perspective, developed by the renaissance archetect Alberti, echoed the optically perceived world as reality was replaced with Cubism. Cubism brought about the destruction of the illusionist means and effects that had characterized Western painting since the fifteenth century.(2) Clement Greenberg describes the way in which physical and spiritual realities are combined in cubist collages. "By pasting a piece of newspaper lettering to the canvas one called attention to the physical reality of the work of art and made that reality the same as the art."(3) Before I discuss some of the concepts that relate collage to working with computer, I would like to define some of the theories behind them. The French word collage means pasting, or gluing. Today the concept may include all forms of composite art and processes of photomontage and assemblage. In the Foreword on Katherine Hoffman's book on Collage Kim Levin writes: "This technique - which takes bits and pieces out of context to patch them into new contexts keeps changeng, adapting to various styles and concerns. And it's perfectly apt that interpretations of collage have varied according to the intellectual inquiries of the time. From our vantage point near the end of the century we can now begin to see that collage has all along carried postmodern genes."(4) Computer, on the other hand is not another medium. It is a visual tool that may be used in the creative process. Patrick D. Prince's views are," Computer art is not concrete. There is no artifact in digital art. The images exist in the computer's memory and can be viewed on a monitor: they are pure visual information."(5) In this way it relates more to conceptual art such as performance art. Timothy Binkley explains that,"I believe we will find the concept of the computer as a medium to be more misleading than useful. Computer art will be better understood and more readily accepted by a skeptical artworld if we acknowledge how different it is from traditional tools. The computer is an extension of the mind, not of the hand or eye,and ,unlike cinema or photography, it does not simply add a new medium to the artist's repertoire, based on a new technology.(6) Conceptual art marked a watershed between the progress of modern art and the pluralism of postmodernism(7) " Once the art is comes out of the computer, it can take a variety of forms or be used with many different media. The artist does not have to write his/her own program to be creative with the computer. The work may have the thumbprint of a specific program, but the creative possibilities are up to the artist. Computer artist John Pearson feels that,"One cannot overlook the fact that no matter how technically interesting the artwork is it has to withstand analysis. Only the creative imagination of the artist, cultivated from a solid conceptual base and tempered by a sophisticsated visual sensitivity, can develop and resolve the problems of art."(8) The artist has to be even more focused and selective by using the computer in the creative process because of the multitude of options it creates and its generative qualities.
series other
email pennyf@mcs.net
more http://www.generativeart.com/
last changed 2003/08/07 15:25

_id a081
authors Greenberg S., Roseman M. and Webster, D.
year 1992
title Issues and Experiences Designing and Implementing Two Group Drawing Tools
source Readings in Groupware, 609-620
summary Groupware designers are now developing multi-user equivalents of popular paint and draw applications. Their job is not an easy one. First, human factors issues peculiar to group interaction appear that, if ignored, seriously limit the usability of the group tool. Second, implementation is fraught with considerable hurdles. This paper describes the issues and experiences we have met and handled in the design of two systems supporting remote real time group interaction: GroupSketch, a multi-user sketchpad; and GroupDraw, an object-based multi-user draw package. On the human factors side, we summarize empirically-derived design principles that we believe are critical to building useful and usable collaborative drawing tools. On the implementation side, we describe our experiences with replicated versus centralized architectures, schemes for participant registration, multiple cursors, network requirements, and the structure of the drawing primitives.
series other
last changed 2003/04/23 13:50

_id 68aa
authors Greenberg, Donald P.
year 1986
title Computer Graphics and Visualization
source Computer-Aided Architectural Design Futures [CAAD Futures Conference Proceedings / ISBN 0-408-05300-3] Delft (The Netherlands), 18-19 September 1985, pp. 63-67
summary The field of computer graphics has made enormous progress during the past decade. It is rapidly approaching the time when we will be able to create images of such realism that it will be possible to 'walk through' nonexistent spaces and to evaluate their aesthetic quality based on the simulations. In this chapter we wish to document the historical development of computer graphics image creation and describe some techniques which are currently being developed. We will try to explain some pilot projects that we are just beginning to undertake at the Program of Computer Graphics and the Center for Theory and Simulation in Science and Engineering at Cornell University.
series CAAD Futures
last changed 1999/04/03 15:58

_id acadia08_316
id acadia08_316
authors Greenberg, Evan
year 2008
title Observation, Analysis, and Computation of Branching Patterns in Natural Systems
source Silicon + Skin: Biological Processes and Computation, [Proceedings of the 28th Annual Conference of the Association for Computer Aided Design in Architecture (ACADIA) / ISBN 978-0-9789463-4-0] Minneapolis 16-19 October 2008, 316-323
summary Branching occurs in natural systems for functional reasons. However, the branching logic for each specific system is quite different due to environmental and mathematical factors. In the computation of branching systems, these mathematical factors can be incorporated quite easily into the coding of each system. However, it is the environmental components that must be given further consideration in the simulation of these natural systems. Through the engine of genetic algorithms based on evolutionary developmental theory, the specific logics observed and analyzed in branching patterns of river systems, trees, and insect tracheae can be simulated and optimized in a digital environment.
keywords Algorithm; Branching; Emergence; Genetic; Simulation
series ACADIA
last changed 2009/02/26 07:39

_id b0d2
authors Greenberg, S. and Roseman, M.
year 1998
title Groupware Toolkits for Synchronous Work
source Beaudouin-Lafon, M. (ed.) Computer - Supported Cooperative Work, Trends in Software Series, John Wiley
summary Groupware toolkits let developers build applications for synchronous and distributed computer-based conferencing. This chapter describes four components that we believe toolkits must provide. A run-time architecture automatically manages the creation, interconnection, and communications of both centralized and distributed processes that comprise conference sessions. A set of groupware programming abstractions allows developers to control the behaviour of distributed processes, to take action on state changes, and to share relevant data. Groupware widgets let interface features of value to conference participants be added easily to groupware applications. Session managers let people create and manage their meetings and are built by developers to accommodate the group's working style. We illustrate the many ways these components can be designed by drawing on our own experiences with GroupKit, and by reviewing approaches taken by other toolkit developers.
series other
last changed 2003/04/23 13:50

_id e65f
authors Haines, Eric A. and Greenberg, Donald P.
year 1986
title The Light Buffer: A Shadow-Testing Accelerator
source IEEE Computer Graphics and Applications. September, 1986. vol. 6: pp. 6-16 : col. ill. includes bibliography
summary In one area of computer graphics, realistic image synthesis, the ultimate goal is to produce a picture indistinguishable from a photograph of a real environment. A particularly powerful technique for simulating light reflection - an important element in creating this realism - is called ray tracing. This method produces images of excellent quality, but suffers from lengthy computation time that limits its practical use. This article presents a new method to reduce shadow testing time during ray tracing. The technique involves generating light buffers, each of which partitions the environment with respect to an individual light source. These partition descriptions are then used during shadow testing to quickly determine a small subset of objects that may have to be tested for intersection. The results of timing tests illustrate the beneficial performance of these techniques. The tests compare the standard ray-tracing algorithm to light buffers of varying resolution
keywords realism, synthesis, ray tracing, algorithms, computer graphics, shadowing
series CADline
last changed 2003/06/02 11:58

_id d37c
authors Haines, Eric A. and Greenberg, Donald P.
year 1986
title The Light Buffer: A Ray Tracer Shadow Testing Accelerator
source Comput. Graph. and App., vol. 6, no. 9, pp. 6-16, IEEE, Sept. 1986
summary The basic ideas presented are classifying objects from the light's viewpoint, and caching shadowing objects. The classification scheme uses a modified z-buffer to create lists of objects in sorted order for each "pixel" the light sees and determining depths beyond which no light passes. The other technique presented is caching the object that was last intersected by a shadow ray and immediately testing this object for the next shadow ray for the same light at the same location in the ray tree. Shadow caching is simple and applicable to almost any ray tracer. Dieter Bayer implemented the light buffer for POV-Ray.
series other
email erich@acm.org
last changed 2003/04/23 13:14

_id e663
authors Hanna, Samir L., Abel, John F. and Greenberg, Donald P.
year 1983
title Intersection of Parametric Surfaces by Means of Look-Up Tables
source IEEE Computer Graphics and Applications. October, 1983. vol. 3: pp. 39-48 : ill. (some col.). includes bibliography
summary The intersection curve between parametric surfaces is important in such computer-aided design and manufacturing functions as shape design, analysis of drawing, design of fillets, and computation of numerically controlled tooling paths. The algorithm presented here provides an adequately accurate mathematical representation of the intersection curve. It also provides a database to simplify such operations as hidden-surface removal, surface rendering, profile identification, and interference or clearance computations. Further the algorithm facilitates creating and changing a finite element mesh in the intersection region
keywords parametrization, curves, curved surfaces, algorithms, intersection
series CADline
last changed 2003/06/02 11:58

_id 634c
authors Joblove, G.H. and Greenberg, D.
year 1978
title Color spaces for computer graphics
source Computer Graphics, vol. 12, pp. 20-25, August 1978
summary Normal human color perception is a product of three independent sensory systems. By mirroring this mechanism, full-color display devices create colors as mixtures of three primaries. Any displayable color can be described by the corresponding values of these primaries. Frequently it is more convenient to define various other color spaces, or coordinate systems, for color representation or manipulation. Several such color spaces are presented which are suitable for applications involving user specification of color, along with the defining equations and illustrations. The use of special color spaces for particular kinds of color computations is discussed.
series journal paper
last changed 2003/04/23 13:50

_id 4966
authors Kaplan, Michael and Greenberg, Donald P.
year 1979
title Parallel Processing Techniques for Hidden Surface Removal
source SIGGRAPH '79 Conference Proceedings. 1979. vol. 13 ; no. 2: pp. 300-307 : ill. includes bibliography
summary Previous work in the hidden-surface problem has revealed two key concepts. First, the removal of non-visible surfaces is essentially a sorting problem. Second, some form of coherence is essential for the efficient solution of this problem. In order to provide real-time simulations, it is not only the amount of sorting which must be reduced, but the total time required for computation. One potentially economic strategy to attain this goal is the use of parallel processor systems. This approach implies that the computational time will no longer be dependent on the total amount of sorting, but more on the appropriate division of responsibility. This paper investigates two existing algorithmic approaches to the hidden-surface problem with a view towards their applicability to implementation on a parallel machine organization. In particular, the statistical results of a parallel processor implementation indicate the difficulties stemming from a loss of coherence and imply potentially important design criteria for a parallel configuration
keywords computer graphics, rendering, display, hidden surfaces, parallel processing, algorithms
series CADline
last changed 2003/06/02 11:58

_id c6a9
authors Kay, Douglas Scott and Greenberg, Donald P.
year 1979
title Transparency for Computer Synthesized Images
source SIGGRAPH '79 Conference Proceedings. August, 1979. vol. 13 ; no. 2: pp. 158-164 : ill. (some col.). includes bibliography
summary Simple transparency algorithms which assume a linear transparency over an entire surface are the type most often employed to produce computer synthesized images of transparent objects with curved surfaces. Although most of the images created with these algorithms do give the impression of transparency, they usually do not look realistic. One of the most serious problems is that the intensity of the light that is transmitted through the objects is generally not proportional to the amount of material through which it must pass. Another problem is that the image seen behind the objects is not distorted as would naturally occur when the light is refracted as it passes through a material of different density. Use of a non-linear transparency algorithm can provide a great improvement in the realism of an image at a small additional cost. Making the transparency proportional to the normal to the surface causes it to decrease towards the edges of the surface where the path of the light through the object is longer. The exact simulation of refraction, however, requires that each sight ray be individually traced from the observer, through the picture plane and through each transparent object until an opaque surface is intersected. Since the direction of the ray would change as each material of differing optical density was entered, the hidden surface calculations required would be very time consuming. However, if a few assumptions are made about the geometry of each object and about the conditions under which they are viewed, a much simpler algorithm can be used to approximate the refractive effect. This method proceeds in a back-to-front order, mapping the current background image onto the next surface, until all surfaces have been considered
keywords computer graphics, shading, transformation, display, visualization, algorithms, realism
series CADline
last changed 2003/06/02 11:58

_id aba4
authors Lischinski, D. Tampieri, F. and Greenberg, D.P.
year 1992
title Discontinuity Meshing for Accurate Radiosity
source IEEE Computer Graphics & Applications, November 1992, pp.25-38
summary We discuss the problem of accurately computing the illumination of a diffuse polyhedral environment due to an area light source. We show how umbra and penumbra boundaries and other illumination details correspond to discontinuities in the radiance function and its derivatives. The shape, location, and order of these discontinuities is determined by the geometry of the light sources and obstacles in the environment. We describe an object-space algorithm that accurately reproduces the radiance across a surface by constructing a discontinuity mesh that explicitly represents various discontinuities in the radiance function as boundaries between mesh elements. A piecewise quadratic interpolant is used to approximate the radiance function, preserving the discontinuities associated with the edges in the mesh. This algorithm can be used in the framework of a progressive refinement radiosity system to solve the diffuse global illumination problem. Results produced by the new method are compared with ones obtained using a standard radiosity system.
series journal paper
last changed 2003/04/23 13:50

_id a91c
authors Meyer, G., Rushmeier, H., Cohen, M., Greenberg, D. and Torrace, K.
year 1986
title An Experimental Evaluation of Computer Graphics Imagery
source ACM Transactions on Graphics, 5, No. 1
summary Accurate simulation of light propagation within an environment and perceptually based imaging techniques are necessary for the creation of realistic images. A physical experiment that verifies the simulation of reflected light intensities for diffuse environments was conducted. Measurements of radiant energy flux densities are compared with predictions using the radiosity method for those physical environments. By using color science procedures the results of the light model simulation are then transformed to produce a color television image. The final image compares favorably with the original physical model. The experiment indicates that, when the physical model and the simulation were viewed through a view camera, subjects could not distinguish between them. The results and comparison of both test procedures are presented within this paper.
series journal paper
last changed 2003/04/23 13:50

_id f851
authors Ramasubramanian, Pattanaik and Greenberg
year 1999
title A perceptually based physical error metric for realistic image synthesis
source Alyn Rockwood, editor, SIGGRAPH 99 Conference Proceedings, Annual Conference Series, ACM SIGGRAPH, Addison Wesley
summary We introduce a new concept for accelerating realistic image synthesis algorithms. At the core of this procedure is a novel physical error metric that correctly predicts the perceptual threshold for detecting artifacts in scene features. Built into this metric is a computational model of the human visual system's loss of sensitivity at high background illumination levels, high spatial frequencies, and high contrast levels (visual masking). An important feature of our model is that it handles the luminance-dependent processing and spatiallydependent processing independently. This allows us to precompute the expensive spatially-dependent component, making our model extremely efficient. We illustrate the utility of our procedure with global illumination algorithms used for realistic image synthesis. The expense of global illumination computations is many orders of magnitude higher than the expense of direct illumination computations and can greatly benefit by applying our perceptually based technique. Results show our method preserves visual quality while achieving significant computational gains in areas of images with high frequency texture patterns, geometric details, and lighting variations.
series other
last changed 2003/04/23 13:50

For more results click below:

this is page 0show page 1HOMELOGIN (you are user _anon_638292 from group guest) CUMINCAD Papers Powered by SciX Open Publishing Services 1.002