CumInCAD is a Cumulative Index about publications in Computer Aided Architectural Design
supported by the sibling associations ACADIA, CAADRIA, eCAADe, SIGraDi, ASCAAD and CAAD futures

PDF papers
References

Hits 1 to 20 of 23

_id ce52
authors Abram, Greg, Weslover, Lee and Whitted, Turner
year 1985
title Efficient Alias-Free Rendering using Bit-masks and Look-up Tables
source SIGGRAPH '85 Conference Proceedings. July, 1985. vol. 19 ; no. 3: pp. 53-59 : ill. (some col.). includes bibliography
summary The authors demonstrate methods of rendering alias-free synthetic images using a precomputed convolution integral. The method is based on the observation that a visible polygon fragment's contribution to an image is solely a function of its position and shape, and that within a reasonable level of accuracy, a limited number of shapes represent the majority of cases encountered in images commonly rendered. The basic technique has been applied to several different rendering algorithms. A version of the newly non-uniform sampling technique implemented in the same program but with different tables values was introduced
keywords algorithms, computer graphics, anti-aliasing
series CADline
last changed 2003/06/02 11:58

_id 8ccf
authors Alvarez, Darío
year 2000
title Atravesando el portal digital: la novísima Arquitectura de los tiempos de la Internet. - (Crossing the Digital Gateway: The Latest Architecture of the Times of the Internet)
source SIGraDi’2000 - Construindo (n)o espacio digital (constructing the digital Space) [4th SIGRADI Conference Proceedings / ISBN 85-88027-02-X] Rio de Janeiro (Brazil) 25-28 september 2000, pp. 30-33
summary Our architectonical environment is based on the material concept - entity whose control marks the relevance of the XX century: the atom. Across the threshold of the XXI century a new virtual entity - concept: the bit, spreads to became the basic unit of power - control - production, being its more dynamic evidence the phenomenon known as Internet, establishing complex relationships with groups constituted in the net like Virtual Communities, outlining metaphors that involve Urbanists and Architects inviting them as protagonist. Against this newest reality the Architect should change his vision of the typical CAAD work in relative isolation with his computer, until crossing the doors of the “digital reality”; we search to show the contemporary Architect as a manager coordinating multiple resources with different importance: into the alternative of building digital realities, inviting the architectonical students to integrated this Virtual Communities or conform his owns.
series SIGRADI
email dalvarez@posta.arq.ucv.ve, alvarezd@camelot.rect.ucv.ve
last changed 2016/03/10 08:47

_id acadia18_36
id acadia18_36
authors Austin, Matthew; Matthews, Linda
year 2018
title Drawing Imprecision. The digital drawing as bits and pixels
source ACADIA // 2018: Recalibration. On imprecisionand infidelity. [Proceedings of the 38th Annual Conference of the Association for Computer Aided Design in Architecture (ACADIA) ISBN 978-0-692-17729-7] Mexico City, Mexico 18-20 October, 2018, pp. 36-45
summary This paper explores the consequences of digitizing the architectural drawing. It argues that the fundamental unit of drawing has shifted from “the line” to an interactive partnership between bits and pixels. It also reveals how the developmental focus of imaging technology has been to synthesize and imitate the line using bits and pixels, rather than to explore their innate productive value and aesthetic potential.

Referring to variations of the architectural drawing from a domestic typology, the paper uses high-precision digital tools tailored to quantitative image analysis and digital tools that sit outside the remit of architectural production, such as word processing, to present a new range of drawing techniques. By applying a series of traditional analytical procedures to the image, it reveals how these maneuvers can interrogate and dislocate any predetermined formal normalization.

The paper reveals that the interdisciplinary repurposing of precise digital toolsets therefore has explicit disciplinary consequences. These arise as a direct result of the recalibration of scale, the liberation of the bit’s representational capacity, and the pixel’s properties of color and brightness. It concludes by proposing that deliberate instances of translational imprecision are highly productive, because by liberating the fundamental qualitative properties of the fundamental digital units, these techniques shift the disciplinary agency of the architectural drawing

keywords full paper, imprecision, representation, recalibration, theory, glitch aesthetics, algorithmic design, process
series ACADIA
type paper
email matthew.austin@uts.edu.au
last changed 2019/01/07 11:21

_id 536e
authors Bouman, Ole
year 1997
title RealSpace in QuickTimes: architecture and digitization
source Rotterdam: Nai Publishers
summary Time and space, drastically compressed by the computer, have become interchangeable. Time is compressed in that once everything has been reduced to 'bits' of information, it becomes simultaneously accessible. Space is compressed in that once everything has been reduced to 'bits' of information, it can be conveyed from A to B with the speed of light. As a result of digitization, everything is in the here and now. Before very long, the whole world will be on disk. Salvation is but a modem away. The digitization process is often seen in terms of (information) technology. That is to say, one hears a lot of talk about the digital media, about computer hardware, about the modem, mobile phone, dictaphone, remote control, buzzer, data glove and the cable or satellite links in between. Besides, our heads are spinning from the progress made in the field of software, in which multimedia applications, with their integration of text, image and sound, especially attract our attention. But digitization is not just a question of technology, it also involves a cultural reorganization. The question is not just what the cultural implications of digitization will be, but also why our culture should give rise to digitization in the first place. Culture is not simply a function of technology; the reverse is surely also true. Anyone who thinks about cultural implications, is interested in the effects of the computer. And indeed, those effects are overwhelming, providing enough material for endless speculation. The digital paradigm will entail a new image of humankind and a further dilution of the notion of social perfectibility; it will create new notions of time and space, a new concept of cause and effect and of hierarchy, a different sort of public sphere, a new view of matter, and so on. In the process it will indubitably alter our environment. Offices, shopping centres, dockyards, schools, hospitals, prisons, cultural institutions, even the private domain of the home: all the familiar design types will be up for review. Fascinated, we watch how the new wave accelerates the process of social change. The most popular sport nowadays is 'surfing' - because everyone is keen to display their grasp of dirty realism. But there is another way of looking at it: under what sort of circumstances is the process of digitization actually taking place? What conditions do we provide that enable technology to exert the influence it does? This is a perspective that leaves room for individual and collective responsibility. Technology is not some inevitable process sweeping history along in a dynamics of its own. Rather, it is the result of choices we ourselves make and these choices can be debated in a way that is rarely done at present: digitization thanks to or in spite of human culture, that is the question. In addition to the distinction between culture as the cause or the effect of digitization, there are a number of other distinctions that are accentuated by the computer. The best known and most widely reported is the generation gap. It is certainly stretching things a bit to write off everybody over the age of 35, as sometimes happens, but there is no getting around the fact that for a large group of people digitization simply does not exist. Anyone who has been in the bit business for a few years can't help noticing that mum and dad are living in a different place altogether. (But they, at least, still have a sense of place!) In addition to this, it is gradually becoming clear that the age-old distinction between market and individual interests are still relevant in the digital era. On the one hand, the advance of cybernetics is determined by the laws of the marketplace which this capital-intensive industry must satisfy. Increased efficiency, labour productivity and cost-effectiveness play a leading role. The consumer market is chiefly interested in what is 'marketable': info- and edutainment. On the other hand, an increasing number of people are not prepared to wait for what the market has to offer them. They set to work on their own, appropriate networks and software programs, create their own domains in cyberspace, domains that are free from the principle whereby the computer simply reproduces the old world, only faster and better. Here it is possible to create a different world, one that has never existed before. One, in which the Other finds a place. The computer works out a new paradigm for these creative spirits. In all these distinctions, architecture plays a key role. Owing to its many-sidedness, it excludes nothing and no one in advance. It is faced with the prospect of historic changes yet it has also created the preconditions for a digital culture. It is geared to the future, but has had plenty of experience with eternity. Owing to its status as the most expensive of arts, it is bound hand and foot to the laws of the marketplace. Yet it retains its capacity to provide scope for creativity and innovation, a margin of action that is free from standardization and regulation. The aim of RealSpace in QuickTimes is to show that the discipline of designing buildings, cities and landscapes is not only a exemplary illustration of the digital era but that it also provides scope for both collective and individual activity. It is not just architecture's charter that has been changed by the computer, but also its mandate. RealSpace in QuickTimes consists of an exhibition and an essay.
series other
email oleb@xs4all.nl
last changed 2003/04/23 13:14

_id 22fd
authors Chou, Wen Huey
year 1996
title An Empirical Study of 2d Static Computer Art: An Investigation of How Contemporary Computer Art is Affected by Media
source CAADRIA ‘96 [Proceedings of The First Conference on Computer Aided Architectural Design Research in Asia / ISBN 9627-75-703-9] Hong Kong (Hong Kong) 25-27 April 1996, pp. 81-89
summary We are in the act of forming the Technology & Electronics society: a society which cultural, psychological, social and economical facets take shape according to the development of technology and electronics, specially in the fields of computer and information. The influence of these mighty functions, produced by the bit, is prevalent in all the science and social courses; in fact, it has already invaded the artistic world. It did not take long after the birth of the computer for it to become the new tool for artistic production; it revolutionized the traditional production habits, production procedures, methods of expression and the work place in artistic creativity, thus bringing the tides of change in the artistic context and attitude towards the study of the Arts.
series CAADRIA
last changed 1999/01/31 14:00

_id 01bb
authors Er, M.C.
year 1981
title A Representation Approach to the Tower of Hanoi Problem
source 22 p. : ill. Wollongong: Department of Computing Science, University of Wollongong, August, 1981. includes bibliography
summary By making the moving direction of each disc explicit in the representation, a bit-string so constructed can be used to drive the Tower of Hanoi algorithm. The behavior of disc moves is further analyzed based on the bit-string representation. It has been shown that the bit-string for moving n discs can be used to generate successively the Gray codes of n bits
keywords representation, programming, combinatorics, algorithms, recursion
series CADline
last changed 2003/06/02 11:58

_id e191
authors Fuchs, Henry, Goldfeather, Jack and Hultquist, Jeff P.
year 1985
title Fast Spheres, Shadows, Textures, Transparencies, and Image Enhancements in Pixel-Planes
source SIGGRAPH '85 Conference Proceedings. July, 1985. 1985. vol. 19 ; no. 3: pp. 111-120 : ill. includes bibliography
summary Pixel-planes is a logic-enhanced memory system for raster graphics and imaging. Although each pixel-memory is enhanced with a one-bit ALU, the system's real power comes from a tree of one-bit address that can evaluate linear expressions Ax + By + C for every pixel (x,y) simultaneously, as fast as the ALUs and the memory circuits can accept the results. The development of a variety of algorithms that exploit this fast linear expression evaluation capability has started. The paper reports some of those results. Illustrated in this paper is a sample image from a small working prototype of the Pixel- planes hardware and a variety of images from simulations of a full-scale system. Timing estimates indicate that 30,000 smooth shaded triangles can be generated per second, or 21, 000 smooth-shaded and shadowed triangles can be generated per second, or over 25,000 shaded spheres can be generated per second. Image-enhancement by adaptive histogram equalization can be performed within 4 seconds on a 512 x 512 image
keywords shadowing, image processing, algorithms, polygons, clipping, computer graphics, technology, hardware
series CADline
last changed 2003/06/02 08:24

_id sigradi2006_e090b
id sigradi2006_e090b
authors Hanna, Sean and Turner, Alasdair
year 2006
title Teaching parametric design in code and construction
source SIGraDi 2006 - [Proceedings of the 10th Iberoamerican Congress of Digital Graphics] Santiago de Chile - Chile 21-23 November 2006, pp. 158-161
summary Automated manufacturing processes with the ability to translate digital models into physical form promise both an increase in the complexity of what can be built, and through rapid prototyping, a possibility to experiment easily with tangible examples of the evolving design. The increasing literacy of designers in computer languages, on the other hand, offers a new range of techniques through which the models themselves might be generated. This paper reviews the results of an integrated parametric modelling and digital manufacturing workshop combining participants with a background in computer programming with those with a background in fabrication. Its aim was both to encourage collaboration in a domain that overlaps both backgrounds, as well as to explore the ways in which the two working methods naturally extend the boundaries of traditional parametric design. The types of projects chosen by the students, the working methods adopted and progress made will be discussed in light of future educational possibilities, and of the future direction of parametric tools themselves. Where standard CAD constructs isolated geometric primitives, parametric models allow the user to set up a hierarchy of relationships, deferring such details as specific dimension and sometimes quantity to a later point. Usually these are captured by a geometric schema. Many such relationships in real design however, can not be defined in terms of geometry alone. Logical operations, environmental effects such as lighting and air flow, the behaviour of people and the dynamic behaviour of materials are all essential design parameters that require other methods of definition, including the algorithm. It has been our position that the skills of the programmer are necessary in the future of design. Bentley’s Generative Components software was used as the primary vehicle for the workshop design projects. Built within the familiar Microstation framework, it enables the construction of a parametric model at a range of different interfaces, from purely graphic through to entirely code based, thus allowing the manipulation of such non-geometric, algorithmic relationships as described above. Two-dimensional laser cutting was the primary fabrication method, allowing for rapid manufacturing, and in some cases iterative physical testing. The two technologies have led in the workshop to working methods that extend the geometric schema: the first, by forcing an explicit understanding of design as procedural, and the second by encouraging physical experimentation and optimisation. The resulting projects have tended to focus on responsiveness to conditions either coded or incorporated into experimental loop. Examples will be discussed. While programming languages and geometry are universal in intent, their constraints on the design process were still notable. The default data structures of computer languages (in particular the rectangular array) replace one schema limitation with another. The indexing of data in this way is conceptually hard-wired into much of our thinking both in CAD and in code. Thankfully this can be overcome with a bit of programming, but the number of projects which have required this suggests that more intuitive, or spatial methods of data access might be developed in the future.
keywords generative design; parametric model; teaching
series SIGRADI
email s.hanna@cs.ucl.ac.uk
last changed 2016/03/10 08:53

_id ga9905
id ga9905
authors Maldonado, Gabriel
year 1999
title Generating digital music with DirectCsound & VMCI
source International Conference on Generative Art
summary This paper concerns two computer-music programs: DirectCsound, a real-time version of the well-known sound-synthesis language Csound, and VMCI, a GUI program that allow the user to control DirectCsound in real-time. DirectCsound allows a total live control of the synthesis process. The aim of DirectCsound project is to give the user a powerful and low-cost workstation in order to produce new sounds and new music interactively, and to make live performances with the computer. Try to imagine DirectCsound being a universal musical instrument. VMCI (Virtual Midi Control Interface) is a program which allows to send any kind of MIDI message by means of the mouse and the alpha-numeric keyboard. It has been thought to be used together with DirectCsound, but it can also be used to control any MIDI instrument. It provides several panels with virtual sliders, virtual joysticks and virtual-piano keyboard. The newer version of the program (VMCI Plus 2.0) allows the user to change more than one parameter at the same time by means of the new Hyper-Vectorial-Synthesis control. VMCI supports seven-bit data as well as higher-resolution fourteen-bit data, all supported by the newest versions of Csound.
series other
email g.maldonado@tiscalinet.it
more http://www.generativeart.com/
last changed 2003/08/07 15:25

_id c7e9
authors Maver, T.W.
year 2002
title Predicting the Past, Remembering the Future
source SIGraDi 2002 - [Proceedings of the 6th Iberoamerican Congress of Digital Graphics] Caracas (Venezuela) 27-29 november 2002, pp. 2-3
summary Charlas Magistrales 2There never has been such an exciting moment in time in the extraordinary 30 year history of our subject area, as NOW,when the philosophical theoretical and practical issues of virtuality are taking centre stage.The PastThere have, of course, been other defining moments during these exciting 30 years:• the first algorithms for generating building layouts (circa 1965).• the first use of Computer graphics for building appraisal (circa 1966).• the first integrated package for building performance appraisal (circa 1972).• the first computer generated perspective drawings (circa 1973).• the first robust drafting systems (circa 1975).• the first dynamic energy models (circa 1982).• the first photorealistic colour imaging (circa 1986).• the first animations (circa 1988)• the first multimedia systems (circa 1995), and• the first convincing demonstrations of virtual reality (circa 1996).Whereas the CAAD community has been hugely inventive in the development of ICT applications to building design, it hasbeen woefully remiss in its attempts to evaluate the contribution of those developments to the quality of the built environmentor to the efficiency of the design process. In the absence of any real evidence, one can only conjecture regarding the realbenefits which fall, it is suggested, under the following headings:• Verisimilitude: The extraordinary quality of still and animated images of the formal qualities of the interiors and exteriorsof individual buildings and of whole neighborhoods must surely give great comfort to practitioners and their clients thatwhat is intended, formally, is what will be delivered, i.e. WYSIWYG - what you see is what you get.• Sustainability: The power of «first-principle» models of the dynamic energetic behaviour of buildings in response tochanging diurnal and seasonal conditions has the potential to save millions of dollars and dramatically to reduce thedamaging environmental pollution created by badly designed and managed buildings.• Productivity: CAD is now a multi-billion dollar business which offers design decision support systems which operate,effectively, across continents, time-zones, professions and companies.• Communication: Multi-media technology - cheap to deliver but high in value - is changing the way in which we canexplain and understand the past and, envisage and anticipate the future; virtual past and virtual future!MacromyopiaThe late John Lansdown offered the view, in his wonderfully prophetic way, that ...”the future will be just like the past, onlymore so...”So what can we expect the extraordinary trajectory of our subject area to be?To have any chance of being accurate we have to have an understanding of the phenomenon of macromyopia: thephenomenon exhibitted by society of greatly exaggerating the immediate short-term impact of new technologies (particularlythe information technologies) but, more importantly, seriously underestimating their sustained long-term impacts - socially,economically and intellectually . Examples of flawed predictions regarding the the future application of information technologiesinclude:• The British Government in 1880 declined to support the idea of a national telephonic system, backed by the argumentthat there were sufficient small boys in the countryside to run with messages.• Alexander Bell was modest enough to say that: «I am not boasting or exaggerating but I believe, one day, there will bea telephone in every American city».• Tom Watson, in 1943 said: «I think there is a world market for about 5 computers».• In 1977, Ken Olssop of Digital said: «There is no reason for any individuals to have a computer in their home».The FutureJust as the ascent of woman/man-kind can be attributed to her/his capacity to discover amplifiers of the modest humancapability, so we shall discover how best to exploit our most important amplifier - that of the intellect. The more we know themore we can figure; the more we can figure the more we understand; the more we understand the more we can appraise;the more we can appraise the more we can decide; the more we can decide the more we can act; the more we can act themore we can shape; and the more we can shape, the better the chance that we can leave for future generations a trulysustainable built environment which is fit-for-purpose, cost-beneficial, environmentally friendly and culturally significactCentral to this aspiration will be our understanding of the relationship between real and virtual worlds and how to moveeffortlessly between them. We need to be able to design, from within the virtual world, environments which may be real ormay remain virtual or, perhaps, be part real and part virtual.What is certain is that the next 30 years will be every bit as exciting and challenging as the first 30 years.
series SIGRADI
email t.w.maver@strath.ac.uk
last changed 2016/03/10 08:55

_id ed9e
authors Mendez, Ricardo and Pimentel, Diego
year 2000
title Internet: Características de la Información, de la Base de Datos al e-Commerce (Internet: Characteristics of the Information, from d-base to e-Commerce)
source SIGraDi’2000 - Construindo (n)o espacio digital (constructing the digital Space) [4th SIGRADI Conference Proceedings / ISBN 85-88027-02-X] Rio de Janeiro (Brazil) 25-28 september 2000, pp. 38-40
summary In the earl’s nineties, when the www start to be popular, it had a configuration given by the academics. This sites worked as a data base where you could get specific information. In a very fast way the commercial sites get in to the www just to be there with a strong institutionally presence. In the middle nineties the possibility of incorporate sound and video to the Internet give the possibility to the appearance of entertainment sites as an option for television. During the last year we saw a new change in the digital landscape of the Internet, the commercial sites start to move trough a new concept: the e-commerce. The WWW is not just a way of communication, it is a distribution channel. In a society where the information is one of the most required commodity, the bit is not just information, is the digital tool for the new global economy. Because the laws of the market does not agree with the anarchic and chaotic spirit that give the first breath to the net of nets, this phenomenon make a great impact in the Internet. The subject of this work is the study of this phenomenon and to analyze the behavior of this “new kids on the Net” and the alternatives facing the advance of the market trough the right of free access to information.
series SIGRADI
email rmendez@fadu.uba.ar, d@fadu.uba.ar
last changed 2016/03/10 08:55

_id ac8b
authors Mitchell, W.
year 1984
title CAD Technology, Its Effects on Practice and the Response of Education - an Overview
source The Third European Conference on CAD in the Education of Architecture [eCAADe Conference Proceedings] Helsinki (Finnland) 20-22 September 1984.
summary Related with the evolution of hardware there also is an evolution of CAD techniques. The very first CAD/CAM packages were developed on mainframes. They moved into practice when 16-bit minicomputers became available. The packages mainly were production drafting applications. The 32-bit super minicomputers give wider possibilities, but at the same time some software problems arise, namely the complexity of CAD- databases and the development and maintenance cost of large programs. With VLSI the distribution of intelligence becomes possible, the enthousiasm for CAD increases, but still the gap between available hardware and high quality software, remains high.Concerning CAD teaching there are severe problems. First of all there are not enough really good designers which know CAD in such a way that they can teach it. Second there is a shortage of equipment and a financial problem. Thirdly there is the question what the students need to know about CAD. which is not clear at the moment. At the University of California, Los Angeles, the following 5 subjects are teached: Computer Support, Computer Literacy, Professional Practice Implications, Exploration of CAD as a Design Medium and Theoretical Foundations of CAD. To use computers as a medium it is necessary to understand architecture, its objects, its operators and its evaluation criteria. The last topic is considered at research level.
series eCAADe
email wjm@mit.edu
more www.ecaade.org
last changed 2001/10/20 08:23

_id 4c7e
authors Mitchell, W.
year 1995
title City of Bits: space, place, and the infobahn
source The MIT Press
summary Entertaining, concise, and relentlessly probing, City of Bits is a comprehensive introduction to a new type of city, a largely invisible but increasingly important system of virtual spaces interconnected by the emerging information superhighway. William Mitchell makes extensive use of concrete, practical examples and illustrations in a technically well-grounded yet accessible examination of architecture and urbanism in the context of the digital telecommunications revolution, the ongoing miniaturization of electronics, the commodification of bits, and the growing domination of software over materialized form. In seven chapters - Pulling Glass, Electronic Agoras, Cyborg Citizens, Recombinant Architecture, Soft Cities, Bit Biz, and Getting to the Good Bits - Mitchell argues that the crucial issue before us is not one of putting in place the digital plumbing of telecommunications links and associated electronic appliances, nor even of producing content for electronic delivery, but rather one of creating electronically mediated environments for the kinds of lives that we want to lead.
series other
last changed 2003/04/23 13:14

_id fdb8
authors Montagu, A., Rodriguez Barros, D. and Chernobilsky, L.
year 2000
title The New Reality through Virtuality
source Promise and Reality: State of the Art versus State of Practice in Computing for the Design and Planning Process [18th eCAADe Conference Proceedings / ISBN 0-9523687-6-5] Weimar (Germany) 22-24 June 2000, pp. 225-229
summary In this paper we want to develop some conceptual reflections of the processes of virtualization procedures with the aim to indicate a series of misfits and mutations as byproducts of the “digital-graphic culture” (DGC) when we are dealing with the perception of the “digital space”. Considering the present situation, a bit chaotic from a pedagogical point of view, we also want to propose a set of “virtual space parameters” in order to organize in a systemic way the teaching procedures of architectural design when using digital technology. Nowadays there is a great variety of computer graphics applications comprising practically all the fields of “science & technology”, “architecture, design & urbanism”, “video & film”, “sound” and the massive amount of information technology protocols. This fact obliges us to have an overall view about the meaning of “the new reality through virtuality”. The paper is divided in two sections and one appendix. In the first section we recognise the relationships among the sensory apparatus, the cognitive structures of perception and the cultural models involved in the process of understanding the reality. In the second section, as architects, we use to have “a global set of social and technical responsabilities” to organize the physical space, but now we must also be able to organize the “virtual space” obtained from a multidimensional set of computer simulations. There are certain features that can be used as “sensory parameters” when we are dealing with architectural design in the “virtual world”, taking into consideration the differences between “immersive virtual reality” and “non inmersive virtual reality”. In the appendix we present a summary of some conclusions based on a set of pedagogical applications analysing the positive and the negative consequences of working exclusively in a “virtual world”.
keywords Virtualisation Processes, Simulation, Philosophy, Space, Design, Cyberspace
series eCAADe
email amontagu@fadu.uba.ar
more http://www.uni-weimar.de/ecaade/
last changed 2002/11/23 05:59

_id ddss9468
id ddss9468
authors Mustoe, I. and Bridges, A.
year 1994
title An Intelligent Architectural Design Resource
source Second Design and Decision Support Systems in Architecture & Urban Planning (Vaals, the Netherlands), August 15-19, 1994
summary With the development of optical disc technology very large resources of visual material are becoming available to designers. For example, the School of Architecture at University College Dublin has compiled a 30 cm Phillips Laser vision disc containing some 20,000 images of buildings from all parts of Europe. Conventional methods of accessing such large bodies of information tend to be based on formal query languages and are unsuitable for designers searching design precedents or other forms of inspiration. Conventional expert systems, based on deductive inference engines, are equally unsuitable. The difficulty stems from design being an exploratory rather than deductive process. The paper describes a novel type of pattern matching expert system, referred to as "image", which has been developed to provide a method of search which is more appropriate to designers. By the use of image, designers can make meaningful but non-deductive connections between their attitudes towards design and the contents of an optical disc. The bit-string manipulation algorithm underlying image is explained and an example of the use of the system in controlling the Dublin disc is also described.
series DDSS
email abacus@strath.ac.uk
last changed 2003/08/07 14:36

_id ecaade2008_000
id ecaade2008_000
authors Muylle, Marc (ed.)
year 2008
title ARCHITECTURE ‘in computro’ - Integrating methods and techniques
source 26th eCAADe Conference Proceedings [ISBN 978-0-9541183-7-2], Antwerp (Belgium) 26-29 September 2008, 968 p.
summary The presence of both visible and hidden digital resources in daily life is overwhelming and their presence continues to grow exponentially. It is surprising how little the impact of this evolution is questioned, especially in education. Reflecting on past experiences of this subject to learn for the future seems rarely to be done, and the sheer fact that a digital method exists is often seen as sufficient justification for its use. Are these the perceptions of serious misgivings or isolated views that circulate in the educational world and beyond? We would suggest that the eCAADe (Education and Research in Computer Aided Architectural Design in Europe) and its conferences provide the ideal forum to provide answers in this debate. For the first conference in what is to be the next quarter century of the existence of eCAADe, a theme was chosen that could easily include all aspects of this debate: ARCHITECTURE ‘in computro’ , Integrating methods and techniques It seems a bit vulgar to use a dog-Latin phrase ‘in computro’ for such a serious matter but at least it now has a place between ‘in vivo’ and ‘in vitro’. For more than 25 years CAAD has been available, and has been more and more successfully used in research and commercial architectural practice. In education, which by definition should prepare students for the future, the constantly evolving CAAD metaphor is provoking a challenge to cope with the ever expanding scope of related topics. It is not surprising that this has led to differing opinions as to how CAAD should be taught. Questions such as how advanced research results can be incorporated in teaching, or if the Internet is provoking self-education by students, are in striking contrast with the more fundamental issues such as the discussion on analogue versus digital design methods. Is CAAD a part of design teaching or is it its logical successor in a global E-topia? Although the E of education is a prominent factor in the ‘raison d’être’ of the organisation, the papers presented at this conference illustrate that eCAADe is open to all other relevant contributions in the area of computer-aided architectural design. It will be a fortunate coincidence that this exchange of knowledge and opinions on such state-of-the-art subjects, will be hosted by the The Higher Institute of Architectural Sciences, Henry van de Velde, located in the historical buildings of the Royal Academy of Fine Art established since the founding of the academy in 1662.
series eCAADe
type normal paper
email marc.muylle@artesis.be
more http://www.ecaade.org
last changed 2008/09/09 15:20

_id ecaade2015_110
id ecaade2015_110
authors Nagakura, Takehiko; Tsai, Daniel and Choi, Joshua
year 2015
title Capturing History Bit by Bit - Architectural Database of Photogrammetric Model and Panoramic Video
source Martens, B, Wurzer, G, Grasl T, Lorenz, WE and Schaffranek, R (eds.), Real Time - Proceedings of the 33rd eCAADe Conference - Volume 1, Vienna University of Technology, Vienna, Austria, 16-18 September 2015, pp. 685-694
wos WOS:000372317300074
summary Architecture changes in real time. It appears differently as the sun and weather shift. And over a long span, it naturally wears and decays or may be renovated. This paper discusses the use of two emerging low-cost technologies, photogrammetric modeling and panoramic video, for recording such transformations of buildings. These methods uniquely capture a moment in the existence of a building, and deliver its three dimensional appearance and the sense of traversing in it like no other conventional media. An approach with a database platform is proposed as a solution for storing recordings amassed from fieldwork and making useful heterogeneous representations out of these unique contents for studying architectural designs.
series eCAADe
email takehiko@mit.edu
more https://mh-engage.ltcc.tuwien.ac.at/engage/ui/watch.html?id=e74479fc-7029-11e5-9c41-d78521461413
last changed 2016/05/16 09:08

_id 2b17
authors Salesin, David and Barzel, Ronen
year 1986
title Two-Bit Graphics
source IEEE Computer Graphics and Applications. June, 1986. vol. 6: pp. 36-42 : ill. includes bibliography
summary Ordinary bitmaps allow pixels to be black or white. The authors introduce a second bitmap, the 'alpha' bitmap, which allows pixels to be transparent as well. The alpha bitmap makes it possible to have black-and-white images that are nonrectangular or that have holes in them. It also provides a richer set of operations for working with bitmaps. The article presents the mathematics for a two-bit compositing algebra, and suggest extensions for two-bit compositing, painting, and region filling. Each of these operations can be implemented with ordinary bitbits and presented on ordinary bitmap displays. The authors analyze the cost of each two-bit operation in terms of the number of bitbits it requires
keywords computer graphics, algorithms, display
series CADline
last changed 2003/06/02 11:58

_id 7670
authors Sawicki, Bogumil
year 1995
title Ray Tracing – New Chances, Possibilities and Limitations in AutoCAD
source CAD Space [Proceedings of the III International Conference Computer in Architectural Design] Bialystock 27-29 April 1995, pp. 121-136
summary Realistic image synthesis is nowadays widely used in engineering applications. Some of these applications, such as architectural, interior, lighting and industrial design demand accurate visualization of non-existent scenes as they would look to us, when built in reality. This can only be archived by using physically based models of light interaction with surfaces, and simulating propagation of light through an environment. Ray tracing is one of the most powerful techniques used in computer graphics, which can produce such very realistic images. Ray tracing algorithm follows the paths of light rays backwards from observer into the scene. It is very time consuming process and as such one could not be developed until proper computers appeared, In recent years the technological improvements in computer industry brought more powerful machines with bigger storage capacities and better graphic devices. Owing to increasing these hardware capabilities successful implementation of ray tracing in different CAD software became possible also on PC machines. Ray tracing in AutoCAD r.12 - the most popular CAD package in the world - is the best of that example. AccuRender and AutoVision are an AutoCAD Development System (ADS) applications that use ray tracing to create photorealistic images from 3D AutoCAD models. These ,internal"' applications let users generate synthetic images of threedimensional models and scenes entirely within AutoCAD space and show effects directly on main AutoCAD screen. Ray tracing algorithm accurately calculates and displays shadows, transparency, diffusion, reflection, and refraction from surface qualities of user-defined materials. The accurate modelling of light lets produce sophisticated effects and high-quality images, which these ray tracers always generates at 24-bit pixel depth,"providing 16,7 million colours. That results can be quite impressive for some architects and are almost acceptable for others but that coloured virtual world, which is presented by ray tracing in AutoCAD space in such convincing way, is still not exactly the same as the real world. Main limitations of realism are due to the nature of ray tracing method Classical ray tracing technique takes into account the effects of light reflection from neighbouring surfaces but, leaves out of account the ambient and global illumination arising out of complex interreflections in an environment. So models generated by ray tracing belong to an "ideal" world where real materials and environment can't find their right place. We complain about that fact and say that ray tracing shows us "too specular world", but (...) (...) there is anything better on the horizon? It should be concluded, that typical abilities of today's graphics software and hardware are far from exploited. As was observed in literature there have been various works carried along with the explicit intention of overcoming all these ray tracing limitations, These researches seem to be very promising and let us hope that their results will be seen in CAD applications soon. As it happens with modelling, perhaps the answer will come from a variety of techniques that can be combined together with ray tracing depending on the case we are dealing with. Therefore from the point of view of an architects that try to keep alive some interest on the nature of materials and their interaction with form, "ray tracing" seems to be right path of research and development that we can still a long way follow, From the point of view of the school, a critical assimilation of "ray tracing" processes is required and one that might help to determinate exactly their distortions and to indicate the correct way of its development and right place in CAAD education. I trust that ray tracing will become standard not only in AutoCAD but in all architectural space modelling CAD applications and will be established as a powerful and real tool for experimental researches in architectural design process. Will be the technological progress so significant in the nearest future as it is anticipated?
series plCAD
last changed 2000/01/24 09:08

_id 861a
authors Sedas, Sergio W. and Talukdar, Sarosh N.
year 1987
title A Disassembly Planner for Redesign
source The Winter Annual Meeting of the American Society of Mechanical Engineers. Symposium of Intelligent and Integrated Manufacturing Analysis and Synthesis. December, 1987. Pittsburgh, PA: Engineering Design Research Center, CMU, 1988. [6] p. : ill. includes bibliography
summary This paper describes an algorithm for generating plans for disassembling given objects. The plans are produced by a set of knowledge sources acting on a set of representations for the object. Both sets are arbitrarily expandable, so programs using the approach can grow continually in capability. Our present complement of knowledge sources and representations can tackle relatively difficult problems. Three examples are included. The first requires a good bit of geometric reasoning before appropriate subassemblies can be selected. The second and third require certain movable parts to be repositioned before disassembly can be achieved
keywords algorithms, representation, synthesis, assemblies, knowledge, reasoning, mechanical engineering
series CADline
last changed 2003/06/02 11:58

For more results click below:

this is page 0show page 1HOMELOGIN (you are user _anon_320188 from group guest) CUMINCAD Papers Powered by SciX Open Publishing Services 1.002