CumInCAD is a Cumulative Index about publications in Computer Aided Architectural Design
supported by the sibling associations ACADIA, CAADRIA, eCAADe, SIGraDi, ASCAAD and CAAD futures

PDF papers
References

Hits 1 to 20 of 37

_id 59ca
authors Bhavnani, S.K. and Bates, M.J.
year 2002
title Separating the Knowledge Layers: Cognitive Analysis of Search Knowledge Through Hierarchical Goal Decompositions
source Proceedings of the ASIST'2002 (2002), 204-213
summary Hierarchical goal decompositions have proved to be a useful method to make explicit the knowledge required by users to perform tasks in a wide range of applications such as computeraided drafting (CAD) systems. This analysis method progressively decomposes a given task starting from the task layer on the top of the decomposition, to the keystroke layer at the bottom. The analysis enables a close inspection of the knowledge required to perform the task at each layer of the decomposition. In this paper we show how the method of hierarchical goal decomposition can be used to understand more precisely the knowledge that is required to perform information search tasks. The analysis pinpoints: (1) the critical strategies in the intermediate layers of knowledge that are known by experts searchers; (2) why such knowledge is difficult to acquire by novice searchers; (3) how the analysis provides testable predictions of behavior based on the acquisition of different types of knowledge. We conclude by discussing the advantages provided by hierarchical goal decompositions, and how such an approach can lead to the design of systems and training.
series other
email bhavnani@umich.edu
last changed 2003/11/21 14:16

_id acadia13_071
id acadia13_071
authors Burry, Jane; Salim, Flora; Williams, Mani; Anton Nielsen, Stig; Pena de Leon, Alex; Sharaidin, Kamil; Burry, Mark
year 2013
title Understanding Heat Transfer Performance for Designing Better Façades
source ACADIA 13: Adaptive Architecture [Proceedings of the 33rd Annual Conference of the Association for Computer Aided Design in Architecture (ACADIA) ISBN 978-1-926724-22-5] Cambridge 24-26 October, 2013), pp. 71-78
summary This early research focuses on the design of building façades to mediate external and internal thermal conditions. It explores new workflow for accessible feedback into the early design of façade systems. Specifically, this research aims to explore the level of corroboration or the gap between predictions of thermal behavior using digital modeling and simulation, and the empirical measurement of thermal behavior in physical analog models for façade design.
keywords Tools and Interfaces: façade design, heat transfer, performance-based design, simulation, data visualization.
series ACADIA
type normal paper
email jane.burry@gmail.com
last changed 2014/01/11 08:13

_id ddssar0229
id ddssar0229
authors De Vries, B., Jessurun, A.J. and J. Dijkstra, J.
year 2002
title Conformance Checking by Capturing and Simulating Human Behaviour in the Built Environment
source Timmermans, Harry (Ed.), Sixth Design and Decision Support Systems in Architecture and Urban Planning - Part one: Architecture Proceedings Avegoor, the Netherlands), 2002
summary In order to model natural human behaviour, it is necessary to capture this behaviour. First, we will start out by modelling behaviour for specific situations, such as taking a seat in a theatre. To capture humanbehaviour, the following experiment is performed: Given a virtual environment, a sufficient number of subjects (real humans) are asked to execute a human task in this virtual environment (e.g. take a seat inthe theatre). Whenever the subject deviates from the shortest path, the system will ask for a clue why this is done. The hypothesis is that the combination of the motion paths and the clues for making/changing decisions will provide decision rules to make reliable predictions about human behaviour under the same conditions when using virtual persons. To test the hypothesis, we propose to use the university’s main conference and presentation hall as a test case. A 3D model and a motion pathgraph are constructed that enables a virtual person to find its way to a selected chair. The clues from the experiment are implemented as decision rules that determine a virtual person’s behaviour. Running thesimulation will result in the following data: Time per person to find a chair, Deviation from the shortest path, Distance covered per person to find a chair, Distribution of seated persons over time and Relocation of persons. To validate the test case, the process of people entering the hall and finding a chair is recorded on videotape. The walking behaviour of the people observed on the video is analysed and compared with the data from the simulation.
series DDSS
last changed 2003/08/07 14:36

_id acadia13_301
id acadia13_301
authors Dierichs, Karola; Menges, Achim
year 2013
title Aggregate Architecture: Simulation Models for Synthetic Non-convex Granulates
source ACADIA 13: Adaptive Architecture [Proceedings of the 33rd Annual Conference of the Association for Computer Aided Design in Architecture (ACADIA) ISBN 978-1-926724-22-5] Cambridge 24-26 October, 2013), pp. 301-310
summary Aggregate Architectures challenge the common notion of architectural structures as being immutable, permanent and controllable. Aggregate Architectures are understood as material systems consisting of large masses of granules—designed or natural—interacting with each other only through loose, frictional contact. As a consequence, they take the realm of structural stability and architectural planning into entire re-configurability and into merely probable predictions of their prospective behavior. This renders them relevant within the paradigm of Adaptive Architecture.The challenge to the designer is to move away from thinking in terms of clearly defined local and global assembly systems and to acquire tools and modes of design that allow for observation and interaction with the evolving granular architectures. In this context, the focus of the presented researchproject is on the relevance of mathematically based simulations as tools of investigation and design.The paper introduces the field of Aggregate Architectures. Consequently experimental and simulation methods for granulates will be outlined and compared. Different modeling and collision-detection methods for non-convex particles are shown and applied in benchmarking simulations for a full-scale architectural prototype. The potential for micro-mechanical simulation analysis within architectural applications are demonstrated and further areas of research outlined.
keywords Tools and Interfaces; aggregate architecture, designed granular matter, discrete element modeling, non-convex particles
series ACADIA
type Normal Paper
email karola.dierichs@icd.uni-stuttgart.de
last changed 2014/01/11 08:13

_id ecaade2009_160
id ecaade2009_160
authors Erinsel Önder, Deniz; Gigi, Yildirim
year 2009
title Urban Spaces by the Space Syntax Method: A Proposal for the South Haliç Region
source Computation: The New Realm of Architectural Design [27th eCAADe Conference Proceedings / ISBN 978-0-9541183-8-9] Istanbul (Turkey) 16-19 September 2009, pp. 827-834
wos WOS:000334282200101
summary For a designer-architect to be able to make accurate predictions for any particular urban space, he/she needs to know the development stages of the city, as well as the city’s various features across time. Thus, it is necessary to read the different segments that constitute the city to reveal its historical, cultural, social, physical and symbolic features. The aim of this study is to determine the social and physical problems of a historical urban space and subsequently to introduce physical and functional suggestions to improve the identified problems, and for the development of the area. The South Haliç Area was chosen as a work-space because of its special importance in protecting the historical and cultural heritage found there and transferring it to future generations. With this in mind, in addition to literature studies, on-site observations and interviews, the area has been analyzed and evaluated using the space syntax method. The suggestions developed for the identified problems and solutions have been re-analyzed, and both the present data and the data obtained after the suggestions have been examined and the results have been presented.
keywords Reading space, space syntax, The South Haliç Region
series eCAADe
email erinselonder@gmail.com, yildirimgigi@gmail.com
last changed 2016/05/16 09:08

_id ascaad2009_samir_foura
id ascaad2009_samir_foura
authors Foura, Samir and Samira Debache
year 2009
title Thermal Simulation In Residential Building Within Computer Aided Architectural Design: Integrated model
source Digitizing Architecture: Formalization and Content [4th International Conference Proceedings of the Arab Society for Computer Aided Architectural Design (ASCAAD 2009) / ISBN 978-99901-06-77-0], Manama (Kingdom of Bahrain), 11-12 May 2009, pp. 235-243
summary Nowadays, the architectural profession is seeking a better energy saving in the design of buildings. The fear of energy shortage in the very near future, together with the rapid rise in energy prices, put pressure on researchers on this field to develop buildings with more efficient heating systems and energy systems. This work is concerned mainly with the development of a software program analyzing comfort in buildings integrated in CAD architectural systems. The problem of presenting the computer with information concerning the building itself has been overcome through integration of thermal analysis with the building capabilities of CAD system. Mainly, such experience concerns the rules for calculating heat loss and heat gain of buildings in Algeria, The program has been developed in order to demonstrate the importance of the innovation of the computer aided-architectural-design field (CAAD) in the technology of buildings such as the three dimensional modeling offering environmental thermal analysis. CAAD is an integrated architectural design system which can be used to carry out many tasks such as working drawings, perspectives and thermal studies, etc., all from the same data. Results are obtained in tabular form or in graphical output on the visual display. The principle of this program is that all input data should be readily available to the designer at the early stages of the design before the user starts to run the integrated model. Particular attention is given to the analysis of thermal aspects including solar radiation gains. Average monthly energy requirement predictions have been estimated depending on the building design aspect. So, this integrated model (CAAD and simulation comfort) is supposed to help architects to decide on the best options for improving the design of buildings. Some of these options may be included at the early design stages analysis. Indications may also be given on how to improve the design. The model stored on CAAD system provides a valuable data base for all sort analytical programs to be integrated into the system. The amount of time and expertise required to use complex analytical methods in architectural practice can be successfully overcome by integration with CAAD system.
series ASCAAD
email sfoura@gmail.com
last changed 2009/06/30 06:12

_id 7ce5
authors Gal, Shahaf
year 1992
title Computers and Design Activities: Their Mediating Role in Engineering Education
source Sociomedia, ed. Edward Barret. MIT Press
summary Sociomedia: With all the new words used to describe electronic communication (multimedia, hypertext, cyberspace, etc.), do we need another one? Edward Barrett thinks we do; hence, he coins the term "sociomedia." It is meant to displace a computing economy in which technicity is hypostasized over sociality. Sociomedia, a compilation of twenty-five articles on the theory, design and practice of educational multimedia and hypermedia, attempts to re-value the communicational face of computing. Value, of course, is "ultimately a social construct." As such, it has everything to do with knowledge, power, education and technology. The projects discussed in this book represent the leading edge of electronic knowledge production in academia (not to mention major funding) and are determining the future of educational media. For these reasons, Sociomedia warrants close inspection. Barrett's introduction sets the tone. For him, designing computer media involves hardwiring a mechanism for the social construction of knowledge (1). He links computing to a process of social and communicative interactivity for constructing and desseminating knowledge. Through a mechanistic mapping of the university as hypercontext (a huge network that includes classrooms as well as services and offices), Barrett models intellectual work in such a way as to avoid "limiting definitions of human nature or human development." Education, then, can remain "where it should be--in the human domain (public and private) of sharing ideas and information through the medium of language." By leaving education in a virtual realm (where we can continue to disagree about its meaning and execution), it remains viral, mutating and contaminating in an intellectually healthy way. He concludes that his mechanistic model, by means of its reductionist approach, preserves value (7). This "value" is the social construction of knowledge. While I support the social orientation of Barrett's argument, discussions of value are related to power. I am not referring to the traditional teacher-student power structure that is supposedly dismantled through cooperative and constructivist learning strategies. The power to be reckoned with in the educational arena is foundational, that which (pre)determines value and the circulation of knowledge. "Since each of you reading this paragraph has a different perspective on the meaning of 'education' or 'learning,' and on the processes involved in 'getting an education,' think of the hybris in trying to capture education in a programmable function, in a displayable object, in a 'teaching machine'" (7). Actually, we must think about that hybris because it is, precisely, what informs teaching machines. Moreover, the basic epistemological premises that give rise to such productions are too often assumed. In the case of instructional design, the episteme of cognitive sciences are often taken for granted. It is ironic that many of the "postmodernists" who support electronic hypertextuality seem to have missed Jacques Derrida's and Michel Foucault's "deconstructions" of the epistemology underpinning cognitive sciences (if not of epistemology itself). Perhaps it is the glitz of the technology that blinds some users (qua developers) to the belief systems operating beneath the surface. Barrett is not guilty of reactionary thinking or politics; he is, in fact, quite in line with much American deconstructive and postmodern thinking. The problem arises in that he leaves open the definitions of "education," "learning" and "getting an education." One cannot engage in the production of new knowledge without orienting its design, production and dissemination, and without negotiating with others' orientations, especially where largescale funding is involved. Notions of human nature and development are structural, even infrastructural, whatever the medium of the teaching machine. Although he addresses some dynamics of power, money and politics when he talks about the recession and its effects on the conference, they are readily visible dynamics of power (3-4). Where does the critical factor of value determination, of power, of who gets what and why, get mapped onto a mechanistic model of learning institutions? Perhaps a mapping of contributors' institutions, of the funding sources for the projects showcased and for participation in the conference, and of the disciplines receiving funding for these sorts of projects would help visualize the configurations of power operative in the rising field of educational multimedia. Questions of power and money notwithstanding, Barrett's introduction sets the social and textual thematics for the collection of essays. His stress on interactivity, on communal knowledge production, on the society of texts, and on media producers and users is carried foward through the other essays, two of which I will discuss. Section I of the book, "Perspectives...," highlights the foundations, uses and possible consequences of multimedia and hypertextuality. The second essay in this section, "Is There a Class in This Text?," plays on the robust exchange surrounding Stanley Fish's book, Is There a Text in This Class?, which presents an attack on authority in reading. The author, John Slatin, has introduced electronic hypertextuality and interaction into his courses. His article maps the transformations in "the content and nature of work, and the workplace itself"-- which, in this case, is not industry but an English poetry class (25). Slatin discovered an increase of productive and cooperative learning in his electronically- mediated classroom. For him, creating knowledge in the electronic classroom involves interaction between students, instructors and course materials through the medium of interactive written discourse. These interactions lead to a new and persistent understanding of the course materials and of the participants' relation to the materials and to one another. The work of the course is to build relationships that, in my view, constitute not only the meaning of individual poems, but poetry itself. The class carries out its work in the continual and usually interactive production of text (31). While I applaud his strategies which dismantle traditional hierarchical structures in academia, the evidence does not convince me that the students know enough to ask important questions or to form a self-directing, learning community. Stanley Fish has not relinquished professing, though he, too, espouses the indeterminancy of the sign. By the fourth week of his course, Slatin's input is, by his own reckoning, reduced to 4% (39). In the transcript of the "controversial" Week 6 exchange on Gertrude Stein--the most disliked poet they were discussing at the time (40)--we see the blind leading the blind. One student parodies Stein for three lines and sums up his input with "I like it." Another, finds Stein's poetry "almost completey [sic] lacking in emotion or any artistic merit" (emphasis added). On what grounds has this student become an arbiter of "artistic merit"? Another student, after admitting being "lost" during the Wallace Steven discussion, talks of having more "respect for Stevens' work than Stein's" and adds that Stein's poetry lacks "conceptual significance[, s]omething which people of varied opinion can intelligently discuss without feeling like total dimwits...." This student has progressed from admitted incomprehension of Stevens' work to imposing her (groundless) respect for his work over Stein's. Then, she exposes her real dislike for Stein's poetry: that she (the student) missed the "conceptual significance" and hence cannot, being a person "of varied opinion," intelligently discuss it "without feeling like [a] total dimwit." Slatin's comment is frightening: "...by this point in the semester students have come to feel increasingly free to challenge the instructor" (41). The students that I have cited are neither thinking critically nor are their preconceptions challenged by student-governed interaction. Thanks to the class format, one student feels self-righteous in her ignorance, and empowered to censure. I believe strongly in student empowerment in the classroom, but only once students have accrued enough knowledge to make informed judgments. Admittedly, Slatin's essay presents only partial data (there are six hundred pages of course transcripts!); still, I wonder how much valuable knowledge and metaknowledge was gained by the students. I also question the extent to which authority and professorial dictature were addressed in this course format. The power structures that make it possible for a college to require such a course, and the choice of texts and pedagogy, were not "on the table." The traditional professorial position may have been displaced, but what took its place?--the authority of consensus with its unidentifiable strong arm, and the faceless reign of software design? Despite Slatin's claim that the students learned about the learning process, there is no evidence (in the article) that the students considered where their attitudes came from, how consensus operates in the construction of knowledge, how power is established and what relationship they have to bureaucratic insitutions. How do we, as teaching professionals, negotiate a balance between an enlightened despotism in education and student-created knowledge? Slatin, and other authors in this book, bring this fundamental question to the fore. There is no definitive answer because the factors involved are ultimately social, and hence, always shifting and reconfiguring. Slatin ends his article with the caveat that computerization can bring about greater estrangement between students, faculty and administration through greater regimentation and control. Of course, it can also "distribute authority and power more widely" (50). Power or authority without a specific face, however, is not necessarily good or just. Shahaf Gal's "Computers and Design Activities: Their Mediating Role in Engineering Education" is found in the second half of the volume, and does not allow for a theory/praxis dichotomy. Gal recounts a brief history of engineering education up to the introduction of Growltiger (GT), a computer-assisted learning aid for design. He demonstrates GT's potential to impact the learning of engineering design by tracking its use by four students in a bridge-building contest. What his text demonstrates clearly is that computers are "inscribing and imaging devices" that add another viewpoint to an on-going dialogue between student, teacher, earlier coursework, and other teaching/learning tools. The less proficient students made a serious error by relying too heavily on the technology, or treating it as a "blueprint provider." They "interacted with GT in a way that trusted the data to represent reality. They did not see their interaction with GT as a negotiation between two knowledge systems" (495). Students who were more thoroughly informed in engineering discourses knew to use the technology as one voice among others--they knew enough not simply to accept the input of the computer as authoritative. The less-advanced students learned a valuable lesson from the competition itself: the fact that their designs were not able to hold up under pressure (literally) brought the fact of their insufficient knowledge crashing down on them (and their bridges). They also had, post factum, several other designs to study, especially the winning one. Although competition and comparison are not good pedagogical strategies for everyone (in this case the competitors had volunteered), at some point what we think we know has to be challenged within the society of discourses to which it belongs. Students need critique in order to learn to push their learning into auto-critique. This is what is lacking in Slatin's discussion and in the writings of other avatars of constructivist, collaborative and computer-mediated pedagogies. Obviously there are differences between instrumental types of knowledge acquisition and discoursive knowledge accumulation. Indeed, I do not promote the teaching of reading, thinking and writing as "skills" per se (then again, Gal's teaching of design is quite discursive, if not dialogic). Nevertheless, the "soft" sciences might benefit from "bridge-building" competitions or the re-institution of some forms of agonia. Not everything agonistic is inhuman agony--the joy of confronting or creating a sound argument supported by defensible evidence, for example. Students need to know that soundbites are not sound arguments despite predictions that electronic writing will be aphoristic rather than periodic. Just because writing and learning can be conceived of hypertextually does not mean that rigor goes the way of the dinosaur. Rigor and hypertextuality are not mutually incompatible. Nor is rigorous thinking and hard intellectual work unpleasurable, although American anti-intellectualism, especially in the mass media, would make it so. At a time when the spurious dogmatics of a Rush Limbaugh and Holocaust revisionist historians circulate "aphoristically" in cyberspace, and at a time when knowledge is becoming increasingly textualized, the role of critical thinking in education will ultimately determine the value(s) of socially constructed knowledge. This volume affords the reader an opportunity to reconsider knowledge, power, and new communications technologies with respect to social dynamics and power relationships.
series other
last changed 2003/04/23 13:14

_id 9a6b
authors Hofmeyer, Herm Combined
year 2000
title Combined web crippling and bending moment failure of first-generation trapezoidal steel sheeting : experiments, finite element models, mechanical models
source Eindhoven University of Technology
summary Cold-formed trapezoidal sheeting of thin steel plate is a very popular product for building construction. It combines low weight and high strength and is economical in use. Current design rules, which predict sheeting failure for an interior support, do not provide sufficient insight into the sheeting behaviour, and can differ up to 40% in their predictions. To develop a new design rule, this thesis presents new experiments in which first-generation sheeting behaviour is studied for practical situations. The experiments show that after ultimate load, three different post-failure modes arise. Mechanical models have been developed for the three post-failure modes. These models can help to explain why a certain post-failure mode occurs. Finite element models were used to simulate the experiments. Studying stress distributions with finite element simulations, it can be seen that there are only two ultimate failure modes at ultimate load. One of these ultimate failure modes is not relevant for practice. A mechanical model has been developed for the other ultimate failure mode. This model performs as well as the current design rules, and it provides insight into the sheeting behaviour.
keywords Steelstructures; Constructive Design; Thin Walled Beams; Local Buckling; Steel Profiles
series thesis:PhD
email h.hofmeijer@bwk.tue.nl
last changed 2003/02/12 21:37

_id ddss9828
id ddss9828
authors Holmberg, Stig C.
year 1998
title Anticipation in Evaluation and Assessment of Urban and Regional Plans
source Timmermans, Harry (Ed.), Fourth Design and Decision Support Systems in Architecture and Urban Planning Maastricht, the Netherlands), ISBN 90-6814-081-7, July 26-29, 1998
summary In order to start a move toward better computer based supporting tools for the assessment of urban and regional plans, a new research and development endeavour is proposed. In so doing, anticipation andanticipatory computing, i.e. a technique applying modelling and simulation, is found to be an interesting and promising point of departure. Hence, a fuzzy cellular automata computer model (STF) for simulation and anticipation of geographical or physical space is constructed. The main idea is to map anurban plan onto the STF for its assessing. STF has a normalised and continuous, i.e. fuzzy, system variable while both the time and space dimensions take on discrete values. Further, while ordinary cellularautomata use local rules, global ones are employed in STF, i.e. there is a total interdependence among all the cells of the automata. Outcomes of STF can be interpreted more as possible future states than exact predictions. Preliminary results seem to be well in line with main characteristics of planned urban or regional geographical spaces. Further, for the managing of multi-criteria choice situations, a fuzzy procedure – the Ordered Weighted Average (OWA) procedure – with continuous control over the degree of ANDOR-ness and with independent control over the degree of tradeoff, is proposed.
keywords Geographical space, Anticipatory Computing, Cellular Automata, Spatio Temporal Fuzzy Model (STF)
series DDSS
last changed 2003/08/07 14:36

_id 869d
authors Howard, Rob
year 1991
title Building IT 2000 -- A Hypertext Database of Predictions on the Use of Information Technology in Building
source The Computer Integrated Future, CIB W78 Seminar September, 1991. Unnumbered : ill.
summary Hypertext is a medium particularly suitable for providing easy access to diverse information and maintaining it. It was used for a database of papers on the future of many aspects of information technology and their likely use by the year 2000. The recommendations include the development of project databases to integrate the use of computers by all parties to a building project, and the establishment of a building IT forum in the UK. CICA acted as research coordinator for the project and already carries out many of the functions of the building IT forum which will also need to include other organizations in the UK and in other countries. The data in Building IT 2000 will be maintained on hypertext and will take advantage of future developments in hypermedia. These new techniques, with the ability to provide selective access to, and payment for, digital data could help solve the problems of managing building project data. Building IT 2000 will be demonstrated at this conference to show its flexibility. It is available as a printed report or on disk for Macintosh or PC Windows 3.0 computers
keywords hypertext, database, construction, building process, information, hypermedia
series CADline
last changed 1999/02/12 14:08

_id acadia12_305
id acadia12_305
authors Kock, Jeffrey ; Bradley, Benjamin ; Levelle, Evan
year 2012
title The Digital-Physical Feedback Loop: A Case Study
source ACADIA 12: Synthetic Digital Ecologies [Proceedings of the 32nd Annual Conference of the Association for Computer Aided Design in Architecture (ACADIA) ISBN 978-1-62407-267-3] San Francisco 18-21 October, 2012), pp. 305-314
summary Kukje Art Center, Seoul’s new gallery designed by SO-IL, features a totally bespoke chainmail mesh system (submission note: the authors are not affiliated with SO-IL). A single sheet of complex-curved, tensioned mesh, made up of interlocking 40mm diameter stainless steel rings, wraps the building. This paper discusses the stages of a feedback loop process employed by the authors to refine a digital model of the mesh. The mesh’s perimeter attachment system does not prescribe ring locations, allowing the mesh to form find for itself during installation. As a result, the digital model must capture the behavioral tendencies of the mesh as it negotiates the building’s geometry. Paramount in meeting this challenge was the use of physical mockups. At each stage of the feedback loop process, the working digital model was used to develop a physical mockup of increased scale and complexity, and this mockup was used to refine the digital model. Ultimately, the model output of a mesh relaxation algorithm was used as the basis for engineering simulations and predictions of the mesh vertical ringcount needed at specific locations around the building. Mesh vertical ringcount predictions are validated relative to a 1:1 mockup and the installed Kukje Art Center mesh.
keywords minimal surface , chainmail mesh , form finding , dynamic relaxation , finite element analysis , feedback loop , tensioned fabric , physical mockup , bespoke cladding , Kukje , Seoul
series ACADIA
type normal paper
email jeffkock@gmail.com
last changed 2013/01/09 10:06

_id 470c
authors Kuenstle, Michael W.
year 2001
title COMPUTATIONAL FLUID DYNAMIC APPLICATIONS IN WIND ENGINEERING FOR THE DESIGN OF BUILDING STRUCTURES IN WIND HAZARD PRONE AREAS (Computational Flow Dynamic Applications in Wind Engineering for the Design of Building Structures in Wind Hazard Prone Urban Areas)
source SIGraDi biobio2001 - [Proceedings of the 5th Iberoamerican Congress of Digital Graphics / ISBN 956-7813-12-4] Concepcion (Chile) 21-23 november 2001, pp. 67-70
summary This paper documents an initial study investigating the integration of Computational Fluid Dynamics (CFD) simulation modeling into wind mitigation design for building structures located in wind hazard prone areas. Some of the basic principles and theoretical concepts of fluid flow and wind pressure as well as their translation into design criteria for structural analysis and design are reviewed, followed by a discussion of a CFD application case study for a simulated hurricane force wind flow over a low rectangular building using the k-epsilon turbulence model. The techniques and parameters for development of the simulation are discussed and some preliminary interpretations of the results are evaluated by comparing its predictions against existing experimental and analytical data, with special attention paid to the American Society of Civil Engineers, Minimum Design Loads for Buildings and Other Structures, ACSE 7-98 and the Uniform Building Code .
series SIGRADI
email kuenstle@ufl.edu
last changed 2016/03/10 08:54

_id 3071
authors Kuenstle, Michael W.
year 2002
title Escarpment Study in a Virtual Flow Environment A Comparative Analysis of a Single Building Type Modeled in Varying Topological Situations [Escarpment Study in a Virtual Flow Environment. A Comparative Analysis of a Single Building Type Modeled in Varying Topological Situations]
source SIGraDi 2002 - [Proceedings of the 6th Iberoamerican Congress of Digital Graphics] Caracas (Venezuela) 27-29 november 2002, pp. 167-171
summary This paper documents the progress of research to investigate the integration of 3-dimensional computational modeling techniques into wind mitigation analysis and design for building structures located in high wind prone areas. Some of the basic mechanics and theoretical concepts of fluid flow and wind pressure as well as their translation into design criteria for structural analysis and design are reviewed, followed by a discussion of a detailed Computational Fluid Dynamics (CFD) application case study for asimulated “3-second gust” hurricane force wind flow over a low rectangular building located in a coastal region of south Florida. The case study project models the wind flow behavior and pressure distribution over the building structure when situated in three varying conditions within a single terrain exposure category. The simulations include three-dimensional modeling of the building type constructed (1) on-grade in a flat coastal area, (2) above grade with the building elevated on structural columns, and (3) ongradedownwind of an escarpment. The techniques and parameters for development of the simulations are discussed and some preliminary interpretations of the results are evaluated by comparing their predictions to existing experimental and analytical data, with special attention paid to the numerical methods outlined in the American Society of Civil Engineers, Minimum Design Loads for Buildings and Other Structures, ASCE 7-98.
series SIGRADI
email kuenstle@ufl.edu
last changed 2016/03/10 08:54

_id ed3c
authors Kuenstle, Michael W.
year 2002
title Escarpment Study in a Virtual Flow Environment: A Comparative Analysis of a Single Building Type Modeled in Varying Topological Situations
source Thresholds - Design, Research, Education and Practice, in the Space Between the Physical and the Virtual [Proceedings of the 2002 Annual Conference of the Association for Computer Aided Design In Architecture / ISBN 1-880250-11-X] Pomona (California) 24-27 October 2002, pp. 239-247
summary This paper documents the progress of research to investigate the integration of 3-dimensionalcomputational modeling techniques into wind mitigation analysis and design for building structureslocated in high wind prone areas. Some of the basic mechanics and theoretical concepts of fluid flowand wind pressure as well as their translation into design criteria for structural analysis and design arereviewed, followed by a discussion of a detailed Computational Fluid Dynamics (CFD) application casestudy for a simulated "3-second gust" hurricane force wind flow over a low rectangular building locatedin a coastal region of south Florida. The case study project models the wind flow behavior and pressuredistribution over the building structure when situated in three varying conditions within a single terrainexposure category. The simulations include three-dimensional modeling of the building type constructed(1) on-grade in a flat coastal area, (2) above grade with the building elevated on structural columns, and(3) on-grade downwind of an escarpment. The techniques and parameters for development of thesimulations are discussed and some preliminary interpretations of the results are evaluated bycomparing their predictions to existing experimental and analytical data, with special attention paid tothe numerical methods outlined in the American Society of Civil Engineers, Minimum Design Loads forBuildings and Other Structures, ASCE 7-98.
series ACADIA
email kuenstle@ufl.edu
last changed 2002/10/26 23:25

_id 79b4
authors Kuenstle, Michael W.
year 2002
title Flow Structure Environment Simulation - A Comparative Analysis of Wind Flow Phenomena and Building Structure Interaction
source Connecting the Real and the Virtual - design e-ducation [20th eCAADe Conference Proceedings / ISBN 0-9541183-0-8] Warsaw (Poland) 18-20 September 2002, pp. 564-568
summary This paper documents the progress of research to investigate the integration of computational modeling techniques into wind mitigation analysis and design for building structures located in high wind prone areas. Some of the basic mechanics and theoretical concepts of fluid flow and wind pressure as well as their translation into design criteria for structural analysis and design are reviewed, followed by a discussion of a detailed Computational Fluid Dynamics (CFD) application case study for a simulated “3-second gust” wind flow over a low rectangular building located in a coastal region. The case study project models the wind flow behavior and pressure distribution over the building structure when situated in three varying conditions within a single terrain exposure category. The simulations include three-dimensional modeling of the building type constructed (1) on-grade in a flat coastal area, (2) above grade with the building elevated on structural columns, and (3) on-grade downwind of an escarpment. The techniques and parameters for development of the simulations are discussed and some preliminary interpretations of the results are evaluated by comparing their predictions to existing experimental and analytical data.
series eCAADe
email kuenstle@ufl.edu
last changed 2002/09/09 17:19

_id c38b
authors Kunz, J.C., Christiansen, T.R., Cohen, G.P., Jin, Y. and Levitt, R.E.
year 1998
title The Virtual Design Team
source Communications of The ACM, Vol. 41, No. 11, November, 1998
summary The long range goal of the Virtual Design Team" (VDT) research program is to develop computational tools to analyze decision making and communication behavior and thereby to support true organizational (re)engineering. This article introduces the underlying theory, the implementation of the theory as a computational model, and results from industrial test cases. Organization theory traditionally describes organizations only at an aggregate-level, describing and predicting the behavior of entire organizations in terms of general qualitative predictions. We define and implement a "micro" theory of the structure and behavior of components of organizations, explicitly representing activities, groups of people called "actors," and organizational structure and policies for project teams. A VDT model can be "run" by a discrete event simulation. Emergent aggregate model output behaviors include the predicted time to complete a project, the total effort to do the project, and a measure of process quality. More detailed model behaviors include the time-varying backlog of individual actors and the "exceptions" associated with activities. The results are detailed and specific, so they can guide specific managerial interventions in a project team and can support sensitivity studies of the relative impact of different organizational changes. We conclude that such a theory is tractable and predictive for complex but relatively routine, project-oriented design tasks. The application for which VDT offers unique new kinds of insights is where an organization is striving to shrink time to market dramatically for a product that is similar to ones it has previously developed. Reducing time to market dramatically almost always requires that previously sequential activities are executed more concurrently. In this situation, experienced managers can still correctly identify the required activities and estimate their durations and skill requirements; but they almost always underestimate the increased workload arising from exponentially higher coordination needs and the propagation of rework between the now highly concurrent activities. The VDT framework, which explicitly models information dependency and failure propagation between concurrent activities, has proven to be far more accurate, and to incorporate a wider range of parameters, than CPM/PERT process models for these fast-paced development projects."
series journal paper
last changed 2003/04/23 13:50

_id 4604
authors Laveau, S. and Faugeras, O.
year 1994
title 3D Scene Representation as a Collection of Images and Fundamental Matrices
source INRIA Report
summary The problem we solve in this paper is the following. Suppose we are given N views of a static scene obtained from different viewpoints, perhaps with different cameras. These viewpoints we call reference viewpoints since they are all we know of the scene. We would like to decide if it is possible to predict ano- ther view of the scene taken by a camera from a viewpoint which is arbitrary and a priori di erent from all the reference viewpoints. One method for doing this would be to use these viewpoints to construct a three-dimensional repre- sentation of the scene and reproject this representation on the retinal plane of the virtual camera. In order to achieve this goal, we would have to establish some sort of calibration of our system of cameras, fuse the three-dimensional representations obtained from, say, pairs of cameras thereby obtaining a set of 3-D points, the scene. We would then have to approximate this set of points by surfaces, a segmentation problem which is still mostly unsolved, and then intersect the optical rays from the virtual camera with these sur- faces. This is the most straightforward way of going from a set of images to a new image using the current computer vision paradigm of rst building a three-dimensional representation of the environment from which the rest is derived. We do not claim that there does not exist any simpler way of using the three-dimensional representation than the one we just sketched, but this is just simply not our point. Our point is that it is possible to avoid entirely the explicit three-dimensional reconstruction process: the scene is represented by its images and by some ba- sically linear relations that govern the way points can be put in correspondence between views when they are the images of the same scene-point. These images and their algebraic relations are all we need for predicting a new image. This approach is similar in spirit to the one that has been used in trinocular stereo. Hypotheses of correspondences between two of the images are used to predict features in the third. These predictions can then be checked to validate or inva- lidate the initial correspondence. This approach has proved to be quite e cient and accurate. Related to these ideas are those develo- ped in the photogrammetric community under the name of transfer methods which nd for one or more image points in a given image set, the corresponding points in some new image set.
series report
last changed 2003/04/23 13:50

_id e2ea
authors Lee, Hwa-Ryong
year 1999
title The Changing Face of Architectural Computing Research
source Architectural Computing from Turing to 2000 [eCAADe Conference Proceedings / ISBN 0-9523687-5-7] Liverpool (UK) 15-17 September 1999, pp. 11-17
summary This paper examines the existing commercial and on-going research computer applications for architectural design. It investigates their uses, predictions and limitations; and reviews the teleology, technologies and theories exploited for computerising design. Finally, I will discuss two trends in the developments of CAAD, and present the new directions in CAAD research. This study will be based on understanding the computer's roles in designing, and further on establishing a new theoretical paradigm for mediating a computer system.
keywords Historical Context, Theoretical Paradigms
series eCAADe
email hlee@moe.go.kr
last changed 1999/10/10 12:53

_id ascaad2004_paper22
id ascaad2004_paper22
authors Leifer, David and John M. Leifer
year 2004
title Towards Computer Aided Life-Cycle Costing
source eDesign in Architecture: ASCAAD's First International Conference on Computer Aided Architectural Design, 7-9 December 2004, KFUPM, Saudi Arabia
summary Sustainability is recognised as a necessary public good. Building sustainable buildings requires architectural methods, specifically CAD systems, that include suitable predictions of long term performance. Unfortunately the predominant view in the Building Industries of the Developed world is essentially short term; this is because building developers – not being the end users - are essentially interested in short term profit. Until they can see the ‘value-added’ by sustainability impacting on the selling price of their buildings, they will not be motivated to build ‘sustainably’. This paper describes the issues that have led to this situation. It discussed how the advent of computers has allowed life-cycle data to be gathered over time, and may be included intro CAD system databases to enable sustainability performance predictions to be made. Once made we are now able to reap the benefits by performance benchmarking. The availability of this building performance information on-line is making life-cycle costing more readily available, and more accurate, allowing building developers, owners and users to make rapid and timely feasibility studies well in advance of design. This also allows owners to test various capital to operating cost options in order to get the best economic performance over time, as well as map future capital replacement cycles. These emerging possibilities are discussed in this paper.
series ASCAAD
email dleifer@arch.usyd.edu.au
last changed 2007/04/08 17:47

_id 4744
authors Livingstone, Margaret and Hubel, David
year 1988
title Segregation of Form, Color, Movement, and Depth : Anatomy, Physiology, and Perception
source Science. May, 1988. vol. 240: pp. 740-750 : ill. some col. includes bibliography
summary Anatomical and physiological observations in monkeys indicate that the primate visual system consists of several separate and independent subdivisions that analyze different aspects of the same retinal image: cells in cortical visual areas 1 and 2 and higher visual areas are segregated into three interdigitating subdivisions that differ in their selectivity for color, stereopsis, movement, and orientation. The pathways selective for form and color seem to be derived mainly from the parvocellular geniculate subdivisions, the depth- and movement-selective components from the magnocellular. At lower levels, in the retina and in the geniculate, cells in these two subdivisions differ in their color selectivity, contrast sensitivity, temporal properties, and spatial resolution. These major differences in the properties of cells at lower levels in each of the subdivisions led to the prediction that different visual functions, such as color, depth, movement, and form perception, should exhibit corresponding differences. Human perceptual experiments are remarkably consistent with these predictions. Moreover, perceptual experiments can be designed to ask which subdivisions of the system are responsible for particular visual abilities, such as figure/ground discrimination or perception of depth from perspective or relative movement-functions that might be difficult to deduce from single-cell response properties
keywords color, theory, perception
series CADline
last changed 2003/06/02 08:24

For more results click below:

this is page 0show page 1HOMELOGIN (you are user _anon_517700 from group guest) CUMINCAD Papers Powered by SciX Open Publishing Services 1.002