CumInCAD is a Cumulative Index about publications in Computer Aided Architectural Design
supported by the sibling associations ACADIA, CAADRIA, eCAADe, SIGraDi, ASCAAD and CAAD futures

PDF papers
References

Hits 1 to 20 of 76

_id cf2011_p027
id cf2011_p027
authors Herssens, Jasmien; Heylighen Ann
year 2011
title A Framework of Haptic Design Parameters for Architects: Sensory Paradox Between Content and Representation
source Computer Aided Architectural Design Futures 2011 [Proceedings of the 14th International Conference on Computer Aided Architectural Design Futures / ISBN 9782874561429] Liege (Belgium) 4-8 July 2011, pp. 685-700.
summary Architects—like other designers—tend to think, know and work in a visual way. In design research, this way of knowing and working is highly valued as paramount to design expertise (Cross 1982, 2006). In case of architecture, however, it is not only a particular strength, but may as well be regarded as a serious weakness. The absence of non-visual features in traditional architectural spatial representations indicates how these are disregarded as important elements in conceiving space (Dischinger 2006). This bias towards vision, and the suppression of other senses—in the way architecture is conceived, taught and critiqued—results in a disappearance of sensory qualities (Pallasmaa 2005). Nevertheless, if architects design with more attention to non visual senses, they are able to contribute to more inclusive environments. Indeed if an environment offers a range of sensory triggers, people with different sensory capacities are able to navigate and enjoy it. Rather than implementing as many sensory triggers as possible, the intention is to make buildings and spaces accessible and enjoyable for more people, in line with the objective of inclusive design (Clarkson et al. 2007), also called Design for All or Universal Design (Ostroff 2001). Within this overall objective, the aim of our study is to develop haptic design parameters that support architects during design in paying more attention to the role of haptics, i.e. the sense of touch, in the built environment by informing them about the haptic implications of their design decisions. In the context of our study, haptic design parameters are defined as variables that can be decided upon by designers throughout the design process, and the value of which determines the haptic characteristics of the resulting design. These characteristics are based on the expertise of people who are congenitally blind, as they are more attentive to non visual information, and of professional caregivers working with them. The parameters do not intend to be prescriptive, nor to impose a particular method. Instead they seek to facilitate a more inclusive design attitude by informing designers and helping them to think differently. As the insights from the empirical studies with people born blind and caregivers have been reported elsewhere (Authors 2010), this paper starts by outlining the haptic design parameters resulting from them. Following the classification of haptics into active, dynamic and passive touch, the built environment unfolds into surfaces that can act as “movement”, “guiding” and/or “rest” plane. Furthermore design techniques are suggested to check the haptic qualities during the design process. Subsequently, the paper reports on a focus group interview/workshop with professional architects to assess the usability of the haptic design parameters for design practice. The architects were then asked to try out the parameters in the context of a concrete design project. The reactions suggest that the participating architects immediately picked up the underlying idea of the parameters, and recognized their relevance in relation to the design project at stake, but that their representation confronts us with a sensory paradox: although the parameters question the impact of the visual in architectural design, they are meant to be used by designers, who are used to think, know and work in a visual way.
keywords blindness, design parameters, haptics, inclusive design, vision
series CAAD Futures
email
last changed 2012/02/11 19:21

_id 8c27
authors Kalay, Yehuda E.
year 1982
title Determining the Spatial Containment of a Point in General Polyhedra
source Computer graphics and Image Processing. 1982. vol. 19: pp. 303-334 : ill. includes bibliography. See also criticism and improvements in Orlowski, Marian
summary Determining the inclusion of a point in volume-enclosing polyhedra (shapes) in 3D space is, in principle, the extension of the well-known problem of determining the inclusion of a point in a polygon in 2D space. However, the extra degree of freedom makes 3D point-polyhedron containment analysis much more difficult to solve than the 2D point polygon problem, mainly because of the nonsequential ordering of the shape elements, which requires global shape data to be applied for resolving special cases. Two general O(n) algorithms for solving the problem by reducing the 3D case into the solvable 2D case are presented. The first algorithm, denoted 'the projection method,' is applicable to any planar- faced polyhedron, reducing the dimensionality by employing parallel projection to generate planar images of the shape faces, together with an image of the point being tested for inclusion. The containment relationship of these images is used to increment a global parity-counter when appropriate, representing an abstraction for counting the intersections between the surface of the shape and a halfline extending from the point to infinity. An 'inside' relationship is established when the parity-count is odd. Special cases (coincidence of the halfline with edges or vertices of the shape) are resolved by eliminating the coincidental elements and re-projecting the merged faces. The second algorithm, denoted 'the intersection method,' is applicable to any well- formed shape, including curved-surfaced ones. It reduces the dimensionality by intersecting the polygonal trace of the shape surface at the plane of intersection, which is tested for containing the trace of the point in the plane, directly establishing the overall 3D containment relationship. A particular O(n) implementation of the 2D point-in-polygon inclusion algorithm, which is used for solving the problem once reduced in dimensionality, is also presented. The presentation is complemented by discussions of the problems associated with point-polyhedron relationship determination in general, and comparative analysis of the two particular algorithms presented
keywords geometric modeling, point inclusion, polygons, polyhedra, computational geometry, algorithms, search, B-rep
series CADline
email
last changed 2003/06/02 10:24

_id avocaad_2001_16
id avocaad_2001_16
authors Yu-Ying Chang, Yu-Tung Liu, Chien-Hui Wong
year 2001
title Some Phenomena of Spatial Characteristics of Cyberspace
source AVOCAAD - ADDED VALUE OF COMPUTER AIDED ARCHITECTURAL DESIGN, Nys Koenraad, Provoost Tom, Verbeke Johan, Verleye Johan (Eds.), (2001) Hogeschool voor Wetenschap en Kunst - Departement Architectuur Sint-Lucas, Campus Brussel, ISBN 80-76101-05-1
summary "Space," which has long been an important concept in architecture (Bloomer & Moore, 1977; Mitchell, 1995, 1999), has attracted interest of researchers from various academic disciplines in recent years (Agnew, 1993; Benko & Strohmayer, 1996; Chang, 1999; Foucault, 1982; Gould, 1998). Researchers from disciplines such as anthropology, geography, sociology, philosophy, and linguistics regard it as the basis of the discussion of various theories in social sciences and humanities (Chen, 1999). On the other hand, since the invention of Internet, Internet users have been experiencing a new and magic "world." According to the definitions in traditional architecture theories, "space" is generated whenever people define a finite void by some physical elements (Zevi, 1985). However, although Internet is a virtual, immense, invisible and intangible world, navigating in it, we can still sense the very presence of ourselves and others in a wonderland. This sense could be testified by our naming of Internet as Cyberspace -- an exotic kind of space. Therefore, as people nowadays rely more and more on the Internet in their daily life, and as more and more architectural scholars and designers begin to invest their efforts in the design of virtual places online (e.g., Maher, 1999; Li & Maher, 2000), we cannot help but ask whether there are indeed sensible spaces in Internet. And if yes, these spaces exist in terms of what forms and created by what ways?To join the current interdisciplinary discussion on the issue of space, and to obtain new definition as well as insightful understanding of "space", this study explores the spatial phenomena in Internet. We hope that our findings would ultimately be also useful for contemporary architectural designers and scholars in their designs in the real world.As a preliminary exploration, the main objective of this study is to discover the elements involved in the creation/construction of Internet spaces and to examine the relationship between human participants and Internet spaces. In addition, this study also attempts to investigate whether participants from different academic disciplines define or experience Internet spaces in different ways, and to find what spatial elements of Internet they emphasize the most.In order to achieve a more comprehensive understanding of the spatial phenomena in Internet and to overcome the subjectivity of the members of the research team, the research design of this study was divided into two stages. At the first stage, we conducted literature review to study existing theories of space (which are based on observations and investigations of the physical world). At the second stage of this study, we recruited 8 Internet regular users to approach this topic from different point of views, and to see whether people with different academic training would define and experience Internet spaces differently.The results of this study reveal that the relationship between human participants and Internet spaces is different from that between human participants and physical spaces. In the physical world, physical elements of space must be established first; it then begins to be regarded as a place after interaction between/among human participants or interaction between human participants and the physical environment. In contrast, in Internet, a sense of place is first created through human interactions (or activities), Internet participants then begin to sense the existence of a space. Therefore, it seems that, among the many spatial elements of Internet we found, "interaction/reciprocity" Ñ either between/among human participants or between human participants and the computer interface Ð seems to be the most crucial element.In addition, another interesting result of this study is that verbal (linguistic) elements could provoke a sense of space in a degree higher than 2D visual representation and no less than 3D visual simulations. Nevertheless, verbal and 3D visual elements seem to work in different ways in terms of cognitive behaviors: Verbal elements provoke visual imagery and other sensory perceptions by "imagining" and then excite personal experiences of space; visual elements, on the other hand, provoke and excite visual experiences of space directly by "mapping".Finally, it was found that participants with different academic training did experience and define space differently. For example, when experiencing and analyzing Internet spaces, architecture designers, the creators of the physical world, emphasize the design of circulation and orientation, while participants with linguistics training focus more on subtle language usage. Visual designers tend to analyze the graphical elements of virtual spaces based on traditional painting theories; industrial designers, on the other hand, tend to treat these spaces as industrial products, emphasizing concept of user-center and the control of the computer interface.The findings of this study seem to add new information to our understanding of virtual space. It would be interesting for future studies to investigate how this information influences architectural designers in their real-world practices in this digital age. In addition, to obtain a fuller picture of Internet space, further research is needed to study the same issue by examining more Internet participants who have no formal linguistics and graphical training.
series AVOCAAD
email
last changed 2005/09/09 10:48

_id 8a88
authors Anderson, David P.
year 1982
title Hidden Line Elimination in Projected Grid Surfaces
source ACM Transactions on Graphics. October, 1982. vol. 1: pp. 274-288 : ill. includes a short bibliography
summary The hidden line and hidden surface problems are simpler when restricted to special classes of objects. An example is the class of grid surfaces, that is, graphs of bivariate functions represented by their values on a set of grid points. Projected grid surfaces have geometric properties which permit hidden line or hidden surface elimination to be done more easily than in the general case. These properties are discussed in this paper, and an algorithm is given which exploits them
keywords algorithms, hidden lines, hidden surfaces, grids, computer graphics
series CADline
last changed 2003/06/02 10:24

_id 8239
authors Campello, Ruy Eduardo and Maculan, Nelson
year 1982
title On Deep Disjunctive Cutting Planes for Set Partitioning : A Computationally Oriented Research.
source Pittsburgh: Design Research Center, CMU [DRC-70-11-82], 10 p.
summary Several mathematical programming problems can be formulated as Disjunctive Programming Problems. This approach offers a powerful procedure for the generation of new and strong cutting planes with desirable properties. For general integer programs, the traditional cutting plane methodologies proved less efficient than enumerative techniques. However, for certain classes of problems, such as set partitioning, cutting planes are known to be efficient. Since the disjunctive cuts are strong, they can be expected to perform better. This paper reports on computational results with disjunctive B(.) cuts for the set partitioning problem, evaluated in terms of computer resources and other independent measures in solving specific randomly generated test problems under controlled conditions. [includes bibliography].
keywords Mathematics, Operations Research, Integer Programming, Optimization
series CADline
last changed 1999/02/15 15:17

_id 66df
authors Cendes, Z.J., Minhas, F.U. and Silvester, P.P.
year 1982
title Universal Finite Element Matrices for Tetrahedra
source 45, [22] p Pittsburgh: Design Research Center, CMU, December, 1982. DRC- 18-58-82. includes bibliography.
summary Methods are described for forming finite element matrices for a wide variety of operators on tetrahedral finite elements, in a manner similar to that previously employed for line segments and triangles. This technique models the differentiation and product-embedding operators as rectangular matrices, and produces finite element matrices by replacing all required analytic operations by their finite matrix analogues. The method is illustrated by deriving the conventional matrix representation for Laplace's equation. Brief computer programs are given, which generate universal finite element matrices for use in various applications
keywords mathematics, computational geometry, finite elements, analysis
series CADline
last changed 2003/06/02 13:58

_id 2415
authors Nievergelt, J. and Preparata, Franco P.
year 1982
title Plane-Sweep Algorithms for Intersecting Geometric Figures
source Communications of the ACM. October, 1982. vol. 25: pp. 739-747 : ill. includes bibliography
summary Algorithms in computational geometry are of increasing importance in computer-aided design, for example, in the layout of integrated circuits. The efficient computation of the intersection of several superimposed figures is a basic problem. Plane figures defined by points connected by straight line segments are considered, for example, polygons (not necessarily simple) and maps (embedded planar graphs). The regions into which the plane is partitioned by these intersecting figures are to be processed in various ways such as listing the boundary of each region in cyclic order or sweeping the interior of each region. Let m be the total number of points of all the figures involved and s be the total number of intersections of all line segments. A two plane-sweep algorithm that solves the problems above is presented; in the general case (non convexity) in time O((n+s)log-n) and space O(n+s); when the regions of each given figure are convex, the same can be achieved in time O(n log n +s) and space O(n)
keywords computational geometry, algorithms, intersection, mapping, polygons, data structures, analysis
series CADline
last changed 2003/06/02 10:24

_id 2243
authors O'Rourke, J., Chien, C.-B. and Olson, Th. (et al)
year 1982
title A New Linear Algorithm for Intersecting Convex Polygons
source Computer Graphics and Image Processing. 1982. vol. 19: pp. 384-391 : ill. includes a short bibliography
summary An algorithm is presented that computes the intersection of two convex polygons in linear time. The algorithm is fundamentally different from the only known linear algorithms for this problem, due to Shamos and to Hoey. These algorithms depend on a division of the plane into either angular sectors (Shamos) or parallel slabs (Hoey), and are mildly complex. The authors' algorithm searches for the intersection points of the polygons by advancing a single pointer around each polygon, and is very easy to program
keywords algorithms, boolean operations, polygons, intersection, search
series CADline
last changed 2003/06/02 14:42

_id e1d1
authors Shafer, Steven A. and Kanade, Takeo
year 1982
title Using Shadows in Finding Surface Orientations
source 61 p. : ill.` Pittsburgh, PA: Department of Computer Science, CMU, January, 1982. CMU-CS- 82-100
summary Given a line drawing from an image with shadow regions identified, the shapes of the shadows can be used to generate constraints on the orientations of the surfaces involved. This paper describes the theory which governs those constraints under orthography. A 'Basic Shadow Problem' is first posed, in which there is a single light source, and a single surface casts a shadow on another (background) surface. There are six parameters to determine: the orientation (2 parameters) for each surface, and the direction of the vector (2 parameters) pointing at the light source. If some set of 3 of these are given in advance, the remaining 3 can then be determined geometrically
keywords The solution method consists of identifying 'illumination surfaces' consisting of illumination vectors, assigning Huffman-Clowes line labels to
series CADline
last changed 2003/06/02 13:58

_id eabb
authors Boeykens, St. Geebelen, B. and Neuckermans, H.
year 2002
title Design phase transitions in object-oriented modeling of architecture
source Connecting the Real and the Virtual - design e-ducation [20th eCAADe Conference Proceedings / ISBN 0-9541183-0-8] Warsaw (Poland) 18-20 September 2002, pp. 310-313
doi https://doi.org/10.52842/conf.ecaade.2002.310
summary The project IDEA+ aims to develop an “Integrated Design Environment for Architecture”. Its goal is providing a tool for the designer-architect that can be of assistance in the early-design phases. It should provide the possibility to perform tests (like heat or cost calculations) and simple simulations in the different (early) design phases, without the need for a fully detailed design or remodeling in a different application. The test for daylighting is already in development (Geebelen, to be published). The conceptual foundation for this design environment has been laid out in a scheme in which different design phases and scales are defined, together with appropriate tests at the different levels (Neuckermans, 1992). It is a translation of the “designerly” way of thinking of the architect (Cross, 1982). This conceptual model has been translated into a “Core Object Model” (Hendricx, 2000), which defines a structured object model to describe the necessary building model. These developments form the theoretical basis for the implementation of IDEA+ (both the data structure & prototype software), which is currently in progress. The research project addresses some issues, which are at the forefront of the architect’s interest while designing with CAAD. These are treated from the point of view of a practicing architect.
series eCAADe
email
last changed 2022/06/07 07:52

_id 898a
authors Bay, J.H.
year 2002
title Cognitive Biases and Precedent Knowledge in Human and Computer-Aided Design Thinking
source CAADRIA 2002 [Proceedings of the 7th International Conference on Computer Aided Architectural Design Research in Asia / ISBN 983-2473-42-X] Cyberjaya (Malaysia) 18–20 April 2002, pp. 213-220
doi https://doi.org/10.52842/conf.caadria.2002.213
summary Cognitive biases (illusions) and potential errors can occur when using precedent knowledge for analogical, pre-parametric and qualitative design thinking. This paper refers largely to part of a completed research (Bay 2001) on how heuristic biases, discussed by Tversky and Kahneman (1982) in cognitive psychology, can affect judgement and learning of facts from precedents in architectural design, made explicit using a kernel of conceptual system (Tzonis et. al., 1978) and a framework of architectural representation (Tzonis 1992). These are used here to consider how such illusions and errors may be transferred to computer aided design thinking.
series CAADRIA
email
last changed 2022/06/07 07:54

_id cf2003_m_040
id cf2003_m_040
authors BAY, Joo-Hwa
year 2003
title Making Rebuttals Available Digitally for Minimising Biases in Mental Judgements
source Digital Design - Research and Practice [Proceedings of the 10th International Conference on Computer Aided Architectural Design Futures / ISBN 1-4020-1210-1] Tainan (Taiwan) 13–15 October 2003, pp. 147-156
summary The problem of heuristic biases (illusions) discussed by Tversky and Kahneman (1982) that can lead to errors in judgement by human designers, when they use precedent knowledge presented graphically (Bay 2001). A Cognitive framework of belief, goal, and decision, and a framework of representation of architectural knowledge by Tzonis are used to map out the problem of heuristic biases in the human mind. These are used to discuss what aspects of knowledge can be presented explicitly and digitally to users to make rebuttal more available for human thinking at the cognitive level. The discussion is applicable to both inductive and analytic digital knowledge systems that use precedent knowledge. This discussion is targeted directly at means of addressing bias in the human mind using digital means. The problem of human bias in machine learning and generalisation are discussed in a different paper, and the problems of international or non-intentional machine bias are not part of discussion in this paper.
keywords analogy, bias, design thinking, environmental design, heuristics
series CAAD Futures
last changed 2003/11/22 07:26

_id 1b10
id 1b10
authors Bay, Joo-Hwa
year 2001
title Cognitive Biases - The case of tropical architecture
source Delft University of Technology
summary This dissertation investigates, i) How cognitive biases (or illusions) may lead to errors in design thinking, ii) Why architects use architectural precedents as heuristics despite such possible errors, and iii) Develops a design tool that can overcome this type of errors through the introduction of a rebuttal mechanism. The mechanism controls biases and improves accuracy in architectural thinking. // The research method applied is interdisciplinary. It employs knowledge from cognitive science, environmental engineering, and architectural theory. The case study approach is also used. The investigation is made in the case of tropical architecture. The investigation of architectural biases draws from work by A. Tversky and D. Kahneman in 1982 on “Heuristics and biases”. According to Tversky and Kahneman, the use of heuristics of representativeness (based on similarity) and availability (based on ease of recall and imaginability) for judgement of probability can result in cognitive biases of illusions of validity and biases due to imaginability respectively. This theory can be used analogically to understand how errors arise in the judgement of environmental behaviour anticipated from various spatial configurations, leading to designs with dysfunctional performances when built. Incomplete information, limited time, and human mental resources make design thinking in practice difficult and impossible to solve. It is not possible to analyse all possible alternative solutions, multiple contingencies, and multiple conflicting demands, as doing so will lead to combinatorial explosion. One of the ways to cope with the difficult design problem is to use precedents as heuristic devices, as shortcuts in design thinking, and at the risk of errors. This is done with analogical, pre-parametric, and qualitative means of thinking, without quantitative calculations. Heuristics can be efficient and reasonably effective, but may not always be good enough or even correct, because they can have associated cognitive biases that lead to errors. Several debiasing strategies are discussed, and one possibility is to introduce a rebuttal mechanism to refocus the designer’s thinking on the negative and opposite outcomes in his judgements, in order to debias these illusions. The research is carried out within the framework of design theory developed by the Design Knowledge System Research Centre, TUDelft. This strategy is tested with an experiment. The results show that the introduction of a rebuttal mechanism can debias and improve design judgements substantially in environmental control. The tool developed has possible applications in design practice and education, and in particular, in the designing of sustainable environments.
keywords Design bias; Design knowledge; Design rebuttal; Design Precedent; Pre-parametric design; Tropical architecture; Sustainability
series thesis:PhD
type normal paper
email
last changed 2006/05/28 07:42

_id 6094
authors Blinn, J.I.
year 1982
title A Generalization of Algebraic Surface Drawing
source ACM Transaction on Graphics, vol. 1, no. 3, pp. 235-256, 1982
summary The technology of creating realistic and visually interesting images of three- dimensional shapes is advancing on many fronts. One such front is the develop- ment of algorithms for drawing curved surfaces directly from their mathematical definitions rather than by dividing them into large numbers of polygons. Two classes of surfaces which have received attention are the quadric and the bivariate parametric surfaces. Bivariate parametric surfaces are generated by three func- tions of two variables (most popularly polynomials), as the variables take on different values. Algorithms dealing with such surfaces are due to Catmull; Lane, Carpenter, Whitted and Blinn; and Clark.
series journal paper
last changed 2003/11/21 15:16

_id 89e4
authors Cendes, Z.J., Shenton, D. and H. Shahnasser
year 1982
title Adaptive Finite Element Mesh Generation Using the Delaunay Algorithm
source 3 p. : ill. Pittsburgh: Design Research Center, CMU, December, 1982
summary Includes bibliography. A two-dimensional generator is described which automatically creates optimal finite element meshes using the Delaunay triangulation algorithm. The mesh generator is adaptive in the sense that elements containing the largest normalized errors are automatically refined, providing meshes with a uniform error density. The system runs on a PERQ computer made by Three Rivers Computer Company. It is menu oriented and utilizes multiple command and display windows to create and edit the object description interactively. Mesh generation from the object data base is automatic, although it may be modified interactively by the user if desired. Application of the mesh generator to electric machine design and to magnetic bubble simulation shows it to be one of the most powerful and easy to use systems yet devised
keywords electrical engineering, triangulation, algorithms, OOPS, finite elements, analysis
series CADline
last changed 2003/06/02 13:58

_id 482a
authors Cole, Sam
year 1982
title A Microprocessor Revolution and the World Distribution of Income: A General Equilibrium Approach
source International Political Science Review. 1982. vol.3: pp. 434- 454 ; ill. includes bibliography
summary This article shows that even if the world economy is able to withstand and surmount the present world crisis, the combination of market forces and rapid technical change that would be the result of a microprocessor revolution will give rise to large shifts in the distribution of income within and between both rich and poor countries. Some developed and developing economies may be unable to join the move to new technologies. In a world governed by only economic forces, all countries, whether they choose to adopt new systems of production or not, will be affected. Indeed, whatever their degree of involvement, all countries are beginning to feel in varying degrees the chain reaction that reverberates through and between all sectors of their domestic and the world economies. To gain insights into interrelations between technological change and global markets, this article uses a special type of model -- a general equilibrium model -- that enables the study to focus on exactly these variables
keywords technology, economics
series CADline
last changed 1999/02/12 15:07

_id sigradi2006_e183a
id sigradi2006_e183a
authors Costa Couceiro, Mauro
year 2006
title La Arquitectura como Extensión Fenotípica Humana - Un Acercamiento Basado en Análisis Computacionales [Architecture as human phenotypic extension – An approach based on computational explorations]
source SIGraDi 2006 - [Proceedings of the 10th Iberoamerican Congress of Digital Graphics] Santiago de Chile - Chile 21-23 November 2006, pp. 56-60
summary The study describes some of the aspects tackled within a current Ph.D. research where architectural applications of constructive, structural and organization processes existing in biological systems are considered. The present information processing capacity of computers and the specific software development have allowed creating a bridge between two holistic nature disciplines: architecture and biology. The crossover between those disciplines entails a methodological paradigm change towards a new one based on the dynamical aspects of forms and compositions. Recent studies about artificial-natural intelligence (Hawkins, 2004) and developmental-evolutionary biology (Maturana, 2004) have added fundamental knowledge about the role of the analogy in the creative process and the relationship between forms and functions. The dimensions and restrictions of the Evo-Devo concepts are analyzed, developed and tested by software that combines parametric geometries, L-systems (Lindenmayer, 1990), shape-grammars (Stiny and Gips, 1971) and evolutionary algorithms (Holland, 1975) as a way of testing new architectural solutions within computable environments. It is pondered Lamarck´s (1744-1829) and Weismann (1834-1914) theoretical approaches to evolution where can be found significant opposing views. Lamarck´s theory assumes that an individual effort towards a specific evolutionary goal can cause change to descendents. On the other hand, Weismann defended that the germ cells are not affected by anything the body learns or any ability it acquires during its life, and cannot pass this information on to the next generation; this is called the Weismann barrier. Lamarck’s widely rejected theory has recently found a new place in artificial and natural intelligence researches as a valid explanation to some aspects of the human knowledge evolution phenomena, that is, the deliberate change of paradigms in the intentional research of solutions. As well as the analogy between genetics and architecture (Estévez and Shu, 2000) is useful in order to understand and program emergent complexity phenomena (Hopfield, 1982) for architectural solutions, also the consideration of architecture as a product of a human extended phenotype can help us to understand better its cultural dimension.
keywords evolutionary computation; genetic architectures; artificial/natural intelligence
series SIGRADI
email
last changed 2016/03/10 09:49

_id e7b8
authors Dahl, Veronica
year 1983
title Logic Programming as a Representation of Knowledge
source IEEE Computer. IEEE Computer Society, October, 1983. vol. 16: pp. 106-110 : ill. includes bibliography
summary Logic has traditionally provided a firm conceptual framework for representing knowledge. As it can formally deal with the notion of logical consequence, the introduction of Prolog has made it possible to represent knowledge in terms of logic and also to expect appropriate inferences to be drawn from it automatically. This article illustrates and explores these ideas with respect to two central representational issues: problem solving knowledge and database knowledge. The technical aspects of both subjects have been covered elsewhere (Kowalski, R. Logic for problem solving, North- Holland pub. 1979 ; Dahl, V. on database system development through logic ACM Trans.vol.7/no.3/Mar.1982 pp.102). This explanation uses simple, nontechnical terms
keywords PROLOG, knowledge, representation, logic, programming, problem solving, database
series CADline
last changed 1999/02/12 15:08

_id 0650
authors Fenves, Stephen J. and Rasdorf, William J.
year 1982
title Role of Database Management Systems in Structural Engineering
source 15 p.: ill Pittsburgh: Design Research Center, CMU, December, 1982. includes bibliography.
summary --- Presented at International Association of Bridge and Structural Engineers Symposium on Informatics in Structural Engineering.(1982 : Bergamo, Italy). The future integration of structural engineering application programs will depend critically on integrated databases which provide access to information in essentially arbitrary sequences, and which automatically perform a large portion of integrity checking on the data. One source of such design databases are the database management systems (DBMS) evolving from management applications. The paper surveys such systems and presents some extensions needed
keywords integration, database, DBMS, systems, civil engineering
series CADline
last changed 2003/06/02 10:24

_id 8d70
authors Fenves, Stephen J.
year 1982
title A Note on Flow Networks and Structures
source Pittsburgh: Design Research Center, CMU, April, 1982. 7, [11] p. : ill
summary The purpose of this note is to explore certain common features of flow and structural networks, in terms of the effect of the (usually implicit) global equilibrium equations on the range of problems amenable to analysis. The objective is to extend the standard analysis problem and explore the possibility of 'tuning' flow and structural networks by preassigning values of certain inflow and reaction components
keywords networks, civil engineering, structures, analysis
series CADline
last changed 2003/06/02 10:24

For more results click below:

this is page 0show page 1show page 2show page 3HOMELOGIN (you are user _anon_356445 from group guest) CUMINCAD Papers Powered by SciX Open Publishing Services 1.002