CumInCAD is a Cumulative Index about publications in Computer Aided Architectural Design
supported by the sibling associations ACADIA, CAADRIA, eCAADe, SIGraDi, ASCAAD and CAAD futures

PDF papers
References

Hits 1 to 20 of 620

_id ga0020
id ga0020
authors Codignola, G.Matteo
year 2000
title [Title missing]
source International Conference on Generative Art
summary This paper is a summary of my last degree in architecture (discussed in December 1999) with Prof. Celestino Soddu and Prof. Enrica Colabella. In this work I had the possibility to reach complexity by a generative approach with the construction of a paradigm that organizes the different codes of project identity. My general objective was to design shape complexity in variable categories : 3d space surfaces, 2d drawings and 2d textures. I was to discover the identity of one of my favourite architects of the 20th century : Antoni Gaudě, by constructing codes relative to shape complexity. I defined my particular objective in the possibility to abduct from Gaudě's imaginary reference the generatives codes that operate in the logical processing I use to create a possible species project. The next step was to verify the exact working of the new generative codes by means of 3d scenaries, that are recognizable as "Antoni Gaudě specie's architecture". Whit project processing on the generative codes and not on a possible resulting shape design, I was able to organize by my general paradigm the attributes of the project's species : different shapes, different attributes (color, scale, proportion), to get to possible and different scenarys, all recognizable by the relative class codes. I chose three examples in Barcellona built during the period 1902 to 1914 : The Parco Guell, Casa Batllň and Casa Milŕ are the three reference sceneryes that I used to create the generative codes. In the second step I defined different codes that operate in sequence (it is defined in the paradigm) : The generatives codes are only subjective; they are one possible solution of my interpretation of Antoni Gaudě's identity. This codes operate in four differents ways : Geometrical codes for 2d shapes Geometrical codes for interface relations Spatial codes for 3d extrusion of 2d shapes Geometrical codes for 2d and 3d texturing of generated surfaces. By a stratified application of this codes I arrived at one idea for all the generative processes but many different, possible scenaryes, all recognizable in Gaudě's species. So, my final result has made possible sceneryes belonging to related species defined previously. At the end of my research I designed a project by combination : using Antoni Gaudě's generative codes on a new 3d scenary with a shape catalyst : the Frank Lloyd Wright Guggenheim Museum of New York. In this process I created a "hybrid scenary" : a new species of architectural look; a Guggenheim museum planned by Wright with a god pinch of Gaudě.
series other
email
more http://www.generativeart.com/
last changed 2003/08/07 17:25

_id 99ce
authors Forowicz, T.
year 1999
title Modeling of energy demands for residential buildings with HTML interface
source Automation in Construction 8 (4) (1999) pp. 481-487
summary This paper presents the package for calculation of energy and cost demands for heating, cooling and hot water. The package represents a new kind of approach to developing software, employing user (client) and server (program provider) computers connected by Internet. It is mounted on the owner server and is available to the whole world through the Web browser. The package was developed as a simplified tool for estimating energy use in four types of new and old houses, located in 900 US cities. The computing engine utilizes the database that was compiled by LBL in support of the 'Affordable Housing through Energy Conservation' Project with over 10000 DOE-2.1 simulations. The package consists of 69 routines and scripts coded in four languages: HTML, Perl, C, and FORTRAN. The modeling, the programming, and the future perspectives of the new kind of computational tool are presented. The paper discusses further technical limitations, as well as suggestions for further improvements and development. Especially important is the problem of multi-user access; ways for its solution are proposed.
series journal paper
more http://www.elsevier.com/locate/autcon
last changed 2003/05/15 21:22

_id cf2011_p109
id cf2011_p109
authors Abdelmohsen, Sherif; Lee Jinkook, Eastman Chuck
year 2011
title Automated Cost Analysis of Concept Design BIM Models
source Computer Aided Architectural Design Futures 2011 [Proceedings of the 14th International Conference on Computer Aided Architectural Design Futures / ISBN 9782874561429] Liege (Belgium) 4-8 July 2011, pp. 403-418.
summary AUTOMATED COST ANALYSIS OF CONCEPT DESIGN BIM MODELS Interoperability: BIM models and cost models This paper introduces the automated cost analysis developed for the General Services Administration (GSA) and the analysis results of a case study involving a concept design courthouse BIM model. The purpose of this study is to investigate interoperability issues related to integrating design and analysis tools; specifically BIM models and cost models. Previous efforts to generate cost estimates from BIM models have focused on developing two necessary but disjoint processes: 1) extracting accurate quantity take off data from BIM models, and 2) manipulating cost analysis results to provide informative feedback. Some recent efforts involve developing detailed definitions, enhanced IFC-based formats and in-house standards for assemblies that encompass building models (e.g. US Corps of Engineers). Some commercial applications enhance the level of detail associated to BIM objects with assembly descriptions to produce lightweight BIM models that can be used by different applications for various purposes (e.g. Autodesk for design review, Navisworks for scheduling, Innovaya for visual estimating, etc.). This study suggests the integration of design and analysis tools by means of managing all building data in one shared repository accessible to multiple domains in the AEC industry (Eastman, 1999; Eastman et al., 2008; authors, 2010). Our approach aims at providing an integrated platform that incorporates a quantity take off extraction method from IFC models, a cost analysis model, and a comprehensive cost reporting scheme, using the Solibri Model Checker (SMC) development environment. Approach As part of the effort to improve the performance of federal buildings, GSA evaluates concept design alternatives based on their compliance with specific requirements, including cost analysis. Two basic challenges emerge in the process of automating cost analysis for BIM models: 1) At this early concept design stage, only minimal information is available to produce a reliable analysis, such as space names and areas, and building gross area, 2) design alternatives share a lot of programmatic requirements such as location, functional spaces and other data. It is thus crucial to integrate other factors that contribute to substantial cost differences such as perimeter, and exterior wall and roof areas. These are extracted from BIM models using IFC data and input through XML into the Parametric Cost Engineering System (PACES, 2010) software to generate cost analysis reports. PACES uses this limited dataset at a conceptual stage and RSMeans (2010) data to infer cost assemblies at different levels of detail. Functionalities Cost model import module The cost model import module has three main functionalities: generating the input dataset necessary for the cost model, performing a semantic mapping between building type specific names and name aggregation structures in PACES known as functional space areas (FSAs), and managing cost data external to the BIM model, such as location and construction duration. The module computes building data such as footprint, gross area, perimeter, external wall and roof area and building space areas. This data is generated through SMC in the form of an XML file and imported into PACES. Reporting module The reporting module uses the cost report generated by PACES to develop a comprehensive report in the form of an excel spreadsheet. This report consists of a systems-elemental estimate that shows the main systems of the building in terms of UniFormat categories, escalation, markups, overhead and conditions, a UniFormat Level III report, and a cost breakdown that provides a summary of material, equipment, labor and total costs. Building parameters are integrated in the report to provide insight on the variations among design alternatives.
keywords building information modeling, interoperability, cost analysis, IFC
series CAAD Futures
email
last changed 2012/02/11 19:21

_id 96dd
authors Naja, H.
year 1999
title Multiview databases for building modelling
source Automation in Construction 8 (5) (1999) pp. 567-579
summary Database systems provide various facilities including modelling data, queries, semantic integrity control, concurrency control, recovery and authorisation. The transition from relational database technology to object technology is characterised by a richer data model to meet the requirements of new applications such as computer-aided design (CAD) systems. However, object technology still has several shortcomings. One of these shortcomings is that conventionally object model is not able to deal with data that can be described and queried according to different viewpoints. The building practice, for example, is characterised by the organisation of different participants that work towards the elaboration of the building; each one performs a specific role and has a specific view on the building project data. This paper proposes the CEDAR model, which specifies object-oriented multiview databases that can represent data and ensure their integrity according to different viewpoints. The approach is illustrated with an outline of a building project.
series journal paper
more http://www.elsevier.com/locate/autcon
last changed 2003/05/15 21:23

_id 7cb1
authors Stuurstraat, N. and Tolman, F.
year 1999
title A product modeling approach to building knowledge integration
source Automation in Construction 8 (3) (1999) pp. 269-275
summary Knowledge informatics is still playing only a minor role in the design process of buildings and civil engineering efforts, particularly in the inception stage. The primary reason that most knowledge tools are not well integrated into the process is that most tend to be based on stand alone expert system technology. Improving the re-use of existing knowledge is required to increase industry performance. A solution could be a new generation of integrated knowledge systems. One problem that must be addressed is how to cope with the conflicting requirements of each particular subsystem when each is optimized for its own knowledge domain. No optimum solution exists that is able to simultaneously optimize each subsystem for a total solution. This paper discusses an approach to building knowledge integration that attempts to address these shortcomings through the use of combined product model and meta-knowledge approach.
series journal paper
more http://www.elsevier.com/locate/autcon
last changed 2003/05/15 21:23

_id 44c0
authors Van Leeuwen, Jos P.
year 1999
title Modelling architectural design information by features : an approach to dynamic product modelling for application in architectural design
source Eindhoven University of Technology
summary Architectural design, like many other human activities, benefits more and more from the ongoing development of information and communication technologies. The traditional paper documents for the representation and communication of design are now replaced by digital media. CAD systems have replaced the drawing board and knowledge systems are used to integrate expert knowledge in the design process. Product modelling is one of the most promising approaches in the developments of the last two decades, aiming in the architectural context at the representation and communication of the information related to a building in all its aspects and during its complete life-cycle. However, after studying both the characteristics of the product modelling approach and the characteristics of architectural design, it is concluded in this research project that product modelling does not suffice for support of architectural design. Architectural design is characterised mainly as a problem solving process, involving illdefined problems that require a very dynamic way of dealing with information that concerns both the problem and emerging solutions. Furthermore, architectural design is in many ways an evolutionary process. In short term this is because of the incremental approach to problem solving in design projects; and in long term because of the stylistic development of designers and the continuous developments in the building and construction industry in general. The requirements that are posed by architectural design are concentrated in the keywords extensibility and flexibility of the design informationmodels. Extensibility means that designers can extend conceptual models with definitions that best suit the design concepts they wish to utilise. Flexibility means that information in design models can be structured in a way that accurately represents the design rationale. This includes the modelling of incidental characteristics and relationships of the entities in the model that are not necessarily predefined in a conceptual model. In general, product modelling does not adequately support this dynamic nature of design. Therefore, this research project has studied the concepts developed in the technology of Feature-based modelling, which originates from the area of mechanical engineering. These concepts include the usage of Features as the primitives for defining and reasoning about a product. Features have an autonomous function in the information model, which, as a result, constitutes a flexible network of relationships between Features that are established during the design process. The definition of Features can be specified by designers to formalise new design concepts. This allows the design tools to be adapted to the specific needs of the individual designer, enlarging the library of available resources for design. In addition to these key-concepts in Feature-based modelling as it is developed in the mechanical engineering context, the project has determined the following principles for a Feature-based approach in the architectural context. Features in mechanical engineering are used mainly to describe the lowest level of detail in a product's design, namely the characteristics of its parts. In architecture the design process does not normally follow a strictly hierarchical approach and therefore requires that the building be modelled as a whole. This implies that multiple levels of abstraction are modelled and that Features are used to describe information at the various abstraction levels. Furthermore, architectural design involves concepts that are non-physical as well as physical; Features are to be used for modelling both kinds. The term Feature is defined in this research project to reflect the above key-concepts for this modelling approach. A Feature is an autonomous, coherent collection of information, with semantic meaning to a designer and possibly emerging during design, that is defined to formalise a design concept at any level of abstraction, either physical or non-physical, as part of a building model. Feature models are built up entirely of Features and are structured in the form of a directed graph. The nodes in the graph are the Features, whereas the arcs are the relationships between the Features. Features can be of user-defined types and incidental relationships can be added that are not defined at the typological level. An inventory in this project of what kind of information is involved in the practice of modelling architectural design is based on the analysis of a selection of sources of architectural design information. This inventory is deepened by a case study and results in the proposition of a categorisation of architectural Feature types.
keywords Automated Management Information Systems; Computer Aided Architectural Design; Information Systems; Modelling
series thesis:PhD
email
more http://www.ds.arch.tue.nl/jos/thesis/
last changed 2003/02/12 22:37

_id cf4d
authors Zamanian, M.K. and Pittman, J.H.
year 1999
title A software industry perspective on AEC information models for distributed collaboration
source Automation in Construction 8 (3) (1999) pp. 237-248
summary This paper focuses on information modeling and computing technologies that are most relevant to the emerging software for the Architecture, Engineering, and Construction (AEC) industry. After a brief introduction to the AEC industry and its present state of computer-based information sharing and collaboration, a set of requirements for AEC information models are identified. Next, a number of key information modeling and standards initiatives for the AEC domain are briefly discussed followed by a review of the emerging object and Internet technologies. The paper will then present our perspective on the challenges and potential directions for using object-based information models in a new generation of AEC software systems intended to offer component-based open architecture for distributed collaboration.
series journal paper
more http://www.elsevier.com/locate/autcon
last changed 2003/05/15 21:23

_id 9745
authors De Paoli, Giovanni and Bogdan, Marius
year 1999
title The Backstage of Vitruvius' Roman Theatre: A New Method of Computer-Aided Design that Reduces the Gap between the Functional and the Operational
source CAADRIA '99 [Proceedings of The Fourth Conference on Computer Aided Architectural Design Research in Asia / ISBN 7-5439-1233-3] Shanghai (China) 5-7 May 1999, pp. 411-422
doi https://doi.org/10.52842/conf.caadria.1999.411
summary Computers are increasingly being used in professional design studios and by students of Architecture. However, their use is limited to technical functions (tekhne); what one usually calls computer-aided design is often no more than computer-aided drawing. In this research paper we reflect on the architect's work methods, and suggest an approach to design based on the "projection" of properties of the object (i.e. operators), rather than by geometric primitives. We propose a method of design using procedural models, and encourage a reevaluation of current programs of study with their traditional subdivision into separate disciplines. By means of a procedural model of Vitruvius' Roman theatre, we show that, from a generic model we can produce a three dimensional (volumetric) model with all the characteristics belonging to a single family of objects. In order to clarify the method of construction, we use a functional language that allows us to model the actions. Similarly, we can use this functional language to encapsulate the properties of the building. The scientific result of this experiment is the understanding and confirmation of the hypothesis that, by means of computers, we can find operators that help the architect assimilate a complex building design.
keywords Architecture, CAD, Discipline, Functions, Modeling, Operator
series CAADRIA
last changed 2022/06/07 07:55

_id b0c3
authors Flanagan, Robert
year 1999
title Designing by Simulation
source III Congreso Iberoamericano de Grafico Digital [SIGRADI Conference Proceedings] Montevideo (Uruguay) September 29th - October 1st 1999, pp. 25-30
summary This article evaluates ‘simulation’ as a contributing factor in architectural design. While computers enhance simulation, they have yet to transform the art of architecture. A partial explanation is found at the extremes of design processes: Gaudí’s Sagrada Família Cathedral of Barcelona is an empiricist’s culminating achievement -- faith expressed in stone. By contrast, SOM’s Sear’s Tower of Chicago is the modernist monument to rational process -- (financial) faith engineered in steel and glass. Gaudí employed an understanding of the heritage of stone and masonry to fashion his design while SOM used precise relationships of mathematics and steel. However, the designs in both the Sear’s Tower and Sagrada Família are restricted by the solutions inherent in the methods. In contrast, student designs often have no inherent approach to building. While the solution may appear to be evident, the method must often be invented; this is potentially more costly and complex than the design itself. This issue is not new to computers, but its hyper--reality is potentially more complex and disruptive. In evaluating the role of computer simulation in architectural design, this article employs two methods: 1.) Exoskeletal design: A limited collection of connected plates is formed and designed through warping, bending and forming. Reference architect Buckminster Fuller. 2.) Endoskeletal design: Curtain wall construction is taken to its minimalist extreme, using pure structure and membrane. Reference artist Christo.
series SIGRADI
email
last changed 2016/03/10 09:52

_id avocaad_2001_22
id avocaad_2001_22
authors Jos van Leeuwen, Joran Jessurun
year 2001
title XML for Flexibility an Extensibility of Design Information Models
source AVOCAAD - ADDED VALUE OF COMPUTER AIDED ARCHITECTURAL DESIGN, Nys Koenraad, Provoost Tom, Verbeke Johan, Verleye Johan (Eds.), (2001) Hogeschool voor Wetenschap en Kunst - Departement Architectuur Sint-Lucas, Campus Brussel, ISBN 80-76101-05-1
summary The VR-DIS research programme aims at the development of a Virtual Reality – Design Information System. This is a design and decision support system for collaborative design that provides a VR interface for the interaction with both the geometric representation of a design and the non-geometric information concerning the design throughout the design process. The major part of the research programme focuses on early stages of design. The programme is carried out by a large number of researchers from a variety of disciplines in the domain of construction and architecture, including architectural design, building physics, structural design, construction management, etc.Management of design information is at the core of this design and decision support system. Much effort in the development of the system has been and still is dedicated to the underlying theory for information management and its implementation in an Application Programming Interface (API) that the various modules of the system use. The theory is based on a so-called Feature-based modelling approach and is described in the PhD thesis by [first author, 1999] and in [first author et al., 2000a]. This information modelling approach provides three major capabilities: (1) it allows for extensibility of conceptual schemas, which is used to enable a designer to define new typologies to model with; (2) it supports sharing of conceptual schemas, called type-libraries; and (3) it provides a high level of flexibility that offers the designer the opportunity to easily reuse design information and to model information constructs that are not foreseen in any existing typologies. The latter aspect involves the capability to expand information entities in a model with relationships and properties that are not typologically defined but applicable to a particular design situation only; this helps the designer to represent the actual design concepts more accurately.The functional design of the information modelling system is based on a three-layered framework. In the bottom layer, the actual design data is stored in so-called Feature Instances. The middle layer defines the typologies of these instances in so-called Feature Types. The top layer is called the meta-layer because it provides the class definitions for both the Types layer and the Instances layer; both Feature Types and Feature Instances are objects of the classes defined in the top layer. This top layer ensures that types can be defined on the fly and that instances can be created from these types, as well as expanded with non-typological properties and relationships while still conforming to the information structures laid out in the meta-layer.The VR-DIS system consists of a growing number of modules for different kinds of functionality in relation with the design task. These modules access the design information through the API that implements the meta-layer of the framework. This API has previously been implemented using an Object-Oriented Database (OODB), but this implementation had a number of disadvantages. The dependency of the OODB, a commercial software library, was considered the most problematic. Not only are licenses of the OODB library rather expensive, also the fact that this library is not common technology that can easily be shared among a wide range of applications, including existing applications, reduces its suitability for a system with the aforementioned specifications. In addition, the OODB approach required a relatively large effort to implement the desired functionality. It lacked adequate support to generate unique identifications for worldwide information sources that were understandable for human interpretation. This strongly limited the capabilities of the system to share conceptual schemas.The approach that is currently being implemented for the core of the VR-DIS system is based on eXtensible Markup Language (XML). Rather than implementing the meta-layer of the framework into classes of Feature Types and Feature Instances, this level of meta-definitions is provided in a document type definition (DTD). The DTD is complemented with a set of rules that are implemented into a parser API, based on the Document Object Model (DOM). The advantages of the XML approach for the modelling framework are immediate. Type-libraries distributed through Internet are now supported through the mechanisms of namespaces and XLink. The implementation of the API is no longer dependent of a particular database system. This provides much more flexibility in the implementation of the various modules of the VR-DIS system. Being based on the (supposed to become) standard of XML the implementation is much more versatile in its future usage, specifically in a distributed, Internet-based environment.These immediate advantages of the XML approach opened the door to a wide range of applications that are and will be developed on top of the VR-DIS core. Examples of these are the VR-based 3D sketching module [VR-DIS ref., 2000]; the VR-based information-modelling tool that allows the management and manipulation of information models for design in a VR environment [VR-DIS ref., 2000]; and a design-knowledge capturing module that is now under development [first author et al., 2000a and 2000b]. The latter module aims to assist the designer in the recognition and utilisation of existing and new typologies in a design situation. The replacement of the OODB implementation of the API by the XML implementation enables these modules to use distributed Feature databases through Internet, without many changes to their own code, and without the loss of the flexibility and extensibility of conceptual schemas that are implemented as part of the API. Research in the near future will result in Internet-based applications that support designers in the utilisation of distributed libraries of product-information, design-knowledge, case-bases, etc.The paper roughly follows the outline of the abstract, starting with an introduction to the VR-DIS project, its objectives, and the developed theory of the Feature-modelling framework that forms the core of it. It briefly discusses the necessity of schema evolution, flexibility and extensibility of conceptual schemas, and how these capabilities have been addressed in the framework. The major part of the paper describes how the previously mentioned aspects of the framework are implemented in the XML-based approach, providing details on the so-called meta-layer, its definition in the DTD, and the parser rules that complement it. The impact of the XML approach on the functionality of the VR-DIS modules and the system as a whole is demonstrated by a discussion of these modules and scenarios of their usage for design tasks. The paper is concluded with an overview of future work on the sharing of Internet-based design information and design knowledge.
series AVOCAAD
email
last changed 2005/09/09 10:48

_id 9580
authors Sprekelsen, Martin and Pittioni, Gernot
year 1999
title AVOCAAD Exercises Expanding on Interactive Operations
source AVOCAAD Second International Conference [AVOCAAD Conference Proceedings / ISBN 90-76101-02-07] Brussels (Belgium) 8-10 April 1999, pp. 89-94
summary The web is a vital element for realising the AVOCAAD project. The web's features and functionality present a splendid platform. The following will discuss multiple advantageous options available through this new media as they relate to the AVOCAAD project. All data are permanently available on a central server, accessible to an unlimited number of clients anytime, anywhere in the world. Clients access the centrally stored information and work locally with the material, thus using the common server-to-client publishing set-up. Dynamic database functions available to the general user are able to control various aspects of data flow. This procedure is used by the AVOCAAD-web-system. Recent developments in the web are going to enable an even more sophisticated use, thus widening the range of application. The online material may present interactive properties, meaning that the user is able to observe changes of processes in relation to the influence he actually exerts on the material within his subsystem. We will focus on this material in our paper, exploring the possible impact on AVOCAAD-exercise.
series AVOCAAD
email
last changed 2005/09/09 10:48

_id 5de1
authors Tah, J.H.M., Howes, R. and Losifidis, P.
year 1999
title Integration of design and construction through shared objects in the CO-CIS project
source CIDAC, Volume 1 Issue 1 May 1999, pp.
summary This paper presents an Integrated Building Project Model (IBPM) which provides the basis on which software applications can share objects and shows how integration is achieved at the conceptual level. It is used to develop a pure object-oriented database server which acts as a central object repository, and facilitates the sharing of objects between multiple applications. The IBPM provided the basis for the development of the COllaborative Construction Integrated System (CO-CIS) based on the principles of client/server computing and utilising dynamic common object sharing in real-time between CAD and project management packages. The work demonstrates the capabilities of pure object technologies and should encourage industry to adopt the approach and facilitate the development of the common information standards.
keywords Object-Oriented Project Modelling, Object-Oriented Databases, Integration, Collaborative Environment, CAD, Project Management
series journal paper
last changed 2003/05/15 21:23

_id edf5
authors Arnold, J.A., Teicholz, P. and Kunz, J.
year 1999
title An approach for the interoperation of web-distributed applications with a design model
source Automation in Construction 8 (3) (1999) pp. 291-303
summary This paper defines the data and inference requirements for the integration of analysis applications with a product model described by a CAD/CAE application. Application input conditions often require sets of complex data that may be considered views of a product model database. We introduce a method that is compatible with the STEP and PLIB product description standards to define an intermediate model that selects, extracts, and validates views of information from a product model to serve as input for an engineering CAD/CAE application. The intermediate model framework was built and tested in a software prototype, the Internet Broker for Engineering Services (IBES). The first research case for IBES integrates applications that specify certain components, for example pumps and valves, with a CAD/CAE application. This paper therefore explores a sub-set of the general problem of integrating product data semantics between various engineering applications. The IBES integration method provides support for a general set of services that effectively assist interpretation and validate information from a product model for an engineering purpose. Such methods can enable application interoperation for the automation of typical engineering tasks, such as component specification and procurement.
series journal paper
more http://www.elsevier.com/locate/autcon
last changed 2003/05/15 21:22

_id f371
authors Hadjisophocleous, G.V. and Benichou, N.
year 1999
title Performance criteria used in fire safety design
source Automation in Construction 8 (4) (1999) pp. 489-501
summary In many countries around the world, building codes are shifting from prescriptive- to performance-based for technical, economic, and social reasons. This move is made possible by progress in fire safety technologies, including the development of engineering tools that are required to implement performance codes. The development of performance-based codes follows a transparent, hierarchical structure in which there are usually three levels of objectives. The top level objectives usually state the functional requirements and the lowest level the performance criteria. Usually, one middle level exists, however, more levels can be used in this hierarchical structure depending on the complexity of the requirements. The success of performance-based codes depends on the ability to establish performance criteria that will be verifiable and enforceable. The performance criteria should be such that designers can easily demonstrate, using engineering tools, that their designs meet them and that the code authority can enforce them. This paper presents the performance criteria that are currently used by fire protection engineers in designing fire safety systems in buildings. These include deterministic and probabilistic design criteria as well as safety factors. The deterministic criteria relate mainly to life safety levels, fire growth and spread levels, fire exposure and structural performance. The probabilistic criteria focus on the incident severity and incident likelihood. Finally, the inclusion of safety factors permits a conservative design and allows for a smaller margin of error due to uncertainty in the models and the input data.
series journal paper
more http://www.elsevier.com/locate/autcon
last changed 2003/05/15 21:22

_id 6810
authors Makkonen, Petri
year 1999
title On multi body systems simulation in product design
source KTH Stockholm
summary The aim of this thesis is to provide a basis for efficient modelling and software use in simulation driven product development. The capabilities of modern commercial computer software for design are analysed experimentally and qualitatively. An integrated simulation model for design of mechanical systems, based on four different "simulation views" is proposed: An integrated CAE (Computer Aided Engineering) model using Solid Geometry (CAD), Finite Element Modelling (FEM), Multi Body Systems Modelling (MBS) and Dynamic System Simulation utilising Block System Modelling tools is presented. A theoretical design process model for simulation driven design based on the theory of product chromosome is introduced. This thesis comprises a summary and six papers. Paper A presents the general framework and a distributed model for simulation based on CAD, FEM, MBS and Block Systems modelling. Paper B outlines a framework to integrate all these models into MBS simulation for performance prediction and optimisation of mechanical systems, using a modular approach. This methodology has been applied to design of industrial robots of parallel robot type. During the development process, from concept design to detail design, models have been refined from kinematic to dynamic and to elastodynamic models, finally including joint backlash. A method for analysing the kinematic Jacobian by using MBS simulation is presented. Motor torque requirements are studied by varying major robot geometry parameters, in dimensionless form for generality. The robot TCP (Tool Center Point) path in time space, predicted from elastodynamic model simulations, has been transformed to the frequency space by Fourier analysis. By comparison of this result with linear (modal) eigen frequency analysis from the elastodynamic MBS model, internal model validation is obtained. Paper C presents a study of joint backlash. An impact model for joint clearance, utilised in paper B, has been developed and compared to a simplified spring-damper model. The impact model was found to predict contact loss over a wider range of rotational speed than the spring-damper model. Increased joint bearing stiffness was found to widen the speed region of chaotic behaviour, due to loss of contact, while increased damping will reduce the chaotic range. The impact model was found to have stable under- and overcritical speed ranges, around the loss of contact region. The undercritical limit depends on the gravitational load on the clearance joint. Papers D and E give examples of the distributed simulation model approach proposed in paper A. Paper D presents simulation and optimisation of linear servo drives for a 3-axis gantry robot, using block systems modelling. The specified kinematic behaviour is simulated with multi body modelling, while drive systems and control system are modelled using a block system model for each drive. The block system model has been used for optimisation of the transmission and motor selection. Paper E presents an approach for re-using CAD geometry for multi body modelling of a rock drilling rig boom. Paper F presents synthesis methods for mechanical systems. Joint and part number synthesis is performed using the Grübler and Euler equations. The synthesis is continued by applying the theory of generative grammar, from which the grammatical rules of planar mechanisms have been formulated. An example of topological synthesis of mechanisms utilising this grammar is presented. Finally, dimensional synthesis of the mechanism is carried out by utilising non-linear programming with addition of a penalty function to avoid singularities.
keywords Simulation; Optimisation; Control Systems; Computer Aided Engineering; Multi Body Systems; Finite Element Method; Backslash; Clearance; Industrial Robots; Parallel Robots
series thesis:PhD
last changed 2003/02/12 22:37

_id 3815
authors Qaqish, Ra’ed
year 2001
title VDS/DDS Practice Hinges on Interventions and Simplicity - A Case Study of Hard Realism vs. Distorted Idealism
source Architectural Information Management [19th eCAADe Conference Proceedings / ISBN 0-9523687-8-1] Helsinki (Finland) 29-31 August 2001, pp. 249-255
doi https://doi.org/10.52842/conf.ecaade.2001.249
summary This paper reports on a contemporary and laborious ongoing experimental work initiated during the establishment of a new Virtual/Digital design studio “VDS” in Sept. 1999 by CAAD tutors at University of Petra “UOP”. The new VDS/DDS now works as an experimental laboratory to explore several solutions to problems of efficiency in design teaching as a new digital design studio paradigm, in tandem with CAD/Design staff, DS environment, materials and facilities. Two groups of graduating level students participated as volunteers in this experiment. The first group was comprised of three fifth-year architectural design students while the second group was comprised of two fourth-year interior design students. The media currently in use are ArchiCAD 6.5 as a design tool along with CorelDraw 9 as a presentational tool, running on Pentium III computers. The series of experiments evaluated the impression on architectural design studio tuition requirements arising from the changes brought about by the implementation of the new CAD pedagogical approach (VDS/DDS) at UOP. The findings echo several important key issues in tandem with CAAD, such as: the changes brought about by the new design strategies, adaptation in problem solving decision-making techniques, studio employment in terms of environment, means and methods. Other issues are VDS/DDS integration schemes carried out by both students and staff as one team in design studio practice on one hand and the curriculum on the other. Finally, the paper discusses the negative impact of conventional design studio hardliner teaching advocates and students alike whose outlook and impressions undermine and deplete effective CAAD integration and obstruct, in many instances, the improvement of such experiments in a VDS environment.
keywords Design Studio Strategies, Problem Solving Decisions, Transformation And Integration Policies
series eCAADe
email
last changed 2022/06/07 08:00

_id 9dfa
authors Ries, R. and Mahdavi, A.
year 1999
title Environmental Life Cycle Assessment in an Integrated CAD Environment: The Ecologue Approach
source Proceedings of the Eighth International Conference on Computer Aided Architectural Design Futures [ISBN 0-7923-8536-5] Atlanta, 7-8 June 1999, pp. 351-363
summary Construction and operation of buildings is a major cause of resource depletion and environmental pollution. Computational performance evaluation tools could support the decision making process in environmentally responsive building design and play an important role in environmental impact assessment, especially when a life cycle assessment (LCA) approach is used. The building domain, however, presents notable challenges to the application of LCA methods. For comprehensive environmental impact analysis to be realized in a computational support tool for the building design domain, such tools must a) have an analysis method that considers the life cycle of building construction, operation, and decommissioning, b) have a representation that is able to accommodate the data and computability requirements of the analysis method and the analysis tool, and c) be seamlessly integrated within a multi-aspect design analysis environment that can provide data on environmentally relevant building operation criteria. This paper reviews the current state of assessment methods and computational support tools for LCA, and their application to building design. Then, the implementation of an application (ECOLOGUE) for comprehensive computational assessment of environmental impact indicators over the building life cycle is presented. The application is a component in a multi-aspect space-based CAD and evaluation environment (SEMPER). The paper describes the use and typical results of ECOLOGUE system via illustrative examples.
keywords Life Cycle Assessment, Integrated Computational Environmental Analysis
series CAAD Futures
email
last changed 2006/11/07 07:22

_id 3d23
authors Sellgren, Ulf
year 1999
title Simulation-driven Design
source KTH Stockholm
summary Efficiency and innovative problem solving are contradictory requirements for product development (PD), and both requirements must be satisfied in companies that strive to remain or to become competitive. Efficiency is strongly related to ”doing things right”, whereas innovative problem solving and creativity is focused on ”doing the right things”. Engineering design, which is a sub-process within PD, can be viewed as problem solving or a decision-making process. New technologies in computer science and new software tools open the way to new approaches for the solution of mechanical problems. Product data management (PDM) technology and tools can enable concurrent engineering (CE) by managing the formal product data, the relations between the individual data objects, and their relation to the PD process. Many engineering activities deal with the relation between behavior and shape. Modern CAD systems are highly productive tools for concept embodiment and detailing. The finite element (FE) method is a general tool used to study the physical behavior of objects with arbitrary shapes. Since a modern CAD technology enables design modification and change, it can support the innovative dimension of engineering as well as the verification of physical properties and behavior. Concepts and detailed solutions have traditionally been evaluated and verified with physical testing. Numerical modeling and simulation is in many cases a far more time efficient method than testing to verify the properties of an artifact. Numerical modeling can also support the innovative dimension of problem solving by enabling parameter studies and observations of real and synthetic behavior. Simulation-driven design is defined as a design process where decisions related to the behavior and performance of the artifact are significantly supported by computer-based product modeling and simulation. A framework for product modeling, that is based on a modern CAD system with fully integrated FE modeling and simulation functionality provides the engineer with tools capable of supporting a number of engineering steps in all life-cycle phases of a product. Such a conceptual framework, that is based on a moderately coupled approach to integrate commercial PDM, CAD, and FE software, is presented. An object model and a supporting modular modeling methodology are also presented. Two industrial cases are used to illustrate the possibilities and some of the opportunities given by simulation-driven design with the presented methodology and framework.
keywords CAE; FE Method; Metamodel; Object Model; PDM; Physical Behavior, System
series thesis:PhD
email
last changed 2003/02/12 22:37

_id ga9902
id ga9902
authors Soddu, Celestino
year 1999
title Recognizability of designer imprinting in Generative artwork
source International Conference on Generative Art
summary Design lives within two fundamental stages, the creative and the evolutionary. The first is that of producing the idea: this approach is built activating a logical jump between the existing and possible worlds that represent our wishes and thoughts. A design idea is the identification of a set of possibilities that goes beyond specific "solutions" but identifies the sense or the attainable quality. The field involved in this design stage is "how" the world may be transformed, not what the possible scenario may be. The second is the evolutionary stage, that of the development of the idea. This approach runs inside paths of refinement and increases in complexity of the projects. It involves the management of the project to reach the desired quality.Generative design is founded on the possibility to clearly separate the creative and the evolutionary stages of the idea. And the first is reserved for man (because creative processes, being activated from subjective interpretations and being abduptive paths and not deductive, inductive or analytical ones, can not be emulated by machines) and the second may be carried out using artificial devices able to emulate logical procedures. The emulation of evolutionary logics is useful for a very simple reason: for getting the best operative design control on complexity. Designers know very well that the quality of a project depends, very importantly, on the time spent designing. If the time is limited, the project can not evolve enough to attain the desired quality. If the available time is increased, the project acquires a higher quality due to the possibility of crossing various parallel evolutionary paths, to develop these and to verify their relative potential running through the cycle idea/evolution more times and in progress. (scheme1) This is not all. In a time-limited design activity, the architect is pressed into facing the formalization of performance requirements in terms of answering directly specific questions. He is pressed into analytically systematizing the requirements before him to quickly work on the evolution of the project. The design solution can be effective but absolutely not flexible. If the real need of the user is, even slightly, different to the hypothesized requirement, the quality of the project, as its ability to respond to needs, breaks down. Projects approached in this way, which we could call "analytical", are quickly obsolete, being tied up to the flow of fashion. A more "creative" approach, where we don't try to accelerate (therby simplifying) the design development path "deducting" from the requirements the formalization choices but we develop our idea using the requirements and the constraints as opportunities of increasing the complexity of the idea, enriching the design development path to reach a higher quality, needs, without doubt, more time. As well as being, of course, a creative and non-analytical approach. This design approach, which is "the" design path, is a voyage of discovery that is comparable to that of scientific research. The fundamental structure is the idea as a "not deducted" hypothesis concerning a quality and recognizability of attainable artwork, according to the architect's "subjective" point of view. The needs and the constraints, identified as fields of possible development of the project, are opportunities for the idea to develop and acquire a specific identity and complexity. Once possible scenarios of a project are formed, the same requirements and constraints will take part, as "verification of congruity", of the increase in quality. Then the cycle, once more, will be run again to reach more satisfactory results. It is, without doubt, an approach that requires time.
series other
email
more http://www.generativeart.com/
last changed 2003/08/07 17:25

_id a875
authors Suwa, M., Gero, J.S. and Purcell, T.
year 1999
title How an Architect Created Design Requirements
source G. Goldschmidt and W. Porter (eds), Design Thinking Research Symposium: Design Representation, MIT, Cambridge, pp. II.101-124
summary There is an anecdotal view that designers, during a conceptual design process, not just synthesise solutions that satisfy initially given requirements, but also create by themselves novel design requirements that capture important aspects of the given problem. Further, it is believed that design sketches serve as a thinking tool for designers to do this. Then, what kinds of cognitive interaction with their own sketches enable designers to create novel requirements? The purpose of this paper is to answer this question. We examined the cognitive processes of a practising architect, using a protocol analysis technique. Our examinations focused on whether particular types of cognitive actions account for the creation of novel design requirements. We found that intensive occurrences of a certain type of perceptual actions, acts of establishing new relations or visual features on the sketches, are likely to co-occur with the creation of requirements. This suggests that this type of perceptual actions are the key constituent of acts of creating novel requirements, and therefore one of the important actions in sketching activities. This presents evidence of the view that designing is a situated act, as well as has an implication for design education.
keywords Design Requirements; Sketches; Design Cognition; Protocol Analysis
series journal paper
email
last changed 2003/03/31 08:37

For more results click below:

this is page 0show page 1show page 2show page 3show page 4show page 5... show page 30HOMELOGIN (you are user _anon_164726 from group guest) CUMINCAD Papers Powered by SciX Open Publishing Services 1.002