The basic element of a project organizing construction works is a schedule. The preparation of the data necessary to specify the timings of the construction completion as indicated in the schedule involves information that is uncertain and hard to quantify. The article presents the methods of building a schedule which includes a fuzzy amount of labour, time standards and number of workers. The proposed procedure allows determining the real deadline for project completion, taking into account variable factors affecting the duration of the individual works.
During implementation of construction projects, durations of activities are affected by various factors. Because of this, both during the planning phase of the project as well as the construction phase, managers try to estimate, or predict, the length of any delays that may occur. Such estimates allow for the ability to take appropriate action in terms of planning and management during the execution of construction works. This paper presents the use of the non-deterministic concept for describing the uncertainty of estimating works duration. The concept uses the theory of fuzzy sets. The author describes a method for fuzzy estimations of construction works duration based on the fact that uncertain data is an inherent factor in the conditions of construction projects. An example application of the method is presented. The author shows a fuzzy estimation for the duration of an activity, taking into consideration the distorting influence caused by malfunctioning construction equipment and delivery delays of construction materials.
We discuss epistemological and methodological aspects of the Bayesian approach in astrophysics and cosmology. The introduction to the Bayesian framework is given for a further discussion concerning the Bayesian inference in physics. The interplay between the modern cosmology, Bayesian statistics, and philosophy of science is presented. We consider paradoxes of confi rmation, like Goodman’s paradox, appearing in the Bayesian theory of confirmation. As in Goodman’s paradox the Bayesian inference is susceptible to some epistemic limitations in the logic of induction. However, Goodman’s paradox applied to cosmological hypotheses seems to be resolved due to the evolutionary character of cosmology and the accumulation of new empirical evidence. We argue that the Bayesian framework is useful in the context of falsifiability of quantum cosmological models, as well as contemporary dark energy and dark matter problem.
Recently, the topic of ontologies has growing attention from the IT community. Various processes of ontology creation, integration, and deployment have been proposed. As a consequence there appeared an urgent need for evaluating the resulting ontologies in a quantitative way. A number of metrics has been defined along with different approaches to measuring the properties of ontologies. In the first part of this paper we review the state of the art in this domain. Special attention is devoted to discussing differences between syntactic measures (referring to various properties of graphs that represent ontologies) and semantic measures (reflecting the properties of the space of ontology models). In the second part we propose an alternative approach to quantification of semantics of an ontology. The original proposal presented here exploits specific methods of representing the space of semantic models used for optimization of reasoning. We argue that this approach enables us to capture different kinds of relations among ontology terms and offers possibilities of devising new useful measures.
Recently, the topic of ontologies has growing attention from the IT community. Various processes of ontology creation, integration, and deployment have been proposed. As a consequence there appeared an urgent need for evaluating the resulting ontologies in a quantitative way. A number of metrics has been defined along with different approaches to measuring the properties of ontologies. In the first part of this paper we review the state of the art in this domain. Special attention is devoted to discussing differences between syntactic measures (referring to various properties of graphs that represent ontologies) and semantic measures (reflecting the properties of the space of ontology models). In the second part we propose an alternative approach to quantification of semantics of an ontology. The original proposal presented here exploits specific methods of representing the space of semantic models used for optimization of reasoning. We argue that this approach enables us to capture different kinds of relations among ontology terms and offers possibilities of devising new useful measures.
In this article the magnetic memory model with nano-meter size made from iron cells was proposed. For a purpose of determining the model specifications, the magnetic probes group with different geometrical parameters were examined using numeric simulations for the two different time duration of transitions among quasistable magnetic distributions found in the system, derived from the energy minimums. The geometrical parameters range was found, for which the 16 quasi–stable energetic states exist for the each probe. Having considered these results the 4 bits magnetic cells systems can be designed whose state is changed by spin-polarized current. Time dependent current densities and the current electron spin polarization directions were determined for all cases of transitions among quasi–stable states, for discovered set of 4 bits cells with different geometrical parameters. The 16- states cells, with the least geometrical area, achieved the 300 times bigger writing density in comparison to actual semiconductor solutions with the largest writing densities. The transitions among quasi-stable states of cells were examined for the time durations 105 times shorter than that for up to date solutions.
Due to increase in threats posed by offshore foundries, the companies outsourcing IPs are forced to protect their designs from the threats posed by the foundries. Few of the threats are IP piracy, counterfeiting and reverse engineering. To overcome these, logic encryption has been observed to be a leading countermeasure against the threats faced. It introduces extra gates in the design, known as key gates which hide the functionality of the design unless correct keys are fed to them. The scan tests are used by various designs to observe the fault coverage. These scan chains can become vulnerable to sidechannel attacks. The potential solution for protection of this vulnerability is obfuscation of the scan output of the scan chain. This involves shuffling the working of the cells in the scan chain when incorrect test key is fed. In this paper, we propose a method to overcome the threats posed to scan design as well as the logic circuit. The efficiency of the secured design is verified on ISCAS’89 circuits and the results prove the security of the proposed method against the threats posed.
This paper presents an innovative method of technology mapping of the circuits in ALM appearing in FPGA devices by Intel. The essence of the idea is based on using triangle tables that are connected with different configurations of blocks. The innovation of the proposed method focuses on the possibility of choosing an appropriate configuration of an ALM block, which is connected with choosing an appropriate decomposition path. The effectiveness of the proposed technique of technology mapping is proved by experiments conducted on combinational and sequential circuits.
The paper shows methods of analysis and assessment of partnering relations of construction enterprises with the use of questionnaires, statistics, and fuzzy logic. The results were obtained from Polish, Slovak and Ukrainian enterprises. The definition of partnering in the construction industry indicates that it is a qualitative concept. By applying a scale in the questionnaire, and due to mathematical analysis of the data, the final research result, showing the level of partnering relations of construction enterprises, is rendered quantitatively.
The paper concerns the problem of state assignment for finite state machines (FSM), targeting at PAL-based CPLDs implementations. Presented in the paper approach is dedicated to state encoding of fast automata. The main idea is to determine the number of logic levels of the transition function before the state encoding process, and keep the constraints during the process. The number of implicants of every single transition function must be known while assigning states, so elements of two level minimization based on Primary and Secondary Merging Conditions are implemented in the algorithm. The method is based on code length extraction if necessary. In one of the most basic stages of the logic synthesis of sequential devices, the elements referring to constraints of PAL-based CPLDs are taken into account.
The protection of Polish architectural heritage in the former eastern borderlands, accomplished through the conservation and technical securing of historical structures, constitutes one of the main programmes that are implemented by the Ministry of Culture and National Heritage. Currently, many Polish historical buildings in the former eastern borderlands are in a very bad technical condition. The load-bearing systems of these elements, as well as elements of their finish, require immediate emergency securing work. The basic steps that precede conservation work are emergency structural works, which guarantee the durability and stability of the entire historical substance. The specifics and complexity of the problem of the failure of historical buildings often demands an in-depth analysis of a series of factors that are difficult to measure and which are responsible for the cause and effect relationship during the early stage of the technical evaluation of a structure. The analyses of failures of numerous historical structures, for instance that were carried out by the authors, have become the inspiration for the search for effective methods of analysis that would allow for an in-depth analysis of the causes and effects of the failures in question. The DEMATEL method (Decision Making Trial and Evaluation Laboratory) that has been presented in this work, and its fuzzy extension, has lately become one of the more popular methods used in the cause-and-effect analysis of various phenomena. The authors demonstrated how this method works on the example of the evaluation and securing of the load-bearing system of the XVII Collegiate church of the Holy Trinity in the town of Olykha in the Volhynskiy Oblast, Ukraine.
This paper presents novel approach to the Huffman’s asynchronous sequential circuit two valued Boolean switching system design. The algorithm is implemented as software using distributed, service oriented application model with means of the web service component design. It considers method implementation challenges, both towards Moore and Mealy structures with particular respect to the estimation of the Huffman’s minimization algorithm computational complexity. The paper provides implementation details, theoretical model estimation and experimental results that acknowledge the theoretical approach in practice. This paper also examine the multistep design process implementation and its problems inherent in web service based environment both for development and educational purposes.
The article introduces a new proposal of a defuzzification method, which can be implemented in fuzzy controllers. The first chapter refers to the origin of fuzzy sets. Next, a modern development based on this theory is presented in the form of ordered fuzzy numbers (OFN). The most important characteristics of ordered fuzzy numbers are also presented. In the following chapter, details about the defuzzification process are given as part of the fuzzy controller model. Then a new method of defuzzification is presented. The method is named center of circles intersection (CCI). The authors compare this method with a similar geometric solution: triangular expanding (TE) and geometric mean (GM). Also, the results are compared with other methods such as center of gravity (COG), first of maxima (FOM) and last of maxima (LOM). The analysis shows that the proposed solution works correctly and provides results for traditional fuzzy numbers as well as directed fuzzy numbers. The last chapter contains a summary, in which more detailed conclusions are provided and further directions of research are indicated.
A novel dual mode logic (DML) model has a superior energy-performance compare to CMOS logic. The DML model has unique feature that allows switching between both modes of operation as per the real-time system requirements. The DML functions in two dissimilar modes (static and dynamic) of operation with its specific features, to selectively obtain either low-energy or high-performance. The sub-threshold region DML achieves minimum-energy. However, sub-threshold region consequence in performance is enormous. In this paper, the working of DML model in the moderate inversion region has been explored. The near-threshold region holds much of the energy saving of subthreshold designs, along with improved performance. Furthermore, robustness to supply voltage and sensitivity to the process temperature variations are presented. Monte carol analysis shows that the projected near-threshold region has minimum energy along with the moderate performance.
Selected scientific contacts of Jacek Hawranek and Jan Zygmunt with Professor Bogusław Wolniewicz in the period from the end of the 1980s to the beginning of the 21st century are presented in this essay. They concerned the algebraic aspects of the ontology of situations and from one moment – one only question that was posed by Wolniewicz in his note A question about join-semilattices (Bulletin of the Section of Logic, 19/3, 1990, pp. 108–108), and resulted in the Hawranek & Zygmunt paper Wokół pewnego zagadnienia z dziedziny półkrat górnych z jednością (“Some comments on a question about semilattices with unit”) (Acta Universitatis Wratislaviensis 1445, Logika 15 (1993), pp. 59–68) containing an answer to Wolniewicz’s question. The Hawranek & Zygmunt paper is reprinted below, and the essay might be also treated as a kind of an analytical and historical introduction to it. The story of contacts Wolniewicz – Hawranek & Zygmunt has been told with the help of the preserved correspondence between the three persons. In his letters Professor Wolniewicz appears as a passionate researcher, open to discussion, ready to share his research successes and difficulties with others.
In the paper issues related to the design of a robust adaptive fuzzy estimator for a drive system with a flexible joint is presented. The proposed estimator ensures variable Kalman gain (based on the Mahalanobis distance) as well as the estimation of the system parameters (based on the fuzzy system). The obtained value of the time constant of the load machine is used to change the values in the system state matrix and to retune the parameters of the state controller. The proposed control structure (fuzzy Kalman filter and adaptive state controller) is investigated in simulation and experimental tests.
The ultrasonic flowmeter which is described in this paper, measures the transit of time of an ultrasonic pulse. This device consists of two ultrasonic transducers and a high resolution time interval measurement module. An ultrasonic transducer emits a characteristic wave packet (transmit mode). When the transducer is in receive mode, a characteristic wave packet is formed and it is connected to the time interval measurement module inputs. The time interval measurement module allows registration of transit time differences of a few pulses in the packet. In practice, during a single measuring cycle a few time-stamps are registered. Moreover, the measurement process is also synchronous and, by applying the statistics, the time interval measurement uncertainty improves even in a single measurement. In this article, besides a detailed discussion on the principle of operation of the ultrasonic flowmeter implemented in the FPGA structure, also the test results are presented and discussed