Hallmark professionalism in probabilistic analysis is to quantify the uncertainties involved in construction materials subject to intrinsic randomness in its physical and mechanical properties and is now gaining popularity in civil engineering arena. As well, knowledge of behaviour of materials is continuously evolving and its statistical descriptors are also changing when more and more data collected or even data updated and hence reliability analysis has to be carried out with the updated data as a continuous process. As per the committee report ACI 544.2R, it is found that there is no attempt made for probabilistic relation between cube compressive strength and cylinder compressive strength for fiber reinforced concrete. In consequence of this report, a robust relation between cube and cylinder of experimentally conducted compressive strength was established by Monte-Carlo simulation technique for different types of fibrous concrete like steel, alkali resistant glass and polyester fibrous concrete before and after thermoshock considering various uncertainties. Nevertheless simulated probabilistic modals, characteristic modals, optimized factor of safety and allowable designed cylinder compressive strength have been developed from the drawn probability of failure graph, which exhibits robust performance in realistic Civil Engineering materials and structures.
The objective of this paper is to present a probabilistic method of analyzing the combinations of snow and wind loads using meteorological data and to determine their combination factors. Calculations are based on data measured at twelve Polish meteorological stations operated by the Institute for Meteorology and Water Management. Data provided are from the years 1966 - 2010. Five combinations of snow load and 10-minute mean wind velocity pressure have been considered. Gumbel probability distribution has been used to fit the empirical distributions of the data. As a result, the interdependence between wind velocity pressure and snow load on the ground for a return period of 50 years has been provided, and the values of the combination factors for snow loads and wind actions are proposed.
The paper formulates some objections to the methods of evaluation of uncertainty in noise measurement which are presented in two standards: ISO 9612 (2009) and DIN 45641 (1990). In particular, it focuses on approximation of an equivalent sound level by a function which depends on the arithmetic average of sound levels. Depending on the nature of a random sample the exact value of the equivalent sound level may be significantly different from an approximate one, which might lead to erroneous estimation of the uncertainty of noise indicators. The article presents an analysis of this problem and the adequacy of the solution depending on the type of a random sample.
From the theory of reliability it follows that the greater the observational redundancy in a network, the higher is its level of internal reliability. However, taking into account physical nature of the measurement process one may notice that the planned additional observations may increase the number of potential gross errors in a network, not raising the internal reliability to the theoretically expected degree. Hence, it is necessary to set realistic limits for a sufficient number of observations in a network. An attempt to provide principles for finding such limits is undertaken in the present paper. An empirically obtained formula (Adamczewski 2003) called there the law of gross errors, determining the chances that a certain number of gross errors may occur in a network, was taken as a starting point in the analysis. With the aid of an auxiliary formula derived on the basis of the Gaussian law, the Adamczewski formula was modified to become an explicit function of the number of observations in a network. This made it possible to construct tools necessary for the analysis and finally, to formulate the guidelines for determining the upper-bounds for internal reliability indices. Since the Adamczewski formula was obtained for classical networks, the guidelines should be considered as an introductory proposal requiring verification with reference to modern measuring techniques.
An embedded time interval data acquisition system (DAS) is developed for zero power reactor (ZPR) noise experiments. The system is capable of measuring the correlation or probability distribution of a random process. The design is totally implemented on a single Field Programmable Gate Array (FPGA). The architecture is tested on different FPGA platforms with different speed grades and hardware resources. Generic experimental values for time resolution and inter-event dead time of the system are 2.22 ns and 6.67 ns respectively. The DAS can record around 48-bit x 790 kS/s utilizing its built-in fast memory. The system can measure very long time intervals due to its 48-bit timing structure design. As the architecture can work on a typical FPGA, this is a low cost experimental tool and needs little time to be established. In addition, revisions are easily possible through its reprogramming capability. The performance of the system is checked and verified experimentally.
Together with the dynamic development of modern computer systems, the possibilities of applying refined methods of nonparametric estimation to control engineering tasks have grown just as fast. This broad and complex theme is presented in this paper for the case of estimation of density of a random variable distribution. Nonparametric methods allow here the useful characterization of probability distributions without arbitrary assumptions regarding their membership to a fixed class. Following an illustratory description of the fundamental procedures used to this end, results will be generalized and synthetically presented of research on the application of kernel estimators, dominant here, in problems of Bayes parameter estimation with asymmetrical polynomial loss function, as well as for fault detection in dynamical systems as objects of automatic control, in the scope of detection, diagnosis and prognosis of malfunctions. To this aim the basics of data analysis and exploration tasks - recognition of outliers, clustering and classification - solved using uniform mathematical apparatus based on the kernel estimators methodology were also investigated