Search results

Filters

  • Journals
  • Date

Search results

Number of results: 12
items per page: 25 50 75
Sort by:
Download PDF Download RIS Download Bibtex

Abstract

The paper refines Lenk’s concept of improving the performance of the computed harmonic mean estimator (HME) in three directions. First, the adjusted HME is derived from an exact analytical identity. Second, Lenk’s assumption concerning the appropriate subset A of the parameter space is significantly weakened. Third, it is shown that, under certain restrictions imposed on A, a fundamental identity underlying the HME also holds for improper prior densities, which substantially extends applicability of the adjusted HME.

Go to article

Authors and Affiliations

Anna Pajor
Jacek Osiewalski
Download PDF Download RIS Download Bibtex

Abstract

One of the prime tool in non-invasive cardiac electrophysiology is the recording of an electrocardiographic signal (ECG) which analysis is greatly useful in the screening and diagnosis of cardiovascular diseases. However, one of the greatest problems is that usually recording an electrical activity of the heart is performed in the presence of noise. The paper presents Bayesian and empirical Bayesian approach to problem of weighted signal averaging in time domain which is commonly used to extract a useful signal distorted by a noise. The averaging is especially useful for biomedical signal such as ECG signal, where the spectra of the signal and noise significantly overlap. Using the methods of weighted averaging are motivated by variability of noise power from cycle to cycle, often observed in reality. It is demonstrated that exploiting a probabilistic Bayesian learning framework leads to accurate prediction models. Additionally, even in the presence of nuisance parameters the empirical Bayesian approach offers the method of theirs automatic estimation which reduces number of preset parameters. Performance of the new method is experimentally compared to the traditional averaging by using arithmetic mean and weighted averaging method based on criterion function minimization.

Go to article

Authors and Affiliations

A. Momot
M. Momot
J. Łęski
Download PDF Download RIS Download Bibtex

Abstract

Finite mixture and Markov-switching models generalize and, therefore, nest specifications featuring only one component. While specifying priors in the general (mixture) model and its special (single-component) case, it may be desirable to ensure that the prior assumptions introduced into both structures are compatible in the sense that the prior distribution in the nested model amounts to the conditional prior in the mixture model under relevant parametric restriction. The study provides the rudiments of setting compatible priors in Bayesian univariate finite mixture and Markov-switching models. Once some primary results are delivered, we derive specific conditions for compatibility in the case of three types of continuous priors commonly engaged in Bayesian modeling: the normal, inverse gamma, and gamma distributions. Further, we study the consequences of introducing additional constraints into the mixture model’s prior on the conditions. Finally, the methodology is illustrated through a discussion of setting compatible priors for Markov-switching AR(2) models.

Go to article

Authors and Affiliations

Łukasz Kwiatkowski
Download PDF Download RIS Download Bibtex

Abstract

The paper discusses Bayesian productivity analysis of 27 EU Member States, USA, Japan and Switzerland. Bayesian Stochastic Frontier Analysis and a two-stage structural decomposition of output growth are used to trace sources of output growth. This allows us to separate the impacts of capital accumulation, labour growth, technical progress and technical efficiency change on economic development. Since estimates of the growth components are conditioned upon model parameterisation and the underlying assumptions, a number of possible specifications are considered. The best model for decomposing output growth is chosen based on the highest marginal data density, which is calculated using adjusted harmonic mean estimator.

Go to article

Authors and Affiliations

Kamil Makieła
Download PDF Download RIS Download Bibtex

Abstract

The paper is devoted to discussing consequences of the so-called Frisch-Waugh Theorem to posterior inference and Bayesian model comparison. We adopt a generalised normal linear regression framework and weakenits assumptions in order to cover non-normal, jointly elliptical samplingdistributions, autoregressive specifications, additional nuisance parameters andmulti-equation SURE or VAR models. The main result is that inference basedon the original full Bayesian model can be obtained using transformed dataand reduced parameter spaces, provided the prior density for scale or precisionparameters is appropriately modified.

Go to article

Authors and Affiliations

Jacek Osiewalski
Download PDF Download RIS Download Bibtex

Abstract

We apply Bayesian inference to estimate transformation matrix that converts vector of industry outputs from NACE Rev. 1.1 to NACE Rev. 2 classification. In formal terms, the studied issue is a representative of the class of matrix balancing (updating, disaggregation) problems, often arising in the field of multi-sector economic modelling. These problems are characterised by availability of only partial, limited data and a strong role for prior assumptions, and are typically solved using bi-proportional balancing or cross-entropy minimisation methods. Building on Bayesian highest posterior density formulation for a similarly structured case, we extend the model with specification of prior information based on Dirichlet distribution, as well as employ MCMC sampling. The model features a specific likelihood, representing accounting restrictions in the form of an underdetermined system of equations. The primary contribution, compared to the alternative, widespread approaches, is in providing a clear account of uncertainty.

Go to article

Authors and Affiliations

Jakub Boratyński
Download PDF Download RIS Download Bibtex

Abstract

In recent years, autoregressive conditional duration models (ACD models) introduced by Engle and Russell in 1998 have become very popular in modelling of the durations between selected events of the transaction process (trade durations or price durations) and modelling of financial market microstructure effects. The aim of the paper is to develop Bayesian inference for the ACD models. Different specifications of ACD models will be considered and compared with particular emphasis on the linear ACD model, Box-Cox ACD model, augmented Box-Cox ACD model and augmented (Hentschel) ACD model. The analysis will consider models with the Burr distribution and the generalized Gamma distribution for the innovation term. Bayesian inference will be presented and practically used in estimation of and prediction within ACD models describing trade durations. The MCMC methods including Metropolis-Hastings algorithm are suitably adopted to obtain samples from the posterior densities of interest. The empirical part of the work includes modelling of trade durations of selected equities from the Polish stock market.

Go to article

Authors and Affiliations

Roman Huptas
Download PDF Download RIS Download Bibtex

Abstract

We discuss the empirical importance of long term cyclical effects in the volatility of financial returns. Following Amado and Teräsvirta (2009), ČiŽek and Spokoiny (2009) and others, we consider a general conditionally heteroscedastic process with stationarity property distorted by a deterministic function that governs the possible time variability of the unconditional variance. The function proposed in this paper can be interpreted as a finite Fourier approximation of an Almost Periodic (AP) function as defined by Corduneanu (1989). The resulting model has a particular form of a GARCH process with time varying parameters, intensively discussed in the recent literature.

In the empirical analyses we apply a generalisation of the Bayesian AR(1)-GARCH model for daily returns of S&P500, covering the period of sixty years of US postwar economy, including the recently observed global financial crisis. The results of a formal Bayesian model comparison clearly indicate the existence of significant long term cyclical patterns in volatility with a strongly supported periodic component corresponding to a 14 year cycle. Our main results are invariant with respect to the changes of the conditional distribution from Normal to Student-tand to the changes of the volatility equation from regular GARCH to the Asymmetric GARCH.

Go to article

Authors and Affiliations

Błażej Mazur
Mateusz Pipień
Download PDF Download RIS Download Bibtex

Abstract

The main goal of this paper is to propose the probabilistic description of cyclical (business) fluctuations. We generalize a fixed deterministic cycle model by incorporating the time-varying amplitude. More specifically, we assume that the mean function of cyclical fluctuations depends on unknown frequencies (related to the lengths of the cyclical fluctuations) in a similar way to the almost periodic mean function in a fixed deterministic cycle, while the assumption concerning constant amplitude is relaxed. We assume that the amplitude associated with a given frequency is time-varying and is a spline function. Finally, using a Bayesian approach and under standard prior assumptions, we obtain the explicit marginal posterior distribution for the vector of frequency parameters. In our empirical analysis, we consider the monthly industrial production in most European countries. Based on the highest marginal data density value, we choose the best model to describe the considered growth cycle. In most cases, data support the model with a time-varying amplitude. In addition, the expectation of the posterior distribution of the deterministic cycle for the considered growth cycles has similar dynamics to cycles extracted by standard bandpass filtration methods.

Go to article

Authors and Affiliations

Łukasz Lenart
Download PDF Download RIS Download Bibtex

Abstract

In empirical research on financial market microstructure and in testing some predictions from the market microstructure literature, the behavior of some characteristics of trading process can be very important and useful. Among all characteristics associated with tick-by-tick data, the trading time and the price seem the most important. The very first joint model for prices and durations, the so-called UHF-GARCH, has been introduced by Engle (2000). The main aim of this paper is to propose a simple, novel extension of Engle’s specification based on trade-to-trade data and to develop and apply the Bayesian approach to estimation of this model. The intraday dynamics of the return volatility is modelled by an EGARCH-type specification adapted to irregularly time-spaced data. In the analysis of price durations, the Box-Cox ACD model with the generalized gamma distribution for the error term is considered. To the best of our knowledge, the UHF-GARCH model with such a combination of the EGARCH and the Box-Cox ACD structures has not been studied in the literature so far. To estimate the model, the Bayesian approach is adopted. Finally, the methodology developed in the paper is employed to analyze transaction data from the Polish Stock Market.

Go to article

Authors and Affiliations

Roman Huptas
Download PDF Download RIS Download Bibtex

Abstract

The problem of estimation of the long-term environmental noise hazard indicators and their uncertainty is presented in the present paper. The type A standard uncertainty is defined by the standard deviation of the mean. The rules given in the ISO/IEC Guide 98 are used in the calculations. It is usually determined by means of the classic variance estimators, under the following assumptions: the normality of measurements results, adequate sample size, lack of correlation between elements of the sample and observation equivalence. However, such assumptions in relation to the acoustic measurements are rather questionable. This is the reason why the authors indicated the necessity of implementation of non-classical statistical solutions. An estimation idea of seeking density function of long-term noise indicators distribution by the kernel density estimation, bootstrap method and Bayesian inference have been formulated. These methods do not generate limitations for form and properties of analyzed statistics. The theoretical basis of the proposed methods is presented in this paper as well as an example of calculation process of expected value and variance of long-term noise indicators LDEN and LN. The illustration of indicated solutions and their usefulness analysis were constant due to monitoring results of traffic noise recorded in Cracow, Poland.
Go to article

Authors and Affiliations

Wojciech Michał Batko
Bartłomiej Stępień
Download PDF Download RIS Download Bibtex

Abstract

The paper investigates Bayesian approach to estimate generalized true random-effects models (GTRE). The analysis shows that under suitably defined priors for transient and persistent inefficiency terms the posterior characteristics of such models are well approximated using simple Gibbs sampling. No model re-parameterization is required. The proposed modification not only allows us to make more reasonable (less informative) assumptions as regards prior transient and persistent inefficiency distribution but also appears to be more reliable in handling especially noisy datasets. Empirical application furthers the research into stochastic frontier analysis using GTRE models by examining the relationship between inefficiency terms in GTRE, true random-effects, generalized stochastic frontier and a standard stochastic frontier model.

Go to article

Authors and Affiliations

Kamil Makieła

This page uses 'cookies'. Learn more