Theme of Conference
Applications of mathematics and statistics in economy.
Aim of Conference
To acquaint the participants of the conference with the latest mathematical and statistical methods that can be used in solving theoretical and practical economic problems. The presentation of research results achieved by the participants in this field.
Doc. Ing. Miroslav Abrahám, PhD. (Banská Bystrica), Chairman
Doc. RNDr. Rudolf Zimka, CSc. (Banská Bystrica)
Prof. Ing. Richard Hindls, CSc. (Prague)
Prof. Ing. Stanislava Hronová, CSc. (Prague)
Prof. Ing. Ilja Novák, CSc. (Prague)
Dr Ludwik Adamczyk (Wroclaw)
Prof. dr hab. Walenty Ostasiewicz (Wroclaw)
Doc. Ing. Miroslav Abrahám, PhD. (Banská Bystrica), Chairman
Doc. RNDr. Rudolf Zimka, CSc. (Banská Bystrica)
Ing. Peter Laco, PhD. (Banská Bystrica), Secretary
Mgr. Leontína Striežovská (Banská Bystrica)
Application of Statistical Commercial Software or Freeware in Academic Environment
Current trends in teaching statistics require active use of specialized statistical software to satisfy demands of the labour market on qualifications of graduates. Software of either commercial or open-source origin can be considered. The article describes a framework of software use in graduate statistical courses offered at a public academic institution in the United States. Software incorporation in the teaching process and research computing are mentioned. Further, characteristics and comparisons of available types of statistical software are provided with respect to analytic potential, programmability, graphical capabilities, data or graphical connectivity, hardware requirements, product support, user comfort and switching ease. Recently developed open-source statistical packages implementing features of Unix-based operating systems can be additionally applied to academic purposes. The most frequently used titles were outlined and contrasted with current commercial packages. Finally, general aspects of statistical software selection for academic use are discussed.
Micro- and Macroeconomic Models and Derive5
Micro- and macroeconomic models belong to the field of economy that uses classical tools of mathematical analysis and algebra for solving problems. We can mathematical software for solving and representation of those problems. The contribution is concerned with utilization of mathematical software that is convenient for such type of mathematical and economic problems.
Smooth-Transition Regression of the Philips Curve for Germany
This paper presents the application of smooth transition regression to a specification of the Phillips curve. After a short discussion of the basic relationship and its empirical implementation we first discuss linear estimates and subsequently provide some evidence of the capability of smooth transition regressions to represent salient features of this relationship in view of structural changes of the economy. The empirical applications are devoted to the German economy.
Some Nonparametric Two-sample Tests of Scale
For the testing the difference in scale of two independent samples, the classical parametric F-test and many of nonparametric tests based on the ranks of observations can be used. The best-known tests will be presented and some results of robustness of F-test and of some nonparametric tests will be presented too.
Actuarial Method Standards for Latvian Private Pension Funds Accounting
The article represents the review of a modern standards of the pension mathematicians – from traditional actuarial methods of financing up to stochastic models of pension systems as systems with dynamic management. Methods of an estimation of pension fund actives are described.
CASE systems and their importance for optimalization enterprising databases
CASE (Computer Aided Software Engineering) systems are the tools, which can increase productivity of work in the enterprise. The main effect of CASE is the permanent application tending to create, save, update and present information about enterprising system, aims, structure and functions. CASE systems use central database, which contain technical, organizational and control information. CASE are appointed for each phase in the life cycle of information system.
On the Existence of the Liapunov bifurcation constant in an Macroeconomic Model.
A non-linear macroeconomic model of an open economy describing the development of output and exchange rate is investigated from the view of the existence of business cycles. The crucial rate in the existence of business cycles plays the Liapunov constant in the bifurcation equation. In the paper sufficient conditions for its existence are found.
Using Obstacles of GDP Estimates in the Czech Republic
This contribution is engaged in the improvement of quarterly information on the gross domestic product by type of expenditures from the first expectations to final data. It concludes that the first estimate of GDP is under-estimated. This underestimation is caused by underestimation of the final households consumption expenditures and of the gross fixed capital formation. At the balancing process, when the above-mentioned items of expenditures are under-estimated, the surplus of resources side is allocated to the change in stock of inventories item. Thus this item contains the discrepancy between the resources side and expenditures side of GDP and is disqualified from its role in business-cycle analysis. At later revisions of the data on GDP, final household consumption expenditures and gross fixed capital formation items are increased and the discrepancy included in change in stock of inventories is decreased. Differences of biases in constant prices data and current prices data are also stated.
IT utilization in education process of managers.
In the contribution we want to mention some possibilities of improving and enhancement of education process on economic faculty with utilization of available IT equipment. We work on establishing e –learning in our subjects. We try to create and use educational multimedia for more effective achievement educational goals. Moreover, we want to present our personal experiences with software NetOp school, which we exploit on our lessons and which is designed to make computer- based teaching more effective. This software makes teaching process easier and teacher can instruct, monitor and assist students on their PCs from his master PC.
A Model of Moderation: Finding Skiba Points on a Slippery Slope
A simple model is considered that rewards “moderation” – finding the right balance between sliding down either of two “slippery slopes”. Optimal solutions are computed as a function of two key parameters: (1) the cost of resisting the underlying uncontrolled dynamics and (2) the discount rate. Analytical expressions are derived for bifurcation lines separating regions where it is optimal to fight to stay balanced, to give in to the attraction of the “left” or the “right”, or to decide based on one’s initial state. The latter case includes situations both with and without so-called Dechert-Nishimura-Skiba (DNS) points defining optimal solution strategies. The model is unusual for having two DNS points in a one-state model, having a single DNS point that bifurcates into two DNS points, and for the ability to explicitly graph regions within which DNS points occur in the 2-D parameter space. The latter helps give intuition and insight concerning conditions under which these interesting points occur.
Application of classification methods in mobile telecommunication sector.
Tough competition in wireless industry at domestic market enforce wireless carriers to look for methods that would help them to manage customer relationship (CRM) more and more effectively. To acquire new customers, grow their value, maintain their life cycle by offering cross-sell and up-sell products and services and, in particular, to control churn require, besides regular marketing tools, some alternative quantitative techniques.
In this case there are two classification techniques applied on real data sets in order to build a predictive model to predict potential churners. Further performance evaluation of C5.0, commercially implemented decision tree algorithm C4.5, and CART, classification and regression tree, is investigated in predicting targeted class over two consecutive time periods. Since some of the recent experiments with these two techniques had shown that C5.0 somehow rather outperformed CART in accuracy when validated on withholded test data set, CART seems to be more accurate over the time, in which distributional changes in underlined population may occur. A hypothesis formulated here is whether a combination of both techniques may lead to an improved accuracy over the time.
The Dynamics of an optimal control model to reduce costs of grug addiction
This paper presents a control model which studies optimal spending for drug substitution programmes given that drug addiction causes economic costs to society. Two types of costs are considered: First, there are social costs resulting from individuals being addicted to drugs. Second, there exist additional costs due to injection drug addicts being vulnerable to infections like hepatitis C or HIV. The analysis of the model demonstrates that the long-run equilibrium is not necessarily unique. Instead, there may be multiple equilibria. Which of these equilibria is optimal depends on the initial conditions as to the number of drug addicts and as to the number of infected.
The possibilities of sectorial analyses
National accounting is a rich source of information about economical results of national economy. One of the variants possible how to use this information are the analyses of data appearing at accounts of institutional sectors. These approaches describing economical behavior of sectors with the help of relevant indicators appear only seldom. The aim of the contribution submitted is to show, how is possible to use data at sectorial accounts and, thus, how to bring the meaning of national accounts simply near to user.
Analysis of interval-censored data
Interval?censored data is a form of incomplete observation of survival time. Interval censoring is used to describe a situation where a subject?s survival time is known only to lie between two values. The term survival time denotes a time to the occurrence of an event. The Cox regression model which is used to analyze survival data assumes right-censored data. The problem of interval-censored data is solved by means of a regression model with a binary dependent variable. A linear model for the complementary log-log transformation of the conditional probability of an occurrence in the j?th interval is used, that is a general linear model with the complementary log-log link function. The model can be approximated by the logistic regression model The aim of the article is to investigate influence of using interval-censored data instead of ?exact? times on parameter estimation.
The Usage of Coherence in Statistical Practice of the Czech Republic
The authors present an algorithm for making flash estimate of quarterly change in gross domestic product at constant prices in the conditions of the CzechRepublic. This algorithm is based on monthly/quarterly statistics on the output of individual branches, i.e. on data measuring trends in structured domestic supply. Estimates produced by the Czech Statistical Office and covering six preceding quarters make up the part of history, which the flash estimate is built on. It depends therefore on both the stability of methodology used for compiling quarterly national accounts and the perfection of its application. The authors assume that the innovated algorithm should be accessible to experts as well as the general public, and that it can serve them to arrive at well-founded rough information on total q-o-q change in GDP at constant prices a month or so before CZSO official estimate is released.
Models for the Prediction of Enterprise Surviving
This paper deals with the prediction of surviving of Slovak enterprises utilizing various econometric models. The total sample of investigated enterprises is divided into the group of existing enterprises and the group of died enterprises. The financial situation of Slovak enterprises is described using various financial indicators which where chosen by experts for the prediction of enterprise prosperity e.i.: Tamari index, Altman’s z – score, the „bonity“ index, etc. The financial indicators are used in the models as independent variables. The measurement of convenience of used methods is found out.
abstract not available
One input power production functions of Slovak construction companies
Author estimates construction sector production functions in Slovakia. Real data from the year 2001 (output: value added (in millions Slovak crowns), inputs: sum of assets (in millions Slovak crowns) and average number of employee) show that private construction companies fulfill one input power production function.
Tax equity in the Polish fiscal system
The aim of this paper is to analyse the range of tax and tax-related burdens in the Polish fiscal system in relation to the height of an earned income. This analysis enables to verify the interpretation of the tax equity, implicitly resulting from actual law regulations. Although the Polish fiscal system, especially seen in the perspective of the income tax, is said to be significantly progressive, detailed analysis of the most important types of burdens of public law suggests that the system is almost “flat”, what would imply, that the rate of taxes paid is independent from the height of an income.
Development of public service quality measurement scale
Research in social and economical sciences is often based on measurement of latent variables. Service quality in public sector could be considered either from operational or client perception point of view. Service quality in the latter case is latent variable that can be measured through application adequate measurement scale. Scale development is the process of defining measured concept, generating scale items in specific format, collecting data through administration scale to development sample, assessing dimensionality, reliability and validity through statistical analyses.
Paper presents results of factor analysis, reliability analysis and validity analysis of measurement public service quality among students of Wroclaw University of Economics, that evaluated quality of different state and local government offices. There were identified two main dimensions of public service quality – functional aspects and personal aspects. The initial scale contained 46 items in Likert format and 23 items in semantic differential format. For practical purposes in next measurements of public services quality initial scale was purified, i.e. items slightly correlated with latent variable were deleted. Properties of purified scale were assessed in new statistical analysis.
Possibilities of education potential quantification
In the paper some basic theoretical concepts for the possible quantification of the educational potential of population are discussed. The synergic effect of various parameters characterizing the population is stressed. Some possible verifications of the educational potential are discussed regarding to several economic parameters as well. We are also dealing with the analogy of using some statistical and mathematical methods for the quantification of educational potential and another qualitative parameters describing the state of population.
Application of time series analyze in hotels’ management
Methods of time series analyze and specific features of time series in hotels are described. Methods, that seem to be suitable for application in hotels’ management, are recommended for application in this sector. Possibilities of applications of these methods with help of statistical software are mentioned and discussed. We present our prepared tool in MS Excel that can allow hotels’ managers to investigate time series with seasonality, as well as some results of our analyze of hotel sector in Slovakia.
The Model of Demand for Substitute Durable Goods and Its Stability
In the paper we construct a model of demand for substitute (i.e. mutually competitive) durable goods at a certain market. The problems of durable goods are common mainly in relation to services which accompany the consumption. The main feature of demand for durable goods is the fact, that it formsnondecreasing function of time. The foundation of nondecreasing demand is confirmed also by different types of fidelity systems utilized by firms and their other activities to keep the permanent clientele of customers.
The model forms a nonlinear system of n differential equations. It takes into account the factors as price, quality, advertising and market capacity. The output of the model is a vector function of demand dependent on time.
Further, we analyze stability of equilibrium solutions of the model and we give their economic interpretations. The equilibrium solution represents the state of competitive environment where market sizes of substitute goods do not change at time.
Regression methods for survival data
Survival time is defined as the time to the occurrence of a given event and this type of data can include various medical variables as well as economical, marketing or sociological data.
In the paper parametrical and non-parametrical approaches are formulated and compared with respect to their properties for different types of data (censoring, data distribution assumptions etc.). Identification of factors that influence the occurrence of the event or covariates that are important for prognosis of survival time will be of interest and regression analysis procedures (that can contain both qualitative and quantitative variables) will be used in order to find most relevant independent variables.
Stochastic interest rate models in life insurance
In the composition we consider some models of interest rate in life insurance. Typically, for simplicity, it is assumed that underlying interest rate is fixed and the same for all years. However, the interest rate that will apply in future years is of course neither known nor constant. We describe three cases. The first we assume that annual rates of interest are independent random variables with common mean and variance. Following two chances refer the force of interest accepting form of autoregressive process and arithmetic Brownion motion. The prior aim is an estimation of parameters on the basis of the real data. We use our results to calculate an expected value and variance of present value of benefit in life insurance. At last we compare the proposed methods.
Forecasts of Czech Export Time Series
I will demonstrate the relationship between Czech import time series into European Union countries and exchange rate of Czech crown vs Euro in my article. The dependency of import on exchange rate will be described by transfer function model in the form
where output series Yt is series of import (probably simply and seasonally diferenced), input series Xt is series of exchange rate of Czech crown (probably simply diferenced). The time lag is assumed in the model. This time lag is the time that elapses before the impulse of the input variable produces an effect on the output variable. The values of output series are contamined by noise series Nt in the model. I use the SCA statistical packet for my computing. Finally, the future values of export time series will be forecasted with using the constructed models.
Risk Management in Defined-benefit Pension Plan with Amortized actuarial gains/losses.
Risk Management in Defined-benefit Pension Plan with Amortized Actuarial Gains/Losses. The subject matter of this article deals with a stochastic investment model for defined benefit pension plan, where actuarial gains/losses are amortized over period of m years. The rates of investment return on pension fund assets are represented by stationary AR(1) process. Two types of risk are identified, the “contribution rate risk” and the “solvency risk” which are concerned with the stability of the contributions and the security of the pension fund, respectively. A performance criterion is introduced to deal with the simultaneous minimization of these two risks, using amortization period (m) as the control variable. A numerical investigation of the optimal values of m is provided. The results lead to practical conclusions about the optimal funding strategy and, hence, about the optimal choice of the contribution rate subject to the constraints needed for the convergence of the performance criterion.
Life insurance with the stochastic interest rates
The paper deals with the valuation in life insurance and also with the principles of its modelling. The aplications of the bond price aproximation in life insurance and results of some computations will be presented.
Optimality of life insurance
Assuming the standard axioms of a rational behavior, there are derived some conclusions concerning purchases of a life insurance and annuity. These known results are critically examined by comparison of different attitudes towards risk exhibited be purchasers who desired to reduce their risk to life and limb as well as to protect their dependents or families from financial perturbations.
The paper concerns a critical overview of general health economics problems. Assessment of quality of life from the health perspective is one of the particular subjects of a special interest.
After short historical remarks there are discussed two widely used measures, namely QALY (Quality Adjusted Life Years) and PYLL (Potential Years of Life Lost).
These two measures are illustrated by means of real data from north part of Poland.
Calculation of the option price with Markov switching in the financial market
We analyse the capital put in actions and bonds without risk. The part of the capital is transferring from the actions in the bond and vice verse in random time moments. This process is described by the system of impulse equations with a small parameter and with some conditions. Then the diffusion approximation proposed in my thesis can be applied. We receive, that the action’s price value is converges to the solution of the diffusion equation of Ornshtein-Ulenbeck type. Then one can use results, proposed by Black and Scholes for the calculation of the price of European call-option. The price of option as the function of several parameters is obtained. This dependence is analysed for optimisation of the process of call-option. Some numerical examples are considered too.
Logit Models with Ordinal Dependent Variables
To explain the variation in the variable of interest, the regression-type statistical models can be used, that predict the expected value of the dependent variable as a regression function of independent variables. When we treat categorical variables as dependent variables, special methods are required, because the classical regression analysis is simply inapplicable. The logit models with binary dependent variable (logistic regression models) are relatively known. This contribution deals with logit models with ordinal dependent variable.
Special electronic communication tools
The process of economic globalisation is a production transfer from an own domestic production to a foreign market. The strategic goal of that process is the creation of highly effective socially oriented economic system. It must be able to develop itself and to motivate increasing of human activity, which is the main factor of improvement of population’s material prosperity, as well as to create the necessary conditions for integration into world economic community.Theusage of special electronic communication variation creates a possibility for a more detailed and flexible monitoring of the company’s activities, give the area to firm’s presentation and assists to efficient managing of enterprises. This paper describes ways how to achieve success in the process of building and restructuring of enterprise strategy by using of new information technologies.
The geometry of dynamic bifurcations
Dynamical systems with slowly varying parameters are a class of systems of considerable practical importance. In many circumstances the assumption of a slowly varying parameter corresponding to a slowly changing environment or experimental situation is more natural than the quasistatic variation of a bifurcation parameter. For ordinary differential equations, in the simplest possible case this leads to problems of the form
χ’ = f (χ, µ), (1)
to which the equation
µ’1 = ε (2)
is appended with ε > 0 small. The dynamics of (1), (2) can be surprisingly different from the bifurcation behavior of the quasistatic problem (1) for ε = 0, as in some cases the effects of a bifurcation in (1), (2) become noticeable only after a certain time delay. The corresponding phenomena which occur in this context are often referred to as dynamic bifurcations.
We give an overview of the cases where f in (1) has a fold, transcritical, or pitchfork singularity, as those are the least degenerate situations possible, as well as of the case when (1) undergoes a Hopf bifurcation. Our principal tool is the blow-up technique, which allows a detailed geometric analysis of such problems.
abstract not available
Actuarial Methods of Funding for Supplementary Pension Insurance
Evaluation of pension schemes is one of chief tasks of an actuary in all European Union member countries. In these calculations, the pension scheme is used as a succession of incomes, i.e. contricutions to the fund together with investment yields on the one hand, and on the other hand, i.e. on the side of expenditures, as a succession of paid claims and costs of a given pension scheme. These most recent methods of funding and procedures of calculating will have to be dealt with also by our supplementary pension insurance. There are four most frequently applied methods: the age at entry method, the age at maturity method, the method of projected unit, and the method of the present unit.
Fluctuations in an open economy model with two dimensional parameter.
In the article “Fluctuations in an open economy model under fixed exchange rate regime” that was introduced in 2002 was solved the dynamic model of an open economy with one dimensional parameter. The questions of the existence and stability of an equilibrium and the existence of business cycles were analyzed. In this paper the model is solved with two dimensional parameter.
Clustering Textual Documents
Textual documents are characterized by terms. For statistical analysis, we can create the data matrix input in which rows concern individual documents and columns contain values concerning individual terms. These values can be binary (the certain term is absent or present in the certain document) or they can consider frequencies of the occurrence of terms. In the second case, we can use either frequencies TFij (a frequency of the term tj in the document Di) directly or we can compute weights wij. The weight can be expressed as a product of values TFij and IDFj, where IDFj is an inverse frequency of the term in all documents. For the use of clustering methods, we have to consider which type of the data has to be analyzed. The data can be binary, integer values or decimal values in the interval from 0 to 1. In the first case, special techniques can be used in statistical packages, for example monothetic cluster analysis and special dissimilarity or similarity measures in hierarchical cluster analysis and multidimensional scaling. For frequencies, we can use a chi-square dissimilarity measure. Further, we should consider that documents cannot be assigned to groups directly. A document can contain terms which can be assigned to more than one group. In this case, fuzzy cluster analysis can be useful.
Neural Networks for Economical Data Prediction
It is true to say that regional prediction and planning has become very important recently. Researches tend to apply different methods to analyze social and economical processes. In many practical applications, like selection of a global optimal supervisory control schema and reconstruction of missing data, Neural Networks modelling appears to be a powerful method.
In our study application of different types of Neural Networks towards economical and demographic factors were examined. The automated program which includes Neural Network simulator was created. Different types of Neural Networks were tried to receive a Network with better generalization such as General Regression and Polynomial Networks. A method of combining Neural Networks and Genetic Algorithms in order to work with short data series was offered.
Our work presents methodology of economic and demographic factor analysis. The methodology consists of the following steps:
1. Definition of examined factors and data preanalysis which includes data representation inside the range [0,1] and correlation testing.
2. Determining the most significant parameters inside object region.
3. Creating Neural Networks for process approximation and forecasting.
4. Building up a modular Neural Network in order to predict parameters and realizing forecasts.
5. Checking adequancy of generated models.
Genetic Algorithms were used in order to choose an optimal structure of Neural Networks.
Some economic and demographic parameters for Russia and Slovakia were calculated and compared. To sum up all the outcomes we should conclude that our system has good results for a little search space.
In future work we will try to compare demographic status for different territories and add some additional advantages to create better Neural Network models.
Technical Analysis of Stock Prices
Technical analysis (TA) of stock prices serves for the generation of trading signals of the type BUY or SELL or HOLD. It is based on the use of either price or volume indicators, which are generally dependent on present and past prices and trade volumes. The purpose of this contribution is to compare the efficiency of the most used price indicators from the point of view of the number of trade signals,rough profit obtained and indicator parameters. Input data are created by the “blue chip” prices at Prague Stock Exchange. The possibility of the use of neural networks is discussed.
Learning in Heterogeneous Agent Model with the WOA
The Efficient Markets Hypothesis provides a theoretical basis for trading rules. Technical trading rules provide a signal of when to buy or sell an asset based on such price patterns to the user. Technical traders tend to put little faith in strict efficient markets. Fundamentalists rely on their model employing fundamental information basis to forecast the next price period. The traders determine whether current conditions call for the acquisition of fundamental information in a forward looking manner rather than relying on past performance. This approach relies on heterogeneity in the agent information and subsequent decisions either as fundamentalists or as chartists. It was shown that implementation of the learning agents process can significantly change the preferences of trader strategies. The Worst Out Algorithm (WOA) is used with this heterogeneous agent model to simulate more realistic market conditions. After every i iterations the WOA replaces the worst performing trading strategy (belief type) with the new one.
This paper shows an influence of the learning agents process on a heterogeneous agent model with the WOA.
Using of Kaplan-Meier methodology and Cox Regression in cancer thick bowels analysis
We were asked about help at statistical evaluation of new operational procedure at surgery cancer thick bowels. This new procedure was applied by the surgical team Thomayer’s hospital in Krc, Prague. The main aim was hypotheses testing of two possible proceedings. First proceeding is “old”. Second proceeding is new and it is used four years only. We tested if exists statistical significant difference between these two proceedings and if the new way is better then old. We assumed that patient who was operated by new proceeding lives a long time than patient who was operated by old proceeding, of course. We have used Kaplan – Meier methodology and Cox Regression for our statistical descriptions, testing and decisions.
Density preserving decoder in evolutionary optimization
Evolutionary algorithms are a class of multi-dimensional global optimization methods inspired by natural evolution. Their search is naturally limited to “box” feasible spaces defined by lower and upper bounds for each variable. To handle other types of constraints several methods have been devised in literature, one of them being the method of decoders consisting in transformation of vector of variables, which is being done before objective function evaluation. In this paper, it is stressed that a sophisticated decoder should not prefer any part of the search space to any other. A transformation formula for such a density-preserving decoder is derived for a case of constraint met in portfolio optimization. The decoder is then compared to two other decoders lacking density-preserving property. The comparisons performed from three different viewpoints show decisive advantage of the density-preserving decoder. The decoder is applicable also with other optimization methods.
Classical and Bayesian Experimental Design Methods – Similarities and Differences
The purpose of experimental design methods is to achieve optimal model parameter estimates through proper setting of experiment conditions. Bayesian experimental design methods can improve the effectiveness of these estimates through incorporating the prior knowledge about model parameters into the design building procedure. It turns out that using the prior knowledge can significantly change the optimal experiment setup as well as the optimal number of trials.
The paper presents differences and similarities between classical and Bayesian approaches with respect to design problem definition, optimality criterion definitions and dependence of optimal discrete designs on the number of experiments performed. In particular the decision theoretic approach to Bayesian experimental design will be shown and how choosing particular loss functions leads to optimality criterions corresponding to classical criterions. We will also show some numerical methods which are used in order to find the optimal experimental designs.
The use of Monte Carlo simulations in evaluation of geographic properties of survey designs
The paper focuses on investigation of possibilities to use Monte Carlo simulation techniques in evaluation of geographic properties of survey designs. Two-stage stratified design with variations in stratification parameters will be analysed and empirical distribution of number of municipalities to be visited will be developed.
abstract not available
On the Possibilities of Regulations of Macroeconomic Processes in a Single Open Economy
A single open economy model describing the development of output, exchange rate, interest rate and money supply is analyzed at alternative exchange rate regimes. Stress will be put on the existence of an equilibrium of this model, its stability and the existence of business cycles.
Determinants of the Slovak Core Inflation
The central banks of so called Central European Countries (CEC) clearly declared their aim to join the Eurozone after entering European Union as soon as possible. The level of inflation is one of the Maastricht criteria, which would be neccessary to pass. The contribution deals with a transmission mechanism of the Slovak central bank monetary policy and describes possibilities of the core inflation modeling by the stepwise regression. The model of the hidden periods is applied.
Bayesian approach to the analysis of incomplete data
In many cencuses and sample surveys some of units do not respond to at least some items being asked. These missing values not only mean less efficient estimates because of the reduced size of the data base, but also that standard complete – data methods cannot be immediately used to analyze the data. Moreover, possible biases exist because the respondents are often systematically different from the nonrespondents. In order to solve this problem, one can use the method of multiple imputation. The multiple imputation follows from the Bayesian theory. We will cosider the theoretical background of mutiple imputation of and the models used in this methods.
 The loss of customers who switch from one carrier to another or quit wireless services.