搜索结果: 1-15 共查到“管理学 Entropy”相关记录34条 . 查询时间(0.187 秒)
Entropy and random feedback
Entropy random feedback
font style='font-size:12px;'>
2015/7/10
The gamma-entropy is a convex function of matrices that is closely related to the Frobenius and spectral (maximum singular value) norms. It comes up in several applications such as central H-infinity ...
Entropy and Mutual Information for Markov Channels with General Inputs
Entropy Mutual Information Markov Channels General Inputs
font style='font-size:12px;'>
2015/7/8
We study new formulas based on Lyapunov exponents for entropy, mutual information, and capacity of finite state discrete time Markov channels. We also develop a method for directly computing mutual in...
A Comparison of Cross-Entropy and Variance Minimization Strategies
variance minimization cross-entropy importance sampling rareevent simulation likelihood ratio degeneracy
font style='font-size:12px;'>
2015/7/6
The variance minimization (VM) and cross-entropy (CE) methods are two versatile adaptive importance sampling procedures that have been successfully applied to a wide variety of difficult rare-event es...
The Cross-Entropy Method for Estimation
cross-entropy estimation rare events importance sampling adaptive Monte Carlo zero-variance distribution
font style='font-size:12px;'>
2015/7/6
This chapter describes how difficult statistical estimation problems can often be solved efficiently by means of the cross-entropy (CE) method. The CE method can be viewed as an adaptive importance sa...
Range-Renewal Speed and Entropy for I.I.D Models
Range-Renewal Speed Entropy I.I.D Models
font style='font-size:12px;'>
2013/6/17
In this note the relation between the range-renewal speed and entropy for i.i.d. models is discussed.
Partial Transfer Entropy on Rank Vectors
Partial Transfer Entropy Rank Vectors
font style='font-size:12px;'>
2013/4/28
For the evaluation of information flow in bivariate time series, information measures have been employed, such as the transfer entropy (TE), the symbolic transfer entropy (STE), defined similarly to T...
Testing Exponentiality Based on Rényi Entropy With Progressively Type-II Censored Data
Renyi Entropy hazard function Monte Carlo simulation order statistics Type-II progressively censored data
font style='font-size:12px;'>
2013/4/28
We express the joint R\'enyi entropy of progressively censored order statistics in terms of an incomplete integral of the hazard function, and provide a simple estimate of the joint R\'enyi entropy of...
Refinement revisited with connections to Bayes error, conditional entropy and calibrated classifiers
Refinement Score Probability Elicitation Calibrated Classifier Bayes Error Bound Conditional Entropy Proper Loss
font style='font-size:12px;'>
2013/4/27
The concept of refinement from probability elicitation is considered for proper scoring rules. Taking directions from the axioms of probability, refinement is further clarified using a Hilbert space i...
Statistical estimation of quadratic Rényi entropy for a stationary m-dependent sequence
Entropy estimation quadratic R′enyi entropy stationarym-dependent sequence inter-point distances U-statistics
font style='font-size:12px;'>
2013/4/27
The R\'enyi entropy is a generalization of the Shannon entropy and is widely used in mathematical statistics and applied sciences for quantifying the uncertainty in a probability distribution. We cons...
Variance estimation and asymptotic confidence bands for the mean estimator of sampled functional data with high entropy unequal probability sampling designs
covariance function finite population Hajek approximation Horvitz-Thompso estimator Kullback-Leibler divergence rejective sampling unequal probability sampling without replacement
font style='font-size:12px;'>
2012/11/23
For fixed size sampling designs with high entropy it is well known that the variance of the Horvitz-Thompson estimator can be approximated by the H\'ajek formula. The interest of this asymptotic varia...
Estimation of entropy-type integral functionals
U-statistics estimation of divergence density power divergence asymptotic normality entropy estimation Raenyi entrop
font style='font-size:12px;'>
2012/11/22
Integrated powers of densities of one- or two-multidimensional random variables appear in a variety of problems in mathematical statistics, information theory, and computer science. We study U-statist...
Learning Theory Approach to Minimum Error Entropy Criterion
minimum error entropy learning theory Renyi’s entropy empirical risk minimization approximation error
font style='font-size:12px;'>
2012/9/18
We consider the minimum error entropy (MEE) criterion and anempirical risk minimization learning algorithm in a regression setting. Alearning theory approach is presented for this MEE algorithm and ex...
Scaling of Model Approximation Errors and Expected Entropy Distances
Scaling of Model Approximation Errors Expected Entropy Distances
font style='font-size:12px;'>
2012/9/19
We compute the expected value of the Kullback-Leibler divergence to various fundamental statistical models with respect to canonical priors on the probability simplex. This yields information about th...
Scaling of Model Approximation Errors and Expected Entropy Distances
Scaling of Model Approximation Errors Expected Entropy Distances
font style='font-size:12px;'>
2012/9/19
We compute the expected value of the Kullback-Leibler divergence to various fundamental statistical models with respect to canonical priors on the probability simplex. This yields information about th...
Using Moving Average Method To Estimate Entropy In Testing Exponentiality For Type-II Censored Data
Entropy Monte Carlo simulation Kullback-Leibler distance mov-ing average method Hazard function.
font style='font-size:12px;'>
2012/9/19
In this paper, we introduce a modified test statistic, by applying moving average method and present a new cdf estimator to estimate the joint entropy of the type-II censored data. We also establish a...