Results 1  10
of
10,635
Entropic measure on multidimensional spaces, preprint
"... Abstract. We construct the entropic measure P β on compact manifolds of any dimension. It is defined as the push forward of the Dirichlet process (another random probability measure, wellknown to exist on spaces of any dimension) under the conjugation map C: P(M) → P(M). This conjugation map is a ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
Abstract. We construct the entropic measure P β on compact manifolds of any dimension. It is defined as the push forward of the Dirichlet process (another random probability measure, wellknown to exist on spaces of any dimension) under the conjugation map C: P(M) → P(M). This conjugation map is a
Entropic measures of individual mobility patterns
"... Abstract. Understanding human mobility from a microscopic point of view may represent a fundamental breakthrough for the development of a statistical physics for cognitive systems and it can shed light on the applicability of macroscopic statistical laws for social systems. Even if the complexity of ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
of individual behaviors prevents a true microscopic approach, the introduction of mesoscopic models allows the study of the dynamical properties for the nonstationary states of the considered system. We propose to compute various entropy measures of the individual mobility patterns obtained from GPS data
ENTROPIC MEASURES FOR STUDENT ENROLMENTS AT UNISA
"... Entropy originated from physics, was introduced by Boltzman in 1872 and shows a degree of uncertainty or chaos for a system/subsystem. The versatility of the entropy concept was proven through several fields of science: chemical engineering, electrical engineering, metallurgical engineering, communi ..."
Abstract
 Add to MetaCart
) at Unisa. Entropic computations revealed preliminarily the degree of uncertainty of enrolment; hence the information contained in enrolments did not vary sensibly from college to college. Similar results were also obtained for graduations as well as staff (HC). Around 16 to 17 % degree of uncertainty
SAFEM: Scalable Analysis of Flows with Entropic Measures and SVM
, 2012
"... Abstract—This paper describes a new approach for the detection of largescale anomalies or malicious events in Netflow records. This approach allows Internet operators, to whom botnets and spam are major threats, to detect largescale distributed attacks. The prototype SAFEM (Scalable Analysis of Fl ..."
Abstract
 Add to MetaCart
of Flows with Entropic Measures) uses spatialtemporal Netflow record aggregation and applies entropic measures to traffic. The aggregation scheme highly reduces data storage leading to the viability of using such an approach in an Internet Service Provider network. I.
1 Entropic Measure and Wasserstein Diffusion
, 704
"... We construct a new random probability measure on the sphere and on the unit interval which in both cases has a Gibbs structure with the relative entropy functional as Hamiltonian. It satisfies a quasiinvariance formula with respect to the action of smooth diffeomorphism of the sphere and the interv ..."
Abstract
 Add to MetaCart
We construct a new random probability measure on the sphere and on the unit interval which in both cases has a Gibbs structure with the relative entropy functional as Hamiltonian. It satisfies a quasiinvariance formula with respect to the action of smooth diffeomorphism of the sphere
Entropic Measure of Freedom and Social Representation Principle
"... The discussion about freedom in economic literature is essentially about the measurement of the value of an opportunity set. The problem is to find a reasonable criterion that involves both objective and subjective considerations. Jones and Sugden's idea of “potential preferences” provides card ..."
Abstract
 Add to MetaCart
cardinality criterion to work within a domain of relevant opportunities. It shapes an optimal conceptual structure, but the definition of the preferences that should be considered relevant is missing. This study proposes a new perspective on freedom that allows entropic measure of freedom to be a satisfying
Entropic Measures of Mixing Tailored for Various Applications
"... Abstract. Mixing is an important component in most processing operations including but not limited to polymer processing. Generically, mixing refers to the system capability to reduce composition nonuniformity. Since the entropy is the rigorous measure of disorder or system homogeneity, we will expl ..."
Abstract
 Add to MetaCart
Abstract. Mixing is an important component in most processing operations including but not limited to polymer processing. Generically, mixing refers to the system capability to reduce composition nonuniformity. Since the entropy is the rigorous measure of disorder or system homogeneity, we
Key words: Entropic Measure of Relative Dispersion, Uncertainty, Measure of Dispersion
, 2002
"... We propose a new measure for dispersion, the Entropic Measure of Relative Dispersion. It uses the variance to measure the dispersion of the typical values around the mean. In order to compare distributions we need a normalization. This normalization is carried out in the way that we devide the var ..."
Abstract
 Add to MetaCart
We propose a new measure for dispersion, the Entropic Measure of Relative Dispersion. It uses the variance to measure the dispersion of the typical values around the mean. In order to compare distributions we need a normalization. This normalization is carried out in the way that we devide
Symplectic invariants, entropic measures and correlations of Gaussian states
, 2004
"... We present a derivation of the Von Neumann entropy and mutual information of arbitrary two–mode Gaussian states, based on the explicit determination of the symplectic eigenvalues of a generic covariance matrix. The key role of the symplectic invariants in such a determination is pointed out. We show ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
We present a derivation of the Von Neumann entropy and mutual information of arbitrary two–mode Gaussian states, based on the explicit determination of the symplectic eigenvalues of a generic covariance matrix. The key role of the symplectic invariants in such a determination is pointed out. We show that the Von Neumann entropy depends on two symplectic invariants, while the purity (or the linear entropy) is determined by only one invariant, so that the two quantities provide two different hierarchies of mixed Gaussian states. A comparison between mutual information and entanglement of formation for symmetric states is considered, remarking the crucial role of the symplectic eigenvalues in qualifying and quantifying the correlations present in a generic state. 1
Results 1  10
of
10,635