Results 21  30
of
342
Spatiotemporal transform based video hashing
 IEEE Transactions on Multimedia
, 2006
"... Abstract—Identification and verification of a video clip via its fingerprint find applications in video browsing, database search and security. For this purpose, the video sequence must be collapsed into a short fingerprint using a robust hash function based on signal processing operations. We propo ..."
Abstract

Cited by 24 (0 self)
 Add to MetaCart
Abstract—Identification and verification of a video clip via its fingerprint find applications in video browsing, database search and security. For this purpose, the video sequence must be collapsed into a short fingerprint using a robust hash function based on signal processing operations. We propose two robust hash algorithms for video based both on the Discrete Cosine Transform (DCT), one on the classical basis set and the other on a novel randomized basis set (RBT). The robustness and randomness properties of the proposed hash functions are investigated in detail. It is found that these hash functions are resistant to signal processing and transmission impairments, and therefore can be instrumental in building database search, broadcast monitoring and watermarking applications for video. The DCT hash is more robust, but lacks security aspect, as it is easy to find different video clips with the same hash value. The RBT based hash, being secret key based, does not allow this and is more secure at the cost of a slight loss in the receiver operating curves. Index Terms—Broadcast monitoring, multimedia content authentication, robust hash, video database indexing, video hash. I.
A hardware Gaussian noise generator using the Wallace method
 IEEE Transactions on VLSI
, 2005
"... Abstract—We describe a hardware Gaussian noise generator based on the Wallace method used for a hardware simulation system. Our noise generator accurately models a true Gaussian probability density function even at high values. We evaluate its properties using: 1) several different statistical tests ..."
Abstract

Cited by 23 (9 self)
 Add to MetaCart
(Show Context)
Abstract—We describe a hardware Gaussian noise generator based on the Wallace method used for a hardware simulation system. Our noise generator accurately models a true Gaussian probability density function even at high values. We evaluate its properties using: 1) several different statistical tests, including the chisquare test and the Anderson–Darling test and 2) an application for decoding of lowdensity paritycheck (LDPC) codes. Our design is implemented on a Xilinx VirtexII XC2V40006 fieldprogrammable gate array (FPGA) at 155 MHz; it takes up 3 % of the device and produces 155 million samples per second, which is three times faster than a 2.6GHz PentiumIV PC. Another implementation on a Xilinx SpartanIII XC3S200E5 FPGA at 106 MHz is two times faster than the software version. Further improvement in performance can be obtained by concurrent execution: 20 parallel instances of the noise generator on an XC2V40006 FPGA at 115 MHz can run 51 times faster than software on a 2.6GHz PentiumIV PC. Index Terms—Channel coding, communication channels, fieldprogrammable gate arrays (FPGAs), Gaussian noise, highperformance, Monte Carlo methods, reconfigurablecomputing, technologymapping.
Exact skewnesskurtosis tests for multivariate normality and goodnessoffit in multivariate regressions with application to asset pricing models
, 2003
"... ..."
Cognitive radios for dynamic spectrum access  dynamic spectrum access in the time domain: Modeling and exploiting white space
 IEEE Communications Magazine
, 2007
"... Dynamic spectrum access is a promising approach to alleviate the spectrum scarcity that wireless communications face today. In short, it aims at reusing sparsely occupied frequency bands while causing no (or insignificant) interference to the actual licensees. This article focuses on applying this c ..."
Abstract

Cited by 21 (0 self)
 Add to MetaCart
(Show Context)
Dynamic spectrum access is a promising approach to alleviate the spectrum scarcity that wireless communications face today. In short, it aims at reusing sparsely occupied frequency bands while causing no (or insignificant) interference to the actual licensees. This article focuses on applying this concept in the time domain by exploiting idle periods between bursty transmissions of multiaccess communication channels and addresses WLAN as an example of practical importance. A statistical model based on empirical data is presented, and it is shown how to use this model for deriving access strategies. The coexistence of Bluetooth and WLAN is considered as a concrete example.
Exploring the Tail of Patented Invention Value Distributions,”
 WZB Working Paper
, 1997
"... Assessing value of patent protection has long been a central issue in the economic literature studying the incentives for innovation in market economies. A number of authors have exploited a particular feature of many (though not all) national patent systems: the renewal fee structure. In many coun ..."
Abstract

Cited by 20 (2 self)
 Add to MetaCart
Assessing value of patent protection has long been a central issue in the economic literature studying the incentives for innovation in market economies. A number of authors have exploited a particular feature of many (though not all) national patent systems: the renewal fee structure. In many countries, patent holders need to pay annual renewal fees in order to keep patent protection in force. The fees are usually increasing over time in order to weed out less important patents. Using information on the timing of nonrenewal, some studies have been able to infer the value of patent protection. However, these studies cannot use observable information on value differences among those patents that were renewed for the maximum duration of patent protection. But these are presumably the most important patented inventions. Since the valuation distributions are extremely skew, actually observed information in most renewal studies is only available on a small share of the total value of the overall national patent portfolio. We circumvent this problem by using a novel and more direct approach to the measurement of patent valuation. The paper focuses on the fullterm patents of the application year 1977 held by West German and U.S. residents. For the German patents, a twostage methodology was pursued. A preliminary telephone and telefax screening elicited patent value estimates and identified the most valuable patents, about which onsite interviews were then sought to develop more detailed historical information. For the U.S. patents, only an extended first stage was executed. Previous studies have found the distribution of patented invention values to be highly skew. We confirm this result, using actual information on the most valuable patents in the 1977 cohort. Several tests were conducted to pin down more precisely the nature of the highvalue tail distribution. Three highly skew alternatives were evaluated by graphical and maximum likelihood techniques: the twoparameter log normal, the oneparameter ParetoLevy, and the threeparameter SinghMaddala distribution. A twoparameter log normal distribution appears to provide the best fit to our patented invention value data. Abstract We explore the tail of patented invention value distributions by using value estimates obtained directly from patent holders. The paper focuses on those fullterm German patents of the application year 1977 which were held by West German and U.S. residents. The most valuable patents in our data account for a large fraction of the cumulative value over all observations. Several tests are conducted to pin down more precisely the nature of the highvalue tail distribution. Among the Pareto, SinghMaddala and log normal distributions, the log normal appears to provide the best fit to our patented invention value data.
Analytical and Empirical Analysis of Countermeasures to Traffic Analysis Attacks
 Proc. 32nd Int’l Conf. Parallel Processing (ICPP ’03
, 2003
"... This paper studies countermeasures to traffic analysis attacks. A common strategy for such countermeasures is traffic padding. We consider systems where payload traffic may be padded to have either constant interarrival times or variable interarrival times for their packets. The adversary applies ..."
Abstract

Cited by 20 (5 self)
 Add to MetaCart
(Show Context)
This paper studies countermeasures to traffic analysis attacks. A common strategy for such countermeasures is traffic padding. We consider systems where payload traffic may be padded to have either constant interarrival times or variable interarrival times for their packets. The adversary applies statistical recognition techniques to detect the payload traffic rates and may use statistical measures, such as sample mean, sample variance, or sample entropy, to perform such a detection. We evaluate quantitatively the ability of the adversary to make a correct detection. We derive closedform formulas for the detection rate based on analytical models we establish. Extensive experiments were carried out to validate the system performance predicted by the analytical method. Based on the systematic evaluations, we develop design guidelines that allow a manager to properly configure a system in order to minimize the detection rate. 1
Data driven smooth tests for composite hypotheses: comparison of powers
, 1997
"... The classical problem of testing goodnessoft of a parametric family is reconsidered. A new test for this problem is proposed and investigated. The new test statistic is a combination of the smooth test statistic and Schwarz's selection rule. More precisely, as the sample size increases, an in ..."
Abstract

Cited by 20 (9 self)
 Add to MetaCart
The classical problem of testing goodnessoft of a parametric family is reconsidered. A new test for this problem is proposed and investigated. The new test statistic is a combination of the smooth test statistic and Schwarz's selection rule. More precisely, as the sample size increases, an increasing family of exponential models describing departures from the null model is introduced and Schwarz's selection rule is presented to select among them. Schwarz's rule provides the "right" dimension given by the data, while the smooth test in the "right" dimension nishes the job. Theoretical properties of the selection rules are derived under null and alternative hypotheses. They imply consistency of data driven smooth tests for composite hypotheses at essentially any alternative.
Estimation of the number of true null hypotheses in multivariate analysis of neuroimaging data
 NeuroImage
, 2001
"... The repeated testing of a null univariate hypothesis in each of many sites (either regions of interest or voxels) is a common approach to the statistical analysis of brain functional images. Procedures, such as the Bonferroni, are available to maintain the Type I error of the set of tests at a speci ..."
Abstract

Cited by 19 (1 self)
 Add to MetaCart
The repeated testing of a null univariate hypothesis in each of many sites (either regions of interest or voxels) is a common approach to the statistical analysis of brain functional images. Procedures, such as the Bonferroni, are available to maintain the Type I error of the set of tests at a specified level. An initial assumption of these methods is a “global null hypothesis, ” i.e., the statistics computed on each site are assumed to be generated by null distributions. This framework may be too conservative when a significant proportion of the sites is affected by the experimental manipulation. This report presents the development of a rigorous statistical procedure for use with a previously reported graphical method, the P plot, for estimation of the number of “true ” null hypotheses in the set. This estimate can then be used to sharpen existing multiple comparison procedures. Performance of the P plot method in the multiple comparison problem is investigated in simulation studies and in the analysis of autoradiographic data. © 2001 Academic Press Key Words: PET; autoradiography; multiple comparisons; P plot.