Results 1 
9 of
9
2001: Impact of improved initialization of mesoscale features on convective system rainfall in the 10km Eta simulations
 Wea. Forecasting
"... Impact of improved initialization of mesoscale features on convective system rainfall in 10km Eta simulations ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
(Show Context)
Impact of improved initialization of mesoscale features on convective system rainfall in 10km Eta simulations
Fisher Information Test of Normality
, 1998
"... An extremal property of normal distributions is that they have the smallest Fisher Information for location among all distributions with the same variance. A new test of normality proposed by Terrell (1995) utilizes the above property by finding that density of maximum likelihood constrained on havi ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
An extremal property of normal distributions is that they have the smallest Fisher Information for location among all distributions with the same variance. A new test of normality proposed by Terrell (1995) utilizes the above property by finding that density of maximum likelihood constrained on having the expected Fisher Information under normality based on the sample variance. The test statistic is then constructed as a ratio of the resulting likelihood against that of normality. Since the asymptotic distribution of this test statistic is not available, the critical values for n = 3 to 200 have been obtained by simulation and smoothed using polynomials. An extensive power study shows that the test has superior power against distributions that are symmetric and leptokurtic (longtailed). Another advantage of the test over existing ones is the direct depiction of any deviation from normality in the form of a density estimate. This is evident when the test is applied to several real data sets. Testing of normality in residuals is also investigated. Various approaches in dealing with residuals being possibly heteroscedastic and correlated suffer from a loss of power. The approach with the fewest undesirable features is to use the Ordinary Least
PERFORMANCE AND ACCURACY ENHANCEMENTS OF RADIATIVE HEAT TRANSFER MODELING
, 2002
"... Two ways to reduce the computational requirements of radiative heat transfer Monte Carlo simulation are explored. First, an efficient algorithm for tracing particles in large, arbitrarily complex, planar geometries containing nonparticipating media is presented. For arbitrary triangles and/or convex ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Two ways to reduce the computational requirements of radiative heat transfer Monte Carlo simulation are explored. First, an efficient algorithm for tracing particles in large, arbitrarily complex, planar geometries containing nonparticipating media is presented. For arbitrary triangles and/or convex planar quadrilaterals, an efficient intersection algorithm is discussed in detail. After surveying several techniques used in ray tracing to limit the number of surfaces tested, the method of Uniform Spatial Division (USD) is implemented. The efficiency of the intersection algorithm and USD are demonstrated by timing results. Second, improving the accuracy of the Monte Carlo results by applying reciprocity and closure is explored. Statistical theory is applied to the reciprocity estimation smoothing (RES) technique which combines reciprocity enforcement through estimation and closure enforcement through the technique of leastsquares smoothing. By examining a large number of runs of two large geometries, several RES methods are compared to find the best method. The effects of the RES method on surfaces and individual results between surfaces are also explored. Estimates of the improvements caused by the RES method that
Visual EDF Software to Check the Normality Assumption
"... A host of materials in an introductory level statistics course relies on the “normality assumption. ” However, assessing normality is a very subtle and difficult task, even for expert data analysts. To aid our student’s understanding we have written a program that implements a visual version of the ..."
Abstract
 Add to MetaCart
(Show Context)
A host of materials in an introductory level statistics course relies on the “normality assumption. ” However, assessing normality is a very subtle and difficult task, even for expert data analysts. To aid our student’s understanding we have written a program that implements a visual version of the Lilliefors test for normality and the powerful AndersonDarling test for normality. In this paper we introduce these tests and show their usefulness through examples. The software, which is freely available to colleagues and students at
GARFIELD AND THE IMPACT FACTOR: THE CREATION, UTILIZATION, AND VALIDATION OF A CITATION MEASURE Part 2, The Probabilistic, Statistical, and Sociological Bases of the Measure By
"... Eugene Garfield laid one of the key empirical foundations for modern information science through the innovation of the citation indexing of science. He created the impact factor as part of the process of developing the Science Citation Index (SCI) produced by his company, the Institute for Scientifi ..."
Abstract
 Add to MetaCart
(Show Context)
Eugene Garfield laid one of the key empirical foundations for modern information science through the innovation of the citation indexing of science. He created the impact factor as part of the process of developing the Science Citation Index (SCI) produced by his company, the Institute for Scientific Information (ISI). In determining the coverage of this index, Garfield
NORMALITY AND DATA TRANSFORMATION FOR APPLIED STATISTICAL ANALYSIS
"... Applications of statistical techniques are common in scientific and multidisciplinary research. Statistical tools are useful to describe the numerical facts as well as relationship between the factors and to test the independence of attributes or variables. Some researchers use statistics to explain ..."
Abstract
 Add to MetaCart
(Show Context)
Applications of statistical techniques are common in scientific and multidisciplinary research. Statistical tools are useful to describe the numerical facts as well as relationship between the factors and to test the independence of attributes or variables. Some researchers use statistics to explain the findings of a phenomenon. However some other researchers use statistical tools without understanding of the statistical technicality. Statistical errors are very common in scientific research and 50 percent of the published articles have at least some error. These errors are mainly about the important assumption of normality. The concept of normality and data transformation are the most important part when using statistical techniques. The assumption of normality is required for most of the statistical tools, namely correlation, regression and parametric test because their validity is based on normality. Data transformations are commonly used tools that can serve many functions in quantitative data analysis. The main purpose of this paper is to highlight the basic and important assumption based on normal distribution in terms normality test. The paper describes the concept of normality and how to test the normality of data. In this paper also describes the tools for normality in terms of standardized data transformation.
and
"... Probability plots are widely used tools for assessing distributional assumptions, but accurate interpretations from these plots are sometimes elusive. An interactive visual basic program that plots the empirical distribution function (e.d.f.) of the data in a coordinate system with KolmogorovSmirno ..."
Abstract
 Add to MetaCart
Probability plots are widely used tools for assessing distributional assumptions, but accurate interpretations from these plots are sometimes elusive. An interactive visual basic program that plots the empirical distribution function (e.d.f.) of the data in a coordinate system with KolmogorovSmirnov bounds is offered as a visual aid to probability plotting. Additionally, the software runs two tests based on the empirical distribution function to check distributional assumptions: the powerful AndersonDarling test and the wellknown Lilliefors test. In a desktop environment, this software offers users options for checking whether a given data set is consistent with either the normality or exponentiality assumptions.