Results 1 - 10
of
36,806
Semantics of Context-Free Languages
- In Mathematical Systems Theory
, 1968
"... "Meaning " may be assigned to a string in a context-free language by defining "at-tributes " of the symbols in a derivation tree for that string. The attributes can be de-fined by functions associated with each production in the grammar. This paper examines the implications of th ..."
Abstract
-
Cited by 569 (0 self)
- Add to MetaCart
specification of semantics which have appeared in the literature. A simple technique for specifying the "meaning " of languages defined by context-free grammars is introduced in Section 1 of this paper, and its basic mathematical properties are investigated in Sections 2 and 3. An example which
Topic-Sensitive PageRank
, 2002
"... In the original PageRank algorithm for improving the ranking of search-query results, a single PageRank vector is computed, using the link structure of the Web, to capture the relative "importance" of Web pages, independent of any particular search query. To yield more accurate search resu ..."
Abstract
-
Cited by 543 (10 self)
- Add to MetaCart
results, we propose computing a set of PageRank vectors, biased using a set of representative topics, to capture more accurately the notion of importance with respect to a particular topic. By using these (precomputed) biased PageRank vectors to generate query-specific importance scores for pages at query
Mixtures of Probabilistic Principal Component Analysers
, 1998
"... Principal component analysis (PCA) is one of the most popular techniques for processing, compressing and visualising data, although its effectiveness is limited by its global linearity. While nonlinear variants of PCA have been proposed, an alternative paradigm is to capture data complexity by a com ..."
Abstract
-
Cited by 532 (6 self)
- Add to MetaCart
maximum-likelihood framework, based on a specific form of Gaussian latent variable model. This leads to a well-defined mixture model for probabilistic principal component analysers, whose parameters can be determined using an EM algorithm. We discuss the advantages of this model in the context
The Application of Petri Nets to Workflow Management
, 1998
"... Workflow management promises a new solution to an age-old problem: controlling, monitoring, optimizing and supporting business processes. What is new about workflow management is the explicit representation of the business process logic which allows for computerized support. This paper discusses the ..."
Abstract
-
Cited by 533 (64 self)
- Add to MetaCart
the use of Petri nets in the context of workflow management. Petri nets are an established tool for modeling and analyzing processes. On the one hand, Petri nets can be used as a design language for the specification of complex workflows. On the other hand, Petri net theory provides for powerful analysis
Developing a Context-aware Electronic Tourist Guide: Some Issues and Experiences
, 2000
"... In this paper, we describe our experiences of developing and evaluating GUIDE, an intelligent electronic tourist guide. The GUIDE system has been built to overcome many of the limitations of the traditional information and navigation tools available to city visitors. For example, group-based tours a ..."
Abstract
-
Cited by 442 (20 self)
- Add to MetaCart
are inherently inflexible with fixed starting times and fixed durations and (like most guidebooks) are constrained by the need to satisfy the interests of the majority rather than the specific interests of individuals. Following a period of requirements capture, involving experts in the field of tourism, we
MediaBench: A Tool for Evaluating and Synthesizing Multimedia and Communications Systems
"... Over the last decade, significant advances have been made in compilation technology for capitalizing on instruction-level parallelism (ILP). The vast majority of ILP compilation research has been conducted in the context of generalpurpose computing, and more specifically the SPEC benchmark suite. At ..."
Abstract
-
Cited by 966 (22 self)
- Add to MetaCart
Over the last decade, significant advances have been made in compilation technology for capitalizing on instruction-level parallelism (ILP). The vast majority of ILP compilation research has been conducted in the context of generalpurpose computing, and more specifically the SPEC benchmark suite
From Data Mining to Knowledge Discovery in Databases.
- AI Magazine,
, 1996
"... ■ Data mining and knowledge discovery in databases have been attracting a significant amount of research, industry, and media attention of late. What is all the excitement about? This article provides an overview of this emerging field, clarifying how data mining and knowledge discovery in database ..."
Abstract
-
Cited by 538 (0 self)
- Add to MetaCart
predictive model for estimating the value of future cases). At the core of the process is the application of specific data-mining methods for pattern discovery and extraction. 1 This article begins by discussing the historical context of KDD and data mining and their intersection with other related fields. A
The “What” and “Why” of Goal Pursuits: Human Needs and the Self-Determination of Behavior
, 2000
"... Self-determination theory (SDT) maintains that an understanding of human motiva-tion requires a consideration of innate psychological needs for competence, auton-omy, and relatedness. We discuss the SDT concept of needs as it relates to previous need theories, emphasizing that needs specify the nece ..."
Abstract
-
Cited by 1105 (36 self)
- Add to MetaCart
relations to the quality of behavior and mental health, specifically be-cause different regulatory processes and different goal contents are associated with differing degrees of need satisfaction. Social contexts and individual differences that support satisfaction of the basic needs facilitate natural
The SLAM project: debugging system software via static analysis
- SIGPLAN Not
"... Abstract. The goal of the SLAM project is to check whether or not a program obeys "API usage rules " that specif[y what it means to be a good client of an API. The SLAM toolkit statically analyzes a C program to determine whether or not it violates given usage rules. The toolkit has two un ..."
Abstract
-
Cited by 472 (17 self)
- Add to MetaCart
Abstract. The goal of the SLAM project is to check whether or not a program obeys "API usage rules " that specif[y what it means to be a good client of an API. The SLAM toolkit statically analyzes a C program to determine whether or not it violates given usage rules. The toolkit has two
Results 1 - 10
of
36,806