• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Advanced Search Include Citations

Tools

Sorted by:
Try your query at:
Semantic Scholar Scholar Academic
Google Bing DBLP
Results 1 - 10 of 42,267
Next 10 →

Matching words and pictures

by Kobus Barnard, Pinar Duygulu, David Forsyth, Nando De Freitas, David M. Blei, Michael I. Jordan - JOURNAL OF MACHINE LEARNING RESEARCH , 2003
"... We present a new approach for modeling multi-modal data sets, focusing on the specific case of segmented images with associated text. Learning the joint distribution of image regions and words has many applications. We consider in detail predicting words associated with whole images (auto-annotation ..."
Abstract - Cited by 665 (40 self) - Add to MetaCart
We present a new approach for modeling multi-modal data sets, focusing on the specific case of segmented images with associated text. Learning the joint distribution of image regions and words has many applications. We consider in detail predicting words associated with whole images (auto

The creation of new words

by John Haiman, Macalester College, John Haiman - Linguistics , 2010
"... The creation of new words* ..."
Abstract - Cited by 1 (0 self) - Add to MetaCart
The creation of new words*

Automatic Retrieval and Clustering of Similar Words

by Dekang Lin , 1998
"... greatest challenges in natural language learning. We first define a word similarity measure based on the distributional pattern of words. The similarity measure allows us to construct a thesaurus using a parsed corpus. We then present a new evaluation methodology for the automatically constructed th ..."
Abstract - Cited by 943 (15 self) - Add to MetaCart
greatest challenges in natural language learning. We first define a word similarity measure based on the distributional pattern of words. The similarity measure allows us to construct a thesaurus using a parsed corpus. We then present a new evaluation methodology for the automatically constructed

Hierarchically Classifying Documents Using Very Few Words

by Daphne Koller, Mehran Sahami , 1997
"... The proliferation of topic hierarchies for text documents has resulted in a need for tools that automatically classify new documents within such hierarchies. Existing classification schemes which ignore the hierarchical structure and treat the topics as separate classes are often inadequate in text ..."
Abstract - Cited by 521 (8 self) - Add to MetaCart
The proliferation of topic hierarchies for text documents has resulted in a need for tools that automatically classify new documents within such hierarchies. Existing classification schemes which ignore the hierarchical structure and treat the topics as separate classes are often inadequate in text

Dynamic programming algorithm optimization for spoken word recognition

by Hiroaki Sakoe, Seibi Chiba - IEEE TRANSACTIONS ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING , 1978
"... This paper reports on an optimum dynamic programming (DP) based time-normalization algorithm for spoken word recognition. First, a general principle of time-normalization is given using timewarping function. Then, two time-normalized distance definitions, ded symmetric and asymmetric forms, are der ..."
Abstract - Cited by 788 (3 self) - Add to MetaCart
This paper reports on an optimum dynamic programming (DP) based time-normalization algorithm for spoken word recognition. First, a general principle of time-normalization is given using timewarping function. Then, two time-normalized distance definitions, ded symmetric and asymmetric forms

Understanding Normal and Impaired Word Reading: Computational Principles in Quasi-Regular Domains

by David C. Plaut , James L. McClelland, Mark S. Seidenberg, Karalyn Patterson - PSYCHOLOGICAL REVIEW , 1996
"... We develop a connectionist approach to processing in quasi-regular domains, as exemplified by English word reading. A consideration of the shortcomings of a previous implementation (Seidenberg & McClelland, 1989, Psych. Rev.) in reading nonwords leads to the development of orthographic and phono ..."
Abstract - Cited by 613 (94 self) - Add to MetaCart
and phonological representations that capture better the relevant structure among the written and spoken forms of words. In a number of simulation experiments, networks using the new representations learn to read both regular and exception words, including low-frequency exception words, and yet are still able

Detection and transcription of new words

by B. Suhm, M. Woszczyna, A. Waibel - In Proceedings of EUROSPEECH , 1993
"... This paper describes a model which enables a speech recognition system to automatically detect new words and to provide a rough phonetic transcription. In our approach to the new word problem the decision whether new words occurred in the speech input is not based exclusively on acoustic evidence bu ..."
Abstract - Cited by 15 (0 self) - Add to MetaCart
This paper describes a model which enables a speech recognition system to automatically detect new words and to provide a rough phonetic transcription. In our approach to the new word problem the decision whether new words occurred in the speech input is not based exclusively on acoustic evidence

Parallel Networks that Learn to Pronounce English Text

by Terrence J. Sejnowski, Charles R. Rosenberg - COMPLEX SYSTEMS , 1987
"... This paper describes NETtalk, a class of massively-parallel network systems that learn to convert English text to speech. The memory representations for pronunciations are learned by practice and are shared among many processing units. The performance of NETtalk has some similarities with observed h ..."
Abstract - Cited by 549 (5 self) - Add to MetaCart
human performance. (i) The learning follows a power law. (;i) The more words the network learns, the better it is at generalizing and correctly pronouncing new words, (iii) The performance of the network degrades very slowly as connections in the network are damaged: no single link or processing unit

Metaphors We Live By

by George Lakoff, Mark Johnson , 1980
"... 1. Make a list of some of the metaphors discussed by Lakoff and Johnson. Try inserting new words that convey a different meaning. For example, consider the expression, “I’d like to share some time with you ” rather than “spend some time with you.” 2. Make a list of “language asymmetries ” (see Part ..."
Abstract - Cited by 3387 (7 self) - Add to MetaCart
1. Make a list of some of the metaphors discussed by Lakoff and Johnson. Try inserting new words that convey a different meaning. For example, consider the expression, “I’d like to share some time with you ” rather than “spend some time with you.” 2. Make a list of “language asymmetries ” (see Part

A New Statistical Parser Based on Bigram Lexical Dependencies

by Michael John Collins , 1996
"... This paper describes a new statistical parser which is based on probabilities of dependencies between head-words in the parse tree. Standard bigram probability estimation techniques are extended to calculate probabilities of dependencies between pairs of words. Tests using Wall Street Journal ..."
Abstract - Cited by 490 (4 self) - Add to MetaCart
This paper describes a new statistical parser which is based on probabilities of dependencies between head-words in the parse tree. Standard bigram probability estimation techniques are extended to calculate probabilities of dependencies between pairs of words. Tests using Wall Street
Next 10 →
Results 1 - 10 of 42,267
Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University