Results 1  10
of
15
A Lightweight Implementation of Generics and Dynamics
, 2002
"... The recent years have seen a number of proposals for extending statically typed languages by dynamics or generics. Most proposals  if not all  require significant extensions to the underlying language. In this paper we show that this need not be the case. We propose a particularly lightweight ..."
Abstract

Cited by 77 (5 self)
 Add to MetaCart
(Show Context)
The recent years have seen a number of proposals for extending statically typed languages by dynamics or generics. Most proposals  if not all  require significant extensions to the underlying language. In this paper we show that this need not be the case. We propose a particularly lightweight extension that supports both dynamics and generics. Furthermore, the two features are smoothly integrated: dynamic values, for instance, can be passed to generic functions. Our proposal makes do with a standard HindleyMilner type system augmented by existential types. Building upon these ideas we have implemented a small library that is readily usable both with Hugs and with the Glasgow Haskell compiler.
Representations of stream processors using nested fixed points
 Logical Methods in Computer Science
"... Abstract. We define representations of continuous functions on infinite streams of discrete values, both in the case of discretevalued functions, and in the case of streamvalued functions. We define also an operation on the representations of two continuous functions between streams that yields a ..."
Abstract

Cited by 24 (2 self)
 Add to MetaCart
(Show Context)
Abstract. We define representations of continuous functions on infinite streams of discrete values, both in the case of discretevalued functions, and in the case of streamvalued functions. We define also an operation on the representations of two continuous functions between streams that yields a representation of their composite. In the case of discretevalued functions, the representatives are wellfounded (finitepath) trees of a certain kind. The underlying idea can be traced back to Brouwer’s justification of barinduction, or to Kreisel and Troelstra’s elimination of choicesequences. In the case of streamvalued functions, the representatives are nonwellfounded trees pieced together in a coinductive fashion from wellfounded trees. The definition requires an alternating fixpoint construction of some ubiquity.
Generic Unification via TwoLevel Types and Parameterized Modules  Functional Pearl
, 2001
"... As a functional pearl, we describe an efficient, modularized implementation of unification using the state of mutable reference cells to encode substitutions. We abstract our algorithms along two dimensions, first abstracting away from the structure of the terms to be unified, and second over the mo ..."
Abstract

Cited by 16 (1 self)
 Add to MetaCart
As a functional pearl, we describe an efficient, modularized implementation of unification using the state of mutable reference cells to encode substitutions. We abstract our algorithms along two dimensions, first abstracting away from the structure of the terms to be unified, and second over the monad in which the mutable state is encapsulated.
We choose this example to illustrate two important techniques that we believe many functional programmers would find useful. The first of these is the definition of recursive data types using two levels: a structure defining level, and a recursive knottying level. The second is the use of rank2 polymorphism inside Haskell’s record types to implement a form of type parameterized modules.
Representations of First Order Function Types as Terminal Coalgebras
 In Typed Lambda Calculi and Applications, TLCA 2001, number 2044 in Lecture Notes in Computer Science
, 2001
"... terminal coalgebras ..."
Generic programming, now
 Generic Programming, Advanced Lectures, LNCS
, 2006
"... Abstract. Tired of writing boilerplate code? Tired of repeating essentially the same function definition for lots of different datatypes? Datatypegeneric programming promises to end these coding nightmares. In these lecture notes, we present the key abstractions of datatypegeneric programming, giv ..."
Abstract

Cited by 7 (1 self)
 Add to MetaCart
(Show Context)
Abstract. Tired of writing boilerplate code? Tired of repeating essentially the same function definition for lots of different datatypes? Datatypegeneric programming promises to end these coding nightmares. In these lecture notes, we present the key abstractions of datatypegeneric programming, give several applications, and provide an elegant embedding of generic programming into Haskell. The embedding builds on recent advances in type theory: generalised algebraic datatypes and open datatypes. We hope to convince you that generic programming is useful and that you can use generic programming techniques today! 1
Functional Pearl: Trouble Shared is Trouble Halved
, 2003
"... than one incoming arc. Shared nodes are created in almost every functional programfor instance, when updating a purely functional data structurethough programmers are seldom aware of this. In fact, there are only a few algorithms that exploit sharing of nodes consciously. One example is constr ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
than one incoming arc. Shared nodes are created in almost every functional programfor instance, when updating a purely functional data structurethough programmers are seldom aware of this. In fact, there are only a few algorithms that exploit sharing of nodes consciously. One example is constructing a tree in sublinear time. In this pearl we discuss an intriguing application of nexuses; we show that they serve admirably as memo structures featuring constant time access to memoized function calls. Along the way we encounter Boolean lattices and binomial trees.
Monadic memoization mixins
, 2007
"... Memoization is a familiar technique for improving the performance of programs: computed answers are saved so that they can be reused later instead of being recomputed. In a pure functional language, memoization of a function is complicated by the need to manage the table of saved answers between cal ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
(Show Context)
Memoization is a familiar technique for improving the performance of programs: computed answers are saved so that they can be reused later instead of being recomputed. In a pure functional language, memoization of a function is complicated by the need to manage the table of saved answers between calls to the function, including recursive calls within the function itself. A lazy recursive data structure can be used to maintain past answers — although achieving an efficient algorithm can require a complex rewrite of the function into a special form. Memoization can also be defined as a language primitive — but to be useful it would need to support a range of memoization strategies. In this paper we develop a technique for modular memoization within a pure functional language. We define monadic memoization mixins that are composed (via inheritance) with an ordinary monadic function to create a memoized version of the function. As a case study, we memoize a recursivedescent parser written using standard parser combinators. A comparison of the performance of different approaches shows that memoization mixins are efficient for a small example. 1.
Type Fusion
"... Fusion is an indispensable tool in the arsenal of techniques for program derivation. Less wellknown, but equally valuable is type fusion, which states conditions for fusing an application of a functor with an initial algebra to form another initial algebra. We provide a novel proof of type fusion b ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
(Show Context)
Fusion is an indispensable tool in the arsenal of techniques for program derivation. Less wellknown, but equally valuable is type fusion, which states conditions for fusing an application of a functor with an initial algebra to form another initial algebra. We provide a novel proof of type fusion based on adjoint folds and discuss several applications: type firstification, type specialisation and tabulation. 1.
PullUps, PushDowns, and Passing It Around  Exercises in Functional Incrementalization
, 2009
"... Programs in functional programming languages with algebraic datatypes are often datatypecentric and use folds or foldlike functions. Incrementalization of such a program can significantly improve its performance. Functional incrementalization separates the recursion from the calculation and sign ..."
Abstract
 Add to MetaCart
(Show Context)
Programs in functional programming languages with algebraic datatypes are often datatypecentric and use folds or foldlike functions. Incrementalization of such a program can significantly improve its performance. Functional incrementalization separates the recursion from the calculation and significantly reduces redundant computation. In this paper, we motivate incrementalization with a simple example and present a library for transforming programs using upwards, downwards, and circular incrementalization. We also give a datatypegeneric implementation for the library and demonstrate the incremental zipper, a zipper extended with attributes.
Function Inheritance: Monadic Memoization Mixins
"... Abstract. Inheritance is a mechanism for incrementally modifying recursive definitions. While inheritance is typically used in objectoriented languages, inheritance also has something to offer to functional programming. In this paper we illustrate the use of inheritance in a pure functional languag ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract. Inheritance is a mechanism for incrementally modifying recursive definitions. While inheritance is typically used in objectoriented languages, inheritance also has something to offer to functional programming. In this paper we illustrate the use of inheritance in a pure functional language by developing a small library for memoization. We define monadic memoization mixins that compose—via inheritance—with an ordinary monadic function to create a memoized version of the function. A comparison of the performance of different approaches shows that memoization mixins are efficient for a small example. 1.