Results 1 - 10
of
174
Refactoring Object-Oriented Frameworks
, 1992
"... This thesis defines a set of program restructuring operations (refactorings) that support the design, evolution and reuse of object-oriented application frameworks. The focus of the thesis is on automating the refactorings in a way that preserves the behavior of a program. The refactorings are defin ..."
Abstract
-
Cited by 489 (4 self)
- Add to MetaCart
(Show Context)
This thesis defines a set of program restructuring operations (refactorings) that support the design, evolution and reuse of object-oriented application frameworks. The focus of the thesis is on automating the refactorings in a way that preserves the behavior of a program. The refactorings are defined to be behavior preserving, provided that their preconditions are met. Most of the refactorings are simple to implement and it is almost trivial to show that they are behavior preserving. However, for a few refactorings, one or more of their preconditions are in general undecidable. Fortunately, for some cases it can be determined whether these refactorings can be applied safely. Three of the most complex refactorings are defined in detail: generalizing the inheritance hierarchy, specializing the inheritance hierarchy and using aggregations to model the relationships among classes. These operations are decomposed into more primitive parts, and the power of these operations is discussed from the perspectives of automatability and usefulness in supporting design. Two design constraints needed in refactoring are class invariants and exclusive components. These constraints are needed to ensure that behavior is preserved across some refactorings. This thesis gives some conservative algorithms for determining whether a program satisfies these constraints, and describes how to use this design information to refactor a program.
Program Restructuring as an Aid to Software Maintenance
, 1991
"... Maintenance tends to degrade the structure of software, ultimately making maintenance more costly. At times, then, it is worthwhile to manipulate the structure of a system to make changes easier. However, it is shown that manual restructuring is an error-prone and expensive activity. By separating ..."
Abstract
-
Cited by 115 (11 self)
- Add to MetaCart
Maintenance tends to degrade the structure of software, ultimately making maintenance more costly. At times, then, it is worthwhile to manipulate the structure of a system to make changes easier. However, it is shown that manual restructuring is an error-prone and expensive activity. By separating structural manipulations from other maintenance activities, the semantics of a system can be held constant by a tool, assuring that no errors are introduced by restructuring. To allow the maintenance team to focus on the aspects of restructuring and maintenance requiring human judgment, a transformation-based tool can be provided---based on a model that exploits preserving data flow-dependence and control flow-dependence---to automate the repetitive, errorprone, and computationally demanding aspects of re...
Structured programming with go to statements
- Computing Surveys
, 1974
"... A consideration of several different examples sheds new light on the problem of ereat-ing reliable, well-structured programs that behave efficiently. This study focuses largely on two issues: (a) improved syntax for iterations and error exits, making it possible to write a larger class of programs c ..."
Abstract
-
Cited by 82 (3 self)
- Add to MetaCart
(Show Context)
A consideration of several different examples sheds new light on the problem of ereat-ing reliable, well-structured programs that behave efficiently. This study focuses largely on two issues: (a) improved syntax for iterations and error exits, making it possible to write a larger class of programs clearly and efficiently without go to state-
A view of 20th and 21st century software engineering
- In Proceedings of the 28th international Conference on Software Engineering ICSE
, 2006
"... George Santayana's statement, "Those who cannot remember the past are condemned to repeat it, " is only half true. The past also includes successful histories. If you haven't been made aware of them, you're often condemned not to repeat their successes. In a rapidly expandin ..."
Abstract
-
Cited by 63 (1 self)
- Add to MetaCart
(Show Context)
George Santayana's statement, "Those who cannot remember the past are condemned to repeat it, " is only half true. The past also includes successful histories. If you haven't been made aware of them, you're often condemned not to repeat their successes. In a rapidly expanding field such as software engineering, this happens a lot. Extensive studies of many software projects such as the Standish Reports offer convincing evidence that many projects fail to repeat past successes. This paper tries to identify at least some of the major past software experiences that were well worth repeating, and some that were not. It also tries to identify underlying phenomena influencing the evolution of software engineering practices that have at least helped the author appreciate how our field has gotten to where it has been and where it is. A counterpart Santayana-like statement about the past and future might say, "In an era of rapid change, those who repeat the past are condemned to a bleak future. " (Think about the dinosaurs, and think carefully about software engineering maturity models that emphasize repeatability.) This paper also tries to identify some of the major sources of change that will affect software engineering practices in the next couple of decades, and identifies some strategies for assessing and adapting to these sources of change. It also makes some first steps towards distinguishing relatively timeless software engineering principles that are risky not to repeat, and conditions of change under which aging practices will become increasingly risky to repeat.
A Control-Flow Normalization Algorithm and Its Complexity
- IEEE Transactions on Software Engineering
, 1992
"... We present a simple method for normalizing the control-flow of programs to facilitate program transformations, program analysis, and automatic parallelization. While previous methods result in programs whose control flowgraphs are reducible, programs normalized by this technique satisfy a stronger c ..."
Abstract
-
Cited by 54 (0 self)
- Add to MetaCart
(Show Context)
We present a simple method for normalizing the control-flow of programs to facilitate program transformations, program analysis, and automatic parallelization. While previous methods result in programs whose control flowgraphs are reducible, programs normalized by this technique satisfy a stronger condition than reducibility and are therefore simpler in their syntax and structure than with previous methods. In particular, all control-flow cycles are normalized into single-entry, single-exit while loops, and all goto's are eliminated. Furthermore, the method avoids problems of code replication that are characteristic of node-splitting techniques. This restructuring obviates the control dependence graph, since afterwards control dependence relations are manifest in the syntax tree of the program. In this paper we present transformations that effect this normalization, and study the complexity of the method. Index Terms: Continuations, control-flow, elimination algorithms, normalization,...
Send-receive considered harmful: Myths and realities of message passing
- ACM Transactions on Programming Languages and Systems
"... During the software crisis of the 1960s, Dijkstra’s famous thesis “goto considered harmful ” paved the way for structured programming. This short communication suggests that many current difficulties of parallel programming based on message passing are caused by poorly structured communication, whic ..."
Abstract
-
Cited by 48 (1 self)
- Add to MetaCart
During the software crisis of the 1960s, Dijkstra’s famous thesis “goto considered harmful ” paved the way for structured programming. This short communication suggests that many current difficulties of parallel programming based on message passing are caused by poorly structured communication, which is a consequence of using low-level send-receive primitives. We argue that, like goto in sequential programs, send-receive should be avoided as far as possible and replaced by collective operations in the setting of message passing. We dispute some widely held opinions about the apparent superiority of pairwise communication over collective communication and present substantial theoretical and empirical evidence to the contrary in the context of MPI (Message Passing Interface).
Taming Control Flow: A Structured Approach to Eliminating Goto Statements
- In Proceedings of 1994 IEEE International Conference on Computer Languages
, 1994
"... In designing optimizing and parallelizing compilers, it is often simpler and more efficient to deal with programs that have structured control flow. Although most programmers naturally program in a structured fashion, there remain many important programs and benchmarks that include some number of go ..."
Abstract
-
Cited by 48 (7 self)
- Add to MetaCart
(Show Context)
In designing optimizing and parallelizing compilers, it is often simpler and more efficient to deal with programs that have structured control flow. Although most programmers naturally program in a structured fashion, there remain many important programs and benchmarks that include some number of goto statements, thus rendering the entire program unstructured. Such unstructured programs cannot be handled with compilers built with analyses and transformations for structured programs. In this paper we present a straight-forward algorithm to structure C programs by eliminating all goto statements. The method works directly on a highlevel abstract syntax tree (AST) representation of the program and could easily be integrated into any compiler that uses an AST-based intermediate representation. The actual algorithm proceeds by eliminating each goto by first applying a sequence of gotomovement transformations followed by the appropriate goto-elimination transformation. We have implemented...
Loop Distribution with Arbitrary Control Flow
- In Proceedings of Supercomputing '90
, 1990
"... Loop distribution is an integral part of transforming a sequential program into a parallel one. It is used extensively in parallelization, vectorization, and memory management. For loops with control flow, previous methods for loop distribution have significant drawbacks. We present a new algorithm ..."
Abstract
-
Cited by 47 (5 self)
- Add to MetaCart
(Show Context)
Loop distribution is an integral part of transforming a sequential program into a parallel one. It is used extensively in parallelization, vectorization, and memory management. For loops with control flow, previous methods for loop distribution have significant drawbacks. We present a new algorithm for loop distribution in the presence of control flow modeled by a control dependence graph. This algorithm is shown optimal in that it generates the minimum number of new arrays and tests possible. We also present a code generation algorithm that produces code for the resulting program without replicating statements or conditions. Although these algorithms are being developed for use in an interactive parallel programming environment for Fortran, they are very general and can be used in automatic parallelization and vectorization systems. Keywords: parallelization, vectorization, transformation, control dependence, data dependence, loop distribution 1 Introduction Loop distribution is a f...
Systems Development Methodologies: The Problem of Tenses
- Information Technology & People
, 2000
"... This paper presents two fundamental arguments. Firstly, it is proposed that most of the currently-available systems development methodologies are founded on concepts which emerged in the period from about 1967 to 1977. This argument is presented through the use of literature references. The second a ..."
Abstract
-
Cited by 43 (0 self)
- Add to MetaCart
This paper presents two fundamental arguments. Firstly, it is proposed that most of the currently-available systems development methodologies are founded on concepts which emerged in the period from about 1967 to 1977. This argument is presented through the use of literature references. The second argument is that the profile of the development environment now faced in organisations is very different from that which prevailed in the period 1967 to 1977. This is illustrated through original empirical research carried out by the author which supports this argument, and by contrasting these findings with those of previous studies in the literature. It is therefore argued that there is a need to update ‘tenses ’ by deriving new methodological canons more appropriate to the needs of the current development environment. Some suggestions for new methodological canons appropriate to the current development environment are provided.