Results 1  10
of
17
Opcodes as predictor for malware
"... Abstract: This paper discusses a detection mechanism for malicious code through statistical analysis of opcode distributions. A total of 67 malware executables were sampled statically disassembled and their statistical opcode frequency distribution compared with the aggregate statistics of 20 nonma ..."
Abstract

Cited by 24 (0 self)
 Add to MetaCart
(Show Context)
Abstract: This paper discusses a detection mechanism for malicious code through statistical analysis of opcode distributions. A total of 67 malware executables were sampled statically disassembled and their statistical opcode frequency distribution compared with the aggregate statistics of 20 nonmalicious samples. We find that malware opcode distributions differ statistically significantly from nonmalicious software. Furthermore, rare opcodes seem to be a stronger predictor, explaining 12–63 % of frequency variation.
The Interactive Nature of Computing: Refuting the Strong ChurchTuring Thesis
, 2007
"... The classical view of computing positions computation as a closedbox transformation of inputs (rational numbers or finite strings) to outputs. According to the interactive view of computing, computation is an ongoing interactive process rather than a functionbased transformation of an input to a ..."
Abstract

Cited by 14 (0 self)
 Add to MetaCart
The classical view of computing positions computation as a closedbox transformation of inputs (rational numbers or finite strings) to outputs. According to the interactive view of computing, computation is an ongoing interactive process rather than a functionbased transformation of an input to an output. Specifically, communication with the outside world happens during the computation, not before or after it. This approach radically changes our understanding of what is computation and how it is modeled. The acceptance of interaction as a new paradigm is hindered by the Strong ChurchTuring Thesis (SCT), the widespread belief that Turing Machines (TMs) capture all computation, so models of computation more expressive than TMs are impossible. In this paper, we show that SCT reinterprets the original ChurchTuring Thesis (CTT) in a way that Turing never intended; its commonly assumed equivalence to the original is a myth. We identify and analyze the historical reasons for the widespread belief in SCT. Only by accepting that it is false can we begin to adopt interaction as an alternative paradigm of computation. We present Persistent Turing Machines (PTMs), that extend TMs to capture sequential interaction. PTMs allow us to formulate the Sequential Interaction Thesis, going beyond the expressiveness of TMs and of the CTT. The paradigm shift to interaction provides an alternative understanding of the nature of computing that better reflects the services provided by today’s computing technology.
Zeno machines and hypercomputation
 Theoretical Computer Science
"... This paper reviews the ChurchTuring Thesis (or rather, theses) with reference to their origin and application and considers some models of “hypercomputation”, concentrating on perhaps the most straightforward option: Zeno machines (Turing machines with accelerating clock). The halting problem is br ..."
Abstract

Cited by 8 (0 self)
 Add to MetaCart
(Show Context)
This paper reviews the ChurchTuring Thesis (or rather, theses) with reference to their origin and application and considers some models of “hypercomputation”, concentrating on perhaps the most straightforward option: Zeno machines (Turing machines with accelerating clock). The halting problem is briefly discussed in a general context and the suggestion that it is an inevitable companion of any reasonable computational model is emphasised. It is suggested that claims to have “broken the Turing barrier ” could be toned down and that the important and wellfounded rôle of Turing computability in the mathematical sciences stands unchallenged.
The Mathematician's Bias  and the Return to Embodied Computation. In H. Zenil (Ed.) A Computable Universe. Understanding Computation & Exploring Nature As Computation. World Scientific: New York/London/Singapore
, 2012
"... There are growing uncertainties surrounding the classical model of computation established by Gödel, Church, Kleene, Turing and others in the 1930s onwards. The mismatch between the Turing machine conception, and the experiences of those more practically engaged in computing, has parallels with th ..."
Abstract

Cited by 7 (0 self)
 Add to MetaCart
(Show Context)
There are growing uncertainties surrounding the classical model of computation established by Gödel, Church, Kleene, Turing and others in the 1930s onwards. The mismatch between the Turing machine conception, and the experiences of those more practically engaged in computing, has parallels with the wider one between science and those working creatively or intuitively out in the ‘real ’ world. The scientific outlook is more flexible and basic than some understand or want to admit. The science is subject to limitations which threaten careers. We look at embodiment and disembodiment of computation as the key to the mismatch, and find Turing had the right idea all along – amongst a productive confusion of ideas about computation in the real and the abstract worlds. When we get out of bed in the morning, we approach a complicated world of information with a determination not just to survive the day – though that may be hard enough: we mean to “compute ” our way towards various vaguely defined
Definability as hypercomputational effect
 Applied Mathematics and Computation
"... The classical simulation of physical processes using standard models of computation is fraught with problems. On the other hand, attempts at modelling realworld computation with the aim of isolating its hypercomputational content have struggled to convince. We argue that a better basic understandin ..."
Abstract

Cited by 7 (6 self)
 Add to MetaCart
(Show Context)
The classical simulation of physical processes using standard models of computation is fraught with problems. On the other hand, attempts at modelling realworld computation with the aim of isolating its hypercomputational content have struggled to convince. We argue that a better basic understanding can be achieved through computability theoretic deconstruction of those physical phenomena most resistant to classical simulation. From this we may be able to better assess whether the hypercomputational enterprise is proleptic computer science, or of mainly philosophical interest.
The role of agent interaction in models of computation (panel summary
 In Workshop on Foundations of Interactive Computation
, 2005
"... ..."
(Show Context)
Semantics of Information as Interactive Computation
 in Moeller, M., Neuser W. & RothBerghofer T. (Eds.), Fifth international workshop on philosophy and informatics, Kaiserslautern
, 2008
"... Abstract. Computers today are not only the calculation tools they are directly (inter)acting in the physical world which itself may be conceived of as the universal computer (Zuse, Fredkin, Wolfram, Chaitin, Lloyd). In expanding its domains from abstract logical symbol manipulation to physical embe ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
Abstract. Computers today are not only the calculation tools they are directly (inter)acting in the physical world which itself may be conceived of as the universal computer (Zuse, Fredkin, Wolfram, Chaitin, Lloyd). In expanding its domains from abstract logical symbol manipulation to physical embedded and networked devices, computing goes beyond ChurchTuring limit (Copeland, Siegelman, Burgin, Schachter). Computational processes are distributed, reactive, interactive, agentbased and concurrent. The main criterion of success of computation is not its termination, but the adequacy of its response, its speed, generality and flexibility; adaptability, and tolerance to noise, error, faults, and damage. Interactive computing is a generalization of Turing computing, and it calls for new conceptualizations (Goldin, Wegner). In the infocomputationalist framework, with computation seen as information processing, natural computation appears as the most suitable paradigm of computation and information semantics requires logical pluralism.
Evolvable Virtual Machines
 Information Science Department, University of Otago
"... The Evolvable Virtual Machine abstract architecture (EVMA) is a computational architecture for dynamic hierarchically organised virtual machines. The concrete EVM instantiation (EVMI) builds on traditional stackbased models of computation and extends them by notions of hierarchy and reflection on ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
(Show Context)
The Evolvable Virtual Machine abstract architecture (EVMA) is a computational architecture for dynamic hierarchically organised virtual machines. The concrete EVM instantiation (EVMI) builds on traditional stackbased models of computation and extends them by notions of hierarchy and reflection on the virtual machine level. The EVM Universe is composed of a number of autonomous and asynchronously communicating EVM machines. The main contribution of this work lies in the new model of computation and in the architecture itself: a novel, compact, flexible and expressive representation of distributed concurrent computation. The EVMA provides a way of expressing and modelling autocatalytic networks composed of a hierarchical hypercycle of autopoietic subsystems characterised by selfadaptable structural tendencies and selforganised criticality. EVMA provides capabilities for: a) selflearning of dynamical patterns through continuous observation of computable environments, b) selfcompacting and generalisation of existing program structures, c) emergence of efficient and robust communication code through appropriate machine assembly on both ends of communication channel. EVMA is in one sense a multidimensional generalisation of stack machine with the purpose of modelling concurrent asynchronous processing. EVMA approach can be also seen as a metaevolutionary theory of evolution. The EVMA is designed to model systems that mimic living autonomous and adaptable computational processes. The EVMI prototype has been designed and developed to conduct experimental studies on complex evolving systems. The generality of our approach not only provides the means to experiment with complex hierarchical, computational and evolutionary systems, but it provides a useful model to evaluate, share and discuss the complex hierarchical systems in general. The EVMA provides a novel methodology and language to pursue research, to understand and to talk about evolution of complexity in living systems. In this thesis, we present the simple singlecell EVMI framework, discuss the multicell EVM Universe architecture, present experimental results, and propose further extensions, experimental studies, and possible hardware implementations of the EVMI. iii
CyberPhysical Systems and Events
"... Abstract. This paper discusses eventbased semantics in the context of the emerging concept of Cyber Physical Systems and describes two related formal models concerning policybased coordination and Interactive Agents. 1 ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
(Show Context)
Abstract. This paper discusses eventbased semantics in the context of the emerging concept of Cyber Physical Systems and describes two related formal models concerning policybased coordination and Interactive Agents. 1
Technical Report 2013608 WHAT IS COMPUTATION?
"... Abstract Three conditions are usually given that must be satisfied by a process in order for it to be called a computation, namely, there must exist a finite length algorithm for the process, the algorithm must terminate in finite time for valid inputs and return a valid output, and finally the alg ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract Three conditions are usually given that must be satisfied by a process in order for it to be called a computation, namely, there must exist a finite length algorithm for the process, the algorithm must terminate in finite time for valid inputs and return a valid output, and finally the algorithm must never return an output for invalid inputs. These three conditions are advanced as being necessary and sufficient for the process to be computable by a universal model of computation. In fact, these conditions are neither necessary, nor sufficient. On the one hand, recently defined paradigms show how certain processes that do not satisfy one or more of the aforementioned properties can indeed be carried out in principle on new, more powerful, types of computers, and hence can be considered as computations. Thus the conditions are not necessary. On the other hand, contemporary work in unconventional computation has demonstrated the existence of processes that satisfy the three stated conditions, yet contradict the ChurchTuring Thesis, and more generally, the principle of universality in computer science. Thus the conditions are not sufficient.