Results 1 - 10
of
15
Vision of the Unseen: Current Trends and Challenges in Digital Image and Video Forensics
"... Digital images are everywhere—from our cell phones to the pages of our online news sites. How we choose to use digital image processing raises a surprising host of legal and ethical questions that we must address. What are the ramifications of hiding data within an innocent image? Is this an intenti ..."
Abstract
-
Cited by 20 (6 self)
- Add to MetaCart
Digital images are everywhere—from our cell phones to the pages of our online news sites. How we choose to use digital image processing raises a surprising host of legal and ethical questions that we must address. What are the ramifications of hiding data within an innocent image? Is this an intentional security practice when used legitimately, or intentional deception? Is tampering with an image appropriate in cases where the image might affect public behavior? Does an image represent a crime, or is it simply a representation of a scene that has never existed? Before action can even be taken on the basis of a questionable image, we must detect something about the image itself. Investigators from a diverse set of fields require the best possible tools to tackle the challenges presented by the malicious use of today’s digital image processing techniques. In this survey, we introduce the emerging field of digital image forensics, including the main topic areas of source camera identification, forgery detection, and steganalysis. In source camera identification, we seek to identify the particular model of a camera, or the exact camera, that produced an image. Forgery detection’s goal is to establish the authenticity of an image, or to expose any potential tampering the image might have undergone. With steganalysis, the detection of hidden data within an image is performed, with a possible attempt to recover any detected data. Each of these components of digital image forensics is described in detail, along with a critical analysis of the state of the art, and recommendations for the direction of future
From Blind to Quantitative Steganalysis
"... A quantitative steganalyzer is an estimator of the number of embedding changes introduced by a specific embedding operation. Since for most algorithms the number of embedding changes correlates with the message length, quantitative steganalyzers are important forensic tools. In this paper, a genera ..."
Abstract
-
Cited by 11 (8 self)
- Add to MetaCart
A quantitative steganalyzer is an estimator of the number of embedding changes introduced by a specific embedding operation. Since for most algorithms the number of embedding changes correlates with the message length, quantitative steganalyzers are important forensic tools. In this paper, a general method for constructing quantitative steganalyzers from features used in blind detectors is proposed. The core of the method is support vector regression, which is used to learn the mapping between a feature vector extracted from the investigated object and the embedding change rate. To demonstrate the generality of the proposed approach, quantitative steganalyzers are constructed for a variety of steganographic algorithms in both JPEG transform and spatial domains. The estimation accuracy is investigated in detail and compares favorably with state-of-the-art quantitative steganalyzers.
Moving Steganography and Steganalysis from the Laboratory into the Real World
"... There has been an explosion of academic literature on steganography and steganalysis in the past two decades. With a few exceptions, such papers address abstractions of the hiding and detection problems, which arguably have become disconnected from the real world. Most published results, including b ..."
Abstract
-
Cited by 9 (7 self)
- Add to MetaCart
(Show Context)
There has been an explosion of academic literature on steganography and steganalysis in the past two decades. With a few exceptions, such papers address abstractions of the hiding and detection problems, which arguably have become disconnected from the real world. Most published results, including by the authors of this paper, apply “in laboratory conditions ” and some are heavily hedged by assumptions and caveats; significant challenges remain unsolved in order to implement good steganography and steganalysis in practice. This position paper sets out some of the important questions which have been left unanswered, as well as highlighting some that have already been addressed successfully, for steganography and steganalysis to be used in the real world.
Selection-channel-aware rich model for steganalysis of digital images
- In Proc. IEEE WIFS
, 2014
"... Abstract—From the perspective of signal detection theory, it seems obvious that knowing the probabilities with which the individual cover elements are modified during message embedding (the so-called probabilistic selection channel) should improve steganalysis. It is, however, not clear how to incor ..."
Abstract
-
Cited by 6 (6 self)
- Add to MetaCart
(Show Context)
Abstract—From the perspective of signal detection theory, it seems obvious that knowing the probabilities with which the individual cover elements are modified during message embedding (the so-called probabilistic selection channel) should improve steganalysis. It is, however, not clear how to incorporate this information into steganalysis features when the detector is built as a classifier. In this paper, we propose a variant of the pop-ular spatial rich model (SRM) that makes use of the se-lection channel. We demonstrate on three state-of-the-art content-adaptive steganographic schemes that even an imprecise knowledge of the embedding probabilities can substantially increase the detection accuracy in comparison with feature sets that do not consider the selection channel. Overly adaptive embedding schemes seem to be more vulnerable than schemes that spread the embedding changes more evenly throughout the cover. I.
Quantitative Structural Steganalysis of Jsteg
, 2010
"... Quantitative steganalysis strives to estimate the change rate defined as the relative number of embedding changes introduced by steganography. In this paper, we propose two new classes of quantitative steganalysis methods for the steganographic algorithm Jsteg. The first class obtains the changerate ..."
Abstract
-
Cited by 5 (2 self)
- Add to MetaCart
Quantitative steganalysis strives to estimate the change rate defined as the relative number of embedding changes introduced by steganography. In this paper, we propose two new classes of quantitative steganalysis methods for the steganographic algorithm Jsteg. The first class obtains the changerate estimate using a maximum likelihood estimator equipped with a precover model. While this approach provides better accuracy than existing structural attacks, it becomes computationally intractable with increasing complexity of the cover model. The second class of methods computes the change-rate estimate by minimizing an objective function constructed from a heuristically-formed zero message hypothesis. The advantage of this heuristic approach is a low implementation complexity and modular architecture that allows flexible incorporation of higherorder statistics of DCT coefficients. The proposed methods are experimentally compared with current state-of-the-art methods.
Estimating the Information Theoretic Optimal Stego Noise. To appear in
- Proc. 8th International Workshop on Digital Watermarking (2009
"... Abstract. We recently developed a new benchmark for steganography, underpinned by the square root law of capacity, called Steganographic Fisher Information (SFI). It is related to the multiplicative constant for the square root capacity rate and represents a truly information theoretic measure of as ..."
Abstract
-
Cited by 3 (3 self)
- Add to MetaCart
(Show Context)
Abstract. We recently developed a new benchmark for steganography, underpinned by the square root law of capacity, called Steganographic Fisher Information (SFI). It is related to the multiplicative constant for the square root capacity rate and represents a truly information theoretic measure of asymptotic evidence. Given a very large corpus of covers from which the joint histograms can be estimated, an estimator for SFI was derived in [1], and certain aspects of embedding and detection were compared using this benchmark. In this paper we concentrate on the evidence presented by various spatial-domain embedding operations. We extend the technology of [1] in two ways, to convex combinations of arbitrary so-called independent embedding functions. We then apply the new techniques to estimate, in genuine sets of cover images, the spatial-domain stego noise shape which optimally trades evidence – in terms of asymptotic KL divergence – for capacity. The results suggest that smallest embedding changes are optimal for cover images not exhibiting much noise, and also for cover images with significant saturation, but in noisy images it is superior to embed with more stego noise in fewer locations. 1
Rich model for steganalysis of color images
- In Proc. IEEE WIFS
, 2014
"... Abstract—In this paper, we propose an extension of the spatial rich model for steganalysis of color images. The additional features are formed by three-dimensional co-occurrences of residuals computed from all three color channels and their role is to capture dependencies across color channels. Thes ..."
Abstract
-
Cited by 3 (3 self)
- Add to MetaCart
(Show Context)
Abstract—In this paper, we propose an extension of the spatial rich model for steganalysis of color images. The additional features are formed by three-dimensional co-occurrences of residuals computed from all three color channels and their role is to capture dependencies across color channels. These CRMQ1 (color rich model) features are extremely powerful for detection of steganography in images that exhibit traces of color interpolation. Content-adaptive algo-rithms seem to be hurt much more because of their ten-dency to modify the same pixels in each channel. The efficiency of the proposed feature set is demonstrated on three different color versions of BOSSbase 1.01 and two steganographic algorithms – the non-adaptive LSB matching and WOW. Index Terms—Steganalysis, steganography, color, rich models, security
JPEG-Compatibility Steganalysis Using Block-Histogram of Recompression Artifacts
"... Abstract. JPEG-compatibility steganalysis detects the presence of embedding changes using the fact that the stego image was previously JPEG compressed. Following the previous art, we work with the difference between the stego image and an estimate of the cover image obtained by recompression with a ..."
Abstract
-
Cited by 2 (2 self)
- Add to MetaCart
(Show Context)
Abstract. JPEG-compatibility steganalysis detects the presence of embedding changes using the fact that the stego image was previously JPEG compressed. Following the previous art, we work with the difference between the stego image and an estimate of the cover image obtained by recompression with a JPEG quantization table estimated from the stego image. To better distinguish recompression artifacts from embedding changes, the difference image is represented using a feature vector in the form of a histogram of the number of mismatched pixels in 8 × 8 blocks. Three types of classifiers are built to assess the detection accuracy and compare the performance to prior art: a clairvoyant detector trained for a fixed embedding change rate, a constant false-alarm rate detector for an unknown change rate, and a quantitative detector. The proposed approach offers significantly more accurate detection across a wide range of quality factors and embedding operations, especially for very small change rates. The technique requires an accurate estimate of the JPEG compression parameters. 1
An epistemological approach to steganography
- INFORMATION HIDING 2009
, 2009
"... Steganography has been studied extensively in the light of information, complexity, probability and signal processing theory. This paper adds epistemology to the list and argues that Simmon’s seminal prisoner’s problem has an empirical dimension, which cannot be ignored (or defined away) without sim ..."
Abstract
-
Cited by 2 (1 self)
- Add to MetaCart
Steganography has been studied extensively in the light of information, complexity, probability and signal processing theory. This paper adds epistemology to the list and argues that Simmon’s seminal prisoner’s problem has an empirical dimension, which cannot be ignored (or defined away) without simplifying the problem substantially. An introduction to the epistemological perspective on steganography is given along with a structured discussion on how the novel perspective fits into the existing body of literature.
A Survey of Steganography and Steganalysis Technique in Image, Text, Audio and Video as Cover Carrier
, 2011
"... The staggering growth in communication technology and usage of public domain channels (i.e. Internet) has greatly facilitated transfer of data. However, such open communication channels have greater vulnerability to security threats causing unauthorized information access. Traditionally, encryption ..."
Abstract
-
Cited by 2 (0 self)
- Add to MetaCart
The staggering growth in communication technology and usage of public domain channels (i.e. Internet) has greatly facilitated transfer of data. However, such open communication channels have greater vulnerability to security threats causing unauthorized information access. Traditionally, encryption is used to realize the communication security. However, important information is not protected once decoded. Steganography is the art and science of communicating in a way which hides the existence of the communication. Important information is firstly hidden in a host data, such as digital image, text, video or audio, etc, and then transmitted secretly to the receiver. Steganalysis is another important topic in information hiding which is the art of detecting the presence of steganography. This paper provides a critical review of steganography as well as to analyze the characteristics of various cover media namely image, text, a u dio and video in respects of the fundamental concepts, the progress of steganographic methods and the development of the corresponding steganalysis schemes.