Results 1 
5 of
5
A Fast O(N) Multiresolution Polygonal Approximation Algorithm for GPS Trajectory Simplification
, 2012
"... Recent advances in geopositioning mobile phones have made it possible for users to collect a large number of GPS trajectories by recording their location information. However, these mobile phones with builtin GPS devices usually record far more data than needed, which brings about both heavy data ..."
Abstract

Cited by 10 (3 self)
 Add to MetaCart
Recent advances in geopositioning mobile phones have made it possible for users to collect a large number of GPS trajectories by recording their location information. However, these mobile phones with builtin GPS devices usually record far more data than needed, which brings about both heavy data storage and a computationally expensive burden in the rendering process for a Web browser. To address this practical problem, we present a fast polygonal approximation algorithm in 2D space for the GPS trajectory simplification under the socalled integral square synchronous distance error criterion in a linear time complexity. The underlying algorithm is designed and implemented using a bottom–up multiresolution method, where the input of polygonal approximation in the coarser resolution is the polygonal curve achieved in the finer resolution. For each resolution (map scale), priorityqueue structure is exploited in graph construction to construct the initialized approximated curve. Once the polygonal curve is initialized, two finetune algorithms are employed in order to achieve the desirable quality level. Experimental results validated that the proposed algorithm is fast and achieves a better approximation result than the existing competitive methods.
The Single Pixel GPS: Learning Big Data Signals from Tiny Coresets
"... We present algorithms for simplifying and clustering patterns from sensors such as GPS, LiDAR, and other devices that can produce highdimensional signals. The algorithms are suitable for handling very large (e.g. terabytes) streaming data and can be run in parallel on networks or clouds. Applicatio ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
We present algorithms for simplifying and clustering patterns from sensors such as GPS, LiDAR, and other devices that can produce highdimensional signals. The algorithms are suitable for handling very large (e.g. terabytes) streaming data and can be run in parallel on networks or clouds. Applications include compression, denoising, activity recognition, road matching, and map generation. We encode these problems as (k, m)segment mean problems. Formally, we provide (1 + ε)approximations to the ksegment and (k, m)segment mean of a ddimensional discretetime signal. The ksegment mean is a kpiecewise linear function that minimizes the regression distance to the signal. The (k, m)segment mean has an additional constraint that the projection of the k segments on R d consists of only m ≤ k segments. Existing algorithms for these problems take O(kn 2) and n O(mk) time respectively and O(kn 2) space, where n is the length of the signal. Our main tool is a new coreset for discretetime signals. The coreset is a smart compression of the input signal that allows computation of a (1 + ε)approximation to the ksegment or (k, m)segment mean in O(n log n) time for arbitrary constants ε, k, and m. We use coresets to obtain a parallel algorithm that scans the signal in one pass, using space and update time per point that is polynomial in log n. We provide empirical evaluations of the quality of our coreset and experimental results that show how our coreset boosts both inefficient optimal algorithms and existing heuristics. We demonstrate our results for extracting signals from GPS traces. However, the results are more general and applicable to other types of sensors.
Compression of GPS Trajectories using Optimized Approximation
"... A large number of GPS trajectories, which include users ' spatial and temporal information, are collected by geopositioning mobile phones in recent years. The massive volumes of trajectory data bring about heavy burdens for both network transmission and data storage. To overcome these difficul ..."
Abstract
 Add to MetaCart
(Show Context)
A large number of GPS trajectories, which include users ' spatial and temporal information, are collected by geopositioning mobile phones in recent years. The massive volumes of trajectory data bring about heavy burdens for both network transmission and data storage. To overcome these difficulties, GPS trajectory compression algorithm (GTC) was proposed recently that optimizes both the data reduction by trajectory simplification and the coding procedure using the quantized data. In this paper, instead of using greedy solution in GTC algorithm, the approximation process is optimized jointly with the encoding step via dynamic programming. In addition, Bayes ' theorem is applied to improve the robustness of probability estimation for encoded values. The proposed solution has the same time complexity with GTC algorithm in the decoding procedure and experimental results show that its bitrate is around 80 % comparing with GTC algorithm. 1.
Low Complexity Spatial Similarity Measure of GPS Trajectories
"... Abstract: We attack the problem of trajectory similarity by approximating the trajectories using a geographical grid based on the MGRS 2D coordinate system. We propose a spatial similarity measure which is computationally feasible for big data collections. The proposed measure is based on cell match ..."
Abstract
 Add to MetaCart
Abstract: We attack the problem of trajectory similarity by approximating the trajectories using a geographical grid based on the MGRS 2D coordinate system. We propose a spatial similarity measure which is computationally feasible for big data collections. The proposed measure is based on cell matching with a similarity metric drawn from Jaccard index. We equip the proposed method with interpolation and dilation to overcome the problems missing data and different sampling frequencies when comparing two trajectories. The proposed measure is implemented online in the framework of Mopsia. acs.uef.fi/mopsi 1
Trajectory Simplification: On Minimizing the Directionbased Error
"... Trajectory data is central to many applications with moving objects. Raw trajectory data is usually very large, and so is simplified before it is stored and processed. Many trajectory simplification notions have been proposed, and among them, the directionpreserving trajectory simplification (DPTS) ..."
Abstract
 Add to MetaCart
(Show Context)
Trajectory data is central to many applications with moving objects. Raw trajectory data is usually very large, and so is simplified before it is stored and processed. Many trajectory simplification notions have been proposed, and among them, the directionpreserving trajectory simplification (DPTS) which aims at protecting the direction information has been shown to perform quite well. However, existing studies on DPTS require users to specify an error tolerance which users might not know how to set properly in some cases (e.g., the error tolerance could only be known at some future time and simply setting one error tolerance does not meet the needs since the simplified trajectories would usually be used in many different applications which accept different error tolerances). In these cases, a better solution is to minimize the error while achieving a predefined simplification size. For this purpose, in this paper, we define a problem called MinError and develop two exact algorithms and one 2factor approximate algorithm for the problem. Extensive experiments on real datasets verified our algorithms. 1.