Data Reduction Based on GPU
Abstract
Graphics Processors Units (GPU) is becoming more popular among other application developers as data parallel coprocessors. In addition, general-purpose programming for graphics processor’s research is growing very fast, which is basically for the GPU graphics computation, is now used not only for the purposes of computation the graphics but also for many applications. Moreover, the GPU becomes cheaper and has high computation. Data Reduction is a key problem in the rough set theory and its application. Rough set is one many method of the most useful data mining techniques. This paper focuses on a Data Reduction (finds minimal sets of data) to process by using GPU and CUDA programming. In this paper, two major processes have to be solved, first is a data reduction process (find minimal data set) using rough set theory, and the second is a process Data Reduction on the GPU by using the CUDA programming. By using shared memory and thread block on the GPU, the data reduction process is improved faster and more efficient. The results of the experiment prove that the computing performance by using GPU is faster and more efficient than using the CPU.
Â
Keyword: GPU, CUDA, Rough Set, Data Reduction