U. Raju, Hillol Barman, R. K. Netalkar, Sanjay Kumar, Hariom Kumar
{"title":"基于Map-Reduce范式的分布式JPEG大图像数据压缩和解压缩","authors":"U. Raju, Hillol Barman, R. K. Netalkar, Sanjay Kumar, Hariom Kumar","doi":"10.13052/jmm1550-4646.1863","DOIUrl":null,"url":null,"abstract":"Digital data is primarily created and delivered in the form of images and videos in today’s world. Storing and transmitting such a large number of images necessitates a lot of computer resources, such as storage and bandwidth. So, rather than keeping the image data as is, the data could be compressed and then stored, which saves a lot of space. Image compression is the process of removing as much redundant data from an image as feasible while retaining only the non-redundant data. In this paper, the traditional JPEG compression technique is executed in the distributed environment with map-reduce paradigm on big image data. This technique is carried out in serial as well as in parallel fashion with different number of workers in order to show the time comparisons between these setups with the self-created large image dataset. In this, more than one Lakh (121,856) images are compressed and decompressed and the execution times are compared with three different setups: single system, Map-Reduce (MR) with 2 workers and MR with 4 workers. Compression on more than one Million (1,096,704) images using single system and MR with 4 workers is also done. To evaluate the efficiency of JPEG technique, two performance measures such as Compression Ratio (CR) and Peak Signal to Noise Ratio (PSNR) are used.","PeriodicalId":425561,"journal":{"name":"J. Mobile Multimedia","volume":"103 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-07-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Distributed JPEG Compression and Decompression for Big Image Data Using Map-Reduce Paradigm\",\"authors\":\"U. Raju, Hillol Barman, R. K. Netalkar, Sanjay Kumar, Hariom Kumar\",\"doi\":\"10.13052/jmm1550-4646.1863\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Digital data is primarily created and delivered in the form of images and videos in today’s world. Storing and transmitting such a large number of images necessitates a lot of computer resources, such as storage and bandwidth. So, rather than keeping the image data as is, the data could be compressed and then stored, which saves a lot of space. Image compression is the process of removing as much redundant data from an image as feasible while retaining only the non-redundant data. In this paper, the traditional JPEG compression technique is executed in the distributed environment with map-reduce paradigm on big image data. This technique is carried out in serial as well as in parallel fashion with different number of workers in order to show the time comparisons between these setups with the self-created large image dataset. In this, more than one Lakh (121,856) images are compressed and decompressed and the execution times are compared with three different setups: single system, Map-Reduce (MR) with 2 workers and MR with 4 workers. Compression on more than one Million (1,096,704) images using single system and MR with 4 workers is also done. To evaluate the efficiency of JPEG technique, two performance measures such as Compression Ratio (CR) and Peak Signal to Noise Ratio (PSNR) are used.\",\"PeriodicalId\":425561,\"journal\":{\"name\":\"J. Mobile Multimedia\",\"volume\":\"103 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-07-02\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"J. Mobile Multimedia\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.13052/jmm1550-4646.1863\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"J. Mobile Multimedia","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.13052/jmm1550-4646.1863","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Distributed JPEG Compression and Decompression for Big Image Data Using Map-Reduce Paradigm
Digital data is primarily created and delivered in the form of images and videos in today’s world. Storing and transmitting such a large number of images necessitates a lot of computer resources, such as storage and bandwidth. So, rather than keeping the image data as is, the data could be compressed and then stored, which saves a lot of space. Image compression is the process of removing as much redundant data from an image as feasible while retaining only the non-redundant data. In this paper, the traditional JPEG compression technique is executed in the distributed environment with map-reduce paradigm on big image data. This technique is carried out in serial as well as in parallel fashion with different number of workers in order to show the time comparisons between these setups with the self-created large image dataset. In this, more than one Lakh (121,856) images are compressed and decompressed and the execution times are compared with three different setups: single system, Map-Reduce (MR) with 2 workers and MR with 4 workers. Compression on more than one Million (1,096,704) images using single system and MR with 4 workers is also done. To evaluate the efficiency of JPEG technique, two performance measures such as Compression Ratio (CR) and Peak Signal to Noise Ratio (PSNR) are used.