{"title":"Data Cleansing Processing using Pentaho Data Integration: Case Study Data Deduplication","authors":"D. Setyawan, T. F. Kusumasari, E. N. Alam","doi":"10.1109/ICST50505.2020.9732824","DOIUrl":null,"url":null,"abstract":"Now is the era of data. Every field has data and uses it to progress towards an innovative future. But often, the amount of data that is not balanced with good data quality ranges from differences in data formats, duplicate data, and errors in the data input process. One technique for maintaining and improving data quality is the data cleansing technique. This paper aims to propose data cleansing processing in the case of data deduplication cases using Pentaho Data Integration tools. Pentaho Data Integration done in 4 phases: Analyze, Mapping function, Design and setting, and Evaluation and test. PDI results are tested and compared with the Talend Open Studio tool. The dataset tested was data on factory names at a company in Indonesia tasked with overseeing the distribution of medicines and food. This research is expected to meet the needs of companies, especially in the field of data quality management, especially cases of data duplication and to find out the comparative results of the tools used.","PeriodicalId":125807,"journal":{"name":"2020 6th International Conference on Science and Technology (ICST)","volume":"39 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-09-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 6th International Conference on Science and Technology (ICST)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICST50505.2020.9732824","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Now is the era of data. Every field has data and uses it to progress towards an innovative future. But often, the amount of data that is not balanced with good data quality ranges from differences in data formats, duplicate data, and errors in the data input process. One technique for maintaining and improving data quality is the data cleansing technique. This paper aims to propose data cleansing processing in the case of data deduplication cases using Pentaho Data Integration tools. Pentaho Data Integration done in 4 phases: Analyze, Mapping function, Design and setting, and Evaluation and test. PDI results are tested and compared with the Talend Open Studio tool. The dataset tested was data on factory names at a company in Indonesia tasked with overseeing the distribution of medicines and food. This research is expected to meet the needs of companies, especially in the field of data quality management, especially cases of data duplication and to find out the comparative results of the tools used.
现在是数据时代。每个领域都有数据,并利用它向创新的未来迈进。但是,通常情况下,数据量与良好的数据质量不平衡的范围包括数据格式的差异、重复数据和数据输入过程中的错误。维护和改进数据质量的一种技术是数据清理技术。本文旨在提出使用Pentaho数据集成工具在重复数据删除情况下的数据清理处理。Pentaho数据集成分为4个阶段:分析,映射功能,设计和设置,评估和测试。对PDI结果进行了测试,并与Talend Open Studio工具进行了比较。测试的数据集是印度尼西亚一家负责监督药品和食品分销的公司的工厂名称数据。本研究旨在满足公司的需求,特别是在数据质量管理领域,特别是在数据重复的情况下,并找出所使用的工具的比较结果。