{"title":"Effect of Dropout layer on Classical Regression Problems","authors":"Atilla Özgür, Fatih Nar","doi":"10.1109/SIU49456.2020.9302054","DOIUrl":null,"url":null,"abstract":"In the last decade, deep learning architectures have provided good accuracy as they become deeper and wider in addition to other theoretical improvements. However, despite their current success, they initially faced with overfitting issue that limits their usage. The first practical and usable solution to overfitting in deep neural networks is a simple approach known as the dropout. Dropout is a regularization approach that randomly drops connections from earlier layers during training of neural nets. Dropout is a widely used technique, especially in image classification, speech recognition and natural language processing tasks, where features created by earlier layers are mostly redundant. Usage of the dropout layer in other tasks is largely unexplored. In this study, we seek an answer to question if the dropout layer is also useful for classical regression problems. A 3 layer deep learning net with a single dropout layer with various dropout levels tested on 8 real regression datasets. According to the experiments, the dropout layer does not help over fitting.","PeriodicalId":312627,"journal":{"name":"2020 28th Signal Processing and Communications Applications Conference (SIU)","volume":"7 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-10-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"11","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 28th Signal Processing and Communications Applications Conference (SIU)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/SIU49456.2020.9302054","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 11
Abstract
In the last decade, deep learning architectures have provided good accuracy as they become deeper and wider in addition to other theoretical improvements. However, despite their current success, they initially faced with overfitting issue that limits their usage. The first practical and usable solution to overfitting in deep neural networks is a simple approach known as the dropout. Dropout is a regularization approach that randomly drops connections from earlier layers during training of neural nets. Dropout is a widely used technique, especially in image classification, speech recognition and natural language processing tasks, where features created by earlier layers are mostly redundant. Usage of the dropout layer in other tasks is largely unexplored. In this study, we seek an answer to question if the dropout layer is also useful for classical regression problems. A 3 layer deep learning net with a single dropout layer with various dropout levels tested on 8 real regression datasets. According to the experiments, the dropout layer does not help over fitting.