{"title":"基于每像素片段列表的实时运动模糊","authors":"Jinhyung Choi, Kyoungsu Oh","doi":"10.1145/3129676.3129692","DOIUrl":null,"url":null,"abstract":"Motion-blur effect helps users recognize fast-moving objects in 3D scenes and virtual environments. Recently, the post-processing technique is one of the most commonly used techniques for motion-blur rendering. However, this algorithm has artifacts when there are complex moving directions. In this paper, we present a new algorithm to resolve those artifacts. First, we find pixel locations between t0 and t1 for all moving pixels. t0 and t1 are meant the start and end time respectively, during the period of one frame in which some object moves. We find pixel locations passing between two times on the screen with Bresenham's algorithm. And we store fragments to linked-lists on this position. Theses fragments contain information depth, time and color of a pixel. After we run visibility testing for every fragment and we set the average color from t0 and t1 with determined visible fragment's data. The result of our algorithm can render similar to the accumulation buffer algorithm without artifacts in interactively. We try to contribute a better quality image for motion-blurred. And we suggest a forward processing motion blur in real-time by linked-list. This method is a part of graphics techniques for complex reality 3D scenes. Therefore, we expect to make the better quality and speed of 3D games and virtual reality though this paper.","PeriodicalId":326100,"journal":{"name":"Proceedings of the International Conference on Research in Adaptive and Convergent Systems","volume":"63 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2017-09-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Real-time motion blur based on per pixel fragment list\",\"authors\":\"Jinhyung Choi, Kyoungsu Oh\",\"doi\":\"10.1145/3129676.3129692\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Motion-blur effect helps users recognize fast-moving objects in 3D scenes and virtual environments. Recently, the post-processing technique is one of the most commonly used techniques for motion-blur rendering. However, this algorithm has artifacts when there are complex moving directions. In this paper, we present a new algorithm to resolve those artifacts. First, we find pixel locations between t0 and t1 for all moving pixels. t0 and t1 are meant the start and end time respectively, during the period of one frame in which some object moves. We find pixel locations passing between two times on the screen with Bresenham's algorithm. And we store fragments to linked-lists on this position. Theses fragments contain information depth, time and color of a pixel. After we run visibility testing for every fragment and we set the average color from t0 and t1 with determined visible fragment's data. The result of our algorithm can render similar to the accumulation buffer algorithm without artifacts in interactively. We try to contribute a better quality image for motion-blurred. And we suggest a forward processing motion blur in real-time by linked-list. This method is a part of graphics techniques for complex reality 3D scenes. Therefore, we expect to make the better quality and speed of 3D games and virtual reality though this paper.\",\"PeriodicalId\":326100,\"journal\":{\"name\":\"Proceedings of the International Conference on Research in Adaptive and Convergent Systems\",\"volume\":\"63 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2017-09-20\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the International Conference on Research in Adaptive and Convergent Systems\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3129676.3129692\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the International Conference on Research in Adaptive and Convergent Systems","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3129676.3129692","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Real-time motion blur based on per pixel fragment list
Motion-blur effect helps users recognize fast-moving objects in 3D scenes and virtual environments. Recently, the post-processing technique is one of the most commonly used techniques for motion-blur rendering. However, this algorithm has artifacts when there are complex moving directions. In this paper, we present a new algorithm to resolve those artifacts. First, we find pixel locations between t0 and t1 for all moving pixels. t0 and t1 are meant the start and end time respectively, during the period of one frame in which some object moves. We find pixel locations passing between two times on the screen with Bresenham's algorithm. And we store fragments to linked-lists on this position. Theses fragments contain information depth, time and color of a pixel. After we run visibility testing for every fragment and we set the average color from t0 and t1 with determined visible fragment's data. The result of our algorithm can render similar to the accumulation buffer algorithm without artifacts in interactively. We try to contribute a better quality image for motion-blurred. And we suggest a forward processing motion blur in real-time by linked-list. This method is a part of graphics techniques for complex reality 3D scenes. Therefore, we expect to make the better quality and speed of 3D games and virtual reality though this paper.