首页 > 最新文献

2011 International Conference on Advanced Computer Science and Information Systems最新文献

英文 中文
Optimization of knowledge sharing through Multi-Forum using cloud computing architecture 基于云计算架构的多论坛知识共享优化
M. Sriram, Srivatsan Sankaran
Knowledge sharing is done through various knowledge sharing forums which requires multiple logins through multiple browser instances. Here a single Multi- Forum knowledge sharing concept is introduced which requires only one login session which makes user to connect multiple forums and display the data in a single browser window. Also few optimization techniques are introduced here to speed up the access time using cloud computing architecture.
知识共享是通过各种知识共享论坛完成的,这需要通过多个浏览器实例多次登录。本文介绍了一个单一的多论坛知识共享的概念,它只需要一个登录会话,使用户可以连接多个论坛,并在一个浏览器窗口中显示数据。本文还介绍了一些利用云计算架构加快访问时间的优化技术。
{"title":"Optimization of knowledge sharing through Multi-Forum using cloud computing architecture","authors":"M. Sriram, Srivatsan Sankaran","doi":"10.1117/12.920151","DOIUrl":"https://doi.org/10.1117/12.920151","url":null,"abstract":"Knowledge sharing is done through various knowledge sharing forums which requires multiple logins through multiple browser instances. Here a single Multi- Forum knowledge sharing concept is introduced which requires only one login session which makes user to connect multiple forums and display the data in a single browser window. Also few optimization techniques are introduced here to speed up the access time using cloud computing architecture.","PeriodicalId":363714,"journal":{"name":"2011 International Conference on Advanced Computer Science and Information Systems","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-12-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122594064","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Enriching time series datasets using Nonparametric kernel regression to improve forecasting accuracy 利用非参数核回归丰富时间序列数据集,提高预测精度
Pub Date : 2011-12-01 DOI: 10.6084/M9.FIGSHARE.1609661.V1
Agus Widodo, Mohamad Ivan Fanani, I. Budi
Improving the accuracy of prediction on future values based on the past and current observations has been pursued by enhancing the prediction's methods, combining those methods or performing data pre-processing. In this paper, another approach is taken, namely by increasing the number of input in the dataset. This approach would be useful especially for a shorter time series data. By filling the in-between values in the time series, the number of training set can be increased, thus increasing the generalization capability of the predictor. The algorithm used to make prediction is Neural Network as it is widely used in literature for time series tasks. For comparison, Support Vector Regression is also employed. The dataset used in the experiment is the frequency of USPTO's patents and PubMed's scientific publications on the field of health, namely on Apnea, Arrhythmia, and Sleep Stages. Another time series data designated for NN3 Competition in the field of transportation is also used for benchmarking. The experimental result shows that the prediction performance can be significantly increased by filling in-between data in the time series. Furthermore, the use of detrend and deseasonalization which separates the data into trend, seasonal and stationary time series also improve the prediction performance both on original and filled dataset. The optimal number of increase on the dataset in this experiment is about five times of the length of original dataset.
通过改进预测方法、将这些方法结合起来或进行数据预处理来提高基于过去和当前观测的未来值预测的准确性。本文采用了另一种方法,即增加数据集中的输入数量。这种方法对于较短的时间序列数据尤其有用。通过填充时间序列中的中间值,可以增加训练集的数量,从而提高预测器的泛化能力。用于预测的算法是神经网络,因为它在文献中广泛用于时间序列任务。为了比较,还使用了支持向量回归。实验中使用的数据集是USPTO的专利和PubMed在健康领域的科学出版物的频率,即呼吸暂停、心律失常和睡眠阶段。另一个为交通领域NN3 Competition指定的时间序列数据也用于基准测试。实验结果表明,在时间序列中填充中间数据可以显著提高预测性能。此外,使用趋势化和反季节化方法,将数据分为趋势序列、季节序列和平稳序列,也提高了原始数据和填充数据集的预测性能。本实验中对数据集的最优增加次数约为原始数据集长度的5倍。
{"title":"Enriching time series datasets using Nonparametric kernel regression to improve forecasting accuracy","authors":"Agus Widodo, Mohamad Ivan Fanani, I. Budi","doi":"10.6084/M9.FIGSHARE.1609661.V1","DOIUrl":"https://doi.org/10.6084/M9.FIGSHARE.1609661.V1","url":null,"abstract":"Improving the accuracy of prediction on future values based on the past and current observations has been pursued by enhancing the prediction's methods, combining those methods or performing data pre-processing. In this paper, another approach is taken, namely by increasing the number of input in the dataset. This approach would be useful especially for a shorter time series data. By filling the in-between values in the time series, the number of training set can be increased, thus increasing the generalization capability of the predictor. The algorithm used to make prediction is Neural Network as it is widely used in literature for time series tasks. For comparison, Support Vector Regression is also employed. The dataset used in the experiment is the frequency of USPTO's patents and PubMed's scientific publications on the field of health, namely on Apnea, Arrhythmia, and Sleep Stages. Another time series data designated for NN3 Competition in the field of transportation is also used for benchmarking. The experimental result shows that the prediction performance can be significantly increased by filling in-between data in the time series. Furthermore, the use of detrend and deseasonalization which separates the data into trend, seasonal and stationary time series also improve the prediction performance both on original and filled dataset. The optimal number of increase on the dataset in this experiment is about five times of the length of original dataset.","PeriodicalId":363714,"journal":{"name":"2011 International Conference on Advanced Computer Science and Information Systems","volume":"2 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129036094","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
A normalization method of converting online handwritten Chinese character to Stroke-Segment-Mesh Glyph 一种在线手写体汉字笔划-段-网格字形的规范化方法
Pub Date : 2011-12-01 DOI: 10.1007/978-3-642-27951-5_32
Hanquan Huang
{"title":"A normalization method of converting online handwritten Chinese character to Stroke-Segment-Mesh Glyph","authors":"Hanquan Huang","doi":"10.1007/978-3-642-27951-5_32","DOIUrl":"https://doi.org/10.1007/978-3-642-27951-5_32","url":null,"abstract":"","PeriodicalId":363714,"journal":{"name":"2011 International Conference on Advanced Computer Science and Information Systems","volume":"2015 25","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114087025","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Model simplification in Petri net models Petri网模型的模型简化
R. Davidrajuh
Model simplification is a methodology to reduce size and complexity of models, e.g. by moving some of the details away from the model and into the model implementation code. This paper talks about supporting Petri net model simplification in a new tool for modeling and simulation of discrete event dynamic systems. Firstly, this paper presents a brief introduction to model abstraction and model simplification. Secondly, this paper presents a brief introduction to the new tool known as GPenSIM. Thirdly, through a case study, this work shows how model simplification can be done in GPenSIM and also how effective or useful model simplification can be.
模型简化是一种减少模型大小和复杂性的方法,例如,通过将一些细节从模型移到模型实现代码中。本文讨论了在离散事件动态系统建模和仿真中支持Petri网模型简化的新工具。本文首先简要介绍了模型抽象和模型简化。其次,本文对新工具GPenSIM进行了简要介绍。第三,通过一个案例研究,本工作展示了如何在GPenSIM中进行模型简化,以及模型简化的有效性和实用性。
{"title":"Model simplification in Petri net models","authors":"R. Davidrajuh","doi":"10.1109/EMS.2011.91","DOIUrl":"https://doi.org/10.1109/EMS.2011.91","url":null,"abstract":"Model simplification is a methodology to reduce size and complexity of models, e.g. by moving some of the details away from the model and into the model implementation code. This paper talks about supporting Petri net model simplification in a new tool for modeling and simulation of discrete event dynamic systems. Firstly, this paper presents a brief introduction to model abstraction and model simplification. Secondly, this paper presents a brief introduction to the new tool known as GPenSIM. Thirdly, through a case study, this work shows how model simplification can be done in GPenSIM and also how effective or useful model simplification can be.","PeriodicalId":363714,"journal":{"name":"2011 International Conference on Advanced Computer Science and Information Systems","volume":"13 2 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-11-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115839445","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
期刊
2011 International Conference on Advanced Computer Science and Information Systems
全部 Acc. Chem. Res. ACS Applied Bio Materials ACS Appl. Electron. Mater. ACS Appl. Energy Mater. ACS Appl. Mater. Interfaces ACS Appl. Nano Mater. ACS Appl. Polym. Mater. ACS BIOMATER-SCI ENG ACS Catal. ACS Cent. Sci. ACS Chem. Biol. ACS Chemical Health & Safety ACS Chem. Neurosci. ACS Comb. Sci. ACS Earth Space Chem. ACS Energy Lett. ACS Infect. Dis. ACS Macro Lett. ACS Mater. Lett. ACS Med. Chem. Lett. ACS Nano ACS Omega ACS Photonics ACS Sens. ACS Sustainable Chem. Eng. ACS Synth. Biol. Anal. Chem. BIOCHEMISTRY-US Bioconjugate Chem. BIOMACROMOLECULES Chem. Res. Toxicol. Chem. Rev. Chem. Mater. CRYST GROWTH DES ENERG FUEL Environ. Sci. Technol. Environ. Sci. Technol. Lett. Eur. J. Inorg. Chem. IND ENG CHEM RES Inorg. Chem. J. Agric. Food. Chem. J. Chem. Eng. Data J. Chem. Educ. J. Chem. Inf. Model. J. Chem. Theory Comput. J. Med. Chem. J. Nat. Prod. J PROTEOME RES J. Am. Chem. Soc. LANGMUIR MACROMOLECULES Mol. Pharmaceutics Nano Lett. Org. Lett. ORG PROCESS RES DEV ORGANOMETALLICS J. Org. Chem. J. Phys. Chem. J. Phys. Chem. A J. Phys. Chem. B J. Phys. Chem. C J. Phys. Chem. Lett. Analyst Anal. Methods Biomater. Sci. Catal. Sci. Technol. Chem. Commun. Chem. Soc. Rev. CHEM EDUC RES PRACT CRYSTENGCOMM Dalton Trans. Energy Environ. Sci. ENVIRON SCI-NANO ENVIRON SCI-PROC IMP ENVIRON SCI-WAT RES Faraday Discuss. Food Funct. Green Chem. Inorg. Chem. Front. Integr. Biol. J. Anal. At. Spectrom. J. Mater. Chem. A J. Mater. Chem. B J. Mater. Chem. C Lab Chip Mater. Chem. Front. Mater. Horiz. MEDCHEMCOMM Metallomics Mol. Biosyst. Mol. Syst. Des. Eng. Nanoscale Nanoscale Horiz. Nat. Prod. Rep. New J. Chem. Org. Biomol. Chem. Org. Chem. Front. PHOTOCH PHOTOBIO SCI PCCP Polym. Chem.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1