Pub Date : 2013-01-01DOI: 10.5176/2251-3043_3.1.230
Victor A. Clincy, Brandon Wilgor
{"title":"Qualitative Evaluation of Latency and Packet Loss in a Cloud-based Games","authors":"Victor A. Clincy, Brandon Wilgor","doi":"10.5176/2251-3043_3.1.230","DOIUrl":"https://doi.org/10.5176/2251-3043_3.1.230","url":null,"abstract":"","PeriodicalId":91079,"journal":{"name":"GSTF international journal on computing","volume":"06 1","pages":"48"},"PeriodicalIF":0.0,"publicationDate":"2013-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"85976115","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
H. Ziaimatin, T. Groza, Georgeta Bordea, P. Buitelaar, J. Hunter
Expertise modeling has been the subject of extensive research in two main disciplines: Information Retrieval (IR) and Social Network Analysis (SNA). Both IR and SNA approaches build the expertise model through a document-centric approach providing a macro-perspective on the knowledge emerging from large corpus of static documents. With the emergence of the Web of Data there has been a significant shift from static to evolving documents, through micro-contributions. Thus, the existing macro-perspective is no longer sufficient to track the evolution of both knowledge and expertise. In this paper we present a comprehensive, domain-agnostic model for expertise profiling in the context of dynamic, living documents and evolving knowledge bases. We showcase its application in the biomedical domain and analyze its performance using two manually created datasets.
专业知识建模一直是信息检索(Information Retrieval, IR)和社会网络分析(Social Network Analysis, SNA)两个主要学科广泛研究的主题。IR和SNA方法都通过以文档为中心的方法来构建专家模型,该方法提供了从大型静态文档语料库中出现的知识的宏观视角。随着Web of Data的出现,通过微贡献,从静态文档到不断发展的文档已经发生了重大转变。因此,现有的宏观视角不再足以跟踪知识和专门技术的演变。在本文中,我们提出了一个全面的,领域不可知论的模型,用于动态,活文档和不断发展的知识库背景下的专业知识分析。我们展示了它在生物医学领域的应用,并使用两个手动创建的数据集分析了它的性能。
{"title":"Expertise Profiling in Evolving Knowledge- curation Platforms","authors":"H. Ziaimatin, T. Groza, Georgeta Bordea, P. Buitelaar, J. Hunter","doi":"10.5176/2010-3043_2","DOIUrl":"https://doi.org/10.5176/2010-3043_2","url":null,"abstract":"Expertise modeling has been the subject of extensive research in two main disciplines: Information Retrieval (IR) and Social Network Analysis (SNA). Both IR and SNA approaches build the expertise model through a document-centric approach providing a macro-perspective on the knowledge emerging from large corpus of static documents. With the emergence of the Web of Data there has been a significant shift from static to evolving documents, through micro-contributions. Thus, the existing macro-perspective is no longer sufficient to track the evolution of both knowledge and expertise. In this paper we present a comprehensive, domain-agnostic model for expertise profiling in the context of dynamic, living documents and evolving knowledge bases. We showcase its application in the biomedical domain and analyze its performance using two manually created datasets.","PeriodicalId":91079,"journal":{"name":"GSTF international journal on computing","volume":"73 1","pages":"118-127"},"PeriodicalIF":0.0,"publicationDate":"2012-10-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"90919067","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2012-09-17DOI: 10.5176/2251-1652_ADPC12.05
Keiichi Tamura, K. Hirahara, H. Kitakami, Shingo Tamura
Online documents on the Internet are represented as a document stream because the documents have a temporal order. This has resulted in numerous studies on extracting a frequent phenomenon (involving keywords, users, locations etc.) known as a burst. Recently, with the growth of interest in social media, the number of documents created on the Internet has increased exponentially. Therefore, the speed-up of burst detection in a large-scale document stream is one of the most important challenges. In this paper, we propose a novel parallelization method for the parallel processing of Kleinberg’s burst detection algorithm in a large-scale document stream. Specifically, we present a technique to combine the inter-task parallelization model with the intra-task parallelization model. This combination can achieve seamless dynamic load balancing and detect bursts in a large-scale document streams in memory.
{"title":"Parallel Processing of Burst Detection in Large-Scale Document Streams and Its Performance Evaluation","authors":"Keiichi Tamura, K. Hirahara, H. Kitakami, Shingo Tamura","doi":"10.5176/2251-1652_ADPC12.05","DOIUrl":"https://doi.org/10.5176/2251-1652_ADPC12.05","url":null,"abstract":"Online documents on the Internet are represented as a document stream because the documents have a temporal order. This has resulted in numerous studies on extracting a frequent phenomenon (involving keywords, users, locations etc.) known as a burst. Recently, with the growth of interest in social media, the number of documents created on the Internet has increased exponentially. Therefore, the speed-up of burst detection in a large-scale document stream is one of the most important challenges. In this paper, we propose a novel parallelization method for the parallel processing of Kleinberg’s burst detection algorithm in a large-scale document stream. Specifically, we present a technique to combine the inter-task parallelization model with the intra-task parallelization model. This combination can achieve seamless dynamic load balancing and detect bursts in a large-scale document streams in memory.","PeriodicalId":91079,"journal":{"name":"GSTF international journal on computing","volume":"48 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2012-09-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"78980834","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
This paper presents a study on a one year m-learning pilot project at Central University College in Ghana. This was done through a user trial, where the m-learning tool AD-CONNECT is introduced in 44 courses with a total of 500 students and 22 lecturers at the College. The paper reports on the first experiences gained by both teachers and students by asking the following questions: What are the perceptions of teachers on m-learning? What are the effects of m-learning on students? What does m-learning contribute to face-to-face teaching and learning? Questionnaires were administered to students and lecturers to gather quantitative data on their views on the use of m-learning, particularly after using the AD-CONNECT M-Learning system. Also, observations and interviews were used to collect data from users which provided us with some qualitative data.
{"title":"Mobile Learning Platform: A case study of introducing m-learning in Tertiary Education.","authors":"N. K. Annan, George Ofori-Dwumfou, M. Falch","doi":"10.1037/E527372013-005","DOIUrl":"https://doi.org/10.1037/E527372013-005","url":null,"abstract":"This paper presents a study on a one year m-learning pilot project at Central University College in Ghana. This was done through a user trial, where the m-learning tool AD-CONNECT is introduced in 44 courses with a total of 500 students and 22 lecturers at the College. The paper reports on the first experiences gained by both teachers and students by asking the following questions: What are the perceptions of teachers on m-learning? What are the effects of m-learning on students? What does m-learning contribute to face-to-face teaching and learning? Questionnaires were administered to students and lecturers to gather quantitative data on their views on the use of m-learning, particularly after using the AD-CONNECT M-Learning system. Also, observations and interviews were used to collect data from users which provided us with some qualitative data.","PeriodicalId":91079,"journal":{"name":"GSTF international journal on computing","volume":"21 1","pages":"23-28"},"PeriodicalIF":0.0,"publicationDate":"2012-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"80183576","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Lixin Li, Jie Tian, Xingyou Zhang, James B Holt, Reinhard Piltner
This paper investigates spatiotemporal interpolation methods for the application of air pollution assessment. The air pollutant of interest in this paper is fine particulate matter PM2.5. The choice of the time scale is investigated when applying the shape function-based method. It is found that the measurement scale of the time dimension has an impact on the quality of interpolation results. Based upon the result of 10-fold cross validation, the most effective time scale out of four experimental ones was selected for the PM2.5 interpolation. The paper also estimates the population exposure to the ambient air pollution of PM2.5 at the county-level in the contiguous U.S. in 2009. The interpolated county-level PM2.5 has been linked to 2009 population data and the population with a risky PM2.5 exposure has been estimated. The risky PM2.5 exposure means the PM2.5 concentration exceeding the National Ambient Air Quality Standards. The geographic distribution of the counties with a risky PM2.5 exposure is visualized. This work is essential to understanding the associations between ambient air pollution exposure and population health outcomes.
{"title":"Estimating Population Exposure to Fine Particulate Matter in the Conterminous U.S. using Shape Function-based Spatiotemporal Interpolation Method: A County Level Analysis.","authors":"Lixin Li, Jie Tian, Xingyou Zhang, James B Holt, Reinhard Piltner","doi":"","DOIUrl":"","url":null,"abstract":"<p><p>This paper investigates spatiotemporal interpolation methods for the application of air pollution assessment. The air pollutant of interest in this paper is fine particulate matter PM<sub>2.5</sub>. The choice of the time scale is investigated when applying the shape function-based method. It is found that the measurement scale of the time dimension has an impact on the quality of interpolation results. Based upon the result of 10-fold cross validation, the most effective time scale out of four experimental ones was selected for the PM<sub>2.5</sub> interpolation. The paper also estimates the population exposure to the ambient air pollution of PM<sub>2.5</sub> at the county-level in the contiguous U.S. in 2009. The interpolated county-level PM<sub>2.5</sub> has been linked to 2009 population data and the population with a risky PM<sub>2.5</sub> exposure has been estimated. The risky PM<sub>2.5</sub> exposure means the PM<sub>2.5</sub> concentration exceeding the National Ambient Air Quality Standards. The geographic distribution of the counties with a risky PM<sub>2.5</sub> exposure is visualized. This work is essential to understanding the associations between ambient air pollution exposure and population health outcomes.</p>","PeriodicalId":91079,"journal":{"name":"GSTF international journal on computing","volume":"1 4","pages":"24-30"},"PeriodicalIF":0.0,"publicationDate":"2012-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4583366/pdf/nihms717903.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"34109239","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Lixin Li, Jie Tian, Xingyou Zhang, J. Holt, R. Piltner
This paper investigates spatiotemporal interpolation methods for the application of air pollution assessment. The air pollutant of interest in this paper is fine particulate matter PM2.5. The choice of the time scale is investigated when applying the shape function-based method. It is found that the measurement scale of the time dimension has an impact on the quality of interpolation results. Based upon the result of 10-fold cross validation, the most effective time scale out of four experimental ones was selected for the PM2.5 interpolation. The paper also estimates the population exposure to the ambient air pollution of PM2.5 at the county-level in the contiguous U.S. in 2009. The interpolated county-level PM2.5 has been linked to 2009 population data and the population with a risky PM2.5 exposure has been estimated. The risky PM2.5 exposure means the PM2.5 concentration exceeding the National Ambient Air Quality Standards. The geographic distribution of the counties with a risky PM2.5 exposure is visualized. This work is essential to understanding the associations between ambient air pollution exposure and population health outcomes.
{"title":"Estimating Population Exposure to Fine Particulate Matter in the Conterminous U.S. using Shape Function-based Spatiotemporal Interpolation Method: A County Level Analysis.","authors":"Lixin Li, Jie Tian, Xingyou Zhang, J. Holt, R. Piltner","doi":"10.1037/e527382013-002","DOIUrl":"https://doi.org/10.1037/e527382013-002","url":null,"abstract":"This paper investigates spatiotemporal interpolation methods for the application of air pollution assessment. The air pollutant of interest in this paper is fine particulate matter PM2.5. The choice of the time scale is investigated when applying the shape function-based method. It is found that the measurement scale of the time dimension has an impact on the quality of interpolation results. Based upon the result of 10-fold cross validation, the most effective time scale out of four experimental ones was selected for the PM2.5 interpolation. The paper also estimates the population exposure to the ambient air pollution of PM2.5 at the county-level in the contiguous U.S. in 2009. The interpolated county-level PM2.5 has been linked to 2009 population data and the population with a risky PM2.5 exposure has been estimated. The risky PM2.5 exposure means the PM2.5 concentration exceeding the National Ambient Air Quality Standards. The geographic distribution of the counties with a risky PM2.5 exposure is visualized. This work is essential to understanding the associations between ambient air pollution exposure and population health outcomes.","PeriodicalId":91079,"journal":{"name":"GSTF international journal on computing","volume":"343 1","pages":"24-30"},"PeriodicalIF":0.0,"publicationDate":"2012-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"75942559","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Discovered gene regulation networks are very helpful to predict unknown gene functions. The activating and deactivating relations between genes and genes are mined from microarray gene expression data. There are evidences showing that multiple time units delay exist in a gene regulation process. Association rule mining technique is very suitable for finding regulation relations among genes. However, current association rule mining techniques cannot handle temporally ordered transactions. We propose a modified association rule mining technique for efficiently discovering time-delayed regulation relationships among genes. By analyzing gene expression data, we can discover gene relations. Thus, we use modified association rule to mine gene regulation patterns. Our proposed method, BC3, is designed to mine time-delayed gene regulation patterns with length 3 from time series gene expression data. However, the front two items are regulators, and the last item is their affecting target. First we use Apriori to find frequent 2-itemset in order to figure backward to BL1. The Apriori mined the frequent 2-itemset in the same time point, so we make the L2 split to length one for having relation in the same time point. Then we combine BL1 with L1 to a new ordered-set BC2 with time-delayed relations. After pruning BC2 with the threshold, BL2 is derived. The results are worked out by BL2 joining itself to BC3, and sifting BL3 from BC3. We use yeast gene expression data to evaluate our method and analyze the results to show our work is efficient.
{"title":"Mining Time-delayed Gene Regulation Patterns from Gene Expression Data","authors":"Huang-Cheng Kuo, Pei-Cheng Tsai","doi":"10.1037/e527372013-011","DOIUrl":"https://doi.org/10.1037/e527372013-011","url":null,"abstract":"Discovered gene regulation networks are very helpful to predict unknown gene functions. The activating and deactivating relations between genes and genes are mined from microarray gene expression data. There are evidences showing that multiple time units delay exist in a gene regulation process. Association rule mining technique is very suitable for finding regulation relations among genes. However, current association rule mining techniques cannot handle temporally ordered transactions. We propose a modified association rule mining technique for efficiently discovering time-delayed regulation relationships among genes. By analyzing gene expression data, we can discover gene relations. Thus, we use modified association rule to mine gene regulation patterns. Our proposed method, BC3, is designed to mine time-delayed gene regulation patterns with length 3 from time series gene expression data. However, the front two items are regulators, and the last item is their affecting target. First we use Apriori to find frequent 2-itemset in order to figure backward to BL1. The Apriori mined the frequent 2-itemset in the same time point, so we make the L2 split to length one for having relation in the same time point. Then we combine BL1 with L1 to a new ordered-set BC2 with time-delayed relations. After pruning BC2 with the threshold, BL2 is derived. The results are worked out by BL2 joining itself to BC3, and sifting BL3 from BC3. We use yeast gene expression data to evaluate our method and analyze the results to show our work is efficient.","PeriodicalId":91079,"journal":{"name":"GSTF international journal on computing","volume":"153 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2012-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"78063026","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Over its useful life a railway track is subject to many mechanical and environmental stresses which gradually lead to its deterioration. Monitoring the wear condition of the railway superstructure is one of the key points to guarantee an adequate safety level of the railway transport system; in this field, the use of high-efficiency laser techniques has become consolidated and implemented in diagnostic trains (e.g. the “Archimede train” and the “Talete train” ) which allow to detect the track geometric parameters (gauge, alignment, longitudinal level, cross level, superelevation defect, etc.) and the state of rail wear (vertical, horizontal, 45-degree etc.) with very high accuracy. The objective of this paper is to describe a new nonconventional procedure for detecting the transverse profile of worn-out rails by means of image-processing technique. This methodological approach is based on the analysis of the information contained in high-resolution photographic images of rails and on specific algorithms which allow to obtain the exact geometric profile and the measurement of the relevant deviations compared to new rails of the same typology. The analyses and the first results, obtained from laboratory researches, concern rails cross sections taken from railway lines under upgrading. The procedure has shown high precision in the wear evaluation as well as great rapidity in being performed.
{"title":"Image analysis for detecting the transverse profile of worn-out rails","authors":"G. Parla, M. Guerrieri, D. Ticali","doi":"10.1037/E527372013-012","DOIUrl":"https://doi.org/10.1037/E527372013-012","url":null,"abstract":"Over its useful life a railway track is subject to many mechanical and environmental stresses which gradually lead to its deterioration. Monitoring the wear condition of the railway superstructure is one of the key points to guarantee an adequate safety level of the railway transport system; in this field, the use of high-efficiency laser techniques has become consolidated and implemented in diagnostic trains (e.g. the “Archimede train” and the “Talete train” ) which allow to detect the track geometric parameters (gauge, alignment, longitudinal level, cross level, superelevation defect, etc.) and the state of rail wear (vertical, horizontal, 45-degree etc.) with very high accuracy. The objective of this paper is to describe a new nonconventional procedure for detecting the transverse profile of worn-out rails by means of image-processing technique. This methodological approach is based on the analysis of the information contained in high-resolution photographic images of rails and on specific algorithms which allow to obtain the exact geometric profile and the measurement of the relevant deviations compared to new rails of the same typology. The analyses and the first results, obtained from laboratory researches, concern rails cross sections taken from railway lines under upgrading. The procedure has shown high precision in the wear evaluation as well as great rapidity in being performed.","PeriodicalId":91079,"journal":{"name":"GSTF international journal on computing","volume":"1 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2012-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"90884272","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2011-02-01DOI: 10.5176/2010-2283_1.2.42
N. Pavaday, Insah Bhurtah, K. Soyjaudah
A wide variety of systems, ubiquitous in our daily activities, require personal identification schemes that verify the identity of individual requesting their services. A non exhaustive list of such application includes secure access to buildings, computer systems, cellular phones, ATMs, crossing of national borders, boarding of planes among others. In the absence of robust schemes, these systems are vulnerable to the wiles of an impostor. Current systems are based on the three vertex of the authentication triangle which are, possession of the token, knowledge of a secret and possessing the required biometric. Due to weaknesses of the de facto password scheme, inclusion of its inherent keystroke rhythms, have been proposed and systems that implement such security measures are also on the market. This correspondence investigates possibility and ways for optimising performance of hardened password mechanism using the widely accepted Neural Network classifier. It represents continuation of a previous work in that direction.
{"title":"How to improve performance of Neural Network in the hardened password mechanism","authors":"N. Pavaday, Insah Bhurtah, K. Soyjaudah","doi":"10.5176/2010-2283_1.2.42","DOIUrl":"https://doi.org/10.5176/2010-2283_1.2.42","url":null,"abstract":"A wide variety of systems, ubiquitous in our daily activities, require personal identification schemes that verify the identity of individual requesting their services. A non exhaustive list of such application includes secure access to buildings, computer systems, cellular phones, ATMs, crossing of national borders, boarding of planes among others. In the absence of robust schemes, these systems are vulnerable to the wiles of an impostor. Current systems are based on the three vertex of the authentication triangle which are, possession of the token, knowledge of a secret and possessing the required biometric. Due to weaknesses of the de facto password scheme, inclusion of its inherent keystroke rhythms, have been proposed and systems that implement such security measures are also on the market. This correspondence investigates possibility and ways for optimising performance of hardened password mechanism using the widely accepted Neural Network classifier. It represents continuation of a previous work in that direction.","PeriodicalId":91079,"journal":{"name":"GSTF international journal on computing","volume":"1 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2011-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"82968454","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2011-02-01DOI: 10.5176/2010-2283_1.2.50
J. Cartlidge, S. Phelps
Economic theory suggests sellers can increase revenue through dynamic pricing; selling identical goods or services at different prices. However, such discrimination requires knowledge of the maximum price that each consumer is willing to pay; information that is often unavailable. Fortunately, electronic markets offer a solution; generating vast quantities of transaction data that, if used intelligently, enable consumer behaviour to be modelled and predicted. Using eBay as an exemplar market, we introduce a model for dynamic pricing that uses a statistical method for deriving the structure of demand from temporal bidding data. This work is a tentative first step of a wider research program to discover a practical methodology for automatically generating dynamic pricing models for the provision of cloud computing services; a pertinent problem with widespread commercial and theoretical interest.
{"title":"Estimating Demand for Dynamic Pricing in Electronic Markets","authors":"J. Cartlidge, S. Phelps","doi":"10.5176/2010-2283_1.2.50","DOIUrl":"https://doi.org/10.5176/2010-2283_1.2.50","url":null,"abstract":"Economic theory suggests sellers can increase revenue through dynamic pricing; selling identical goods or services at different prices. However, such discrimination requires knowledge of the maximum price that each consumer is willing to pay; information that is often unavailable. Fortunately, electronic markets offer a solution; generating vast quantities of transaction data that, if used intelligently, enable consumer behaviour to be modelled and predicted. Using eBay as an exemplar market, we introduce a model for dynamic pricing that uses a statistical method for deriving the structure of demand from temporal bidding data. This work is a tentative first step of a wider research program to discover a practical methodology for automatically generating dynamic pricing models for the provision of cloud computing services; a pertinent problem with widespread commercial and theoretical interest.","PeriodicalId":91079,"journal":{"name":"GSTF international journal on computing","volume":"147 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2011-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"86540545","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}