Pub Date : 2018-12-31DOI: 10.11113/MATEMATIKA.V34.N3.1142
Zakya Reyhana, K. Fithriasari, M. Atok, Nur Iriawan
Sentiment analysis is related to the automatic extraction of positive or negative opinions from the text. It is a special text mining application. It is important to classify implicit contents from citizen’s tweet using sentiment analysis. This research aimed to find out the opinion of infrastructure that sustained urban development in Surabaya, Indonesia’s second largest city. The procedures of text mining analysis were the data undergoes some preprocessing first, such as removing the link, retweet (RT), username, punctuation, digits, stopwords, case folding, and tokenizing. Then, the opinion was classified into positive and negative comments. Classification methods used in this research were support vector machine (SVM) and neural network (NN). The result of this research showed that NN classification method was better than SVM.
{"title":"Linking Twitter Sentiment Knowledge with Infrastructure Development","authors":"Zakya Reyhana, K. Fithriasari, M. Atok, Nur Iriawan","doi":"10.11113/MATEMATIKA.V34.N3.1142","DOIUrl":"https://doi.org/10.11113/MATEMATIKA.V34.N3.1142","url":null,"abstract":"Sentiment analysis is related to the automatic extraction of positive or negative opinions from the text. It is a special text mining application. It is important to classify implicit contents from citizen’s tweet using sentiment analysis. This research aimed to find out the opinion of infrastructure that sustained urban development in Surabaya, Indonesia’s second largest city. The procedures of text mining analysis were the data undergoes some preprocessing first, such as removing the link, retweet (RT), username, punctuation, digits, stopwords, case folding, and tokenizing. Then, the opinion was classified into positive and negative comments. Classification methods used in this research were support vector machine (SVM) and neural network (NN). The result of this research showed that NN classification method was better than SVM.","PeriodicalId":43733,"journal":{"name":"Matematika","volume":null,"pages":null},"PeriodicalIF":0.4,"publicationDate":"2018-12-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"42129491","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2018-12-31DOI: 10.11113/MATEMATIKA.V34.N3.1136
Siti Nor Asiah Isa, Nor'aini Aris, Shazirawati Mohd Puzi, Y. Hoe
This paper revisits the comrade matrix approach in finding the greatest common divisor (GCD) of two orthogonal polynomials. The present work investigates on the applications of the QR decomposition with iterative refinement (QRIR) to solve certain systems of linear equations which is generated from the comrade matrix. Besides iterative refinement, an alternative approach of improving the conditioning behavior of the coefficient matrix by normalizing its columns is also considered. As expected the results reveal that QRIR is able to improve the solutions given by QR decomposition while the normalization of the matrix entries do improves the conditioning behavior of the coefficient matrix leading to a good approximate solutions of the GCD.
{"title":"The Mechanization of the Comrade Matrix Approach in Determining the GCD of Orthogonal Polynomials","authors":"Siti Nor Asiah Isa, Nor'aini Aris, Shazirawati Mohd Puzi, Y. Hoe","doi":"10.11113/MATEMATIKA.V34.N3.1136","DOIUrl":"https://doi.org/10.11113/MATEMATIKA.V34.N3.1136","url":null,"abstract":"This paper revisits the comrade matrix approach in finding the greatest common divisor (GCD) of two orthogonal polynomials. The present work investigates on the applications of the QR decomposition with iterative refinement (QRIR) to solve certain systems of linear equations which is generated from the comrade matrix. Besides iterative refinement, an alternative approach of improving the conditioning behavior of the coefficient matrix by normalizing its columns is also considered. As expected the results reveal that QRIR is able to improve the solutions given by QR decomposition while the normalization of the matrix entries do improves the conditioning behavior of the coefficient matrix leading to a good approximate solutions of the GCD.","PeriodicalId":43733,"journal":{"name":"Matematika","volume":null,"pages":null},"PeriodicalIF":0.4,"publicationDate":"2018-12-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"41780475","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2018-12-31DOI: 10.11113/MATEMATIKA.V34.N3.1144
E. R. Putri, V. Tjahjono, D. B. Utomo
A deposit insurance is a measure to protect bank’s depositors fully or partly from the risk of losses caused by the banks failure to pay its debts when due. If the bank does not meet the payment since the asset value of the bank is less than debt, the guarantor will do the payment and take over the bank’s assets. The role of the guarantor is considered as a deposit insurance. Similar mechanism of the insurance to the European put option model, motivates the use of a Black-Scholes model in the valuation. The deposit insurance model is solved using a Fourier transform method analytically. Numerical results based on the solution confirms the results obtained by previous research. Also, some behaviours of the deposit insurance premium due to interest rate, volatility, and deposit-to-asset value ratio are presented.
{"title":"An Analytic Valuation of a Deposit Insurance","authors":"E. R. Putri, V. Tjahjono, D. B. Utomo","doi":"10.11113/MATEMATIKA.V34.N3.1144","DOIUrl":"https://doi.org/10.11113/MATEMATIKA.V34.N3.1144","url":null,"abstract":"A deposit insurance is a measure to protect bank’s depositors fully or partly from the risk of losses caused by the banks failure to pay its debts when due. If the bank does not meet the payment since the asset value of the bank is less than debt, the guarantor will do the payment and take over the bank’s assets. The role of the guarantor is considered as a deposit insurance. Similar mechanism of the insurance to the European put option model, motivates the use of a Black-Scholes model in the valuation. The deposit insurance model is solved using a Fourier transform method analytically. Numerical results based on the solution confirms the results obtained by previous research. Also, some behaviours of the deposit insurance premium due to interest rate, volatility, and deposit-to-asset value ratio are presented.","PeriodicalId":43733,"journal":{"name":"Matematika","volume":null,"pages":null},"PeriodicalIF":0.4,"publicationDate":"2018-12-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"45371638","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2018-12-31DOI: 10.11113/MATEMATIKA.V34.N3.1134
Hamizah Rashid, Fuaada Mohd Siam, N. Maan, W. N. Rahman
A mechanistic model has been used to explain the effect of radiation. Themodel consists of parameters which represent the biological process following ionizingradiation. The parameters in the model are estimated using local and global optimiza-tion algorithms. The aim of this study is to compare the efficiency between local andglobal optimization method, which is Pattern Search and Genetic Algorithm respectively.Experimental data from the cell survival of irradiated HeLa cell line is used to find theminimum value of the sum of squared error (SSE) between experimental data and sim-ulation data from the model. The performance of both methods are compared based onthe computational time and the value of the objective function, SSE. The optimizationprocess is carried out by using the built-in function in MATLAB software. The parameterestimation results show that genetic algorithm is more superior than pattern search forthis problem.
{"title":"Parameter Estimation for a Model of Ionizing Radiation Effects on Targeted Cells using Genetic Algorithm and Pattern Search Method","authors":"Hamizah Rashid, Fuaada Mohd Siam, N. Maan, W. N. Rahman","doi":"10.11113/MATEMATIKA.V34.N3.1134","DOIUrl":"https://doi.org/10.11113/MATEMATIKA.V34.N3.1134","url":null,"abstract":"A mechanistic model has been used to explain the effect of radiation. Themodel consists of parameters which represent the biological process following ionizingradiation. The parameters in the model are estimated using local and global optimiza-tion algorithms. The aim of this study is to compare the efficiency between local andglobal optimization method, which is Pattern Search and Genetic Algorithm respectively.Experimental data from the cell survival of irradiated HeLa cell line is used to find theminimum value of the sum of squared error (SSE) between experimental data and sim-ulation data from the model. The performance of both methods are compared based onthe computational time and the value of the objective function, SSE. The optimizationprocess is carried out by using the built-in function in MATLAB software. The parameterestimation results show that genetic algorithm is more superior than pattern search forthis problem.","PeriodicalId":43733,"journal":{"name":"Matematika","volume":null,"pages":null},"PeriodicalIF":0.4,"publicationDate":"2018-12-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"44262369","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2018-12-31DOI: 10.11113/MATEMATIKA.V34.N3.1148
Muhammad Fauzee Hamdan, A. Jemain, Shariffah Suraya Syed Jamaludin
Rainfall is an interesting phenomenon to investigate since it is directly related to all aspects of life on earth. One of the important studies is to investigate and understand the rainfall patterns that occur throughout the year. To identify the pattern, it requires a rainfall curve to represent daily observation of rainfall received during the year. Functional data analysis methods are capable to convert discrete data intoa function that can represent the rainfall curve and as a result, try to describe the hidden patterns of the rainfall. This study focused on the distribution of daily rainfall amount using functional data analysis. Fourier basis functions are used for periodic rainfall data. Generalized cross-validation showed 123 basis functions were sufficient to describe the pattern of daily rainfall amount. North and west areas of the peninsula show a significant bimodal pattern with the curve decline between two peaks at the mid-year. Meanwhile,the east shows uni-modal patterns that reached a peak in the last three months. Southern areas show more uniform trends throughout the year. Finally, the functional spatial method is introduced to overcome the problem of estimating the rainfall curve in the locations with no data recorded. We use a leave one out cross-validation as a verification method to compare between the real curve and the predicted curve. We used coefficient of basis functions to get the predicted curve. It was foundthatthe methods ofspatial prediction can match up with the existing spatial prediction methods in terms of accuracy,but it is better as the new approach provides a simpler calculation.
{"title":"Estimation of Rainfall Curve by using Functional Data Analysis and Ordinary Kriging Approach","authors":"Muhammad Fauzee Hamdan, A. Jemain, Shariffah Suraya Syed Jamaludin","doi":"10.11113/MATEMATIKA.V34.N3.1148","DOIUrl":"https://doi.org/10.11113/MATEMATIKA.V34.N3.1148","url":null,"abstract":"Rainfall is an interesting phenomenon to investigate since it is directly related to all aspects of life on earth. One of the important studies is to investigate and understand the rainfall patterns that occur throughout the year. To identify the pattern, it requires a rainfall curve to represent daily observation of rainfall received during the year. Functional data analysis methods are capable to convert discrete data intoa function that can represent the rainfall curve and as a result, try to describe the hidden patterns of the rainfall. This study focused on the distribution of daily rainfall amount using functional data analysis. Fourier basis functions are used for periodic rainfall data. Generalized cross-validation showed 123 basis functions were sufficient to describe the pattern of daily rainfall amount. North and west areas of the peninsula show a significant bimodal pattern with the curve decline between two peaks at the mid-year. Meanwhile,the east shows uni-modal patterns that reached a peak in the last three months. Southern areas show more uniform trends throughout the year. Finally, the functional spatial method is introduced to overcome the problem of estimating the rainfall curve in the locations with no data recorded. We use a leave one out cross-validation as a verification method to compare between the real curve and the predicted curve. We used coefficient of basis functions to get the predicted curve. It was foundthatthe methods ofspatial prediction can match up with the existing spatial prediction methods in terms of accuracy,but it is better as the new approach provides a simpler calculation.","PeriodicalId":43733,"journal":{"name":"Matematika","volume":null,"pages":null},"PeriodicalIF":0.4,"publicationDate":"2018-12-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"44133230","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2018-12-31DOI: 10.11113/MATEMATIKA.V34.N3.1138
Norshela Mohd Noh, A. Bahar, Z. Zainuddin
Recently, oil refining industry is facing with lower profit margin due to uncertainty. This causes oil refinery to include stochastic optimization in making a decision to maximize the profit. In the past, deterministic linear programming approach is widely used in oil refinery optimization problems. However, due to volatility and unpredictability of oil prices in the past ten years, deterministic model might not be able to predict the reality of the situation as it does not take into account the uncertainties thus, leads to non-optimal solution. Therefore, this study will develop two-stage stochastic linear programming for the midterm production planning of oil refinery to handle oil price volatility. Geometric Brownian motion (GBM) is used to describe uncertainties in crude oil price, petroleum product prices, and demand for petroleum products. This model generates the future realization of the price and demands with scenario tree based on the statistical specification of GBM using method of moment as input to the stochastic programming. The model developed in this paper was tested for Malaysia oil refinery data. The result of stochastic approach indicates that the model gives better prediction of profit margin.
{"title":"Scenario Based Two-Stage Stochastic Programming Approach for the Midterm Production Planning of Oil Refinery","authors":"Norshela Mohd Noh, A. Bahar, Z. Zainuddin","doi":"10.11113/MATEMATIKA.V34.N3.1138","DOIUrl":"https://doi.org/10.11113/MATEMATIKA.V34.N3.1138","url":null,"abstract":"Recently, oil refining industry is facing with lower profit margin due to uncertainty. This causes oil refinery to include stochastic optimization in making a decision to maximize the profit. In the past, deterministic linear programming approach is widely used in oil refinery optimization problems. However, due to volatility and unpredictability of oil prices in the past ten years, deterministic model might not be able to predict the reality of the situation as it does not take into account the uncertainties thus, leads to non-optimal solution. Therefore, this study will develop two-stage stochastic linear programming for the midterm production planning of oil refinery to handle oil price volatility. Geometric Brownian motion (GBM) is used to describe uncertainties in crude oil price, petroleum product prices, and demand for petroleum products. This model generates the future realization of the price and demands with scenario tree based on the statistical specification of GBM using method of moment as input to the stochastic programming. The model developed in this paper was tested for Malaysia oil refinery data. The result of stochastic approach indicates that the model gives better prediction of profit margin.","PeriodicalId":43733,"journal":{"name":"Matematika","volume":null,"pages":null},"PeriodicalIF":0.4,"publicationDate":"2018-12-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"45444190","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2018-12-31DOI: 10.11113/MATEMATIKA.V34.N3.1141
N. Cahyani, K. Fithriasari, Irhamah Irhamah, Nur Iriawan
Neural Network and Binary Logistic Regression are modern and classical data mining analysis tools that can be used to classify data on Bidikmisi scholarship acceptance in East Java Province, Indonesia. One form of Neural Network model available for various applications is the Resilient Backpropagation Neural Network (Resilient BPNN). This study aims to compare the performance of the Resilient BPNN method as a Deep Learning Neural Network and Binary Logistic Regression method in determining the classification of Bidikmisi scholarship acceptance in East Java Province. After preprocessing data and dividing them into two parts, i.e. sets of testing and training data, with 10-foldcross-validation procedure, the Resilient BPNN and Binary Logistic Regression methods are implemented. The result shows that Resilient BPNN with two hidden layers is the best platformnetwork model. The classificationG-mean resulted by these both methods is that Resilient BPNN with two hidden layers is more representative with better performance than Binary Logistic Regression. The Resilient BPNN is recommended to be used topredict acceptance of Bidikmisi applicants yearly.
{"title":"On the Comparison of Deep Learning Neural Network and Binary Logistic Regression for Classifying the Acceptance Status of Bidikmisi Scholarship Applicants in East Java","authors":"N. Cahyani, K. Fithriasari, Irhamah Irhamah, Nur Iriawan","doi":"10.11113/MATEMATIKA.V34.N3.1141","DOIUrl":"https://doi.org/10.11113/MATEMATIKA.V34.N3.1141","url":null,"abstract":"Neural Network and Binary Logistic Regression are modern and classical data mining analysis tools that can be used to classify data on Bidikmisi scholarship acceptance in East Java Province, Indonesia. One form of Neural Network model available for various applications is the Resilient Backpropagation Neural Network (Resilient BPNN). This study aims to compare the performance of the Resilient BPNN method as a Deep Learning Neural Network and Binary Logistic Regression method in determining the classification of Bidikmisi scholarship acceptance in East Java Province. After preprocessing data and dividing them into two parts, i.e. sets of testing and training data, with 10-foldcross-validation procedure, the Resilient BPNN and Binary Logistic Regression methods are implemented. The result shows that Resilient BPNN with two hidden layers is the best platformnetwork model. The classificationG-mean resulted by these both methods is that Resilient BPNN with two hidden layers is more representative with better performance than Binary Logistic Regression. The Resilient BPNN is recommended to be used topredict acceptance of Bidikmisi applicants yearly.","PeriodicalId":43733,"journal":{"name":"Matematika","volume":null,"pages":null},"PeriodicalIF":0.4,"publicationDate":"2018-12-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"41867698","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2018-12-31DOI: 10.11113/MATEMATIKA.V34.N3.1137
N. Awang, N. Maan, Dasuki Sul’ain
Tumour cells behave differently than normal cells in the body. They grow and divide in an uncontrolled manner (actively proliferating) and fail to respond to signal. However, there are cells that become inactive and reside in quiescent phase (G0). These cells are known as quiescence cells that are less sensitive to drug treatments (radiotherapy and chemotherapy) than actively proliferation cells. This paper proposes a new mathematical model that describes the interaction of tumour growth and immune response by considering tumour population that is divided into three different phases namely interphase, mitosis and G0. The model consists of a system of delay differential equations where the delay, represents the time for tumour cell to reside interphase before entering mitosis phase. Stability analysis of the equilibrium points of the system was performed to determine the dynamics behaviour of system. Result showed that the tumour population depends on number of tumour cells that enter active (interphase and mitosis) and G0phases. This study is important for treatment planning since tumour cell can resist treatment when they refuge in a quiescent state.
{"title":"Tumour-Immune Interaction Model with Cell Cycle Effects Including G0 Phase","authors":"N. Awang, N. Maan, Dasuki Sul’ain","doi":"10.11113/MATEMATIKA.V34.N3.1137","DOIUrl":"https://doi.org/10.11113/MATEMATIKA.V34.N3.1137","url":null,"abstract":"Tumour cells behave differently than normal cells in the body. They grow and divide in an uncontrolled manner (actively proliferating) and fail to respond to signal. However, there are cells that become inactive and reside in quiescent phase (G0). These cells are known as quiescence cells that are less sensitive to drug treatments (radiotherapy and chemotherapy) than actively proliferation cells. This paper proposes a new mathematical model that describes the interaction of tumour growth and immune response by considering tumour population that is divided into three different phases namely interphase, mitosis and G0. The model consists of a system of delay differential equations where the delay, represents the time for tumour cell to reside interphase before entering mitosis phase. Stability analysis of the equilibrium points of the system was performed to determine the dynamics behaviour of system. Result showed that the tumour population depends on number of tumour cells that enter active (interphase and mitosis) and G0phases. This study is important for treatment planning since tumour cell can resist treatment when they refuge in a quiescent state.","PeriodicalId":43733,"journal":{"name":"Matematika","volume":null,"pages":null},"PeriodicalIF":0.4,"publicationDate":"2018-12-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"46656305","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2018-12-31DOI: 10.11113/MATEMATIKA.V34.N3.1139
S. Suhartono, M. Gazali, D. Prastyo
VARX and GSTARX models are an extension of Vector Autoregressive (VAR) and Generalized Space-Time Autoregressive (GSTAR) models. These models include exogenous variable to increase the forecast accuracy. The objective of this research is to develop and compare the forecast accuracy of VARX and GSTARX models in predicting currency inflow and outflow in Bali, West Nusa Tenggara, and East Nusa Tenggara that contain multiple calendar variations effects. The exogenous variables that are used in this research are holidays in those three locations, i.e. EidFitr, Galungan, and Nyepi. The proposed VARX and GSTARX models are evaluated through simulation studies on the data that contain trend, seasonality, and multiple calendar variations representing the occurrence of EidFitr, Galungan, and Nyepi. The criteria for selecting the best forecasting model is Root Mean Square Error (RMSE). The results of a simulation study show that VARX and GSTARX models provide similar forecast accuracy. Furthermore, the results of currency inflow and outflow data in Bali,West Nusa Tenggara, and East Nusa Tenggara show that the best model for forecasting inflow and outflow in these three locations are VARX and GSTARX (with uniform weight) model, respectively. Both models show that currency inflow and outflow in Bali, West Nusa Tenggara, and East Nusa Tenggara have a relationship in space and time, and contain trends, seasonality and multiple calendar variations.
{"title":"VARX and GSTARX Models for Forecasting Currency Inflow and Outflow with Multiple Calendar Variations Effect","authors":"S. Suhartono, M. Gazali, D. Prastyo","doi":"10.11113/MATEMATIKA.V34.N3.1139","DOIUrl":"https://doi.org/10.11113/MATEMATIKA.V34.N3.1139","url":null,"abstract":"VARX and GSTARX models are an extension of Vector Autoregressive (VAR) and Generalized Space-Time Autoregressive (GSTAR) models. These models include exogenous variable to increase the forecast accuracy. The objective of this research is to develop and compare the forecast accuracy of VARX and GSTARX models in predicting currency inflow and outflow in Bali, West Nusa Tenggara, and East Nusa Tenggara that contain multiple calendar variations effects. The exogenous variables that are used in this research are holidays in those three locations, i.e. EidFitr, Galungan, and Nyepi. The proposed VARX and GSTARX models are evaluated through simulation studies on the data that contain trend, seasonality, and multiple calendar variations representing the occurrence of EidFitr, Galungan, and Nyepi. The criteria for selecting the best forecasting model is Root Mean Square Error (RMSE). The results of a simulation study show that VARX and GSTARX models provide similar forecast accuracy. Furthermore, the results of currency inflow and outflow data in Bali,West Nusa Tenggara, and East Nusa Tenggara show that the best model for forecasting inflow and outflow in these three locations are VARX and GSTARX (with uniform weight) model, respectively. Both models show that currency inflow and outflow in Bali, West Nusa Tenggara, and East Nusa Tenggara have a relationship in space and time, and contain trends, seasonality and multiple calendar variations.","PeriodicalId":43733,"journal":{"name":"Matematika","volume":null,"pages":null},"PeriodicalIF":0.4,"publicationDate":"2018-12-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"44744385","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2018-12-31DOI: 10.11113/MATEMATIKA.V34.N3.1140
D. Prastyo, Yurike Nurmala Rucy, Advendos D.C. Sigalingging, S. Suhartono, S. Fam
Coxmodel is popular in survival analysis. In the case of time-varying covariate; several subject-specific attributes possibly to change more frequently than others. This paper deals with that issue. This study aims to analyze survival data with time-varying covariate using a time-dependent covariate Cox model. The two case studies employed in this work are (1) delisting time of companies from IDX and (2) delisting time of company from LQ45 (liquidity index). The survival time is the time until a company is delisted from IDX or LQ45. The determinants are eighteen quarterly financial ratios and two macroeconomics indicators, i.e., the Jakarta Composite Index (JCI) and BI interest rate that changes more frequent. The empirical results show that JCI is significant for both delisting and liquidity whereas BI rate is significant only for liquidity. The significant firm-specific financial ratios vary for delisting and liquidity.
{"title":"Micro and Macro Determinants of Delisting and Liquidity in Indonesian Stock Market: A Time-Dependent Covariate of Survival Cox Approach","authors":"D. Prastyo, Yurike Nurmala Rucy, Advendos D.C. Sigalingging, S. Suhartono, S. Fam","doi":"10.11113/MATEMATIKA.V34.N3.1140","DOIUrl":"https://doi.org/10.11113/MATEMATIKA.V34.N3.1140","url":null,"abstract":"Coxmodel is popular in survival analysis. In the case of time-varying covariate; several subject-specific attributes possibly to change more frequently than others. This paper deals with that issue. This study aims to analyze survival data with time-varying covariate using a time-dependent covariate Cox model. The two case studies employed in this work are (1) delisting time of companies from IDX and (2) delisting time of company from LQ45 (liquidity index). The survival time is the time until a company is delisted from IDX or LQ45. The determinants are eighteen quarterly financial ratios and two macroeconomics indicators, i.e., the Jakarta Composite Index (JCI) and BI interest rate that changes more frequent. The empirical results show that JCI is significant for both delisting and liquidity whereas BI rate is significant only for liquidity. The significant firm-specific financial ratios vary for delisting and liquidity.","PeriodicalId":43733,"journal":{"name":"Matematika","volume":null,"pages":null},"PeriodicalIF":0.4,"publicationDate":"2018-12-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"44524879","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}