Abstract In this paper, we study a pair of unified and extended fractional integral operator involving the multivariable I-functions and general class of multivariable polynomials. Here, we use Mellin transforms to obtain our main results. Certain properties of these operators concerning to their Mellin-transforms have been investigated. On account of the general nature of the functions involved herein, a large number of known (may be new also) fractional integral operators involved simpler functions can be obtained. We will also quote the particular case of the multivariable H-function.
{"title":"Fractional calculus pertaining to multivariable I-function defined by Prathima","authors":"D. Kumar, F. Ayant","doi":"10.2478/jamsi-2019-0009","DOIUrl":"https://doi.org/10.2478/jamsi-2019-0009","url":null,"abstract":"Abstract In this paper, we study a pair of unified and extended fractional integral operator involving the multivariable I-functions and general class of multivariable polynomials. Here, we use Mellin transforms to obtain our main results. Certain properties of these operators concerning to their Mellin-transforms have been investigated. On account of the general nature of the functions involved herein, a large number of known (may be new also) fractional integral operators involved simpler functions can be obtained. We will also quote the particular case of the multivariable H-function.","PeriodicalId":43016,"journal":{"name":"Journal of Applied Mathematics Statistics and Informatics","volume":"15 1","pages":"61 - 73"},"PeriodicalIF":0.3,"publicationDate":"2019-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"44879941","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Abstract In recent years, a lot of effort has been put into finding suitable mathematical models that fit historical data set. Such models often include coefficients and the accuracy of data approximation depends on them. So the goal is to choose the unknown coefficients to achieve the best possible approximation of data by the corresponding solution of the model. One of the standard methods for coefficient estimation is the least square method. This can provide us data approximation but it can also serve as a starting method for further minimizations such as Matlab function fminsearch.
{"title":"Data approximation using Lotka-Volterra models and a software minimization function","authors":"Michal Feckan, J. Pacuta","doi":"10.2478/jamsi-2019-0005","DOIUrl":"https://doi.org/10.2478/jamsi-2019-0005","url":null,"abstract":"Abstract In recent years, a lot of effort has been put into finding suitable mathematical models that fit historical data set. Such models often include coefficients and the accuracy of data approximation depends on them. So the goal is to choose the unknown coefficients to achieve the best possible approximation of data by the corresponding solution of the model. One of the standard methods for coefficient estimation is the least square method. This can provide us data approximation but it can also serve as a starting method for further minimizations such as Matlab function fminsearch.","PeriodicalId":43016,"journal":{"name":"Journal of Applied Mathematics Statistics and Informatics","volume":"15 1","pages":"14 - 5"},"PeriodicalIF":0.3,"publicationDate":"2019-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"45617024","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Abstract We are interested in comparing the performance of various nonlinear estimators of parameters of the standard nonlinear regression model. While the standard nonlinear least squares estimator is vulnerable to the presence of outlying measurements in the data, there exist several robust alternatives. However, it is not clear which estimator should be used for a given dataset and this question remains extremely difficult (or perhaps infeasible) to be answered theoretically. Metalearning represents a computationally intensive methodology for optimal selection of algorithms (or methods) and is used here to predict the most suitable nonlinear estimator for a particular dataset. The classification rule is learned over a training database of 24 publicly available datasets. The results of the primary learning give an interesting argument in favor of the nonlinear least weighted squares estimator, which turns out to be the most suitable one for the majority of datasets. The subsequent metalearning reveals that tests of normality and heteroscedasticity play a crucial role in finding the most suitable nonlinear estimator.
{"title":"Statistical learning for recommending (robust) nonlinear regression methods","authors":"J. Kalina, J. Tichavský","doi":"10.2478/jamsi-2019-0008","DOIUrl":"https://doi.org/10.2478/jamsi-2019-0008","url":null,"abstract":"Abstract We are interested in comparing the performance of various nonlinear estimators of parameters of the standard nonlinear regression model. While the standard nonlinear least squares estimator is vulnerable to the presence of outlying measurements in the data, there exist several robust alternatives. However, it is not clear which estimator should be used for a given dataset and this question remains extremely difficult (or perhaps infeasible) to be answered theoretically. Metalearning represents a computationally intensive methodology for optimal selection of algorithms (or methods) and is used here to predict the most suitable nonlinear estimator for a particular dataset. The classification rule is learned over a training database of 24 publicly available datasets. The results of the primary learning give an interesting argument in favor of the nonlinear least weighted squares estimator, which turns out to be the most suitable one for the majority of datasets. The subsequent metalearning reveals that tests of normality and heteroscedasticity play a crucial role in finding the most suitable nonlinear estimator.","PeriodicalId":43016,"journal":{"name":"Journal of Applied Mathematics Statistics and Informatics","volume":"15 1","pages":"47 - 59"},"PeriodicalIF":0.3,"publicationDate":"2019-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"44540628","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Abstract We propose an approach for real-time physically semi-realistic animation of strings which directly manipulates the string positions by position based dynamics. The main advantage of a position based dynamics is its controllability. Instability problems of explicit integration schemes can be avoided. Specifically, we offer the following three contributions. We introduce the non-elongating and non-stretchable mass spring dynamics model based on Position Based Dynamics to simulate 1D string. We introduce a method for propagating the twisting angle along the chain of segments. In addition, we solve collision constraints by regularly distributing the spheres along the chain segments followed by particle projection to validate the positions. Proposed strain limiting constraint can handle the strings fixed in multiple locations contrary to single fixed side as is common for hair models. The use of multiple constraints provides an efficient treatment for stiff twisting and non-stretchable mass spring dynamics model.
{"title":"Towards the non-stretchable and non-elongating string with stress-strain handling","authors":"R. Durikovic, E. Siebenstich","doi":"10.2478/jamsi-2019-0007","DOIUrl":"https://doi.org/10.2478/jamsi-2019-0007","url":null,"abstract":"Abstract We propose an approach for real-time physically semi-realistic animation of strings which directly manipulates the string positions by position based dynamics. The main advantage of a position based dynamics is its controllability. Instability problems of explicit integration schemes can be avoided. Specifically, we offer the following three contributions. We introduce the non-elongating and non-stretchable mass spring dynamics model based on Position Based Dynamics to simulate 1D string. We introduce a method for propagating the twisting angle along the chain of segments. In addition, we solve collision constraints by regularly distributing the spheres along the chain segments followed by particle projection to validate the positions. Proposed strain limiting constraint can handle the strings fixed in multiple locations contrary to single fixed side as is common for hair models. The use of multiple constraints provides an efficient treatment for stiff twisting and non-stretchable mass spring dynamics model.","PeriodicalId":43016,"journal":{"name":"Journal of Applied Mathematics Statistics and Informatics","volume":"15 1","pages":"29 - 46"},"PeriodicalIF":0.3,"publicationDate":"2019-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"42328534","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A Comparative Study on Financial Statements Preparing under CAS and IFRS in Cambodia","authors":"Vannida Kimrong","doi":"10.22457/jmi.141av16a9","DOIUrl":"https://doi.org/10.22457/jmi.141av16a9","url":null,"abstract":"","PeriodicalId":43016,"journal":{"name":"Journal of Applied Mathematics Statistics and Informatics","volume":"54 1","pages":""},"PeriodicalIF":0.3,"publicationDate":"2019-06-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"80414628","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Abstract. In this work, we have presented a novel method to f ind the numerical solution of the linear Ferdholm integral equation of second kind. This method is based on the Taylor series multiplied by an exponential function to approximate the kernel as a summation of multiplication functions. The presente d method has high accurate when compare its results with the other numerical method s results.
{"title":"Modified Numerical Method for Solving Fredholm Integral Equations","authors":"M. H. Suhhiem, M. Lafta","doi":"10.22457/jmi.136av16a6","DOIUrl":"https://doi.org/10.22457/jmi.136av16a6","url":null,"abstract":"Abstract. In this work, we have presented a novel method to f ind the numerical solution of the linear Ferdholm integral equation of second kind. This method is based on the Taylor series multiplied by an exponential function to approximate the kernel as a summation of multiplication functions. The presente d method has high accurate when compare its results with the other numerical method s results.","PeriodicalId":43016,"journal":{"name":"Journal of Applied Mathematics Statistics and Informatics","volume":"42 1","pages":""},"PeriodicalIF":0.3,"publicationDate":"2019-06-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"78844040","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
When the traditional Hough transform based on circl e detection locates the human iris, it involves a three-dimensional paramet er space, so there is a shortage of computational time and space overhead. Aiming at th is problem, a Hough transform circle detection algorithm using gradient to reduce the sp atial dimension of parameters is proposed. Firstly, the image is preprocessed by mat hematical morphology to reduce noise and eyelash interference. Secondly, the ant colony optimization algorithm is used to preprocess the image. Edge extraction is performed to reduce the number of points participating in the Hough transform. Finally, the improved Hough transform is used to locate the iris. The high-quality and low-quality i mages are used to compare the traditional Hough transform method and the literature [13] meth od. The results show that the method not only improves the positioning speed, but also i mproves the positioning accuracy. Compared with other methods, the image quality is i mproved. The requirements are also significantly reduced.
{"title":"Human Iris Localization Combined with Ant Colony and Improved Hough Circle Detection","authors":"Jinhui Gong, Guicang Zhang, Kai Wang","doi":"10.22457/jmi.138av16a3","DOIUrl":"https://doi.org/10.22457/jmi.138av16a3","url":null,"abstract":"When the traditional Hough transform based on circl e detection locates the human iris, it involves a three-dimensional paramet er space, so there is a shortage of computational time and space overhead. Aiming at th is problem, a Hough transform circle detection algorithm using gradient to reduce the sp atial dimension of parameters is proposed. Firstly, the image is preprocessed by mat hematical morphology to reduce noise and eyelash interference. Secondly, the ant colony optimization algorithm is used to preprocess the image. Edge extraction is performed to reduce the number of points participating in the Hough transform. Finally, the improved Hough transform is used to locate the iris. The high-quality and low-quality i mages are used to compare the traditional Hough transform method and the literature [13] meth od. The results show that the method not only improves the positioning speed, but also i mproves the positioning accuracy. Compared with other methods, the image quality is i mproved. The requirements are also significantly reduced.","PeriodicalId":43016,"journal":{"name":"Journal of Applied Mathematics Statistics and Informatics","volume":"8 1","pages":""},"PeriodicalIF":0.3,"publicationDate":"2019-06-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"83989254","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
This paper is based on the situation of whether the re is a psychological gap (degree of regret and claim willingness) between di fferent types of products (high price or low price) and price discount promotion. This pa per uses empirical research to understand the consumer's reaction and make corresp nding conclusions, and then gives marketing Suggestions. The results show that the in t raction between income and discount presents a significant negative correlatio n between the two dimensions of the psychological gap (degree of regret and claim willi ngness).
{"title":"Research on the Influence of e-commerce Discount Promotion on Consumer's Psychological Gap","authors":"Lianghao Yu, Jing Liu, J. Chen, C. Zhou","doi":"10.22457/jmi.144av16a10","DOIUrl":"https://doi.org/10.22457/jmi.144av16a10","url":null,"abstract":"This paper is based on the situation of whether the re is a psychological gap (degree of regret and claim willingness) between di fferent types of products (high price or low price) and price discount promotion. This pa per uses empirical research to understand the consumer's reaction and make corresp nding conclusions, and then gives marketing Suggestions. The results show that the in t raction between income and discount presents a significant negative correlatio n between the two dimensions of the psychological gap (degree of regret and claim willi ngness).","PeriodicalId":43016,"journal":{"name":"Journal of Applied Mathematics Statistics and Informatics","volume":"43 1","pages":""},"PeriodicalIF":0.3,"publicationDate":"2019-06-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"81499419","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Measuring Productivity of Construction Industry in Vietnam: Based on Data Envelopment Analysis (DEA) Method from 2007-2016","authors":"Li Wei, Le Nhi","doi":"10.22457/jmi.137av16a2","DOIUrl":"https://doi.org/10.22457/jmi.137av16a2","url":null,"abstract":"","PeriodicalId":43016,"journal":{"name":"Journal of Applied Mathematics Statistics and Informatics","volume":"14 1","pages":""},"PeriodicalIF":0.3,"publicationDate":"2019-06-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"75115754","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}