Pub Date : 2014-09-04DOI: 10.5176/2010-2283_1.2.32
Sumitha Ch
Pattern recognition is the act of taking in raw data and taking an action based on the category of the pattern. DNA pattern recognition has applications in almost any field. It has applications in forensics, genetic engineering, bio informatics, DNA nanotechnology, history and so on. The size of the DNA molecules can be very large that it is a tedious task to perform pattern recognition for the same using common techniques. Hence this paper describes the pattern recognition for DNA molecules using the concept of Turing Machines. It also performs a simulation of the standard Turing Machine that performs DNA pattern recognition on the Universal Turing Machine.
{"title":"Implementation of DNA Pattern Recognition in Turing Machines","authors":"Sumitha Ch","doi":"10.5176/2010-2283_1.2.32","DOIUrl":"https://doi.org/10.5176/2010-2283_1.2.32","url":null,"abstract":"Pattern recognition is the act of taking in raw data and taking an action based on the category of the pattern. DNA pattern recognition has applications in almost any field. It has applications in forensics, genetic engineering, bio informatics, DNA nanotechnology, history and so on. The size of the DNA molecules can be very large that it is a tedious task to perform pattern recognition for the same using common techniques. Hence this paper describes the pattern recognition for DNA molecules using the concept of Turing Machines. It also performs a simulation of the standard Turing Machine that performs DNA pattern recognition on the Universal Turing Machine.","PeriodicalId":91079,"journal":{"name":"GSTF international journal on computing","volume":"30 3 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2014-09-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"85765079","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2014-09-04DOI: 10.5176/2010-2283_1.2.30
T. Ahmad, Huda Ibrahim, S. M. Yusof
Many scholars have listed the problems that prevent organizations’ employees to attend face to face training methods. Additionally, they have presented Information and Communication Technology (ICT) especially distance learning system as important way to overcome these obstacles. However, they did not depend on empirical studies to mention those problems and to compare between traditional training method and applying computer-based distance learning system. Therefore, this survey aims to distinguish between the traditional training methods and computer-based distance learning system as an important way to overcome employees’ problems with traditional training, including the challenges and some issues.
{"title":"Issues and Challenges in Applying Computer-Based Distance Learning system as an alternative to traditional training methods","authors":"T. Ahmad, Huda Ibrahim, S. M. Yusof","doi":"10.5176/2010-2283_1.2.30","DOIUrl":"https://doi.org/10.5176/2010-2283_1.2.30","url":null,"abstract":"Many scholars have listed the problems that prevent organizations’ employees to attend face to face training methods. Additionally, they have presented Information and Communication Technology (ICT) especially distance learning system as important way to overcome these obstacles. However, they did not depend on empirical studies to mention those problems and to compare between traditional training method and applying computer-based distance learning system. Therefore, this survey aims to distinguish between the traditional training methods and computer-based distance learning system as an important way to overcome employees’ problems with traditional training, including the challenges and some issues.","PeriodicalId":91079,"journal":{"name":"GSTF international journal on computing","volume":"13 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2014-09-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"77068643","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2014-09-04DOI: 10.5176/2010-2283_1.2.35
H. Ranganath, A. Bhatnagar
For over a decade, the Pulse Coupled Neural Network (PCNN) based algorithms have been used for image segmentation. Though there are several versions of the PCNN based image segmentation methods, almost all of them use singlelayer PCNN with excitatory linking inputs. There are four major issues associated with the single-burst PCNN which need attention. Often, the PCNN parameters including the linking coefficient are determined by trial and error. The segmentation accuracy of the single-layer PCNN is highly sensitive to the value of the linking coefficient. Finally, in the single-burst mode, neurons corresponding to background pixels do not participate in the segmentation process. This paper presents a new 2-layer network organization of PCNN in which excitatory and inhibitory linking inputs exist. The value of the linking coefficient and the threshold signal at which primary firing of neurons start are determined directly from the image statistics. Simulation results show that the new PCNN achieves significant improvement in the segmentation accuracy over the widely known Kuntimad’s single burst image segmentation approach. The two-layer PCNN based image segmentation method overcomes all three drawbacks of the single-layer PCNN.
{"title":"Image Segmentation using Two-Layer Pulse Coupled Neural Network with Inhibitory Linking Field","authors":"H. Ranganath, A. Bhatnagar","doi":"10.5176/2010-2283_1.2.35","DOIUrl":"https://doi.org/10.5176/2010-2283_1.2.35","url":null,"abstract":"For over a decade, the Pulse Coupled Neural Network (PCNN) based algorithms have been used for image segmentation. Though there are several versions of the PCNN based image segmentation methods, almost all of them use singlelayer PCNN with excitatory linking inputs. There are four major issues associated with the single-burst PCNN which need attention. Often, the PCNN parameters including the linking coefficient are determined by trial and error. The segmentation accuracy of the single-layer PCNN is highly sensitive to the value of the linking coefficient. Finally, in the single-burst mode, neurons corresponding to background pixels do not participate in the segmentation process. This paper presents a new 2-layer network organization of PCNN in which excitatory and inhibitory linking inputs exist. The value of the linking coefficient and the threshold signal at which primary firing of neurons start are determined directly from the image statistics. Simulation results show that the new PCNN achieves significant improvement in the segmentation accuracy over the widely known Kuntimad’s single burst image segmentation approach. The two-layer PCNN based image segmentation method overcomes all three drawbacks of the single-layer PCNN.","PeriodicalId":91079,"journal":{"name":"GSTF international journal on computing","volume":"1 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2014-09-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"82903398","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
This work examines how a predominantly graphical approach to software development, that was designed to be deployment platform agnostic, can be used to target embedded software systems. The general aim of the approach was to provide engineers with a development method that was general enough to be applied across a multitude of problem domains. The development technique employs a component centric approach, in which target platform specifics are hidden from the language design. Deployment specific mapping tools are then used to target each type of system. Embedded software systems however are probably the most demanding type of target system, due to limited resources and lack of software infrastructure support. This paper describes a method of mapping an example component based design to a target embedded system.
{"title":"Generating Embedded Systems Software using a Component Based Development Approach","authors":"Mark Dixon","doi":"10.1037/e525192013-013","DOIUrl":"https://doi.org/10.1037/e525192013-013","url":null,"abstract":"This work examines how a predominantly graphical approach to software development, that was designed to be deployment platform agnostic, can be used to target embedded software systems. The general aim of the approach was to provide engineers with a development method that was general enough to be applied across a multitude of problem domains. The development technique employs a component centric approach, in which target platform specifics are hidden from the language design. Deployment specific mapping tools are then used to target each type of system. Embedded software systems however are probably the most demanding type of target system, due to limited resources and lack of software infrastructure support. This paper describes a method of mapping an example component based design to a target embedded system.","PeriodicalId":91079,"journal":{"name":"GSTF international journal on computing","volume":"53 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2014-09-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"83315614","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Patrick Charland, Geneviève Allaire-Duquette, Pierre-Majorique Léger
This paper explores the potential of collecting neurophysiological data in order to further understand user’s learning experience. The experimental setup involves collecting electroencephalographic signal (EEG) from the brain cortex to infer users’ cognitive state while they played an educational video game designed to support the learning of Newtonian mechanics. Preliminary results suggest that this neuroscience perspective is quite promising in the idea of quantitatively characterizing users’ learning experience. This could be an innovative and promising avenue in general game development or in educational videogame research field.
{"title":"Collecting neurophysiological data to investigate users’ cognitive states during game play","authors":"Patrick Charland, Geneviève Allaire-Duquette, Pierre-Majorique Léger","doi":"10.1037/e525192013-005","DOIUrl":"https://doi.org/10.1037/e525192013-005","url":null,"abstract":"This paper explores the potential of collecting neurophysiological data in order to further understand user’s learning experience. The experimental setup involves collecting electroencephalographic signal (EEG) from the brain cortex to infer users’ cognitive state while they played an educational video game designed to support the learning of Newtonian mechanics. Preliminary results suggest that this neuroscience perspective is quite promising in the idea of quantitatively characterizing users’ learning experience. This could be an innovative and promising avenue in general game development or in educational videogame research field.","PeriodicalId":91079,"journal":{"name":"GSTF international journal on computing","volume":"1 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2014-09-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"78730386","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Enhancing the effectiveness of e-learning is an important topic today. Many factors influence the effectiveness of learning, among which time management has the most direct impact on self-paced learning. This study developed a calendar time-management module to record the learning process in an self-paced learning environment. After analyzing the learning modes, we extracted learners that displayed intensive learning towards the end of a course period. We implemented two types of time management modules on the extracted subjects: a countdown timer and a course schedule module, and then analyzed the influence of the time modules on the learners in self-paced learning. The objective was to promote diligence by helping learners to begin learning earlier in the course period. Our results demonstrate that the incorporation of the countdown timer and course schedule time-management modules altered the distribution of study times and prompted all of the learners to complete the reading of course materials. The countdown timer module presented a stronger correlation with the tendencies of time management and the use of the time modules. This indicates that learners who are sensitive to changing numbers are more likely to follow a set course. Overall, the time modules differed in the degree of impact according to the characteristics of learners; however, the use of time modules was proven to enhance the effectiveness of studying.
{"title":"Analyzing the Learning Modes of Learners using Time-Management Modules in Self-Paced Learning","authors":"Juin-Ling Tseng, Ing-Chyi Pai","doi":"10.1037/e525192013-011","DOIUrl":"https://doi.org/10.1037/e525192013-011","url":null,"abstract":"Enhancing the effectiveness of e-learning is an important topic today. Many factors influence the effectiveness of learning, among which time management has the most direct impact on self-paced learning. This study developed a calendar time-management module to record the learning process in an self-paced learning environment. After analyzing the learning modes, we extracted learners that displayed intensive learning towards the end of a course period. We implemented two types of time management modules on the extracted subjects: a countdown timer and a course schedule module, and then analyzed the influence of the time modules on the learners in self-paced learning. The objective was to promote diligence by helping learners to begin learning earlier in the course period. Our results demonstrate that the incorporation of the countdown timer and course schedule time-management modules altered the distribution of study times and prompted all of the learners to complete the reading of course materials. The countdown timer module presented a stronger correlation with the tendencies of time management and the use of the time modules. This indicates that learners who are sensitive to changing numbers are more likely to follow a set course. Overall, the time modules differed in the degree of impact according to the characteristics of learners; however, the use of time modules was proven to enhance the effectiveness of studying.","PeriodicalId":91079,"journal":{"name":"GSTF international journal on computing","volume":"297 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2014-09-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"89226791","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
On-Line Analytical Processing (OLAP) systems considerably ease the process of analyzing business data and have become widely used in industry. Such systems primarily employ multidimensional data models to structure their data. However, current multidimensional data models fall short in time and skills to model the complex data found in some real-world application domains. Multidimensional data Analysis is based on Measure, Dimensions and Hierarchies. Process to find them manually is very crucial and time consuming because large and complex data is involved across multiple regions, products, and employees. This paper presents an Intelligent Multidimensional modelling system which helps the modeller in building multidimensional model and provides working at logical level by hiding heterogeneousity of physical database. The paper proposes the process to identify Measures, Dimensions, and Hierarchies to generate multidimensional model.
{"title":"Intelligent Multidimensional Modelling","authors":"Swati Hira, P. Deshpande","doi":"10.1037/e525192013-016","DOIUrl":"https://doi.org/10.1037/e525192013-016","url":null,"abstract":"On-Line Analytical Processing (OLAP) systems considerably ease the process of analyzing business data and have become widely used in industry. Such systems primarily employ multidimensional data models to structure their data. However, current multidimensional data models fall short in time and skills to model the complex data found in some real-world application domains. Multidimensional data Analysis is based on Measure, Dimensions and Hierarchies. Process to find them manually is very crucial and time consuming because large and complex data is involved across multiple regions, products, and employees. This paper presents an Intelligent Multidimensional modelling system which helps the modeller in building multidimensional model and provides working at logical level by hiding heterogeneousity of physical database. The paper proposes the process to identify Measures, Dimensions, and Hierarchies to generate multidimensional model.","PeriodicalId":91079,"journal":{"name":"GSTF international journal on computing","volume":"35 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2014-09-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"75615849","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
This paper offers a new concept called in education called Artificial Education. Though the term artificial education might disturb many educators, parents and students, it is important to understand what it is and the potential it has for the educational success of all learners. Artificial education refers to using artificially intelligent systems, also known as expert systems, to educate students and teachers. This is a short introductory article on what artificial education refers to, and how intelligent or expert systems can assist students and teachers at the elementary level.
{"title":"Artificial Education: Expert systems used to assist and support 21st century education.","authors":"C. sora, A. S. Sora","doi":"10.1037/e525192013-002","DOIUrl":"https://doi.org/10.1037/e525192013-002","url":null,"abstract":"This paper offers a new concept called in education called Artificial Education. Though the term artificial education might disturb many educators, parents and students, it is important to understand what it is and the potential it has for the educational success of all learners. Artificial education refers to using artificially intelligent systems, also known as expert systems, to educate students and teachers. This is a short introductory article on what artificial education refers to, and how intelligent or expert systems can assist students and teachers at the elementary level.","PeriodicalId":91079,"journal":{"name":"GSTF international journal on computing","volume":"69 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2014-09-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"73351920","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Cloud is taking over the computing environment in both public as well as private sector. This has increased the use of service-oriented architecture (SOA) for the development of services later deployed in the Cloud. This paper presents a Cloud Security algorithm using SOA 3.0 for secured transactions on the data, which usually governments of countries like USA International Traffic in Arms Regulations (ITAR) and Export Administration Regulations (EAR) requires to be utilized and distributed only within United States by security cleared personal only. In this paper, we describe a novel algorithm and corresponding cloud service as Cloud Monitoring Gateway (CMG). The current service prototype simulates the behavior of actual Cloud Security Gateway Application (CSGA) using the algorithm called as TPALM (The Privacy Authentication Latency Management). This simulation is coarse-grained, but is capable of measuring the privacy authentication on the given variables of a legit user. We also present an evaluation of this service utilization on actual data.
{"title":"Cloud Computing Security for Organizations using Live Signature – TPALM Printing Client Service","authors":"Atif Farid Mohammad, S. E. Grant","doi":"10.1037/e525192013-004","DOIUrl":"https://doi.org/10.1037/e525192013-004","url":null,"abstract":"Cloud is taking over the computing environment in both public as well as private sector. This has increased the use of service-oriented architecture (SOA) for the development of services later deployed in the Cloud. This paper presents a Cloud Security algorithm using SOA 3.0 for secured transactions on the data, which usually governments of countries like USA International Traffic in Arms Regulations (ITAR) and Export Administration Regulations (EAR) requires to be utilized and distributed only within United States by security cleared personal only. In this paper, we describe a novel algorithm and corresponding cloud service as Cloud Monitoring Gateway (CMG). The current service prototype simulates the behavior of actual Cloud Security Gateway Application (CSGA) using the algorithm called as TPALM (The Privacy Authentication Latency Management). This simulation is coarse-grained, but is capable of measuring the privacy authentication on the given variables of a legit user. We also present an evaluation of this service utilization on actual data.","PeriodicalId":91079,"journal":{"name":"GSTF international journal on computing","volume":"30 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2014-09-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"81720300","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Retail Loans now-a-days form a major proportion of Loan Portfolio. Broadly they can be classified as (i) Loans for Small and medium Sector and (ii) Loans for Individuals. The objective of Credit Scoring is that we use enough of precaution before the sanction of the loan so that the loans do not go bad after disbursement. This will increase to the bottom line of the financial institution and also reduce the Credit Risk. Techniques used to perform Credit Scoring Varies for the above two classes of loans. In this paper, we concentrate on the application of Credit Scoring for individual or so called personal loans like – Auto loan, buying goods like Televisions, Refrigerators etc. Large numbers of loans are being disbursed in these areas. Though the size of the loan may be small, when compared to Small/Medium Scale Industry, if one does not control the defaults, the consequences will be disastrous. From the characteristics of borrower, product characteristics a Credit Score is computed for each applicant. If the Score exceeds a given threshold loan is sanctioned. If it is below the threshold, loan is sanctioned. If it is below the threshold, loan is rejected. In practice a buffer zone is created near the threshold so that those Credit Scores that fall in buffer zone, detailed investigation will be done before a decision is taken. Two broad classes of Scoring Model exists (i) Subjective Scoring and (ii) Statistical Scoring. Subjective Scoring is based on intuitive judgement. Subjective Scoring works but there is scope for improvement one limitation is prediction of risk is person dependent and focuses on few characteristics and may be mistakenly focusing on wrong characteristics. Statistical Scoring uses hardcore data of borrower characteristics, product characteristics and uses mathematical models to predict the risk. The relation is expressed in the form of an equation which finally gets converted to a score. Subjectivity will be reduced and variable(s) that are important to scoring are identified based on strong mathematical foundation. Different Models have been used in Credit Scoring like Regression, Decision Tree, Discriminate Analysis and Logistic Regression. Most of the times, a single model is used to compute the Credit Score. This method works well when the underlying decision rule is simple and when the rule becomes complex, the accuracy of the model diminishes very fast. In this Research Paper, a combination of Decision Tree and Logistic Regression is used to determine the weights that are to be assigned to different characteristics of the borrower. Decision Tree is used at first level of analysis to narrow down the importance of Variables and overall weights that needs to be assigned. It is also used for optimum groupings of numeric and non-numeric Variables. At second level, Logistic Regression is used to compute odd ratios a variant of probability, which in turn is used to assign weights for an attribute and to individual levels in an
{"title":"BI APPLICATION IN FINANCIAL SECTOR - CREDIT SCORING OF RETAIL LOANS USING A HYBRID MODELING APPROACH","authors":"S. Chandrasekhar, B. Tech, .. M.Tec","doi":"10.1037/e525192013-023","DOIUrl":"https://doi.org/10.1037/e525192013-023","url":null,"abstract":"Retail Loans now-a-days form a major proportion of Loan Portfolio. Broadly they can be classified as (i) Loans for Small and medium Sector and (ii) Loans for Individuals. The objective of Credit Scoring is that we use enough of precaution before the sanction of the loan so that the loans do not go bad after disbursement. This will increase to the bottom line of the financial institution and also reduce the Credit Risk. Techniques used to perform Credit Scoring Varies for the above two classes of loans. In this paper, we concentrate on the application of Credit Scoring for individual or so called personal loans like – Auto loan, buying goods like Televisions, Refrigerators etc. Large numbers of loans are being disbursed in these areas. Though the size of the loan may be small, when compared to Small/Medium Scale Industry, if one does not control the defaults, the consequences will be disastrous. From the characteristics of borrower, product characteristics a Credit Score is computed for each applicant. If the Score exceeds a given threshold loan is sanctioned. If it is below the threshold, loan is sanctioned. If it is below the threshold, loan is rejected. In practice a buffer zone is created near the threshold so that those Credit Scores that fall in buffer zone, detailed investigation will be done before a decision is taken. Two broad classes of Scoring Model exists (i) Subjective Scoring and (ii) Statistical Scoring. Subjective Scoring is based on intuitive judgement. Subjective Scoring works but there is scope for improvement one limitation is prediction of risk is person dependent and focuses on few characteristics and may be mistakenly focusing on wrong characteristics. Statistical Scoring uses hardcore data of borrower characteristics, product characteristics and uses mathematical models to predict the risk. The relation is expressed in the form of an equation which finally gets converted to a score. Subjectivity will be reduced and variable(s) that are important to scoring are identified based on strong mathematical foundation. Different Models have been used in Credit Scoring like Regression, Decision Tree, Discriminate Analysis and Logistic Regression. Most of the times, a single model is used to compute the Credit Score. This method works well when the underlying decision rule is simple and when the rule becomes complex, the accuracy of the model diminishes very fast. In this Research Paper, a combination of Decision Tree and Logistic Regression is used to determine the weights that are to be assigned to different characteristics of the borrower. Decision Tree is used at first level of analysis to narrow down the importance of Variables and overall weights that needs to be assigned. It is also used for optimum groupings of numeric and non-numeric Variables. At second level, Logistic Regression is used to compute odd ratios a variant of probability, which in turn is used to assign weights for an attribute and to individual levels in an ","PeriodicalId":91079,"journal":{"name":"GSTF international journal on computing","volume":"49 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2014-09-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"84921595","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}