The disaster could happen at any point of time and due to which there could be financial loss in addition to operational loss, considering 24/7 banking environment and having its branches in different parts of world. The risk involved in financial loss may be very high and could result in unbearable loss. In such cases where organizations have zero downtime and having high availability of data, the disaster recovery mechanism and data replication scope has to be well defined and should be strictly followed. Assuming that the network, storage and hardware is available at DR site and the replication of critical data is also verified but still there are several objects left behind which were not covered under replication scope. These objects were created by programs themselves while executing (not in replication scope). In such case the successful switch might not be called as successful because those objects will not be available at DR site and may lead to business interruption or financial loss. This problem arises when developers have access on production servers and they create new libraries/objects for their program execution without informing the implementers through proper change management process. In this paper we will provide the guidelines and framework for deployment of programs on production servers also the guide for smooth switching (change of role) of entire core banking environment running on IBM System-i to DR site.
{"title":"Repercussion Of Program Generated Objects In Smooth Operations Running From Disaster Recovery Site","authors":"M. J. Ashraf, M. Mukati","doi":"10.31645/2014.12.1.7","DOIUrl":"https://doi.org/10.31645/2014.12.1.7","url":null,"abstract":"The disaster could happen at any point of time and due to which there could be financial loss in addition to operational loss, considering 24/7 banking environment and having its branches in different parts of world. The risk involved in financial loss may be very high and could result in unbearable loss. In such cases where organizations have zero downtime and having high availability of data, the disaster recovery mechanism and data replication scope has to be well defined and should be strictly followed. Assuming that the network, storage and hardware is available at DR site and the replication of critical data is also verified but still there are several objects left behind which were not covered under replication scope. These objects were created by programs themselves while executing (not in replication scope). In such case the successful switch might not be called as successful because those objects will not be available at DR site and may lead to business interruption or financial loss. This problem arises when developers have access on production servers and they create new libraries/objects for their program execution without informing the implementers through proper change management process. In this paper we will provide the guidelines and framework for deployment of programs on production servers also the guide for smooth switching (change of role) of entire core banking environment running on IBM System-i to DR site.","PeriodicalId":412730,"journal":{"name":"Journal of Independent Studies and Research Computing","volume":"30 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124929511","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Web technology has become an important part of online business such as sharing information, social interaction, activities related to business especially online payment methods, credit card information, bank transactions, online banking and many other applications. The exponential use of web technology introduces new possibilities of criminal activities and creates unbearable threats to the information and business trades. This is mainly because of the widely use of the World Wide Web and cloud computing for information sharing and saving regardless of the user. As technology grows with the passage of time, the web technology becomes more advanced and complex in terms of security and privacy and drives new challenges and threats which makes the internet unsafe for business applications. Many of the currently available applications have loopholes which makes cyber criminals to exploit the applications easily. The main vulnerabilities are Cyber theft, vandalism, web jacking, credit card information stolen, privacy and security issues, cyber terrorism, spam and etc. An ongoing challenge related to cyber security community is related to handle the transition of technology to commercial or open source web applications available in the market. This paper provides the strategy to overcome the vulnerabilities and potential threats to the business applications.
{"title":"The Vulnerability Of Cyber Security And Strategy To Conquer The Potential Threats On Business Applications","authors":"Muhammad Altaf Mukati, Syed Muzammil Ali","doi":"10.31645/.2014.12.1.9","DOIUrl":"https://doi.org/10.31645/.2014.12.1.9","url":null,"abstract":"Web technology has become an important part of online business such as sharing information, social interaction, activities related to business especially online payment methods, credit card information, bank transactions, online banking and many other applications. The exponential use of web technology introduces new possibilities of criminal activities and creates unbearable threats to the information and business trades. This is mainly because of the widely use of the World Wide Web and cloud computing for information sharing and saving regardless of the user. As technology grows with the passage of time, the web technology becomes more advanced and complex in terms of security and privacy and drives new challenges and threats which makes the internet unsafe for business applications. Many of the currently available applications have loopholes which makes cyber criminals to exploit the applications easily. The main vulnerabilities are Cyber theft, vandalism, web jacking, credit card information stolen, privacy and security issues, cyber terrorism, spam and etc. An ongoing challenge related to cyber security community is related to handle the transition of technology to commercial or open source web applications available in the market. This paper provides the strategy to overcome the vulnerabilities and potential threats to the business applications.","PeriodicalId":412730,"journal":{"name":"Journal of Independent Studies and Research Computing","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114575584","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The current study is to evaluate the patient facing problems by applying Late Acceptance Hill Climbing Algorithm (LAHC) in hospital settings. The recent proposed procedure of LAHC is based on metaheuristic algorithm which is linked with one-point clarification method. Patient’s satisfaction regarding the performance of the hospital is a composite mechanism. The optimization procedure is connected with NP-hard problems which is practically related to the problems faced by patients. These problems are concerned with assigning the group of patients visiting the hospital for receiving healthcare services. The common issues faced by patients include communication gap, response time to attend patients, early symptomatic relief, getting proper advice for dosage and usage of medicines, clean hospital environment. Moreover, patient education and guidance before discharge from hospital is also the missing element. The suggested algorithm of LAHC to PFP is developed and it has two phases: the first phase includes providing the initial feasible solution using communication-oriented methodology. The second phase uses three neighborhood framework which are implanted inside the segment of PFP based on LAHC to additionally upgrade the underlying feasible solution of the introductory phase.
本研究是在医院环境中应用延迟接受爬坡算法(Late Acceptance Hill climb Algorithm, LAHC)来评估患者面临的问题。最近提出的LAHC过程是基于与一点澄清法相结合的元启发式算法。患者对医院绩效的满意度是一个复合机制。优化过程与NP-hard问题有关,而NP-hard问题与患者所面临的问题实际相关。这些问题涉及分配到医院接受保健服务的病人组。患者面临的常见问题包括沟通差距、就诊反应时间、早期症状缓解、正确用药建议、医院环境清洁等。此外,患者出院前的教育和指导也是缺失的因素。提出了从LAHC到PFP的建议算法,该算法分为两个阶段:第一阶段采用面向通信的方法提供初始可行解。第二阶段采用三个邻域框架植入到基于LAHC的PFP片段中,对引入阶段的底层可行解进行额外升级。
{"title":"Late Acceptance Hill Climbing Algorithm For Solving Patient Facing Problems In Hospitals","authors":"Irfan Majeed, Mansoor Alam, Mustaneer Noor, Rizwan Ahmed, Muhammad Humayoun, Javeria Iftikhar","doi":"10.31645/24","DOIUrl":"https://doi.org/10.31645/24","url":null,"abstract":"The current study is to evaluate the patient facing problems by applying Late Acceptance Hill Climbing Algorithm (LAHC) in hospital settings. The recent proposed procedure of LAHC is based on metaheuristic algorithm which is linked with one-point clarification method. Patient’s satisfaction regarding the performance of the hospital is a composite mechanism. The optimization procedure is connected with NP-hard problems which is practically related to the problems faced by patients. These problems are concerned with assigning the group of patients visiting the hospital for receiving healthcare services. The common issues faced by patients include communication gap, response time to attend patients, early symptomatic relief, getting proper advice for dosage and usage of medicines, clean hospital environment. Moreover, patient education and guidance before discharge from hospital is also the missing element. The suggested algorithm of LAHC to PFP is developed and it has two phases: the first phase includes providing the initial feasible solution using communication-oriented methodology. The second phase uses three neighborhood framework which are implanted inside the segment of PFP based on LAHC to additionally upgrade the underlying feasible solution of the introductory phase.","PeriodicalId":412730,"journal":{"name":"Journal of Independent Studies and Research Computing","volume":"5 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122132731","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The advent of blogging more and more content is published on the web. Micro blogging is used to convey ideas using short text messages; video links or images .micro blogging platforms allow the public to convey their ideas through new technologies and to convey ideas and news in precise manner. This paper aims to use to text mining algorithm, which is used for gathering relevant information from passages of text, to derive a context through which predictive analysis can be made. The goal will be to use this technique to provide a predictive analysis on the electoral polls using tweets from the micro-blogging platform twitter.
{"title":"Predictive Analysis on Electoral Poll using micro- blogging (twitter)","authors":"Tabraiz Anwer, Adeel Ahmed","doi":"10.31645/2013.11.1.6","DOIUrl":"https://doi.org/10.31645/2013.11.1.6","url":null,"abstract":"The advent of blogging more and more content is published on the web. Micro blogging is used to convey ideas using short text messages; video links or images .micro blogging platforms allow the public to convey their ideas through new technologies and to convey ideas and news in precise manner. This paper aims to use to text mining algorithm, which is used for gathering relevant information from passages of text, to derive a context through which predictive analysis can be made. The goal will be to use this technique to provide a predictive analysis on the electoral polls using tweets from the micro-blogging platform twitter.","PeriodicalId":412730,"journal":{"name":"Journal of Independent Studies and Research Computing","volume":"103 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133810232","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Manufacturing industry in need of a system to meet their quality assurance requirements specially when they have to produce quality and precise components which require to be used in mechanical machines. To reach this quality standard there is a need to solution which enable manufacturing industry to overcome speed and accuracy requirements. Inspection of quality parameters in mechanical components is very difficult and time consuming process. Computer vision / Image processing may contribute key role in developing solution through which quality assurance process become easy and fast.
{"title":"Verifying Practical Implementation Of QA Process Of Mechanical Component By Using Digital Image Processing","authors":"Nazim Badar","doi":"10.31645/2014.12.1.11","DOIUrl":"https://doi.org/10.31645/2014.12.1.11","url":null,"abstract":"Manufacturing industry in need of a system to meet their quality assurance requirements specially when they have to produce quality and precise components which require to be used in mechanical machines. To reach this quality standard there is a need to solution which enable manufacturing industry to overcome speed and accuracy requirements. Inspection of quality parameters in mechanical components is very difficult and time consuming process. Computer vision / Image processing may contribute key role in developing solution through which quality assurance process become easy and fast.","PeriodicalId":412730,"journal":{"name":"Journal of Independent Studies and Research Computing","volume":"49 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127373872","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The Turing Machine model has proven to be capable of simulating every known computational model. Since its inception the model has served as the basis on which computational devices have been constructed. The paper focuses on highlighting the fact that Turing Machine serves as a precise model for defining and executing a specific algorithm based on the Computability Theory. The paper also highlights the fact that when executed on a Turing Machine an algorithm’s time and space complexity can be analyzed based on the Complexity Theory. Therefore, this model can serve as a superb abstraction for the computational devices as number of steps and space required to execute an algorithm can be predicted. It is worth mentioning that the simulator engine named Reflector designed in this study is the foremost simulator in the regards to have the distinct capabilities of; firstly dynamically modelling a Turing Machine based on its manifestations and secondly performing Time & Space Complexity Analysis of an algorithm executed on the Turing Machine.
{"title":"Reflector – A Dynamic Manifestation of Turing Machines with Time and Space Complexity Analysis","authors":"Behroz Mirza, Muhammad Rafi","doi":"10.31645/2013.11.2.4","DOIUrl":"https://doi.org/10.31645/2013.11.2.4","url":null,"abstract":"The Turing Machine model has proven to be capable of simulating every known computational model. Since its inception the model has served as the basis on which computational devices have been constructed. The paper focuses on highlighting the fact that Turing Machine serves as a precise model for defining and executing a specific algorithm based on the Computability Theory. The paper also highlights the fact that when executed on a Turing Machine an algorithm’s time and space complexity can be analyzed based on the Complexity Theory. Therefore, this model can serve as a superb abstraction for the computational devices as number of steps and space required to execute an algorithm can be predicted. It is worth mentioning that the simulator engine named Reflector designed in this study is the foremost simulator in the regards to have the distinct capabilities of; firstly dynamically modelling a Turing Machine based on its manifestations and secondly performing Time & Space Complexity Analysis of an algorithm executed on the Turing Machine.","PeriodicalId":412730,"journal":{"name":"Journal of Independent Studies and Research Computing","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128690785","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Requirements engineering is a crucial phase in system development. If the requirements are correct, the entire project will succeed but the failure of requirements will have negative impact on the project. In this research, ontology for requirement reuse is proposed either across different projects or within a single project. The requirements once elicited, designed, verified and validated, produce sets of artefacts that are used to build the system. These requirements can be useful if stored in a repository and searched for similar criteria in future scenario thus, saving time and effort. This paper proposes ontology for the reuse process of the requirements.
{"title":"Proposed Ontology for Requirements Reuse","authors":"Sugandh Wafai, Z. Jan","doi":"10.31645/2014.12.2.3","DOIUrl":"https://doi.org/10.31645/2014.12.2.3","url":null,"abstract":"Requirements engineering is a crucial phase in system development. If the requirements are correct, the entire project will succeed but the failure of requirements will have negative impact on the project. In this research, ontology for requirement reuse is proposed either across different projects or within a single project. The requirements once elicited, designed, verified and validated, produce sets of artefacts that are used to build the system. These requirements can be useful if stored in a repository and searched for similar criteria in future scenario thus, saving time and effort. This paper proposes ontology for the reuse process of the requirements.","PeriodicalId":412730,"journal":{"name":"Journal of Independent Studies and Research Computing","volume":"61 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134528450","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The paper discusses secure user authentication mechanisms using graphical passwords. Graphical Password is an alternative to a textual password, which uses images, designs patterns etc. as a password instead of alphanumeric password. Graphical passwords provide better security and usability over textual password, but along with so many advantages of graphical password, they have a major issue of Shoulder Surfing Attack. In this report different techniques of graphical passwords are discussed. To combat shoulder surfing attack, two different techniques of graphical password based authentication are implemented as part of this project. Different user based surveys are conducted. Based on results of user surveys, a comparative analysis is carried out between two prototypes developed in this project. In the end, conclusion is written in terms of application’s security, reliability, user convenience and security against the shoulder surfing attack.
{"title":"Secure User Authentication Using Graphical Passwords","authors":"Fariya Ghori, Kashif Abbasi","doi":"10.31645/2013.11.2.5","DOIUrl":"https://doi.org/10.31645/2013.11.2.5","url":null,"abstract":"The paper discusses secure user authentication mechanisms using graphical passwords. Graphical Password is an alternative to a textual password, which uses images, designs patterns etc. as a password instead of alphanumeric password. Graphical passwords provide better security and usability over textual password, but along with so many advantages of graphical password, they have a major issue of Shoulder Surfing Attack. In this report different techniques of graphical passwords are discussed. To combat shoulder surfing attack, two different techniques of graphical password based authentication are implemented as part of this project. Different user based surveys are conducted. Based on results of user surveys, a comparative analysis is carried out between two prototypes developed in this project. In the end, conclusion is written in terms of application’s security, reliability, user convenience and security against the shoulder surfing attack.","PeriodicalId":412730,"journal":{"name":"Journal of Independent Studies and Research Computing","volume":"61 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127688003","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The social networks become the important part of the life. During the last decade, use of social networks has increased to a great extent. People publish their content on the web they use to convey their idea by posting statuses, links, videos, images, comments(Facebook). The use of social networks rapidly increased during the last decade and there is huge amount of data publish on these networks People share their personal information and insights of their lives. In this paper we thoroughly studied five big personality models and come to the point that neuroticism reflects the criminal mind of person, we are using the text mining algorithm, which will be used to extract necessary information from the text massages (statuses, comments etc.) containing neuroticism words e.g. anger, anxiety stress, depression etc. and by using these words prediction can be made. The main aim of this paper is to find the crime tendency in person from the information they share on social networking site (FACEBOOK).
{"title":"To Identify the Criminal tendency of a Person by analysis of their social media profiles (Facebook)","authors":"Sajjad Khan, Z. Jan","doi":"10.31645/2013.11.2.7","DOIUrl":"https://doi.org/10.31645/2013.11.2.7","url":null,"abstract":"The social networks become the important part of the life. During the last decade, use of social networks has increased to a great extent. People publish their content on the web they use to convey their idea by posting statuses, links, videos, images, comments(Facebook). The use of social networks rapidly increased during the last decade and there is huge amount of data publish on these networks People share their personal information and insights of their lives. In this paper we thoroughly studied five big personality models and come to the point that neuroticism reflects the criminal mind of person, we are using the text mining algorithm, which will be used to extract necessary information from the text massages (statuses, comments etc.) containing neuroticism words e.g. anger, anxiety stress, depression etc. and by using these words prediction can be made. The main aim of this paper is to find the crime tendency in person from the information they share on social networking site (FACEBOOK).","PeriodicalId":412730,"journal":{"name":"Journal of Independent Studies and Research Computing","volume":"9 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125841620","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}