Pub Date : 2017-03-01DOI: 10.1109/C-CODE.2017.7918903
Muhammad Arshad Islam, M. Iqbal, Muhammad Aleem, Z. Halim
Network centrality measures are used to identify important and influential nodes in any network. Several centrality measures are proposed in literature for static networks however these measures cannot be used for routing purpose in opportunistic networks as opportunistic networks are dynamic in nature. Opportunistic networks utilise pocket switching for routing messages where each device attempts to forward its messages to a suitable next node. Appropriate decision making for the selection of the forwarder node is crucial for the performance of a routing protocol in opportunistic networks. In any opportunistic network, some node play more important role in the routing process than the rest of the network. In this paper, we have analysed metrics that can be used to simplify the centrality measure computation in opportunistic networks. We have investigated the relationship between these node (ego) characteristics with centrality measures that are computed using novel network transformations. The aim of the transform mechanisms is to aggregate the link between any two nodes in a way that strongly relates the nodes which frequently come in contact with each other. Our experiment show that ego characteristics can be used to estimate centrality measures for dense opportunistic networks.
{"title":"Analysing connectivity patterns and centrality metrics for opportunistic networks","authors":"Muhammad Arshad Islam, M. Iqbal, Muhammad Aleem, Z. Halim","doi":"10.1109/C-CODE.2017.7918903","DOIUrl":"https://doi.org/10.1109/C-CODE.2017.7918903","url":null,"abstract":"Network centrality measures are used to identify important and influential nodes in any network. Several centrality measures are proposed in literature for static networks however these measures cannot be used for routing purpose in opportunistic networks as opportunistic networks are dynamic in nature. Opportunistic networks utilise pocket switching for routing messages where each device attempts to forward its messages to a suitable next node. Appropriate decision making for the selection of the forwarder node is crucial for the performance of a routing protocol in opportunistic networks. In any opportunistic network, some node play more important role in the routing process than the rest of the network. In this paper, we have analysed metrics that can be used to simplify the centrality measure computation in opportunistic networks. We have investigated the relationship between these node (ego) characteristics with centrality measures that are computed using novel network transformations. The aim of the transform mechanisms is to aggregate the link between any two nodes in a way that strongly relates the nodes which frequently come in contact with each other. Our experiment show that ego characteristics can be used to estimate centrality measures for dense opportunistic networks.","PeriodicalId":344222,"journal":{"name":"2017 International Conference on Communication, Computing and Digital Systems (C-CODE)","volume":"47 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127357644","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2017-03-01DOI: 10.1109/C-CODE.2017.7918930
Sajid Anwer, Ahmad Adbellatif, M. Alshayeb, Muhammad Shakeel Anjum
Software product's quality is one of the important aspects that affect the user, the developer, and the product. Measuring quality in the early phases of the project life cycle is a major goal of project planning. Accordingly, several research studies have been proposed to measure the software product quality attributes. In this paper, we empirically study the impact of afferent coupling (Ca), efferent coupling (Ce) and coupling between object (CBO) metrics on fault prediction using bivariate correlation. We built a prediction model using these metrics to predict faults by using multivariate logistic linear regression. A case study of an open source object oriented systems is used to evaluate the correlation between coupling metrics and faults. The results indicate that the efferent coupling (Ce) is a better indicator for fault prediction than afferent coupling (Ca) and CBO (coupling between object)
{"title":"Effect of coupling on software faults: An empirical study","authors":"Sajid Anwer, Ahmad Adbellatif, M. Alshayeb, Muhammad Shakeel Anjum","doi":"10.1109/C-CODE.2017.7918930","DOIUrl":"https://doi.org/10.1109/C-CODE.2017.7918930","url":null,"abstract":"Software product's quality is one of the important aspects that affect the user, the developer, and the product. Measuring quality in the early phases of the project life cycle is a major goal of project planning. Accordingly, several research studies have been proposed to measure the software product quality attributes. In this paper, we empirically study the impact of afferent coupling (Ca), efferent coupling (Ce) and coupling between object (CBO) metrics on fault prediction using bivariate correlation. We built a prediction model using these metrics to predict faults by using multivariate logistic linear regression. A case study of an open source object oriented systems is used to evaluate the correlation between coupling metrics and faults. The results indicate that the efferent coupling (Ce) is a better indicator for fault prediction than afferent coupling (Ca) and CBO (coupling between object)","PeriodicalId":344222,"journal":{"name":"2017 International Conference on Communication, Computing and Digital Systems (C-CODE)","volume":"75 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132997217","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2017-03-01DOI: 10.1109/C-CODE.2017.7918936
M. U. Farooq, Aamna Shakoor, A. Siddique
The efficiency of embedded systems mainly depends on the process scheduling policy of operating systems running on them. Better scheduling algorithms make a system fast using less resources for small time. Out of some important scheduling algorithms, Round Robin algorithm is much efficient. But its efficiency fairly depends on choosen time quantum. In this paper, we have developed an efficient Round Robin algorithm using Dynamic Time Quantum. Some such systems have already been developed but they take advantage of other algorithms and their running time is much higher due to sorting of processes which is practically impossible. So, our goal is to reduce running time of an algorithm along with efficiency constraints such as context switches, average waiting and turnaround times. Lower the context switches, average waiting and turnaround times; higher the efficiency of an operating system and thus better embedded system. In the last section of this paper, we will present a comparison of our system with previously developed algorithms.
{"title":"An Efficient Dynamic Round Robin algorithm for CPU scheduling","authors":"M. U. Farooq, Aamna Shakoor, A. Siddique","doi":"10.1109/C-CODE.2017.7918936","DOIUrl":"https://doi.org/10.1109/C-CODE.2017.7918936","url":null,"abstract":"The efficiency of embedded systems mainly depends on the process scheduling policy of operating systems running on them. Better scheduling algorithms make a system fast using less resources for small time. Out of some important scheduling algorithms, Round Robin algorithm is much efficient. But its efficiency fairly depends on choosen time quantum. In this paper, we have developed an efficient Round Robin algorithm using Dynamic Time Quantum. Some such systems have already been developed but they take advantage of other algorithms and their running time is much higher due to sorting of processes which is practically impossible. So, our goal is to reduce running time of an algorithm along with efficiency constraints such as context switches, average waiting and turnaround times. Lower the context switches, average waiting and turnaround times; higher the efficiency of an operating system and thus better embedded system. In the last section of this paper, we will present a comparison of our system with previously developed algorithms.","PeriodicalId":344222,"journal":{"name":"2017 International Conference on Communication, Computing and Digital Systems (C-CODE)","volume":"15 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125450749","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2017-03-01DOI: 10.1109/C-CODE.2017.7918940
Z. Parveen, Muhammad Anzar Alam, Hina Shakir
Rice is the most favorable and most consuming food for human being in all over the world and researchers are working to improve the quality of rice. The quality measurement of rice is also important because it is consumed as food as well as it is used for milling process in the national and international market. Many researchers have already worked on the quality of grain and proposed different techniques to characterize the quality of rice. Chalky is whiteness part in the rice grain and it is one of the most important parameter that is used to evaluate the quality of rice grain. We proposed an image processing technique using extended maxima operator to detect the chalky area in the rice. We also calculated the dimensions and color to classify rice grains. The experiment was performed on 22 sample images of rice grain to test the proposed method and was validated using visual inspection.
{"title":"Assessment of quality of rice grain using optical and image processing technique","authors":"Z. Parveen, Muhammad Anzar Alam, Hina Shakir","doi":"10.1109/C-CODE.2017.7918940","DOIUrl":"https://doi.org/10.1109/C-CODE.2017.7918940","url":null,"abstract":"Rice is the most favorable and most consuming food for human being in all over the world and researchers are working to improve the quality of rice. The quality measurement of rice is also important because it is consumed as food as well as it is used for milling process in the national and international market. Many researchers have already worked on the quality of grain and proposed different techniques to characterize the quality of rice. Chalky is whiteness part in the rice grain and it is one of the most important parameter that is used to evaluate the quality of rice grain. We proposed an image processing technique using extended maxima operator to detect the chalky area in the rice. We also calculated the dimensions and color to classify rice grains. The experiment was performed on 22 sample images of rice grain to test the proposed method and was validated using visual inspection.","PeriodicalId":344222,"journal":{"name":"2017 International Conference on Communication, Computing and Digital Systems (C-CODE)","volume":"94 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126220222","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2017-03-01DOI: 10.1109/C-CODE.2017.7918943
Muhammad Asim, M. Akram, A. A. Salam
Digital X ray images normally suffer from poor quality and one of the main reason is the presence of poison noise which is added in X rays images during acquisition process. Different de-noising techniques have been presented in literature for enhancement of x ray images. This paper presents a comparative study for extraction of noise from X-Rays images. The comparison has been done for bilateral filtering, Dual Tree Complex Wavelet Transform, Gaussian filter, Wiener filter and Non-Local Mean filters and evaluation of these techniques have been done using cervical X-Ray images from NHANES-II database. The significance of each technique has been evaluated using different performance measures.
{"title":"Comparison of different de-noising techniques for removal of poison noise from cervical X-Rays images","authors":"Muhammad Asim, M. Akram, A. A. Salam","doi":"10.1109/C-CODE.2017.7918943","DOIUrl":"https://doi.org/10.1109/C-CODE.2017.7918943","url":null,"abstract":"Digital X ray images normally suffer from poor quality and one of the main reason is the presence of poison noise which is added in X rays images during acquisition process. Different de-noising techniques have been presented in literature for enhancement of x ray images. This paper presents a comparative study for extraction of noise from X-Rays images. The comparison has been done for bilateral filtering, Dual Tree Complex Wavelet Transform, Gaussian filter, Wiener filter and Non-Local Mean filters and evaluation of these techniques have been done using cervical X-Ray images from NHANES-II database. The significance of each technique has been evaluated using different performance measures.","PeriodicalId":344222,"journal":{"name":"2017 International Conference on Communication, Computing and Digital Systems (C-CODE)","volume":"26 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132713410","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2017-03-01DOI: 10.1109/C-CODE.2017.7918965
Umar Talha, Tariq Mairaj, Waleed Bin Yousuf, Omer Zia
Computed Tomography (CT) has been a major contributor in revolutionizing and commercializing the medical imaging industry. However, it does so with some significant drawbacks. Most of the commonly used CT reconstruction algorithms need heavy dose of hazardous X-ray radiations. Higher X-ray dose produces better reconstructed image resolution, whereas low radiation dose projection data induces artifacts in the reconstructed image. It also reduces the image quality of conventional CT reconstruction algorithms [1]. The paper presents a novel algorithm for efficient CT reconstruction from low radiation dose projection data. In the proposed method, the phantom is subjected to low X-ray dose and then the projection data is enhanced using a series of post-processing algorithms and a new interpolation technique. The proposed algorithm is supported by computer simulation and promising results were observed. The proposed algorithm reduces the radiation dose to great extent, with good quality reconstructed image.
{"title":"Morphological operations and re-projection based novel low-dose CT reconstruction scheme","authors":"Umar Talha, Tariq Mairaj, Waleed Bin Yousuf, Omer Zia","doi":"10.1109/C-CODE.2017.7918965","DOIUrl":"https://doi.org/10.1109/C-CODE.2017.7918965","url":null,"abstract":"Computed Tomography (CT) has been a major contributor in revolutionizing and commercializing the medical imaging industry. However, it does so with some significant drawbacks. Most of the commonly used CT reconstruction algorithms need heavy dose of hazardous X-ray radiations. Higher X-ray dose produces better reconstructed image resolution, whereas low radiation dose projection data induces artifacts in the reconstructed image. It also reduces the image quality of conventional CT reconstruction algorithms [1]. The paper presents a novel algorithm for efficient CT reconstruction from low radiation dose projection data. In the proposed method, the phantom is subjected to low X-ray dose and then the projection data is enhanced using a series of post-processing algorithms and a new interpolation technique. The proposed algorithm is supported by computer simulation and promising results were observed. The proposed algorithm reduces the radiation dose to great extent, with good quality reconstructed image.","PeriodicalId":344222,"journal":{"name":"2017 International Conference on Communication, Computing and Digital Systems (C-CODE)","volume":"27 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115370487","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2017-03-01DOI: 10.1109/C-CODE.2017.7918955
A. Mannan, A. Habib
Wireless communication has a very important role in the day to day life for everyone. The wireless technology is growing rapidly and there is a growing demand of high performance, capacity and larger bit rate wireless communication systems which cope wireless communication services such as high speed data, video and voice signals. Multicarrier modulation scheme like OFDM provides an efficient solution to overcome this problem. In this paper a gray scale image processing is done using a LMS algorithm with wavelet based OFDM system using the QPSK modulation scheme in AWGN and Rayleigh channel in SISO environment and results are compared with the conventional adaptive FFT based OFDM system. We reconstruct our transmitted signal at receiver; in both systems by minimizing the error by adaptive filter but the computational complexity of FFT based system is more as compared to DWT based system. Results are compared in term of SNR vs BER which shows that adaptive DWT based OFDM system perform better as compared to the conventional adaptive FFT OFDM system.
{"title":"Adaptive processing of image using DWT and FFT OFDM in AWGN and Rayleigh channel","authors":"A. Mannan, A. Habib","doi":"10.1109/C-CODE.2017.7918955","DOIUrl":"https://doi.org/10.1109/C-CODE.2017.7918955","url":null,"abstract":"Wireless communication has a very important role in the day to day life for everyone. The wireless technology is growing rapidly and there is a growing demand of high performance, capacity and larger bit rate wireless communication systems which cope wireless communication services such as high speed data, video and voice signals. Multicarrier modulation scheme like OFDM provides an efficient solution to overcome this problem. In this paper a gray scale image processing is done using a LMS algorithm with wavelet based OFDM system using the QPSK modulation scheme in AWGN and Rayleigh channel in SISO environment and results are compared with the conventional adaptive FFT based OFDM system. We reconstruct our transmitted signal at receiver; in both systems by minimizing the error by adaptive filter but the computational complexity of FFT based system is more as compared to DWT based system. Results are compared in term of SNR vs BER which shows that adaptive DWT based OFDM system perform better as compared to the conventional adaptive FFT OFDM system.","PeriodicalId":344222,"journal":{"name":"2017 International Conference on Communication, Computing and Digital Systems (C-CODE)","volume":"53 49","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"120809182","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2017-03-01DOI: 10.1109/C-CODE.2017.7918947
I. Baseer, Rabeea Basir
Voice morphing one of the speech synthesis frameworks, in simplest term aim to transforms speaker's identity from source to target speaker while preserving the original content of message. This paper presents a novel spectral envelope mapping algorithm based on Canonical Correlation Analysis(CCA) that find the association between spectral envelope characteristics of source speaker and target speaker in terms of correlation as a similarity metric. Moreover, the speech also undergoes to prosodic modification using PSOLA as pitch frequency is also an important parameter for varying identity. This morphing algorithm is evaluated by taking the utterances from freely available CMU-ARCTIC speech dataset. The subjective experiment shows that the proposed method successfully transforms speaker identity and produced high-quality morphed signal.
{"title":"Cross gender voice morphing using Canonical Correlation Analysis","authors":"I. Baseer, Rabeea Basir","doi":"10.1109/C-CODE.2017.7918947","DOIUrl":"https://doi.org/10.1109/C-CODE.2017.7918947","url":null,"abstract":"Voice morphing one of the speech synthesis frameworks, in simplest term aim to transforms speaker's identity from source to target speaker while preserving the original content of message. This paper presents a novel spectral envelope mapping algorithm based on Canonical Correlation Analysis(CCA) that find the association between spectral envelope characteristics of source speaker and target speaker in terms of correlation as a similarity metric. Moreover, the speech also undergoes to prosodic modification using PSOLA as pitch frequency is also an important parameter for varying identity. This morphing algorithm is evaluated by taking the utterances from freely available CMU-ARCTIC speech dataset. The subjective experiment shows that the proposed method successfully transforms speaker identity and produced high-quality morphed signal.","PeriodicalId":344222,"journal":{"name":"2017 International Conference on Communication, Computing and Digital Systems (C-CODE)","volume":"14 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127894702","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2017-03-01DOI: 10.1109/C-CODE.2017.7918908
R. Khan, Mohibullah, Muhammad Arshad Islam
In current era the best way to find the information over the internet is search engine. Web search engines maintains user profile for better search results which could raise serious privacy issues. In order to intact the users privacy in front of a web search engines Private information retrieval (PIR) protocols are used which hide the identity of the user by submitting his/her query through other group member. A basic problem is related with these protocols are their predictability. This paper is the extension of previous work in which a person with anonymous query was successfully identified. This paper aims to find all queries submitted by the target user using UPIR and UUP protocols. For experimentation purpose a machine learning based adversarial model is proposed to find the actual queries of user of interest based on the previous profile. The results shows that the precision, recall and f-measure of J48 in finding user's real queries is more then 0.70 on the average. Similarly J48 reported highest trues positive rate of above 0.7 and lowest false positive rate of 0.006. It was also observed that the size of training data has very little effect on accuracy according to this experiment.
{"title":"Quantification of PIR protocols privacy","authors":"R. Khan, Mohibullah, Muhammad Arshad Islam","doi":"10.1109/C-CODE.2017.7918908","DOIUrl":"https://doi.org/10.1109/C-CODE.2017.7918908","url":null,"abstract":"In current era the best way to find the information over the internet is search engine. Web search engines maintains user profile for better search results which could raise serious privacy issues. In order to intact the users privacy in front of a web search engines Private information retrieval (PIR) protocols are used which hide the identity of the user by submitting his/her query through other group member. A basic problem is related with these protocols are their predictability. This paper is the extension of previous work in which a person with anonymous query was successfully identified. This paper aims to find all queries submitted by the target user using UPIR and UUP protocols. For experimentation purpose a machine learning based adversarial model is proposed to find the actual queries of user of interest based on the previous profile. The results shows that the precision, recall and f-measure of J48 in finding user's real queries is more then 0.70 on the average. Similarly J48 reported highest trues positive rate of above 0.7 and lowest false positive rate of 0.006. It was also observed that the size of training data has very little effect on accuracy according to this experiment.","PeriodicalId":344222,"journal":{"name":"2017 International Conference on Communication, Computing and Digital Systems (C-CODE)","volume":"29 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116805103","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2017-03-01DOI: 10.1109/C-CODE.2017.7918933
K. Awar, M. M. I. Sameem, Y. Hafeez
Distributed teams faced a lot of obstacles faced while applying Agile practices in distributed environment, these obstacles were poorly understood by local software industry. In market, Agile and distributed software development became a need more than a trend. It seems to be problematic when Agile and distributed software development merge. There is need to identify those Agile practices adoptable for distributed teams and can also help local software industry to produce a quality software. This study contributes an empirical based investigation of the critical factors affect in Agile distributed. Study inductively formulated a theoretical model of how specific agile alignment practices affect in mitigating distributed Agile software development process. The model presented in this paper presented state-of-the-art of critical factors affect in Distributed Agile environment given in literature. For further proof of concept a case study was conducted to test the applicability of proposed model in local environment. Results indicate basis for further research in local industry Proposed model makes applicability of agile practices in distributed environment by tailoring Scrum and XP methods.
{"title":"A model for applying Agile practices in Distributed environment: A case of local software industry","authors":"K. Awar, M. M. I. Sameem, Y. Hafeez","doi":"10.1109/C-CODE.2017.7918933","DOIUrl":"https://doi.org/10.1109/C-CODE.2017.7918933","url":null,"abstract":"Distributed teams faced a lot of obstacles faced while applying Agile practices in distributed environment, these obstacles were poorly understood by local software industry. In market, Agile and distributed software development became a need more than a trend. It seems to be problematic when Agile and distributed software development merge. There is need to identify those Agile practices adoptable for distributed teams and can also help local software industry to produce a quality software. This study contributes an empirical based investigation of the critical factors affect in Agile distributed. Study inductively formulated a theoretical model of how specific agile alignment practices affect in mitigating distributed Agile software development process. The model presented in this paper presented state-of-the-art of critical factors affect in Distributed Agile environment given in literature. For further proof of concept a case study was conducted to test the applicability of proposed model in local environment. Results indicate basis for further research in local industry Proposed model makes applicability of agile practices in distributed environment by tailoring Scrum and XP methods.","PeriodicalId":344222,"journal":{"name":"2017 International Conference on Communication, Computing and Digital Systems (C-CODE)","volume":"5 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132540314","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}