Pub Date : 2019-12-01DOI: 10.1109/ICECCO48375.2019.9043282
D. Kariboz, A. Bogdanchikov, K. Orynbekova
With a massive increase of images and videos in the internet, processing these data for face detection and recognition is a bottleneck in current situation. At least there is a lack of frameworks and tools to process lots of data efficiently and not time consuming. System proposed in this paper offers face recognition framework based on Apache Spark to analyze lots images and videos from Instagram. As a people need to be searched and recognized are university students, their feature vectors used for recognition initially computed and can be enlarged at any moment. Spark’s Resilient Distributed Databases gives ability to compute all the intermediate data directly in memory making computations faster meanwhile keeping resilience property.
{"title":"Computing feature vectors of students for face recognition using Apache Spark","authors":"D. Kariboz, A. Bogdanchikov, K. Orynbekova","doi":"10.1109/ICECCO48375.2019.9043282","DOIUrl":"https://doi.org/10.1109/ICECCO48375.2019.9043282","url":null,"abstract":"With a massive increase of images and videos in the internet, processing these data for face detection and recognition is a bottleneck in current situation. At least there is a lack of frameworks and tools to process lots of data efficiently and not time consuming. System proposed in this paper offers face recognition framework based on Apache Spark to analyze lots images and videos from Instagram. As a people need to be searched and recognized are university students, their feature vectors used for recognition initially computed and can be enlarged at any moment. Spark’s Resilient Distributed Databases gives ability to compute all the intermediate data directly in memory making computations faster meanwhile keeping resilience property.","PeriodicalId":166322,"journal":{"name":"2019 15th International Conference on Electronics, Computer and Computation (ICECCO)","volume":"64 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127105223","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2019-12-01DOI: 10.1109/ICECCO48375.2019.9043224
M. Abubakar, K. Umar
Collaborative filtering techniques is among the popular approaches used in addressing product recommender systems, which uses ratings and predictions to make new suggestions. However the major weakness of collaborative filtering approaches is cold user problem. Literature investigation has shown that cold user problem could be effectively addressed using active learning technique of administering personalized questionnaire. Unfortunately, the result of personalized questionnaire technique could contain some user preference uncertainties where the product database is too large (as in Amazon.com). This research work tends to address the weakness of personalized questionnaire technique by applying the active learning technique of uncertainty reduction over the result obtained from administering personalized questionnaire. This strategy has the tendency of resolving user preference uncertainties that could be associated with the result of personalized questionnaire. This research work is in progress. Preliminary result is encouraging.
{"title":"Improving the Result of Personalized Questionnaire Towards Solving Cold User Problem","authors":"M. Abubakar, K. Umar","doi":"10.1109/ICECCO48375.2019.9043224","DOIUrl":"https://doi.org/10.1109/ICECCO48375.2019.9043224","url":null,"abstract":"Collaborative filtering techniques is among the popular approaches used in addressing product recommender systems, which uses ratings and predictions to make new suggestions. However the major weakness of collaborative filtering approaches is cold user problem. Literature investigation has shown that cold user problem could be effectively addressed using active learning technique of administering personalized questionnaire. Unfortunately, the result of personalized questionnaire technique could contain some user preference uncertainties where the product database is too large (as in Amazon.com). This research work tends to address the weakness of personalized questionnaire technique by applying the active learning technique of uncertainty reduction over the result obtained from administering personalized questionnaire. This strategy has the tendency of resolving user preference uncertainties that could be associated with the result of personalized questionnaire. This research work is in progress. Preliminary result is encouraging.","PeriodicalId":166322,"journal":{"name":"2019 15th International Conference on Electronics, Computer and Computation (ICECCO)","volume":"52 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124346623","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2019-12-01DOI: 10.1109/ICECCO48375.2019.9043198
A. Tukur, K. Umar, A. Muhammad
Despite the recent exponential growth of Hausa online content on websites like Aminiya.dailytrust.com.ng, Freedomradio.com, Hausa.Leadership.ng, Arewa24.com, and BBC.com/hausa, Sentiment Analysis on such Hausa web content have not been explored in the research community. Thus, Sentiment Analysis of Hausa based web content is a virgin researchable area. However, there are some pre-requisite activities prior to conducting sentiment analysis such as stemming/Lemmatization of words, and Part of Speech (POS) tagging. Unfortunately, for Hausa language, researches towards tools and techniques for these pre-requisite activities are still in the infancy stage, with little work done on stemming of words, but no work so far on Lemmatization and POS tagging. Consequently, this research work proposes a technique/model for POS tagging of Hausa sentences towards the realization of sentiment analysis of Hausa web content. The work is in progress. Preliminary results are encouraging.
{"title":"Tagging Part of Speech in Hausa Sentences","authors":"A. Tukur, K. Umar, A. Muhammad","doi":"10.1109/ICECCO48375.2019.9043198","DOIUrl":"https://doi.org/10.1109/ICECCO48375.2019.9043198","url":null,"abstract":"Despite the recent exponential growth of Hausa online content on websites like Aminiya.dailytrust.com.ng, Freedomradio.com, Hausa.Leadership.ng, Arewa24.com, and BBC.com/hausa, Sentiment Analysis on such Hausa web content have not been explored in the research community. Thus, Sentiment Analysis of Hausa based web content is a virgin researchable area. However, there are some pre-requisite activities prior to conducting sentiment analysis such as stemming/Lemmatization of words, and Part of Speech (POS) tagging. Unfortunately, for Hausa language, researches towards tools and techniques for these pre-requisite activities are still in the infancy stage, with little work done on stemming of words, but no work so far on Lemmatization and POS tagging. Consequently, this research work proposes a technique/model for POS tagging of Hausa sentences towards the realization of sentiment analysis of Hausa web content. The work is in progress. Preliminary results are encouraging.","PeriodicalId":166322,"journal":{"name":"2019 15th International Conference on Electronics, Computer and Computation (ICECCO)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130535284","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2019-12-01DOI: 10.1109/ICECCO48375.2019.9043190
Faiza Jibril, Steve A. Adeshina, Sadiq Thomas, Ali Nyangwarimam Obadia, Oiza Sala, Oluwatomisin E. Aina
Cardio Vascular Disease is one of the leading causes of deaths in Nigeria with a staggering number of about 150,000 deaths annually as compiled by the Nigerian Heart Foundation (NHF). A good number of these deaths are a result of wrong observation and late detection of the disease. This system provides a means of reducing the number of deaths caused by cardiovascular diseases through the design and implementation of a portable and cost-effective device for the detection and monitoring of heart rate conditions. System performance testing was carried out by measuring the pulse rate and detecting the state of the heart at 5-minute intervals for 50 minutes. The produced results of the system show an average error of 2.8% in comparison to the data measured by the application of Cardiac diagnosis and an error of 2.2% in comparison to the traditional means of measuring heart rate. Hence the designed system can be used effectively to monitor and detect cardiovascular diseases.
{"title":"A Microcontroller Based Heart Rate Monitoring and Conditions Detection System","authors":"Faiza Jibril, Steve A. Adeshina, Sadiq Thomas, Ali Nyangwarimam Obadia, Oiza Sala, Oluwatomisin E. Aina","doi":"10.1109/ICECCO48375.2019.9043190","DOIUrl":"https://doi.org/10.1109/ICECCO48375.2019.9043190","url":null,"abstract":"Cardio Vascular Disease is one of the leading causes of deaths in Nigeria with a staggering number of about 150,000 deaths annually as compiled by the Nigerian Heart Foundation (NHF). A good number of these deaths are a result of wrong observation and late detection of the disease. This system provides a means of reducing the number of deaths caused by cardiovascular diseases through the design and implementation of a portable and cost-effective device for the detection and monitoring of heart rate conditions. System performance testing was carried out by measuring the pulse rate and detecting the state of the heart at 5-minute intervals for 50 minutes. The produced results of the system show an average error of 2.8% in comparison to the data measured by the application of Cardiac diagnosis and an error of 2.2% in comparison to the traditional means of measuring heart rate. Hence the designed system can be used effectively to monitor and detect cardiovascular diseases.","PeriodicalId":166322,"journal":{"name":"2019 15th International Conference on Electronics, Computer and Computation (ICECCO)","volume":"55 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124630134","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2019-12-01DOI: 10.1109/ICECCO48375.2019.9043187
Omotayo Oshiga, Oluwatomisin E. Aina, A. Obadiah, A. Salihu, Gokhan Koyunlu
Given a number of sensors, their characteristics, how should the sensors be deployed so as to minimize the location estimation errors. Yet, the design of sensor deployment techniques still receives little attention. In this paper, we address the problem of optimal deployment of sensors using tight frames for the minimization of the mean square error on the user location. By employing, some of the properties of tight frames, it is seen that the existence of an optimal geometry is governed by the fundamental parseval equality of tight frames. A CRLB-based algorithm utilizing the properties of tight frames and a coordinate descent method is presented for placing sensors which minimize the mean square error of multiple users in a coverage area. Finally, simulation analysis is performed to illustrate the achievable gains in minimizing the mean square error of users by optimally placing sensors using tight frames.
{"title":"Sensor Placement For Wireless Localization","authors":"Omotayo Oshiga, Oluwatomisin E. Aina, A. Obadiah, A. Salihu, Gokhan Koyunlu","doi":"10.1109/ICECCO48375.2019.9043187","DOIUrl":"https://doi.org/10.1109/ICECCO48375.2019.9043187","url":null,"abstract":"Given a number of sensors, their characteristics, how should the sensors be deployed so as to minimize the location estimation errors. Yet, the design of sensor deployment techniques still receives little attention. In this paper, we address the problem of optimal deployment of sensors using tight frames for the minimization of the mean square error on the user location. By employing, some of the properties of tight frames, it is seen that the existence of an optimal geometry is governed by the fundamental parseval equality of tight frames. A CRLB-based algorithm utilizing the properties of tight frames and a coordinate descent method is presented for placing sensors which minimize the mean square error of multiple users in a coverage area. Finally, simulation analysis is performed to illustrate the achievable gains in minimizing the mean square error of users by optimally placing sensors using tight frames.","PeriodicalId":166322,"journal":{"name":"2019 15th International Conference on Electronics, Computer and Computation (ICECCO)","volume":"244 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124691622","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2019-12-01DOI: 10.1109/ICECCO48375.2019.9043219
A. Miyim, M. A. Muhammed
Millions of vehicles pass via roads and cities every day. Various economic, social and cultural factors affect growth of traffic congestion. The effect of traffic congestion has major impacts on accidents, loss of time, cost, delay of emergency, etc. Due to traffic congestions there is a loss in productivity from workers, people lose time, trade opportunities are lost, delivery gets delayed leading to increasing cost. In providing solutions to these congestion problems, a new robust and smart solution that is based on Vehicle-to-Infrastructure (V2I) technology capable of addressing road accident and traffic management in Nigeria’s mega cities is proposed. In this paper, the proposed system serves as an alternative to the existing traffic management system with an intersection control station that communicates with vehicles approaching the intersection through the V2I network. The vehicles are equipped with Dashboard Traffic Light (DBTL) sensors that communicate with the infrastructure. A Safe-to-Pass-First (SPF) algorithm was designed that considered real time speed, vehicle position and data to decide when to allow the vehicle to pass through the intersection. The algorithm checks the status of conflicting lanes to ensure that vehicle pass the intersection safely. This method has been found to be more efficient than the existing methods as the average waiting time at the intersection is reduced by 23% and improved throughput recorded, Python code and SUMO were used for the simulation.
{"title":"Smart Traffic Management System","authors":"A. Miyim, M. A. Muhammed","doi":"10.1109/ICECCO48375.2019.9043219","DOIUrl":"https://doi.org/10.1109/ICECCO48375.2019.9043219","url":null,"abstract":"Millions of vehicles pass via roads and cities every day. Various economic, social and cultural factors affect growth of traffic congestion. The effect of traffic congestion has major impacts on accidents, loss of time, cost, delay of emergency, etc. Due to traffic congestions there is a loss in productivity from workers, people lose time, trade opportunities are lost, delivery gets delayed leading to increasing cost. In providing solutions to these congestion problems, a new robust and smart solution that is based on Vehicle-to-Infrastructure (V2I) technology capable of addressing road accident and traffic management in Nigeria’s mega cities is proposed. In this paper, the proposed system serves as an alternative to the existing traffic management system with an intersection control station that communicates with vehicles approaching the intersection through the V2I network. The vehicles are equipped with Dashboard Traffic Light (DBTL) sensors that communicate with the infrastructure. A Safe-to-Pass-First (SPF) algorithm was designed that considered real time speed, vehicle position and data to decide when to allow the vehicle to pass through the intersection. The algorithm checks the status of conflicting lanes to ensure that vehicle pass the intersection safely. This method has been found to be more efficient than the existing methods as the average waiting time at the intersection is reduced by 23% and improved throughput recorded, Python code and SUMO were used for the simulation.","PeriodicalId":166322,"journal":{"name":"2019 15th International Conference on Electronics, Computer and Computation (ICECCO)","volume":"10 34 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123372252","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2019-12-01DOI: 10.1109/ICECCO48375.2019.9043244
Aaron Tsado Kolo, Moses Stephen Abiodun, J. A. Ezenwora
The worldwide development of thousands of terrestrial broadcasting networks over the past 90 years depended crucially upon the prediction and measurement of radio field strength. This work determines the actual coverage area of NTA television signal in Kebbi State, Nigeria, by quantitatively measuring the signal level of the signal. The signal level of the transmitter of Nigeria Television Authority (NTA), Birnin-Kebbi channel 39 (615.25 MHz), and the corresponding distances were measured along some radial routes with the transmitting stations as the reference point. This measurement was taken using Digital Signal Level Meter and Global Positioning System (GPS). From the data obtained, Surfer 13 software application was used to draw contour map of the signal level around the transmitting station to determine the coverage areas of the station. The result obtained show that the present configuration of the transmitter of the television station does not give an optimal coverage of the state. Only 4.05% of the entire land mass of the state has television signal coverage. Consequently, greater percentage of Kebbi State is completely out of NTA television signal coverage. So, there is need to have repeater stations at some intervals to ensure reception of the television signal throughout the state.
{"title":"Determination of Coverage Area of Nigeria Television Authority (NTA), Television Signal in Kebbi State, Nigeria","authors":"Aaron Tsado Kolo, Moses Stephen Abiodun, J. A. Ezenwora","doi":"10.1109/ICECCO48375.2019.9043244","DOIUrl":"https://doi.org/10.1109/ICECCO48375.2019.9043244","url":null,"abstract":"The worldwide development of thousands of terrestrial broadcasting networks over the past 90 years depended crucially upon the prediction and measurement of radio field strength. This work determines the actual coverage area of NTA television signal in Kebbi State, Nigeria, by quantitatively measuring the signal level of the signal. The signal level of the transmitter of Nigeria Television Authority (NTA), Birnin-Kebbi channel 39 (615.25 MHz), and the corresponding distances were measured along some radial routes with the transmitting stations as the reference point. This measurement was taken using Digital Signal Level Meter and Global Positioning System (GPS). From the data obtained, Surfer 13 software application was used to draw contour map of the signal level around the transmitting station to determine the coverage areas of the station. The result obtained show that the present configuration of the transmitter of the television station does not give an optimal coverage of the state. Only 4.05% of the entire land mass of the state has television signal coverage. Consequently, greater percentage of Kebbi State is completely out of NTA television signal coverage. So, there is need to have repeater stations at some intervals to ensure reception of the television signal throughout the state.","PeriodicalId":166322,"journal":{"name":"2019 15th International Conference on Electronics, Computer and Computation (ICECCO)","volume":"14 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129801523","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2019-12-01DOI: 10.1109/ICECCO48375.2019.9043256
N. Nathaniel, E. Ashigwuike, Abubkar U. Sadiq, N. A. Obadiah
An application of radar sensor in self-driven vehicles, to be used in detecting obstacles and providing accurate information about the vehicle’s ambient environment to activate appropriate control commands. There is need for the sensor to have a computing platform that can ensure real-time processing of the received signals. From previous works, appropriate algorithm, chip-set, memory, etc. capable of performing these tasks sufficiently, are the main challenges. This work model and simulate Radar Sensor signals; Radar signal for automated driving using Fast Fourier Transform (FFT) Technique. Analysis on the FFT Technique is carried out; in terms of its merits and demerits in this application. Applicability of Wavelet Transform (WT) technique for processing of Automotive Radar Signal (ARS) is demonstrated by offering WT Technique Solutions to FFT Problems for ARS by modeling and simulating the following: (a) 1-D Multi-signal WT Operations; (b) Solution to the Noise Problems – Wavelet Denoising; (c) Use of WT for Time-Frequency Reassignment and Mode Extraction with Synchrosqueezing; (d) Discrete Wavelet Transform (DWT) and Continuous Wavelet Transform (CWT) of an ARS with a Frequency Break. All simulations are done using the MATLAB R2017b software. The focused of this work is in the area of appropriate algorithm: to show how the WT technique and which of its tools, and how those tools could be used in developing appropriate algorithm for Automotive Radar Signal Processing (ARSP) as applied in self-driven vehicles.
{"title":"Modeling, Simulation and Analysis of Automotive Radar Signal Using Wavelet Transform Technique","authors":"N. Nathaniel, E. Ashigwuike, Abubkar U. Sadiq, N. A. Obadiah","doi":"10.1109/ICECCO48375.2019.9043256","DOIUrl":"https://doi.org/10.1109/ICECCO48375.2019.9043256","url":null,"abstract":"An application of radar sensor in self-driven vehicles, to be used in detecting obstacles and providing accurate information about the vehicle’s ambient environment to activate appropriate control commands. There is need for the sensor to have a computing platform that can ensure real-time processing of the received signals. From previous works, appropriate algorithm, chip-set, memory, etc. capable of performing these tasks sufficiently, are the main challenges. This work model and simulate Radar Sensor signals; Radar signal for automated driving using Fast Fourier Transform (FFT) Technique. Analysis on the FFT Technique is carried out; in terms of its merits and demerits in this application. Applicability of Wavelet Transform (WT) technique for processing of Automotive Radar Signal (ARS) is demonstrated by offering WT Technique Solutions to FFT Problems for ARS by modeling and simulating the following: (a) 1-D Multi-signal WT Operations; (b) Solution to the Noise Problems – Wavelet Denoising; (c) Use of WT for Time-Frequency Reassignment and Mode Extraction with Synchrosqueezing; (d) Discrete Wavelet Transform (DWT) and Continuous Wavelet Transform (CWT) of an ARS with a Frequency Break. All simulations are done using the MATLAB R2017b software. The focused of this work is in the area of appropriate algorithm: to show how the WT technique and which of its tools, and how those tools could be used in developing appropriate algorithm for Automotive Radar Signal Processing (ARSP) as applied in self-driven vehicles.","PeriodicalId":166322,"journal":{"name":"2019 15th International Conference on Electronics, Computer and Computation (ICECCO)","volume":"46 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134024998","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2019-12-01DOI: 10.1109/ICECCO48375.2019.9043229
Temiloluwa I. Adegboruwa, Steve A. Adeshina, Moussa Mahamat Boukar
Bitcoin is the first digital currency that uses decentralization to solve the issue of trust in performing the functions of a digital currency successfully. This digital currency has shown extraordinary growth and intermittent plunge in value and market capitalization over time. This makes it important to understand what determines the volatility of bitcoin and to what extent they are predictable. Long Short Term Memory Neural Networks (LSTM-NN) have recently grown popular for time series prediction systems but there has been no consensus on methods to model time series inputs for LSTMs, this paper proposes the need for this problem to be solved by conducting an experimental research on the efficacy of an LSTM-NN given the form of its time-series input features.
{"title":"Time Series Analysis and prediction of bitcoin using Long Short Term Memory Neural Network","authors":"Temiloluwa I. Adegboruwa, Steve A. Adeshina, Moussa Mahamat Boukar","doi":"10.1109/ICECCO48375.2019.9043229","DOIUrl":"https://doi.org/10.1109/ICECCO48375.2019.9043229","url":null,"abstract":"Bitcoin is the first digital currency that uses decentralization to solve the issue of trust in performing the functions of a digital currency successfully. This digital currency has shown extraordinary growth and intermittent plunge in value and market capitalization over time. This makes it important to understand what determines the volatility of bitcoin and to what extent they are predictable. Long Short Term Memory Neural Networks (LSTM-NN) have recently grown popular for time series prediction systems but there has been no consensus on methods to model time series inputs for LSTMs, this paper proposes the need for this problem to be solved by conducting an experimental research on the efficacy of an LSTM-NN given the form of its time-series input features.","PeriodicalId":166322,"journal":{"name":"2019 15th International Conference on Electronics, Computer and Computation (ICECCO)","volume":"220 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132684988","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2019-12-01DOI: 10.1109/ICECCO48375.2019.9043218
E. Apollos, Steve A. Adeshina, N. A. Nnanna
In this paper, memristor based Computation-in-Memory (CiM) architecture is introduced to mitigate today’s challenges faced with the conventional CMOS technologies and von Neumann architecture due to the emergence of Big data era. Memristor material has shown through design and simulation as presented in this paper where necessary to have high switching speed, non-volatile capability, compact density using crossbar array, chaotic and non-binary ability, almost zero power and current leakage thus making memristor-based Computation-in-Memory architecture the needed technology revolution to mitigate these Big data computing limits caused by the conventional computer architecture and CMOS process technologies. The CMOS technologies and von Neumann architecture have reached fabrication physical limit as transistor scaling goes below 45nm technology node thus resulting to increasing delays that occur in the metal interconnect for signal propagation in transistors, power leakages, low data reliability, security issues, and high cost of developing and building CMOS chip-fabrication facilities as scaling goes down below 45nm.
{"title":"Memristor-Based CiM Architecture for Big Data Era","authors":"E. Apollos, Steve A. Adeshina, N. A. Nnanna","doi":"10.1109/ICECCO48375.2019.9043218","DOIUrl":"https://doi.org/10.1109/ICECCO48375.2019.9043218","url":null,"abstract":"In this paper, memristor based Computation-in-Memory (CiM) architecture is introduced to mitigate today’s challenges faced with the conventional CMOS technologies and von Neumann architecture due to the emergence of Big data era. Memristor material has shown through design and simulation as presented in this paper where necessary to have high switching speed, non-volatile capability, compact density using crossbar array, chaotic and non-binary ability, almost zero power and current leakage thus making memristor-based Computation-in-Memory architecture the needed technology revolution to mitigate these Big data computing limits caused by the conventional computer architecture and CMOS process technologies. The CMOS technologies and von Neumann architecture have reached fabrication physical limit as transistor scaling goes below 45nm technology node thus resulting to increasing delays that occur in the metal interconnect for signal propagation in transistors, power leakages, low data reliability, security issues, and high cost of developing and building CMOS chip-fabrication facilities as scaling goes down below 45nm.","PeriodicalId":166322,"journal":{"name":"2019 15th International Conference on Electronics, Computer and Computation (ICECCO)","volume":"2 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132529478","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}