Introduction: Liver biopsies are the main method in the diagnosis and treatment of paediatric liver pathologies. Major complication rates of paediatric liver biopsies range from 0% to 6.6% in the literature and minor complication rates range from 0% to 25%. In this study, we aimed to review the complications, indications and results of percutaneous core liver biopsies with paediatric sonography in a tertiary care centre by an interventional radiologist.
Methods: We retrospectively evaluated the results, indications and complications of paediatric liver biopsies performed in our tertiary health centre between January 2017 and December 2020. Biopsies were performed with a 16G semi-automatic needle in 17 patients (29.8%) and with an 18G semi-automatic needle in 40 patients (70.2%). Biopsies were performed only with local anaesthesia in patients older than 12 years; in younger patients, it was performed under general anaesthesia.
Results: Fifty-eight liver biopsies were obtained from 57 children (34 males, 23 females). The most common indications were elevated liver enzymes (33 patients), cholestasis (14 patients), and adiposity and metabolic problems (6 patents). The most common pathological diagnoses were chronic hepatitis (33 patients) and steatosis (10 patients). Major complication in the form of symptomatic subcapsular haematoma developed after liver biopsy performed with 18G needle in only one patient (1.8%).
Conclusions: As previously stated in the literature, percutaneous biopsies performed by interventional radiologists in paediatric patients under the guidance of sonography can be used in diagnosis and treatment; the complication rate is low and it is a safe method.
{"title":"Paediatric liver biopsies: A single-centre experience in Erzincan Binali Yıldırım University.","authors":"Omer Kazci, Ozlem Kadirhan, Cigdem Uner, Erdal Karavas, Berna Ucan, Sonay Aydin","doi":"10.1177/1742271X231157634","DOIUrl":"10.1177/1742271X231157634","url":null,"abstract":"<p><strong>Introduction: </strong>Liver biopsies are the main method in the diagnosis and treatment of paediatric liver pathologies. Major complication rates of paediatric liver biopsies range from 0% to 6.6% in the literature and minor complication rates range from 0% to 25%. In this study, we aimed to review the complications, indications and results of percutaneous core liver biopsies with paediatric sonography in a tertiary care centre by an interventional radiologist.</p><p><strong>Methods: </strong>We retrospectively evaluated the results, indications and complications of paediatric liver biopsies performed in our tertiary health centre between January 2017 and December 2020. Biopsies were performed with a 16G semi-automatic needle in 17 patients (29.8%) and with an 18G semi-automatic needle in 40 patients (70.2%). Biopsies were performed only with local anaesthesia in patients older than 12 years; in younger patients, it was performed under general anaesthesia.</p><p><strong>Results: </strong>Fifty-eight liver biopsies were obtained from 57 children (34 males, 23 females). The most common indications were elevated liver enzymes (33 patients), cholestasis (14 patients), and adiposity and metabolic problems (6 patents). The most common pathological diagnoses were chronic hepatitis (33 patients) and steatosis (10 patients). Major complication in the form of symptomatic subcapsular haematoma developed after liver biopsy performed with 18G needle in only one patient (1.8%).</p><p><strong>Conclusions: </strong>As previously stated in the literature, percutaneous biopsies performed by interventional radiologists in paediatric patients under the guidance of sonography can be used in diagnosis and treatment; the complication rate is low and it is a safe method.</p>","PeriodicalId":46518,"journal":{"name":"JOURNAL OF INFORMATION & OPTIMIZATION SCIENCES","volume":"12 1","pages":"4-10"},"PeriodicalIF":4.5,"publicationDate":"2024-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10836225/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"81635202","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Through technology, the world has become a global village, connecting geographically distant people into virtual neighbors. This presents many opportunities for individuals, businesses, and remote communities. The advantages of this connectivity have been brought to the fore by COVID-19, as technology and the Internet kept the world connected and functioning. However, in Africa, although much has been done on connecting the unconnected, there remains rural underserved communities. Notwithstanding the benefits of technology and the Internet, the cyber-safety and cybersecurity of those communities living on the fringes of society – rural underserved communities – is unknown. Consequently, many researchers, noting the benefits of technology, have embarked on ICT for Development projects (ICT4D). However, community based ICT4D projects have been failing due to myriad of factors. Some such factors are engagement misalignment, cultural faux pas’, communication, power dynamics and community-buy-in of end-products. With the propensity to circumvent the shortcomings indicated above, Participatory Design (PD) processes may be utilized. Co-design extends PD by allowing researchers and participants to be equals in the creation of solutions to problems contrary to traditional research approaches, that abstract researchers from the community within which they work. As an effort to address these challenges, we present a co-designed community engagement protocol (co-CEP) for rural underserved communities in Northern (Oukwanyama), Namibia. This study was qualitative, guided by Ubuntu and Uushiindaism tenets of trust, neighbourliness, respect, familiarity, hospitality, and collective unity as pillars of fostering communication within 4 villages in Oukwanyama, Namibia. The aim was to understand the concepts of how information flows in order to get engagement from community members for cybersecurity research. We held 4 co-design sessions in 4 villages, with each feeding into the next. The presented co-CEP comprises of 13 elements that are key to successful community engagement within community-based technology projects, especially cybersecurity research. The co-CEP is part of a research study on co-designing cybersecurity practices with rural underserved communities.
{"title":"Co-CEP: A co-designed community engagement protocol as a catalyst for cybersecurity research in Africa: The case of northern Namibia","authors":"G. Nhinda, Fungai Bhunu Shava","doi":"10.47974/jios-1407","DOIUrl":"https://doi.org/10.47974/jios-1407","url":null,"abstract":"Through technology, the world has become a global village, connecting geographically distant people into virtual neighbors. This presents many opportunities for individuals, businesses, and remote communities. The advantages of this connectivity have been brought to the fore by COVID-19, as technology and the Internet kept the world connected and functioning. However, in Africa, although much has been done on connecting the unconnected, there remains rural underserved communities. Notwithstanding the benefits of technology and the Internet, the cyber-safety and cybersecurity of those communities living on the fringes of society – rural underserved communities – is unknown. Consequently, many researchers, noting the benefits of technology, have embarked on ICT for Development projects (ICT4D). However, community based ICT4D projects have been failing due to myriad of factors. Some such factors are engagement misalignment, cultural faux pas’, communication, power dynamics and community-buy-in of end-products. With the propensity to circumvent the shortcomings indicated above, Participatory Design (PD) processes may be utilized. Co-design extends PD by allowing researchers and participants to be equals in the creation of solutions to problems contrary to traditional research approaches, that abstract researchers from the community within which they work. As an effort to address these challenges, we present a co-designed community engagement protocol (co-CEP) for rural underserved communities in Northern (Oukwanyama), Namibia. This study was qualitative, guided by Ubuntu and Uushiindaism tenets of trust, neighbourliness, respect, familiarity, hospitality, and collective unity as pillars of fostering communication within 4 villages in Oukwanyama, Namibia. The aim was to understand the concepts of how information flows in order to get engagement from community members for cybersecurity research. We held 4 co-design sessions in 4 villages, with each feeding into the next. The presented co-CEP comprises of 13 elements that are key to successful community engagement within community-based technology projects, especially cybersecurity research. The co-CEP is part of a research study on co-designing cybersecurity practices with rural underserved communities.","PeriodicalId":46518,"journal":{"name":"JOURNAL OF INFORMATION & OPTIMIZATION SCIENCES","volume":"1 1","pages":""},"PeriodicalIF":1.4,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"70470263","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Wanting Chen, Xuanyi Wu, Z. Y. Chen, Y. Meng, Ruei-yuan Wang, Timothy Chen
With poor financial information transparency, and relatively weak profitability and asset strength stability, Small, medium and micro enterprises started late in China. This makes commercial banks need to bear more risks when providing loans to small, medium and micro enterprises than large enterprises. Big. When commercial banks do not have credit records of certain small, medium and micro enterprises, this will increase the risk that banks need to bear when lending to these small, medium and micro enterprises without credit records, and will also increase the difficulty of credit loan for small, medium and micro enterprises in commercial banks.First of all, we should comprehensively analyze the credit risk of enterprises, establish the TOPSIS comprehensive evaluation model, and then establish the constraint conditions according to the data given by the topic, calculate the interest rate of bank loans to each type of enterprises, so as to determine the bank’s credit strategy. The paper seeks the bank’s credit strategy for these enterprises when the annual total amount of credit is fixed. We use AHP (Analytic Hierarchy Process) to normalize the credit rating, and then get the relevant data ranking of enterprises with credit records through TOPSIS comprehensive evaluation model. Through cluster analysis, we divide them into nine categories, and make the optimal credit strategy from the perspective of interest rate and credit line.
{"title":"Credit strategy of micro, small, and medium enterprises with known reputation risk: Evidence from a comprehensive evaluation model","authors":"Wanting Chen, Xuanyi Wu, Z. Y. Chen, Y. Meng, Ruei-yuan Wang, Timothy Chen","doi":"10.47974/jios-1183","DOIUrl":"https://doi.org/10.47974/jios-1183","url":null,"abstract":"With poor financial information transparency, and relatively weak profitability and asset strength stability, Small, medium and micro enterprises started late in China. This makes commercial banks need to bear more risks when providing loans to small, medium and micro enterprises than large enterprises. Big. When commercial banks do not have credit records of certain small, medium and micro enterprises, this will increase the risk that banks need to bear when lending to these small, medium and micro enterprises without credit records, and will also increase the difficulty of credit loan for small, medium and micro enterprises in commercial banks.First of all, we should comprehensively analyze the credit risk of enterprises, establish the TOPSIS comprehensive evaluation model, and then establish the constraint conditions according to the data given by the topic, calculate the interest rate of bank loans to each type of enterprises, so as to determine the bank’s credit strategy. The paper seeks the bank’s credit strategy for these enterprises when the annual total amount of credit is fixed. We use AHP (Analytic Hierarchy Process) to normalize the credit rating, and then get the relevant data ranking of enterprises with credit records through TOPSIS comprehensive evaluation model. Through cluster analysis, we divide them into nine categories, and make the optimal credit strategy from the perspective of interest rate and credit line.","PeriodicalId":46518,"journal":{"name":"JOURNAL OF INFORMATION & OPTIMIZATION SCIENCES","volume":"1 1","pages":""},"PeriodicalIF":1.4,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"70469209","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Though visual identification of plants seems easier for the trained botanists or agriculturists, the automated identification of plants using leaf images still remains a challenging task. The proper identification of plants forms the most important phase as it leads to usage of plants for various purposes. In this paper, we have manually collected about 30 leaves per species belonging to five medicinal plant species. The dataset was created using the scans of the adaxial and abaxial sides of the leaves. As the small number of images makes it difficult for the Convolutional neural network to learn the features, we have augmented the dataset using Deep Convolutional Generative Adversarial Networks (DCGAN). This paper shows that the low-quality images obtained by the scanner could be effectively augmented using the DCGAN thus increasing the variance in the dataset. A comparison of proposed versions of deep learning models namely VGG16, ResNet50 and DenseNet 121 is presented. To validate the results obtained, 5-Fold-Cross validation was used.
{"title":"DCGAN-based deep learning approach for medicinal leaf identification","authors":"S. Sachar, Anuj Kumar","doi":"10.47974/jios-1270","DOIUrl":"https://doi.org/10.47974/jios-1270","url":null,"abstract":"Though visual identification of plants seems easier for the trained botanists or agriculturists, the automated identification of plants using leaf images still remains a challenging task. The proper identification of plants forms the most important phase as it leads to usage of plants for various purposes. In this paper, we have manually collected about 30 leaves per species belonging to five medicinal plant species. The dataset was created using the scans of the adaxial and abaxial sides of the leaves. As the small number of images makes it difficult for the Convolutional neural network to learn the features, we have augmented the dataset using Deep Convolutional Generative Adversarial Networks (DCGAN). This paper shows that the low-quality images obtained by the scanner could be effectively augmented using the DCGAN thus increasing the variance in the dataset. A comparison of proposed versions of deep learning models namely VGG16, ResNet50 and DenseNet 121 is presented. To validate the results obtained, 5-Fold-Cross validation was used.","PeriodicalId":46518,"journal":{"name":"JOURNAL OF INFORMATION & OPTIMIZATION SCIENCES","volume":"1 1","pages":""},"PeriodicalIF":1.4,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"70469305","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The research paper detects a modified version of the Vedic multiplier by using the sutras of Vedic mathematics by implementing a 13T hybrid full adder. A conventional multiplier is considered for comparative analysis of existing Vedic versions and a modified Vedic multiplier that better reflects the timing and usage of the device. This technology was developed and implemented by EDA. The proposed 13T hybrid full adder is achieved to reduce the static power consumption by 12.12 % and dynamic power consumption by 15.7%. The modified Vedic multiplier is implemented by using a 13T hybrid full adder which is achieved to reduce the power consumption by 10.08% and delay by 2.068%. The circuit and simulation are executed for 4-bit multiplication and can be performed in Eight-bit, Sixteen-bit or Thirty-two-bit. Results of simulation are shown only in the Vedic 4-bit multiplication technique. The results of this multiplication method are compared with existing techniques of Vedic multiplicative circuits.
{"title":"Optimized vedic multiplier using low power 13T hybrid full adder","authors":"Mansi Jhamb, M. Kumar","doi":"10.47974/jios-1222","DOIUrl":"https://doi.org/10.47974/jios-1222","url":null,"abstract":"The research paper detects a modified version of the Vedic multiplier by using the sutras of Vedic mathematics by implementing a 13T hybrid full adder. A conventional multiplier is considered for comparative analysis of existing Vedic versions and a modified Vedic multiplier that better reflects the timing and usage of the device. This technology was developed and implemented by EDA. The proposed 13T hybrid full adder is achieved to reduce the static power consumption by 12.12 % and dynamic power consumption by 15.7%. The modified Vedic multiplier is implemented by using a 13T hybrid full adder which is achieved to reduce the power consumption by 10.08% and delay by 2.068%. The circuit and simulation are executed for 4-bit multiplication and can be performed in Eight-bit, Sixteen-bit or Thirty-two-bit. Results of simulation are shown only in the Vedic 4-bit multiplication technique. The results of this multiplication method are compared with existing techniques of Vedic multiplicative circuits.","PeriodicalId":46518,"journal":{"name":"JOURNAL OF INFORMATION & OPTIMIZATION SCIENCES","volume":"1 1","pages":""},"PeriodicalIF":1.4,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"70469448","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
This paper proposed lasso estimator in parametric frailty model. Comparison of lasso (least absolute shrinkage and selection operator) and maximum likelihood (ML) estimator is done in terms of scalar mean square error (MSE). Performance of lasso estimator is examined through simulation study. Furthermore, approach is applied to analyze infant mortality in India.
{"title":"Lasso estimation in parametric frailty model","authors":"Anu Sirohi, Prem Shenkar Jha","doi":"10.47974/jios-1291","DOIUrl":"https://doi.org/10.47974/jios-1291","url":null,"abstract":"This paper proposed lasso estimator in parametric frailty model. Comparison of lasso (least absolute shrinkage and selection operator) and maximum likelihood (ML) estimator is done in terms of scalar mean square error (MSE). Performance of lasso estimator is examined through simulation study. Furthermore, approach is applied to analyze infant mortality in India.","PeriodicalId":46518,"journal":{"name":"JOURNAL OF INFORMATION & OPTIMIZATION SCIENCES","volume":"1 1","pages":""},"PeriodicalIF":1.4,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"70469704","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
India is a country where there is an abundant supply of human resources but at the same time their quality and standard of workforce is very low. On the basis of the report given by UNO India was placed at 131th rank in terms of Human Development Index (HDI) but compared with in the less developed countries the index is very little low. From the days of independence the Government of India has launched various programmes and policies to improve their working efficiency but the Government was not able to achieve the desired results. There were many more policies ensuring better productive workforce to increase their standard and quality of the production process so as to achieve the better results. In Indian context these programmes like social security and compensation policies not only safeguard the interest of the working class but are also providing a fair and competitive wages. Proper HR policies are needed so as to tap their ful working capacity and also to improve the status of the firm. More precisely the working class in India has drastically faced many challenges and risks under the globalization period. The authors makes an attempt to analyse the role of human resources in Indian context and certain suggestive measures to improve their workforce in a detailed manner are brought out in this paper with some illustrations.
{"title":"A perspective analysis on human resources and suggestive measures to improve their workforce in Indian context","authors":"S. B. Roopaa, N. Kogila, P. Balasubramanian","doi":"10.47974/jios-1310","DOIUrl":"https://doi.org/10.47974/jios-1310","url":null,"abstract":"India is a country where there is an abundant supply of human resources but at the same time their quality and standard of workforce is very low. On the basis of the report given by UNO India was placed at 131th rank in terms of Human Development Index (HDI) but compared with in the less developed countries the index is very little low. From the days of independence the Government of India has launched various programmes and policies to improve their working efficiency but the Government was not able to achieve the desired results. There were many more policies ensuring better productive workforce to increase their standard and quality of the production process so as to achieve the better results. In Indian context these programmes like social security and compensation policies not only safeguard the interest of the working class but are also providing a fair and competitive wages. Proper HR policies are needed so as to tap their ful working capacity and also to improve the status of the firm. More precisely the working class in India has drastically faced many challenges and risks under the globalization period. The authors makes an attempt to analyse the role of human resources in Indian context and certain suggestive measures to improve their workforce in a detailed manner are brought out in this paper with some illustrations.","PeriodicalId":46518,"journal":{"name":"JOURNAL OF INFORMATION & OPTIMIZATION SCIENCES","volume":"1 1","pages":""},"PeriodicalIF":1.4,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"70469802","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Prof. K. R. Chowdhary, Rajendra Purohit, S. Purohit
Multi-core design intends to serve a large market with user-oriented and highproductivity management as opposed to any other parallel system. Small numbers of processors, a frequent feature of current multi-core systems, are ideal for future generation of CPUs, where automated parallelization succeeds on shared space architectures. The multi-core compiler optimization platform CETUS (high-level to high-level compiler) offers initiates automatic parallelization in compiled programmes. This compiler’s infrastructure is built with C programmes in mind and is user-friendly and simple to use. It offers the significant parallelization passes and also the underlying empowering techniques, allows source-to-source conversions, and delivers these features. This compiler has undergone numerous benchmark investigations (techniques) and approach implementation iterations. It might enhance the programs’ parallel performance. The main drawback of advanced optimising compilers, however, is that they don’t provide runtime details like the program’s input data. The approaches presented in this paper facilitatedynamic optimization using CETUS. The large amount of proposed compiler analyses and modifications for parallelization is the last point. To research the behaviour as well as the throughput gains, we investigated both non-CETUS based and CETUS based parallelized program features in this work.
{"title":"Source-to-source translation for code-optimization","authors":"Prof. K. R. Chowdhary, Rajendra Purohit, S. Purohit","doi":"10.47974/jios-1350","DOIUrl":"https://doi.org/10.47974/jios-1350","url":null,"abstract":"Multi-core design intends to serve a large market with user-oriented and highproductivity management as opposed to any other parallel system. Small numbers of processors, a frequent feature of current multi-core systems, are ideal for future generation of CPUs, where automated parallelization succeeds on shared space architectures. The multi-core compiler optimization platform CETUS (high-level to high-level compiler) offers initiates automatic parallelization in compiled programmes. This compiler’s infrastructure is built with C programmes in mind and is user-friendly and simple to use. It offers the significant parallelization passes and also the underlying empowering techniques, allows source-to-source conversions, and delivers these features. This compiler has undergone numerous benchmark investigations (techniques) and approach implementation iterations. It might enhance the programs’ parallel performance. The main drawback of advanced optimising compilers, however, is that they don’t provide runtime details like the program’s input data. The approaches presented in this paper facilitatedynamic optimization using CETUS. The large amount of proposed compiler analyses and modifications for parallelization is the last point. To research the behaviour as well as the throughput gains, we investigated both non-CETUS based and CETUS based parallelized program features in this work.","PeriodicalId":46518,"journal":{"name":"JOURNAL OF INFORMATION & OPTIMIZATION SCIENCES","volume":"1 1","pages":""},"PeriodicalIF":1.4,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"70469683","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Sakshi Kalra, Chitneedi Hemanth Sai Kumar, Yashvardhan Sharma, G. S. Chauhan
Social media for news utilization has its own pros and cons. There are several reasons why people look for and read news through internet media. On the one hand, it is easier to access, and on the other, social media’s dynamic content and misinformation pose serious problems for both government and public institutions. Several studies have been conducted in the past to classify online reviews and their textual content. The current paper suggests a multimodal strategy for the (FND) task that covers both text and image. The suggested model (FakeExpose) is created to automatically learn a variety of discriminative features, instead of relying on manually created features. Several pre-trained words and image embedding models, such as DistilRoBERTa and Vision Transformers (ViTs) are used and fine-tined for the best feature extraction and the various word dependencies. Data augmentation is used to address the issue of pre-trained textual feature extractors not processing a maximum of 512 tokens at a time. The accuracy of the presented model on PolitiFact and GossipCop is 91.35 percent and 98.59 percent, respectively, based on current standards. According to our knowledge, this is the first attempt to use the FakeNewsNet repository to reach the maximum multimodal accuracy. The results show that combining text and image data improves accuracy when compared to utilizing only text or images (Unimodal). Moreover, the outcomes imply that adding more data has improved the model’s accuracy rather than degraded it.
{"title":"FakeExpose: Uncovering the falsity of news by targeting the multimodality via transfer learning","authors":"Sakshi Kalra, Chitneedi Hemanth Sai Kumar, Yashvardhan Sharma, G. S. Chauhan","doi":"10.47974/jios-1342","DOIUrl":"https://doi.org/10.47974/jios-1342","url":null,"abstract":"Social media for news utilization has its own pros and cons. There are several reasons why people look for and read news through internet media. On the one hand, it is easier to access, and on the other, social media’s dynamic content and misinformation pose serious problems for both government and public institutions. Several studies have been conducted in the past to classify online reviews and their textual content. The current paper suggests a multimodal strategy for the (FND) task that covers both text and image. The suggested model (FakeExpose) is created to automatically learn a variety of discriminative features, instead of relying on manually created features. Several pre-trained words and image embedding models, such as DistilRoBERTa and Vision Transformers (ViTs) are used and fine-tined for the best feature extraction and the various word dependencies. Data augmentation is used to address the issue of pre-trained textual feature extractors not processing a maximum of 512 tokens at a time. The accuracy of the presented model on PolitiFact and GossipCop is 91.35 percent and 98.59 percent, respectively, based on current standards. According to our knowledge, this is the first attempt to use the FakeNewsNet repository to reach the maximum multimodal accuracy. The results show that combining text and image data improves accuracy when compared to utilizing only text or images (Unimodal). Moreover, the outcomes imply that adding more data has improved the model’s accuracy rather than degraded it.","PeriodicalId":46518,"journal":{"name":"JOURNAL OF INFORMATION & OPTIMIZATION SCIENCES","volume":"1 1","pages":""},"PeriodicalIF":1.4,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"70470026","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The research methodology employed in the study was the hybrid approach consisting of exploratory and descriptive research as a primary method of research. Primary and secondary research was conducted during the described research consisting of secondary data. Digital marketing has also been shown to affect consumer decision-making with a more important impact on problem recognition, purchase decisions and purchase behavior. In this paper we have Implementation & Analysis of Strategies based on Digital Marketing, which will target only Certain Group of Consumers. Which will result in Maximum Benefits for our Website Analytics. And also, to improve Number of Conversions, that is Converting Viewer to Customer. We have developed the website and analyzing the good techniques for improving the marketing techniques by applying the marketing techniques on our website and fetching out best techniques. We are trying to provide better way of doing digital marketing and developed a website for the business so that we can do digital marketing in a better way so that everyone can face upcoming Indian crises in market field and can-do marketing of their product at low cost.
{"title":"Digital marketing in India: Navigating the future with effective strategies and performance metrics","authors":"Priyanka Panday","doi":"10.47974/jios-1418","DOIUrl":"https://doi.org/10.47974/jios-1418","url":null,"abstract":"The research methodology employed in the study was the hybrid approach consisting of exploratory and descriptive research as a primary method of research. Primary and secondary research was conducted during the described research consisting of secondary data. Digital marketing has also been shown to affect consumer decision-making with a more important impact on problem recognition, purchase decisions and purchase behavior. In this paper we have Implementation & Analysis of Strategies based on Digital Marketing, which will target only Certain Group of Consumers. Which will result in Maximum Benefits for our Website Analytics. And also, to improve Number of Conversions, that is Converting Viewer to Customer. We have developed the website and analyzing the good techniques for improving the marketing techniques by applying the marketing techniques on our website and fetching out best techniques. We are trying to provide better way of doing digital marketing and developed a website for the business so that we can do digital marketing in a better way so that everyone can face upcoming Indian crises in market field and can-do marketing of their product at low cost.","PeriodicalId":46518,"journal":{"name":"JOURNAL OF INFORMATION & OPTIMIZATION SCIENCES","volume":"1 1","pages":""},"PeriodicalIF":1.4,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"70470907","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}