{"title":"使用朴素贝叶斯作为判别模型","authors":"E. Azeraf, E. Monfrini, W. Pieczynski","doi":"10.1145/3457682.3457697","DOIUrl":null,"url":null,"abstract":"For classification tasks, probabilistic graphical models are usually categorized into two disjoint classes: generative or discriminative. It depends on the posterior probability p(x|y) of the label x given the observation y computation. On the one hand, generative models, like the Naive Bayes or the Hidden Markov Model (HMM), need the computation of the joint probability p(x, y), before using the Bayes rule to compute p(x|y). On the other hand, discriminative models compute p(x|y) directly, regardless of the observations’ law. They are intensively used nowadays, with models as Logistic Regression or Conditional Random Fields (CRF). However, the recent Entropic Forward-Backward algorithm shows that the HMM, considered as a generative model, can also match the discriminative one’s definition. This example leads to question if it is the case for other generative models. In this paper, we show that the Naive Bayes can also match the discriminative model definition, so it can be used in either a generative or a discriminative way. Moreover, this observation also discusses the notion of Generative-Discriminative pairs, linking, for example, Naive Bayes and Logistic Regression, or HMM and CRF. Related to this point, we show that the Logistic Regression can be viewed as a particular case of the Naive Bayes used in a discriminative way.","PeriodicalId":142045,"journal":{"name":"2021 13th International Conference on Machine Learning and Computing","volume":"60 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-02-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":"{\"title\":\"Using the Naive Bayes as a discriminative model\",\"authors\":\"E. Azeraf, E. Monfrini, W. Pieczynski\",\"doi\":\"10.1145/3457682.3457697\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"For classification tasks, probabilistic graphical models are usually categorized into two disjoint classes: generative or discriminative. It depends on the posterior probability p(x|y) of the label x given the observation y computation. On the one hand, generative models, like the Naive Bayes or the Hidden Markov Model (HMM), need the computation of the joint probability p(x, y), before using the Bayes rule to compute p(x|y). On the other hand, discriminative models compute p(x|y) directly, regardless of the observations’ law. They are intensively used nowadays, with models as Logistic Regression or Conditional Random Fields (CRF). However, the recent Entropic Forward-Backward algorithm shows that the HMM, considered as a generative model, can also match the discriminative one’s definition. This example leads to question if it is the case for other generative models. In this paper, we show that the Naive Bayes can also match the discriminative model definition, so it can be used in either a generative or a discriminative way. Moreover, this observation also discusses the notion of Generative-Discriminative pairs, linking, for example, Naive Bayes and Logistic Regression, or HMM and CRF. Related to this point, we show that the Logistic Regression can be viewed as a particular case of the Naive Bayes used in a discriminative way.\",\"PeriodicalId\":142045,\"journal\":{\"name\":\"2021 13th International Conference on Machine Learning and Computing\",\"volume\":\"60 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-02-26\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"3\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2021 13th International Conference on Machine Learning and Computing\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3457682.3457697\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 13th International Conference on Machine Learning and Computing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3457682.3457697","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
For classification tasks, probabilistic graphical models are usually categorized into two disjoint classes: generative or discriminative. It depends on the posterior probability p(x|y) of the label x given the observation y computation. On the one hand, generative models, like the Naive Bayes or the Hidden Markov Model (HMM), need the computation of the joint probability p(x, y), before using the Bayes rule to compute p(x|y). On the other hand, discriminative models compute p(x|y) directly, regardless of the observations’ law. They are intensively used nowadays, with models as Logistic Regression or Conditional Random Fields (CRF). However, the recent Entropic Forward-Backward algorithm shows that the HMM, considered as a generative model, can also match the discriminative one’s definition. This example leads to question if it is the case for other generative models. In this paper, we show that the Naive Bayes can also match the discriminative model definition, so it can be used in either a generative or a discriminative way. Moreover, this observation also discusses the notion of Generative-Discriminative pairs, linking, for example, Naive Bayes and Logistic Regression, or HMM and CRF. Related to this point, we show that the Logistic Regression can be viewed as a particular case of the Naive Bayes used in a discriminative way.