Pub Date : 2020-12-17DOI: 10.1093/acrefore/9780190236557.013.457
S. Kazarian
Societies around the world are a tapestry of cultural diversity weaved in globalization to narrate the inherent value of pluralism as a panacea for good mental health, happiness, and the good life. The scientific construction of culture is also a mosaic of ethnic and racial proxies; national worldviews such as individualism and collectivism; and construals of the self as independent and interdependent. Similarly, the culture of psychological health has been informed by the ethnocentric Western paradigm of clinical psychology looking at the “dark” psychopathological side of life and positive psychology focusing on the hedonic and eudaimonic traditions of well-being. Nevertheless, cultural pluralism (multiculturalism) and globalization have contributed to unveiling the limits of the Western paradigm in which both clinical psychology and positive psychology have been embedded and the imperative for a paradigm shift beyond the Western paradigm. The revisioning of clinical psychology as cultural clinical psychology and positive psychology as cultural positive psychology has contributed to the emergence of the more inclusive cultural psychological health perspective. Cultural psychological health considers the culture and psychological health interface to bring light on an integrated approach that narrates how mental health problems are conceptualized, expressed, and ameliorated culturally and how positive mental health is understood, desired, pursued, and promoted culturally. In addition to inclusivity, cultural psychological health pursues scientific inquiry and knowledge through both quantitative and qualitative methodologies and invokes a science and practice informed by the ethical imperatives of cultural competence and cultural humility with social responsiveness to local and global suffering, happiness, and flourishing.
{"title":"Culture and Psychological Health","authors":"S. Kazarian","doi":"10.1093/acrefore/9780190236557.013.457","DOIUrl":"https://doi.org/10.1093/acrefore/9780190236557.013.457","url":null,"abstract":"Societies around the world are a tapestry of cultural diversity weaved in globalization to narrate the inherent value of pluralism as a panacea for good mental health, happiness, and the good life. The scientific construction of culture is also a mosaic of ethnic and racial proxies; national worldviews such as individualism and collectivism; and construals of the self as independent and interdependent. Similarly, the culture of psychological health has been informed by the ethnocentric Western paradigm of clinical psychology looking at the “dark” psychopathological side of life and positive psychology focusing on the hedonic and eudaimonic traditions of well-being. Nevertheless, cultural pluralism (multiculturalism) and globalization have contributed to unveiling the limits of the Western paradigm in which both clinical psychology and positive psychology have been embedded and the imperative for a paradigm shift beyond the Western paradigm. The revisioning of clinical psychology as cultural clinical psychology and positive psychology as cultural positive psychology has contributed to the emergence of the more inclusive cultural psychological health perspective. Cultural psychological health considers the culture and psychological health interface to bring light on an integrated approach that narrates how mental health problems are conceptualized, expressed, and ameliorated culturally and how positive mental health is understood, desired, pursued, and promoted culturally. In addition to inclusivity, cultural psychological health pursues scientific inquiry and knowledge through both quantitative and qualitative methodologies and invokes a science and practice informed by the ethical imperatives of cultural competence and cultural humility with social responsiveness to local and global suffering, happiness, and flourishing.","PeriodicalId":339030,"journal":{"name":"Oxford Research Encyclopedia of Psychology","volume":"128 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-12-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128093955","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2020-12-17DOI: 10.1093/acrefore/9780190236557.013.514
Glenn Adams, Annabella Osei‐Tutu, A. A. Affram
Standard constructions of history pose a celebratory narrative of progress via modern individualist development. In contrast, decolonial perspectives emphasize the coloniality inherent both in Eurocentric modernity and in the individualist selfways associated with Eurocentric modernity. The coloniality of modern individualist selfways is evident not only in the racialized violence that enabled their characteristic experience of freedom from constraint, but also in the epistemic violence that results from the imposition of these ways of being as a developmental standard. Research in West African settings illuminates these forms of epistemic violence. Standard accounts tend to pathologize West African ways of being as immature or suboptimal in relation to a presumed universal developmental pathway toward psychological autonomy. A decolonial response, rooted in decolonial perspectives of Southern theory or epistemology, follows two analytic strategies that disrupt standard accounts. One strategy draws upon local understanding to illuminate the adaptive value of West African patterns. Rather than manifestations of backwardness on a trajectory of modern individualist development, these ways of being reflect developmental trajectories that emerged as an adaptation to cultural ecologies of embeddedness. The other strategy draws upon West African settings as a standpoint from which to denaturalize the modern individualist selfways that hegemonic perspectives regard as just-natural standards. Rather than naturally superior forms, the widespread promotion of modern individualist selfways has harmful consequences related to the narrow pursuit of personal fulfillment and corresponding disinvestment in broader solidarities. With the growth orientation of modern individualist development pushing the planet toward a future of ecological catastrophe, decolonial perspectives direct attention to West African and other communities in the Global South for ways of being, rooted in Other understandings of the past, as a pathway to a sustainable and just future.
{"title":"Decolonial Perspectives on Psychology and Development","authors":"Glenn Adams, Annabella Osei‐Tutu, A. A. Affram","doi":"10.1093/acrefore/9780190236557.013.514","DOIUrl":"https://doi.org/10.1093/acrefore/9780190236557.013.514","url":null,"abstract":"Standard constructions of history pose a celebratory narrative of progress via modern individualist development. In contrast, decolonial perspectives emphasize the coloniality inherent both in Eurocentric modernity and in the individualist selfways associated with Eurocentric modernity. The coloniality of modern individualist selfways is evident not only in the racialized violence that enabled their characteristic experience of freedom from constraint, but also in the epistemic violence that results from the imposition of these ways of being as a developmental standard. Research in West African settings illuminates these forms of epistemic violence. Standard accounts tend to pathologize West African ways of being as immature or suboptimal in relation to a presumed universal developmental pathway toward psychological autonomy. A decolonial response, rooted in decolonial perspectives of Southern theory or epistemology, follows two analytic strategies that disrupt standard accounts. One strategy draws upon local understanding to illuminate the adaptive value of West African patterns. Rather than manifestations of backwardness on a trajectory of modern individualist development, these ways of being reflect developmental trajectories that emerged as an adaptation to cultural ecologies of embeddedness. The other strategy draws upon West African settings as a standpoint from which to denaturalize the modern individualist selfways that hegemonic perspectives regard as just-natural standards. Rather than naturally superior forms, the widespread promotion of modern individualist selfways has harmful consequences related to the narrow pursuit of personal fulfillment and corresponding disinvestment in broader solidarities. With the growth orientation of modern individualist development pushing the planet toward a future of ecological catastrophe, decolonial perspectives direct attention to West African and other communities in the Global South for ways of being, rooted in Other understandings of the past, as a pathway to a sustainable and just future.","PeriodicalId":339030,"journal":{"name":"Oxford Research Encyclopedia of Psychology","volume":"10 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-12-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126198621","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2020-12-17DOI: 10.1093/acrefore/9780190236557.013.509
T. Pettigrew
The discipline of psychology has an extremely broad range—from the life sciences to the social sciences, from neuroscience to social psychology. These distinctly different components have varying histories of their own. Social psychology is psychology’s social science wing. The major social sciences—anthropology, economics, sociology, and political science—all had their origins in the 19th century or even earlier. But social psychology is much younger; it developed both in Europe and North America in the 20th century. The field’s enormous growth over the past century began modestly with a few scant locations, several textbooks, and a single journal in the 1920s. Today’s social psychologists would barely recognize their discipline in the years prior to World War II. But trends forming in the 1920s and 1930s would become important years later. With steady growth, especially starting in the 1960s, the discipline gained thousands of new doctorates and multiple journals scattered throughout the world. Social psychology has become a recognized, influential, and often-cited social science. It is the basis, for example, of behavioral economics as well as such key theories as authoritarianism in political science. Central to this extraordinary expansion were the principal events of mid-20th century. World War II, the growth of universities and the social sciences in general, rising prosperity, statistical advances, and other global changes set the stage for the discipline’s rapid development. Together with this growth, social psychology has expanded its topics in both the affective and cognitive domains. Indeed, new theories are so numerous that theoretical integration has become a prime need for the discipline.
{"title":"History of Social Psychology at Mid-20th Century","authors":"T. Pettigrew","doi":"10.1093/acrefore/9780190236557.013.509","DOIUrl":"https://doi.org/10.1093/acrefore/9780190236557.013.509","url":null,"abstract":"The discipline of psychology has an extremely broad range—from the life sciences to the social sciences, from neuroscience to social psychology. These distinctly different components have varying histories of their own. Social psychology is psychology’s social science wing. The major social sciences—anthropology, economics, sociology, and political science—all had their origins in the 19th century or even earlier. But social psychology is much younger; it developed both in Europe and North America in the 20th century.\u0000 The field’s enormous growth over the past century began modestly with a few scant locations, several textbooks, and a single journal in the 1920s. Today’s social psychologists would barely recognize their discipline in the years prior to World War II. But trends forming in the 1920s and 1930s would become important years later.\u0000 With steady growth, especially starting in the 1960s, the discipline gained thousands of new doctorates and multiple journals scattered throughout the world. Social psychology has become a recognized, influential, and often-cited social science. It is the basis, for example, of behavioral economics as well as such key theories as authoritarianism in political science. Central to this extraordinary expansion were the principal events of mid-20th century. World War II, the growth of universities and the social sciences in general, rising prosperity, statistical advances, and other global changes set the stage for the discipline’s rapid development.\u0000 Together with this growth, social psychology has expanded its topics in both the affective and cognitive domains. Indeed, new theories are so numerous that theoretical integration has become a prime need for the discipline.","PeriodicalId":339030,"journal":{"name":"Oxford Research Encyclopedia of Psychology","volume":"2018 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-12-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132319643","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2020-11-19DOI: 10.1093/acrefore/9780190236557.013.694
Annette Mülberger Rogele
The intelligence test consists of a series of exercises designed to measure intelligence. Intelligence is generally understood as mental capacity that enables a person to learn at school or, more generally, to reason, to solve problems, and to adapt to new (challenging) situations. There are many types of intelligence tests depending on the kind of person (age, profession, culture, etc.) and the way intelligence is understood. Some tests are general, others are focused on evaluating language skills, others on memory, on abstract and logical thinking, or on abilities in a wide variety of areas, such as, for example, recognizing and matching implicit visual patterns. Scores may be presented as an IQ (intelligence quotient), as a mental age, or simply as a point on a scale. Intelligence tests are instrumental in ordering, ranking, and comparing individuals and groups. The testing of intelligence started in the 19th century and became a common practice in schools and universities, psychotechnical institutions, courts, asylums, and private companies on an international level during the 20th century. It is generally assumed that the first test was designed by the French scholars A. Binet and T. Simon in 1905, but the historical link between testing and experimenting points to previous tests, such as the word association test. Testing was practiced and understood in different ways, depending not only on the time, but also on the concrete local (cultural and institutional) conditions. For example, in the United States and Brazil, testing was immediately linked to race differences and eugenic programs, while in other places, such as Spain, it was part of an attempt to detect “feebleness” and to grade students at certain schools. Since its beginning, the intelligence test received harsh criticism and triggered massive protests. The debate went through the mass media, leading to the infamous “IQ test wars.” Thus, nowadays, psychologists are aware of the inherent danger of cultural discrimination and social marginalization, and they are more careful in the promotion of intelligence testing. In order to understand the role the intelligence test plays in today’s society, it is necessary to explore its history with the help of well-documented case studies. Such studies show how the testing practice was employed in national contexts and how it was received, used, or rejected by different social groups or professionals. Current historical research adopts a more inclusive perspective, moving away from a narrative focused on the role testing played in North-America. New work has appeared that explores how testing was taking place in different national and cultural environments, such as Russia (the former Soviet Union), India, Italy, the Netherlands, Sweden, Argentina, Chile, and many other places.
{"title":"Biographies of a Scientific Subject: The Intelligence Test","authors":"Annette Mülberger Rogele","doi":"10.1093/acrefore/9780190236557.013.694","DOIUrl":"https://doi.org/10.1093/acrefore/9780190236557.013.694","url":null,"abstract":"The intelligence test consists of a series of exercises designed to measure intelligence. Intelligence is generally understood as mental capacity that enables a person to learn at school or, more generally, to reason, to solve problems, and to adapt to new (challenging) situations. There are many types of intelligence tests depending on the kind of person (age, profession, culture, etc.) and the way intelligence is understood. Some tests are general, others are focused on evaluating language skills, others on memory, on abstract and logical thinking, or on abilities in a wide variety of areas, such as, for example, recognizing and matching implicit visual patterns. Scores may be presented as an IQ (intelligence quotient), as a mental age, or simply as a point on a scale. Intelligence tests are instrumental in ordering, ranking, and comparing individuals and groups.\u0000 The testing of intelligence started in the 19th century and became a common practice in schools and universities, psychotechnical institutions, courts, asylums, and private companies on an international level during the 20th century. It is generally assumed that the first test was designed by the French scholars A. Binet and T. Simon in 1905, but the historical link between testing and experimenting points to previous tests, such as the word association test. Testing was practiced and understood in different ways, depending not only on the time, but also on the concrete local (cultural and institutional) conditions. For example, in the United States and Brazil, testing was immediately linked to race differences and eugenic programs, while in other places, such as Spain, it was part of an attempt to detect “feebleness” and to grade students at certain schools.\u0000 Since its beginning, the intelligence test received harsh criticism and triggered massive protests. The debate went through the mass media, leading to the infamous “IQ test wars.” Thus, nowadays, psychologists are aware of the inherent danger of cultural discrimination and social marginalization, and they are more careful in the promotion of intelligence testing. In order to understand the role the intelligence test plays in today’s society, it is necessary to explore its history with the help of well-documented case studies. Such studies show how the testing practice was employed in national contexts and how it was received, used, or rejected by different social groups or professionals. Current historical research adopts a more inclusive perspective, moving away from a narrative focused on the role testing played in North-America. New work has appeared that explores how testing was taking place in different national and cultural environments, such as Russia (the former Soviet Union), India, Italy, the Netherlands, Sweden, Argentina, Chile, and many other places.","PeriodicalId":339030,"journal":{"name":"Oxford Research Encyclopedia of Psychology","volume":"38 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-11-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129906078","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2020-09-28DOI: 10.1093/ACREFORE/9780190236557.013.253
W. Steinel, F. Harinck
Bargaining and negotiation are the most constructive ways to handle conflict. Economic prosperity, order, harmony, and enduring social relationships are more likely to be reached by parties who decide to work together toward agreements that satisfy everyone’s interests than by parties who fight openly, dominate one another, break off contact, or take their dispute to an authority to resolve. There are two major research paradigms: distributive and integrative negotiation. Distributive negotiation (“bargaining”) focuses on dividing scarce resources and is studied in social dilemma research. Integrative negotiation focuses on finding mutually beneficial agreements and is studied in decision-making negotiation tasks with multiple issues. Negotiation behavior can be categorized by five different styles: distributive negotiation is characterized by forcing, compromising, or yielding behavior in which each party gives and takes; integrative negotiation is characterized by problem-solving behavior in which parties search for mutually beneficial agreements. Avoiding is the fifth negotiation style, in which parties do not negotiate. Cognitions (what people think about the negotiation) and emotions (how they feel about the negotiation and the other party) affect negotiation behavior and outcomes. Most cognitive biases hinder the attainment of integrative agreements. Emotions have intrapersonal and interpersonal effects, and can help or hinder the negotiation. Aspects of the social context, such as gender, power, cultural differences, and group constellations, affect negotiation behaviors and outcomes as well. Although gender differences in negotiation exist, they are generally small and are usually caused by stereotypical ideas about gender and negotiation. Power differences affect negotiation in such a way that the more powerful party usually has an advantage. Different cultural norms dictate how people will behave in a negotiation. Aspects of the situational context of a negotiation are, for example, time, communication media, and conflict issues. Communication media differ in whether they contain visual and acoustic channels, and whether they permit synchronous communication. The richness of the communication channel can help unacquainted negotiators to reach a good agreement, yet it can lead negotiators with a negative relationship into a conflict spiral. Conflict issues can be roughly categorized in scarce resources (money, time, land) on the one hand, and norms and values on the other. Negotiation is more feasible when dividing scarce resources, and when norms and values are at play in the negotiation, people generally have a harder time to find agreements, since the usual give and take is no longer feasible. Areas of future research include communication, ethics, physiological or hormonal correlates, or personality factors in negotiations.
{"title":"Negotiation and Bargaining","authors":"W. Steinel, F. Harinck","doi":"10.1093/ACREFORE/9780190236557.013.253","DOIUrl":"https://doi.org/10.1093/ACREFORE/9780190236557.013.253","url":null,"abstract":"Bargaining and negotiation are the most constructive ways to handle conflict. Economic prosperity, order, harmony, and enduring social relationships are more likely to be reached by parties who decide to work together toward agreements that satisfy everyone’s interests than by parties who fight openly, dominate one another, break off contact, or take their dispute to an authority to resolve.\u0000 There are two major research paradigms: distributive and integrative negotiation. Distributive negotiation (“bargaining”) focuses on dividing scarce resources and is studied in social dilemma research. Integrative negotiation focuses on finding mutually beneficial agreements and is studied in decision-making negotiation tasks with multiple issues. Negotiation behavior can be categorized by five different styles: distributive negotiation is characterized by forcing, compromising, or yielding behavior in which each party gives and takes; integrative negotiation is characterized by problem-solving behavior in which parties search for mutually beneficial agreements. Avoiding is the fifth negotiation style, in which parties do not negotiate.\u0000 Cognitions (what people think about the negotiation) and emotions (how they feel about the negotiation and the other party) affect negotiation behavior and outcomes. Most cognitive biases hinder the attainment of integrative agreements. Emotions have intrapersonal and interpersonal effects, and can help or hinder the negotiation. Aspects of the social context, such as gender, power, cultural differences, and group constellations, affect negotiation behaviors and outcomes as well. Although gender differences in negotiation exist, they are generally small and are usually caused by stereotypical ideas about gender and negotiation. Power differences affect negotiation in such a way that the more powerful party usually has an advantage. Different cultural norms dictate how people will behave in a negotiation.\u0000 Aspects of the situational context of a negotiation are, for example, time, communication media, and conflict issues. Communication media differ in whether they contain visual and acoustic channels, and whether they permit synchronous communication. The richness of the communication channel can help unacquainted negotiators to reach a good agreement, yet it can lead negotiators with a negative relationship into a conflict spiral. Conflict issues can be roughly categorized in scarce resources (money, time, land) on the one hand, and norms and values on the other. Negotiation is more feasible when dividing scarce resources, and when norms and values are at play in the negotiation, people generally have a harder time to find agreements, since the usual give and take is no longer feasible. Areas of future research include communication, ethics, physiological or hormonal correlates, or personality factors in negotiations.","PeriodicalId":339030,"journal":{"name":"Oxford Research Encyclopedia of Psychology","volume":"48 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-09-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126665757","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2020-08-27DOI: 10.1093/acrefore/9780190236557.013.313
Rhiannon N. Turner
Scholars have developed a plethora of approaches to reducing prejudice and discrimination, many of which have been successfully applied in schools, workplaces, and community settings. Research on intergroup contact suggests that contact between members of different groups, particularly when that contact is warm and positive (for example through friendships) reduces negative emotional reactions (e.g., anxiety) and promotes positive emotions (e.g., empathy), results in more positive attitudes toward members of that group. One might expect that, in an increasingly connected world characterized by global mobility and diversity, higher levels of contact would be associated with a significant lessening of prejudice and discrimination. However, critics have pointed out that changes in attitudes at the individual level do not necessarily translate into reduced prejudice and discrimination at a societal level. Moreover, not everyone has the opportunity to engage in meaningful contact with members of other groups, and even when they do, these opportunities are not always capitalized on. One solution to lack of opportunities for contact is to capitalize on “indirect contact.” These are interventions based on the principles of contact, but which do not involve a face-to-face encounter. Extended contact, which refers to knowing in-group members who have out-group friends, and vicarious contact, which involves learning about the positive contact experiences of our fellow group members, for example via the media, online intergroup contact, and imagining intergroup contact, have each been shown to promote more positive intergroup attitudes. Another way to reduce prejudice and discrimination is to change the way people categorize social groups. When people perceive members of their own group and another group to belong to the same overarching group—that is, they hold a common in-group identity—there is evidence of reduced intergroup bias. However, when our group membership is important to us, this may constitute a threat to our identity, and lead to a reactive increase in bias in order to reassert the distinctiveness of our group. One solution to this is to encourage a dual identity, whereby an individual holds both the original group membership and a common in-group identity that encompasses both groups simultaneously. Alternatively, given the many and varied group memberships that individuals hold, social categories become less useful as a way of categorizing people. There is also evidence that taking a multicultural approach, where differences are acknowledged, rather than a color-blind approach, where differences are ignored, is less likely to result in prejudice and discrimination. Finally, there is evidence that teaching people about other groups, and about the biases they hold but perhaps are not aware of, can help to reduce prejudice and discrimination.
{"title":"Reducing Prejudice and Discrimination","authors":"Rhiannon N. Turner","doi":"10.1093/acrefore/9780190236557.013.313","DOIUrl":"https://doi.org/10.1093/acrefore/9780190236557.013.313","url":null,"abstract":"Scholars have developed a plethora of approaches to reducing prejudice and discrimination, many of which have been successfully applied in schools, workplaces, and community settings. Research on intergroup contact suggests that contact between members of different groups, particularly when that contact is warm and positive (for example through friendships) reduces negative emotional reactions (e.g., anxiety) and promotes positive emotions (e.g., empathy), results in more positive attitudes toward members of that group. One might expect that, in an increasingly connected world characterized by global mobility and diversity, higher levels of contact would be associated with a significant lessening of prejudice and discrimination. However, critics have pointed out that changes in attitudes at the individual level do not necessarily translate into reduced prejudice and discrimination at a societal level. Moreover, not everyone has the opportunity to engage in meaningful contact with members of other groups, and even when they do, these opportunities are not always capitalized on. One solution to lack of opportunities for contact is to capitalize on “indirect contact.” These are interventions based on the principles of contact, but which do not involve a face-to-face encounter. Extended contact, which refers to knowing in-group members who have out-group friends, and vicarious contact, which involves learning about the positive contact experiences of our fellow group members, for example via the media, online intergroup contact, and imagining intergroup contact, have each been shown to promote more positive intergroup attitudes. Another way to reduce prejudice and discrimination is to change the way people categorize social groups. When people perceive members of their own group and another group to belong to the same overarching group—that is, they hold a common in-group identity—there is evidence of reduced intergroup bias. However, when our group membership is important to us, this may constitute a threat to our identity, and lead to a reactive increase in bias in order to reassert the distinctiveness of our group. One solution to this is to encourage a dual identity, whereby an individual holds both the original group membership and a common in-group identity that encompasses both groups simultaneously. Alternatively, given the many and varied group memberships that individuals hold, social categories become less useful as a way of categorizing people. There is also evidence that taking a multicultural approach, where differences are acknowledged, rather than a color-blind approach, where differences are ignored, is less likely to result in prejudice and discrimination. Finally, there is evidence that teaching people about other groups, and about the biases they hold but perhaps are not aware of, can help to reduce prejudice and discrimination.","PeriodicalId":339030,"journal":{"name":"Oxford Research Encyclopedia of Psychology","volume":"20 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-08-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115015792","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2020-08-27DOI: 10.1093/acrefore/9780190236557.013.742
Kimberly Rios, C. Mackey
The definition of group cohesion has been debated since the formal introduction of the concept in social psychology. Group cohesion has undergone a variety of conceptualizations over the years stemming from several theoretical perspectives. Many models of group cohesion have been introduced; however, research with these models is largely confined to the field (e.g., psychology) or subfield (e.g., sports psychology) in which it originated. Initially, unidimensional models of group cohesion were popular, with proponents of these models arguing that cohesion would have the same consequences regardless of its operationalization. However, later research found that group cohesion may be multidimensional in nature. Several two-dimensional models have been proposed, the most popular of which distinguishes between group members working together to attain common goals (task cohesion) and group members interacting with one another on a more personal level (social cohesion). Another multidimensional model of group cohesion builds on the social-task cohesion distinction but further divides social and task cohesion into Group Integration and Individual Attractiveness to Group sub-components, thus creating a four-factor model. Group cohesion has been applied to a variety of group contexts, including sports teams, military squads, and work groups. The amount of cohesion in each group is dependent upon the properties of the group being investigated. Groups that have naturally formed (i.e., “real” groups) have higher rates of group cohesion than groups created for the purpose of a study (i.e., “artificial” groups). Other factors that affect group cohesion include type of group (e.g., interdependent vs. co-acting) and level of analysis (i.e., individual or group). Research on group cohesion has focused on the consequences of group cohesion in lieu of what causes group cohesion in the first place. Furthermore, although much research has detailed the relationship between cohesion and performance, many other positive consequences of group cohesion have not been assessed in depth. Finally, group cohesion is also associated with potential negative consequences, such as groupthink.
{"title":"Group Cohesion","authors":"Kimberly Rios, C. Mackey","doi":"10.1093/acrefore/9780190236557.013.742","DOIUrl":"https://doi.org/10.1093/acrefore/9780190236557.013.742","url":null,"abstract":"The definition of group cohesion has been debated since the formal introduction of the concept in social psychology. Group cohesion has undergone a variety of conceptualizations over the years stemming from several theoretical perspectives. Many models of group cohesion have been introduced; however, research with these models is largely confined to the field (e.g., psychology) or subfield (e.g., sports psychology) in which it originated. Initially, unidimensional models of group cohesion were popular, with proponents of these models arguing that cohesion would have the same consequences regardless of its operationalization. However, later research found that group cohesion may be multidimensional in nature. Several two-dimensional models have been proposed, the most popular of which distinguishes between group members working together to attain common goals (task cohesion) and group members interacting with one another on a more personal level (social cohesion). Another multidimensional model of group cohesion builds on the social-task cohesion distinction but further divides social and task cohesion into Group Integration and Individual Attractiveness to Group sub-components, thus creating a four-factor model.\u0000 Group cohesion has been applied to a variety of group contexts, including sports teams, military squads, and work groups. The amount of cohesion in each group is dependent upon the properties of the group being investigated. Groups that have naturally formed (i.e., “real” groups) have higher rates of group cohesion than groups created for the purpose of a study (i.e., “artificial” groups). Other factors that affect group cohesion include type of group (e.g., interdependent vs. co-acting) and level of analysis (i.e., individual or group). Research on group cohesion has focused on the consequences of group cohesion in lieu of what causes group cohesion in the first place. Furthermore, although much research has detailed the relationship between cohesion and performance, many other positive consequences of group cohesion have not been assessed in depth. Finally, group cohesion is also associated with potential negative consequences, such as groupthink.","PeriodicalId":339030,"journal":{"name":"Oxford Research Encyclopedia of Psychology","volume":"30 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-08-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126519157","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2020-08-27DOI: 10.1016/b978-0-08-097086-8.55032-6
G. Teskey
{"title":"Kindling","authors":"G. Teskey","doi":"10.1016/b978-0-08-097086-8.55032-6","DOIUrl":"https://doi.org/10.1016/b978-0-08-097086-8.55032-6","url":null,"abstract":"","PeriodicalId":339030,"journal":{"name":"Oxford Research Encyclopedia of Psychology","volume":"294 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-08-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123920395","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2020-08-27DOI: 10.1093/acrefore/9780190236557.013.711
I. Whishaw, Megan Sholomiski
A brain lesion is an area of damage, injury, or abnormal change to a part of the brain. Brain lesions may be caused by head injury, disease, surgery, or congenital disorders, and they are classified by the cause, extent, and locus of injury. Lesions cause many behavioral symptoms. Symptom severity generally corresponds to the region and extent of damaged brain. Thus, behavior is often a reliable indicator of the type and extent of a lesion. Observations of patients suffering brain lesions were first recorded in detail in the 18th century, and lesion studies continue to shape modern neuroscience and to give insight into the functions of brain regions. Recovery, defined as any return of lost behavioral or cognitive function, depends on the age, sex, genetics, and lifestyle of patients, and recovery may be predicted by the cause of injury. Most recovery occurs within the first 6 to 9 months after injury and likely involves a combination of compensatory behaviors and physiological changes in the brain. Children often recover some function after brain lesions better than adults, though both children and adults experience residual deficits. Brain lesion survival rates are improved by better diagnostic tools and treatments. Therapeutic interventions and treatments for brain lesions include surgery, pharmaceuticals, transplants, and temperature regulation, each with varying degrees of success. Research in treating brain lesions is progressing, but in principle a cure will only be complete when brain lesions are replaced with healthy tissue.
{"title":"Brain Lesions","authors":"I. Whishaw, Megan Sholomiski","doi":"10.1093/acrefore/9780190236557.013.711","DOIUrl":"https://doi.org/10.1093/acrefore/9780190236557.013.711","url":null,"abstract":"A brain lesion is an area of damage, injury, or abnormal change to a part of the brain. Brain lesions may be caused by head injury, disease, surgery, or congenital disorders, and they are classified by the cause, extent, and locus of injury. Lesions cause many behavioral symptoms. Symptom severity generally corresponds to the region and extent of damaged brain. Thus, behavior is often a reliable indicator of the type and extent of a lesion. Observations of patients suffering brain lesions were first recorded in detail in the 18th century, and lesion studies continue to shape modern neuroscience and to give insight into the functions of brain regions. Recovery, defined as any return of lost behavioral or cognitive function, depends on the age, sex, genetics, and lifestyle of patients, and recovery may be predicted by the cause of injury. Most recovery occurs within the first 6 to 9 months after injury and likely involves a combination of compensatory behaviors and physiological changes in the brain. Children often recover some function after brain lesions better than adults, though both children and adults experience residual deficits. Brain lesion survival rates are improved by better diagnostic tools and treatments. Therapeutic interventions and treatments for brain lesions include surgery, pharmaceuticals, transplants, and temperature regulation, each with varying degrees of success. Research in treating brain lesions is progressing, but in principle a cure will only be complete when brain lesions are replaced with healthy tissue.","PeriodicalId":339030,"journal":{"name":"Oxford Research Encyclopedia of Psychology","volume":"58 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-08-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133157709","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2020-08-27DOI: 10.1093/acrefore/9780190236557.013.785
R. Gibb
The process of brain development begins shortly after conception and in humans takes decades to complete. Indeed, it has been argued that brain development occurs over the lifespan. A complex genetic blueprint provides the intricate details of the process of brain construction. Additional operational instructions that control gene and protein expression are derived from experience, and these operational instructions allow an individual to meet and uniquely adapt to the environmental demands they face. The science of epigenetics provides an explanation of how an individual’s experience adds a layer of instruction to the existing DNA that ultimately controls the phenotypic expression of that individual and can contribute to gene and protein expression in their children, grandchildren, and ensuing generations. Experiences that contribute to alterations in gene expression include gonadal hormones, diet, toxic stress, microbiota, and positive nurturing relationships, to name but a few. There are seven phases of brain development and each phase is defined by timing and purpose. As the brain proceeds through these genetically predetermined steps, various experiences have the potential to alter its final form and behavioral output. Brain plasticity refers to the brain’s ability to change in response to environmental cues or demands. Sensitive periods in brain development are times during which a part of the brain is particularly malleable and dependent on the occurrence of specific experiences in order for the brain to tune its connections and optimize its function. These periods open at different time points for various brain regions and the closing of a sensitive period is dependent on the development of inhibitory circuitry. Some experiences have negative consequences for brain development, whereas other experiences promote positive outcomes. It is the accumulation of these experiences that shape the brain and determine the behavioral outcomes for an individual.
{"title":"Brain Development","authors":"R. Gibb","doi":"10.1093/acrefore/9780190236557.013.785","DOIUrl":"https://doi.org/10.1093/acrefore/9780190236557.013.785","url":null,"abstract":"The process of brain development begins shortly after conception and in humans takes decades to complete. Indeed, it has been argued that brain development occurs over the lifespan. A complex genetic blueprint provides the intricate details of the process of brain construction. Additional operational instructions that control gene and protein expression are derived from experience, and these operational instructions allow an individual to meet and uniquely adapt to the environmental demands they face. The science of epigenetics provides an explanation of how an individual’s experience adds a layer of instruction to the existing DNA that ultimately controls the phenotypic expression of that individual and can contribute to gene and protein expression in their children, grandchildren, and ensuing generations. Experiences that contribute to alterations in gene expression include gonadal hormones, diet, toxic stress, microbiota, and positive nurturing relationships, to name but a few. There are seven phases of brain development and each phase is defined by timing and purpose. As the brain proceeds through these genetically predetermined steps, various experiences have the potential to alter its final form and behavioral output. Brain plasticity refers to the brain’s ability to change in response to environmental cues or demands. Sensitive periods in brain development are times during which a part of the brain is particularly malleable and dependent on the occurrence of specific experiences in order for the brain to tune its connections and optimize its function. These periods open at different time points for various brain regions and the closing of a sensitive period is dependent on the development of inhibitory circuitry. Some experiences have negative consequences for brain development, whereas other experiences promote positive outcomes. It is the accumulation of these experiences that shape the brain and determine the behavioral outcomes for an individual.","PeriodicalId":339030,"journal":{"name":"Oxford Research Encyclopedia of Psychology","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-08-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130294984","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}