除了禁止:我们的数字世界必须对年轻人更安全。

IF 1.4 4区 医学 Q3 PUBLIC, ENVIRONMENTAL & OCCUPATIONAL HEALTH Health Promotion Journal of Australia Pub Date : 2024-12-18 DOI:10.1002/hpja.938
Nicholas Carah, Sandro Demaio, Louise Holly, Ilona Kickbusch, Carmel Williams
{"title":"除了禁止:我们的数字世界必须对年轻人更安全。","authors":"Nicholas Carah,&nbsp;Sandro Demaio,&nbsp;Louise Holly,&nbsp;Ilona Kickbusch,&nbsp;Carmel Williams","doi":"10.1002/hpja.938","DOIUrl":null,"url":null,"abstract":"<p>Social media offers young people opportunities for connection, self-expression and learning, but not without increasingly evidenced health risks. A new report from the WHO suggests that 11% of adolescents show signs of problematic social media behaviours and experience negative consequences such as disrupted sleep.<span><sup>1</sup></span> As governments around the world grapple to find the balance between access and protections, the question arises: can we build a safer, more balanced digital space for children?</p><p>The regulation of children's use of social media is a growing global public health priority.<span><sup>2</sup></span> In the United States, legislation sets the age of 13 as the threshold for children creating new social media accounts. The European Union has considered raising the minimum age for social media access to 16, and in France, social media platforms must refuse access to children under 15 unless they have parental permission.</p><p>Australia's recently announced proposal is expected to go further, completely banning young people from social media.<span><sup>3</sup></span> This controversial idea hinges on the outcomes of ongoing trials involving age verification and assurance technologies.<span><sup>4</sup></span> While discussions around banning children from social media are happening in several countries, Australia's approach could reshape how social media companies manage user access for minors. It focuses on the effectiveness of two key technological solutions: age assurance and age verification.</p><p>Age assurance includes a variety of techniques designed to estimate or determine a user's age. Methods include self-reporting or parental verification, and even using technology like facial recognition or analysing scrolling habits. These methods, however, can easily be circumvented by savvy teens making enforcement of any ban difficult.</p><p>Meanwhile, age verification involves confirming a person's age by matching their identity to a verified source of information, such as a government-issued document. However, concerns arise over privacy and security risks, whether personal data are managed by the social media companies themselves or with third parties.<span><sup>5</sup></span></p><p>Beyond the technical challenges of determining user age, there are social and cultural risks associated with age verification systems. Young people often use social media to explore their identities and seek information they may not feel comfortable seeking from parents, teachers or peers.<span><sup>6</sup></span> Social media provides a level of anonymity, allowing them to ask questions about personal topics such as body image, sexuality and relationships. Age verification requirements could undermine this anonymity, stripping young users of their right to remain forgotten online.<span><sup>7</sup></span></p><p>Another issue is the potential to exacerbate digital divides. Navigating age verification systems may prove more difficult for young people with lower digital access or literacy, or limited access to the necessary identification documents. These individuals could be excluded from mainstream social media platforms altogether, potentially pushing them into more dangerous, less regulated corners of the internet.</p><p>Despite the many challenges, some social media companies have begun to address public concerns over the safety of children on their platforms. Two common strategies are the creation of advisory groups focused on user safety and the development of specialised apps and accounts for children. For example, Meta has established a ‘Safety Advisory Council’ and an ‘Instagram Suicide and Self-Injury Advisory Group’ to guide its child safety policies. However, little is known about how these groups operate or what impact their advice has had.</p><p>Meanwhile, platforms like YouTube have introduced child-specific versions, such as YouTube Kids, which offers a commercially lucrative ‘walled garden’ of content specifically curated for an ever-younger audience.</p><p>In September, Meta announced the creation of ‘teen accounts’ for Instagram, a feature aimed at the safety of younger users.<span><sup>8</sup></span> These accounts will have several key features, including accounts for users under 18 defaulting to private, and users under 16 needing parental permission to switch to a public profile. Teen accounts will limit interactions with strangers, restrict tagging and mentions, and apply content filters aimed at reducing exposure to harmful material. Young users will also receive notifications prompting them to log off after 60 min of use and turn on sleep mode, which mutes notifications overnight.</p><p>While these features may provide enhanced child and teen safety, they also raise concerns. Many of these ‘innovations’ are simply repackaged versions of pre-existing features and fail to represent protections on a scale that matches the known public health risks.<span><sup>9</sup></span> Moreover, these changes appear to further commercial interests. Instagram's vision for teens now focuses on fostering a more intimate, chat-based experience akin to Snapchat, while also incorporating TikTok-style entertainment through Reels.</p><p>Whether we ultimately decide to limit young people's use of social media or not, the more important task is to create online environments that enable these generations to flourish and be healthy in a digitalised world. Standalone bans are a potential quick-win for tech companies and policymakers but risk displacing important responsibilities that digital platforms have as commercial enterprises that make enormous profits from children's culture and private lives.</p><p>Young people do not want to be excluded from the digital world but do want to be better protected from inaccurate information and digital harms.<span><sup>10</sup></span> We must think critically about building social media spaces where young people are not targeted by advertisers looking to exploit their emotions or sell harmful products. Social media should prioritise users' health and well-being rather than aim to maximise attention and engagement at any cost.</p><p>Ultimately, young people deserve online spaces where they can explore their identities, form social connections, and express themselves freely, without being inundated with harmful content or manipulated by advertisers. Rather than focusing on restricting access, policymakers and platforms should be expected to provide high-quality, educational and supportive content that fosters healthy online experiences.</p><p>There are opportunities to draw on the lessons and experiences from the fields of public health and health promotion to apply similar approaches to tackle the risks and benefits associated with the digitalised world.<span><sup>2</sup></span> For example, public health and health promotion approaches have provided both safeguards to limit young people's exposure to hazardous products and also provided clear guidance on the safe levels of access and strategic incentives. Tobacco, alcohol and car and road safety are good examples of public health success, where the evidence is clear that unfettered access and/or exposure damages the health and well-being of young people.<span><sup>2, 11, 12</sup></span> In Australia and other similar countries, a diverse range of public health actions have been put in place to protect young people from the harm caused by unlimited exposure to these products. These include regulation and legislation, such as setting age limits; fiscal responses including taxes and levees on the products; reducing access through restrictions on where products can be sold and consumed; providing clear and accurate information and advice to the community; and guidelines on safe use of these products.</p><p>Holding multinational corporations, including the owners of social media platforms, accountable for the harm their products cause young people is critical, but it is the responsibility of governments to act to protect young people's rights both to engage in the digital world and to health and well-being.</p><p>Carmel Williams is the Editor-in-Chief of <i>HPJA</i> and a co-author of this article. They were excluded from editorial decision-making related to the acceptance and publication of this article. All authors declare no conflict of interest.</p>","PeriodicalId":47379,"journal":{"name":"Health Promotion Journal of Australia","volume":"36 1","pages":""},"PeriodicalIF":1.4000,"publicationDate":"2024-12-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1002/hpja.938","citationCount":"0","resultStr":"{\"title\":\"Beyond banning: Our digital world must be made safer for young people\",\"authors\":\"Nicholas Carah,&nbsp;Sandro Demaio,&nbsp;Louise Holly,&nbsp;Ilona Kickbusch,&nbsp;Carmel Williams\",\"doi\":\"10.1002/hpja.938\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>Social media offers young people opportunities for connection, self-expression and learning, but not without increasingly evidenced health risks. A new report from the WHO suggests that 11% of adolescents show signs of problematic social media behaviours and experience negative consequences such as disrupted sleep.<span><sup>1</sup></span> As governments around the world grapple to find the balance between access and protections, the question arises: can we build a safer, more balanced digital space for children?</p><p>The regulation of children's use of social media is a growing global public health priority.<span><sup>2</sup></span> In the United States, legislation sets the age of 13 as the threshold for children creating new social media accounts. The European Union has considered raising the minimum age for social media access to 16, and in France, social media platforms must refuse access to children under 15 unless they have parental permission.</p><p>Australia's recently announced proposal is expected to go further, completely banning young people from social media.<span><sup>3</sup></span> This controversial idea hinges on the outcomes of ongoing trials involving age verification and assurance technologies.<span><sup>4</sup></span> While discussions around banning children from social media are happening in several countries, Australia's approach could reshape how social media companies manage user access for minors. It focuses on the effectiveness of two key technological solutions: age assurance and age verification.</p><p>Age assurance includes a variety of techniques designed to estimate or determine a user's age. Methods include self-reporting or parental verification, and even using technology like facial recognition or analysing scrolling habits. These methods, however, can easily be circumvented by savvy teens making enforcement of any ban difficult.</p><p>Meanwhile, age verification involves confirming a person's age by matching their identity to a verified source of information, such as a government-issued document. However, concerns arise over privacy and security risks, whether personal data are managed by the social media companies themselves or with third parties.<span><sup>5</sup></span></p><p>Beyond the technical challenges of determining user age, there are social and cultural risks associated with age verification systems. Young people often use social media to explore their identities and seek information they may not feel comfortable seeking from parents, teachers or peers.<span><sup>6</sup></span> Social media provides a level of anonymity, allowing them to ask questions about personal topics such as body image, sexuality and relationships. Age verification requirements could undermine this anonymity, stripping young users of their right to remain forgotten online.<span><sup>7</sup></span></p><p>Another issue is the potential to exacerbate digital divides. Navigating age verification systems may prove more difficult for young people with lower digital access or literacy, or limited access to the necessary identification documents. These individuals could be excluded from mainstream social media platforms altogether, potentially pushing them into more dangerous, less regulated corners of the internet.</p><p>Despite the many challenges, some social media companies have begun to address public concerns over the safety of children on their platforms. Two common strategies are the creation of advisory groups focused on user safety and the development of specialised apps and accounts for children. For example, Meta has established a ‘Safety Advisory Council’ and an ‘Instagram Suicide and Self-Injury Advisory Group’ to guide its child safety policies. However, little is known about how these groups operate or what impact their advice has had.</p><p>Meanwhile, platforms like YouTube have introduced child-specific versions, such as YouTube Kids, which offers a commercially lucrative ‘walled garden’ of content specifically curated for an ever-younger audience.</p><p>In September, Meta announced the creation of ‘teen accounts’ for Instagram, a feature aimed at the safety of younger users.<span><sup>8</sup></span> These accounts will have several key features, including accounts for users under 18 defaulting to private, and users under 16 needing parental permission to switch to a public profile. Teen accounts will limit interactions with strangers, restrict tagging and mentions, and apply content filters aimed at reducing exposure to harmful material. Young users will also receive notifications prompting them to log off after 60 min of use and turn on sleep mode, which mutes notifications overnight.</p><p>While these features may provide enhanced child and teen safety, they also raise concerns. Many of these ‘innovations’ are simply repackaged versions of pre-existing features and fail to represent protections on a scale that matches the known public health risks.<span><sup>9</sup></span> Moreover, these changes appear to further commercial interests. Instagram's vision for teens now focuses on fostering a more intimate, chat-based experience akin to Snapchat, while also incorporating TikTok-style entertainment through Reels.</p><p>Whether we ultimately decide to limit young people's use of social media or not, the more important task is to create online environments that enable these generations to flourish and be healthy in a digitalised world. Standalone bans are a potential quick-win for tech companies and policymakers but risk displacing important responsibilities that digital platforms have as commercial enterprises that make enormous profits from children's culture and private lives.</p><p>Young people do not want to be excluded from the digital world but do want to be better protected from inaccurate information and digital harms.<span><sup>10</sup></span> We must think critically about building social media spaces where young people are not targeted by advertisers looking to exploit their emotions or sell harmful products. Social media should prioritise users' health and well-being rather than aim to maximise attention and engagement at any cost.</p><p>Ultimately, young people deserve online spaces where they can explore their identities, form social connections, and express themselves freely, without being inundated with harmful content or manipulated by advertisers. Rather than focusing on restricting access, policymakers and platforms should be expected to provide high-quality, educational and supportive content that fosters healthy online experiences.</p><p>There are opportunities to draw on the lessons and experiences from the fields of public health and health promotion to apply similar approaches to tackle the risks and benefits associated with the digitalised world.<span><sup>2</sup></span> For example, public health and health promotion approaches have provided both safeguards to limit young people's exposure to hazardous products and also provided clear guidance on the safe levels of access and strategic incentives. Tobacco, alcohol and car and road safety are good examples of public health success, where the evidence is clear that unfettered access and/or exposure damages the health and well-being of young people.<span><sup>2, 11, 12</sup></span> In Australia and other similar countries, a diverse range of public health actions have been put in place to protect young people from the harm caused by unlimited exposure to these products. These include regulation and legislation, such as setting age limits; fiscal responses including taxes and levees on the products; reducing access through restrictions on where products can be sold and consumed; providing clear and accurate information and advice to the community; and guidelines on safe use of these products.</p><p>Holding multinational corporations, including the owners of social media platforms, accountable for the harm their products cause young people is critical, but it is the responsibility of governments to act to protect young people's rights both to engage in the digital world and to health and well-being.</p><p>Carmel Williams is the Editor-in-Chief of <i>HPJA</i> and a co-author of this article. They were excluded from editorial decision-making related to the acceptance and publication of this article. All authors declare no conflict of interest.</p>\",\"PeriodicalId\":47379,\"journal\":{\"name\":\"Health Promotion Journal of Australia\",\"volume\":\"36 1\",\"pages\":\"\"},\"PeriodicalIF\":1.4000,\"publicationDate\":\"2024-12-18\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://onlinelibrary.wiley.com/doi/epdf/10.1002/hpja.938\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Health Promotion Journal of Australia\",\"FirstCategoryId\":\"3\",\"ListUrlMain\":\"https://onlinelibrary.wiley.com/doi/10.1002/hpja.938\",\"RegionNum\":4,\"RegionCategory\":\"医学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"PUBLIC, ENVIRONMENTAL & OCCUPATIONAL HEALTH\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Health Promotion Journal of Australia","FirstCategoryId":"3","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.1002/hpja.938","RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"PUBLIC, ENVIRONMENTAL & OCCUPATIONAL HEALTH","Score":null,"Total":0}
引用次数: 0

摘要

无论我们最终是否决定限制年轻人使用社交媒体,更重要的任务是创造网络环境,使这几代人能够在数字化世界中茁壮成长、健康成长。对科技公司和政策制定者来说,单独的禁令可能是一种速胜手段,但有可能取代数字平台作为从儿童文化和私人生活中赚取巨额利润的商业企业所承担的重要责任。年轻人不希望被排除在数字世界之外,但他们确实希望得到更好的保护,免受不准确信息和数字伤害我们必须批判性地考虑建立社交媒体空间,让年轻人不会成为广告商利用他们的情绪或销售有害产品的目标。社交媒体应该优先考虑用户的健康和幸福,而不是不惜一切代价最大化注意力和参与度。最终,年轻人应该拥有网络空间,在那里他们可以探索自己的身份,建立社会联系,自由表达自己,而不会被有害内容淹没或被广告商操纵。政策制定者和平台不应专注于限制访问,而应提供高质量、有教育意义和支持性的内容,促进健康的在线体验。有机会借鉴公共卫生和健康促进领域的教训和经验,采用类似的方法来应对与数字化世界相关的风险和利益例如,公共卫生和促进健康的办法既提供了限制青年人接触危险产品的保障,又就获取危险产品的安全程度和战略激励措施提供了明确的指导。烟草、酒精以及汽车和道路安全是公共卫生成功的很好例子,在这些领域,有证据清楚表明,不受限制地获取和/或接触烟草和酒精会损害年轻人的健康和福祉。2,11,12在澳大利亚和其他类似国家,已经采取了各种各样的公共卫生行动,以保护年轻人免受无限制接触这些产品所造成的伤害。其中包括法规和立法,例如设定年龄限制;财政对策,包括对产品征税和征税;通过限制产品的销售和消费地点来减少获取途径;向社会提供清晰准确的资讯和意见;以及这些产品的安全使用指南。让包括社交媒体平台所有者在内的跨国公司对其产品对年轻人造成的伤害负责至关重要,但政府有责任采取行动,保护年轻人参与数字世界以及健康和福祉的权利。卡梅尔·威廉姆斯是HPJA的主编,也是本文的合著者。他们被排除在与本文的接受和发表相关的编辑决策之外。所有作者声明无利益冲突。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Beyond banning: Our digital world must be made safer for young people

Social media offers young people opportunities for connection, self-expression and learning, but not without increasingly evidenced health risks. A new report from the WHO suggests that 11% of adolescents show signs of problematic social media behaviours and experience negative consequences such as disrupted sleep.1 As governments around the world grapple to find the balance between access and protections, the question arises: can we build a safer, more balanced digital space for children?

The regulation of children's use of social media is a growing global public health priority.2 In the United States, legislation sets the age of 13 as the threshold for children creating new social media accounts. The European Union has considered raising the minimum age for social media access to 16, and in France, social media platforms must refuse access to children under 15 unless they have parental permission.

Australia's recently announced proposal is expected to go further, completely banning young people from social media.3 This controversial idea hinges on the outcomes of ongoing trials involving age verification and assurance technologies.4 While discussions around banning children from social media are happening in several countries, Australia's approach could reshape how social media companies manage user access for minors. It focuses on the effectiveness of two key technological solutions: age assurance and age verification.

Age assurance includes a variety of techniques designed to estimate or determine a user's age. Methods include self-reporting or parental verification, and even using technology like facial recognition or analysing scrolling habits. These methods, however, can easily be circumvented by savvy teens making enforcement of any ban difficult.

Meanwhile, age verification involves confirming a person's age by matching their identity to a verified source of information, such as a government-issued document. However, concerns arise over privacy and security risks, whether personal data are managed by the social media companies themselves or with third parties.5

Beyond the technical challenges of determining user age, there are social and cultural risks associated with age verification systems. Young people often use social media to explore their identities and seek information they may not feel comfortable seeking from parents, teachers or peers.6 Social media provides a level of anonymity, allowing them to ask questions about personal topics such as body image, sexuality and relationships. Age verification requirements could undermine this anonymity, stripping young users of their right to remain forgotten online.7

Another issue is the potential to exacerbate digital divides. Navigating age verification systems may prove more difficult for young people with lower digital access or literacy, or limited access to the necessary identification documents. These individuals could be excluded from mainstream social media platforms altogether, potentially pushing them into more dangerous, less regulated corners of the internet.

Despite the many challenges, some social media companies have begun to address public concerns over the safety of children on their platforms. Two common strategies are the creation of advisory groups focused on user safety and the development of specialised apps and accounts for children. For example, Meta has established a ‘Safety Advisory Council’ and an ‘Instagram Suicide and Self-Injury Advisory Group’ to guide its child safety policies. However, little is known about how these groups operate or what impact their advice has had.

Meanwhile, platforms like YouTube have introduced child-specific versions, such as YouTube Kids, which offers a commercially lucrative ‘walled garden’ of content specifically curated for an ever-younger audience.

In September, Meta announced the creation of ‘teen accounts’ for Instagram, a feature aimed at the safety of younger users.8 These accounts will have several key features, including accounts for users under 18 defaulting to private, and users under 16 needing parental permission to switch to a public profile. Teen accounts will limit interactions with strangers, restrict tagging and mentions, and apply content filters aimed at reducing exposure to harmful material. Young users will also receive notifications prompting them to log off after 60 min of use and turn on sleep mode, which mutes notifications overnight.

While these features may provide enhanced child and teen safety, they also raise concerns. Many of these ‘innovations’ are simply repackaged versions of pre-existing features and fail to represent protections on a scale that matches the known public health risks.9 Moreover, these changes appear to further commercial interests. Instagram's vision for teens now focuses on fostering a more intimate, chat-based experience akin to Snapchat, while also incorporating TikTok-style entertainment through Reels.

Whether we ultimately decide to limit young people's use of social media or not, the more important task is to create online environments that enable these generations to flourish and be healthy in a digitalised world. Standalone bans are a potential quick-win for tech companies and policymakers but risk displacing important responsibilities that digital platforms have as commercial enterprises that make enormous profits from children's culture and private lives.

Young people do not want to be excluded from the digital world but do want to be better protected from inaccurate information and digital harms.10 We must think critically about building social media spaces where young people are not targeted by advertisers looking to exploit their emotions or sell harmful products. Social media should prioritise users' health and well-being rather than aim to maximise attention and engagement at any cost.

Ultimately, young people deserve online spaces where they can explore their identities, form social connections, and express themselves freely, without being inundated with harmful content or manipulated by advertisers. Rather than focusing on restricting access, policymakers and platforms should be expected to provide high-quality, educational and supportive content that fosters healthy online experiences.

There are opportunities to draw on the lessons and experiences from the fields of public health and health promotion to apply similar approaches to tackle the risks and benefits associated with the digitalised world.2 For example, public health and health promotion approaches have provided both safeguards to limit young people's exposure to hazardous products and also provided clear guidance on the safe levels of access and strategic incentives. Tobacco, alcohol and car and road safety are good examples of public health success, where the evidence is clear that unfettered access and/or exposure damages the health and well-being of young people.2, 11, 12 In Australia and other similar countries, a diverse range of public health actions have been put in place to protect young people from the harm caused by unlimited exposure to these products. These include regulation and legislation, such as setting age limits; fiscal responses including taxes and levees on the products; reducing access through restrictions on where products can be sold and consumed; providing clear and accurate information and advice to the community; and guidelines on safe use of these products.

Holding multinational corporations, including the owners of social media platforms, accountable for the harm their products cause young people is critical, but it is the responsibility of governments to act to protect young people's rights both to engage in the digital world and to health and well-being.

Carmel Williams is the Editor-in-Chief of HPJA and a co-author of this article. They were excluded from editorial decision-making related to the acceptance and publication of this article. All authors declare no conflict of interest.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Health Promotion Journal of Australia
Health Promotion Journal of Australia PUBLIC, ENVIRONMENTAL & OCCUPATIONAL HEALTH-
CiteScore
3.10
自引率
10.50%
发文量
115
期刊介绍: The purpose of the Health Promotion Journal of Australia is to facilitate communication between researchers, practitioners, and policymakers involved in health promotion activities. Preference for publication is given to practical examples of policies, theories, strategies and programs which utilise educational, organisational, economic and/or environmental approaches to health promotion. The journal also publishes brief reports discussing programs, professional viewpoints, and guidelines for practice or evaluation methodology. The journal features articles, brief reports, editorials, perspectives, "of interest", viewpoints, book reviews and letters.
期刊最新文献
An anti-junk food ad from a sports commercial break reduced junk food consumption inclinations, yet junk food ads had minimal to no impact. Osteoporosis screening in Australian community pharmacies: A mixed methods study. Co-design of digital public health substance use resources: A collaboration between young people and experts. Health literacy profiles of final year pre-service teachers in two initial education programs compared with the general population: A cross-sectional study using the Health Literacy Questionnaire. Free bus fares, bus use and physical activity: An exploratory cross-sectional study.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1