透過您的圖書館登入
IP:18.221.40.152
  • 學位論文

從作者來源與被引表現分析巨型期刊的品質差異:以標竿期刊為比較基礎

Analyzing the Quality Differences of Mega-journals by Comparing Author Sources and Citation Performances: Using Top Journals as Benchmark

指導教授 : 林奇秀
若您是本文的作者,可授權文章由華藝線上圖書館中協助推廣。

摘要


巨型期刊是一種以出版量龐大為特徵的期刊統稱。近年學界以期刊黑名單,將巨型期刊視為一種品質具疑慮、不建議投稿的期刊類型。然而這樣的做法,並不能呈現不同巨型期刊的特質。 本研究以書目計量方法,比較巨型期刊的作者來源與被引表現,更具體來說,作者來源是探討期刊內國家、機構發表分布情形;被引表現則探討期刊的論文被引次數分布、被引來源組成、以及期刊與出版社自我引用。研究結果可代表巨型期刊受到認可的程度,進而反映期刊的品質差異。 本研究以三種出版量最大、有著類似JIF等級的綜合學科巨型期刊為比較對象。分別為PLOS ONE、Scientific Reports以及MDPI發行的Sustainability。同時,以頂尖綜合學科期刊Nature與Science作為標竿,比較巨型期刊與標竿期刊的表現相似程度。 本研究結果顯示,三本巨型期刊中,PLOS ONE在國家與機構來源的組成和標竿期刊最為相似。而PLOS ONE和Scientific Reports,來自科研領先國與已開發國家的發表量比率皆超過50%,與標竿期刊的結果相比差距不大。同時,PLOS ONE和Scientific Reports在世界前一千大大學的發表量皆佔約五成、h-index排在全球前10%的機構發表量比例也都佔六成,顯示兩期刊在優秀機構發表比率相當接近。 對比來說,Sustainability的國家發表量和機構發表量分布,和標竿期刊以及其他兩本巨型期刊很不相同。Sustainability中,科研領先國發表比率僅佔四成,已開發國家發表比率則大約在五成上下。同時,來自科研領先機構的發表量僅佔三成、來自世界前一千大大學的發表量約佔四成、h-index排在全球前10%的機構發表量佔不到五成。意味著Sustainability較不受頂尖國家與機構作者認可。 在被引表現的結果方面,在未排除自我引用的情況下,Scientific Reports無論在低被引或高被引論文的表現都是三本巨型期刊最好。而在被引來源的部分,Sustainability的期刊自我引用比率高達26.01%、出版社自我引用比率高達47.62%,遠高於所比較的其他四本期刊,凸顯MDPI的巨型期刊在自我引用上的問題。 本研究結果凸顯出同樣是巨型期刊,品質表現仍有很大不同。本研究建議學術單位在制定巨型期刊的獎懲政策,或是學者在考慮投稿到巨型期刊時,除仰賴期刊黑名單或是其他期刊評比清單,可再進一步從作者來源和被引表現進行評估。

並列摘要


Mega-journals are a collective term for journals characterized by large publication volume. In recent years, the academic community has used journal blocklists to categorize mega-journals as questionable and not recommended for submission. However, this approach fails to reflect the unique characteristics of different mega-journals. This study employs bibliometric analysis to compare the author sources and citation performances of mega-journals. More specifically, the analysis of author sources examines the publication distribution by country and institution within the journals. The citation performances explore the distribution of citation counts, the composition of citation sources, journal self-citations, and publisher self-citations. The results represent the degree of recognition that mega-journals received and reflect their quality differences. The study compares the three most massive publication volumes of multidisciplinary science mega-journals with similar JIF quartiles: PLOS ONE, Scientific Reports, and MDPI's Sustainability. Additionally, top multidisciplinary science journals, Nature and Science, are used as benchmarks to examine the similarity in performance between mega-journals and benchmark journals. The results show that among the three mega-journals, PLOS ONE has the most similar composition of country and institution sources to the benchmark journals. In PLOS ONE and Scientific Reports, more than half of the publications in both journals come from scientifically leading countries and developed countries, with little difference compared to the benchmark journals. Additionally, PLOS ONE and Scientific Reports each have about 50% of their publications from the world's top 1,000 universities, and about 60% of their publications from institutions ranked in the top 10% globally by h-index, indicating the publication rates of the two journals in top institutions are similar. In contrast, Sustainability differs significantly from the benchmark journals and the other two mega-journals. In Sustainability, the proportion of publications from scientifically leading countries is only about 40%, while the proportion from developed countries is around 50%. Also, only 30% of the publications come from scientifically leading institutions, about 40% from the world's top 1,000 universities, and less than 50% from institutions ranked in the top 10% globally by h-index. The results suggests that Sustainability is less recognized by authors from top countries and institutions. In terms of citation performance, without excluding self-citations, Scientific Reports performs the best among the three mega-journals in both lowly- and highly-cited papers. Regarding citation sources, Sustainability has a journal self-citation rate of 26.01% and a publisher self-citation rate of 47.62%, significantly higher than the other four journals, highlighting the problems of self-citation in MDPI's mega-journals. The results of this study accentuate that even among mega-journals, the quality difference varies greatly. This study suggests that when the authorities formulate reward and punishment policies for mega-journals, or scholars considering submitting to mega-journals, they should further evaluate the author sources and citation performance instead of relying only on journal blocklists or other journal evaluation lists.

參考文獻


Clarivate(2020)。InCites快速使用指南。https://olis.kmu.edu.tw/images/2020-InCites%E5%BF%AB%E9%80%9F%E4%BD%BF%E7%94%A8%E6%8C%87%E5%8D%97_%E6%96%B0%E7%89%88.pdf
Clarivate(2021)。Web of Science 核心合輯概觀。Clarivate。https://webofscience.help.clarivate.com/zh-tw/Content/wos-core-collection/wos-core-collection.htm
任勝利、高洋、程維紅(2020)。巨型OA期刊的發展現狀及相關思考。中國科技期刊研究,31(10),1171。https://doi.org/10.11946/cjstp.202009020786
何萬順(2022)。學術期刊的白與黑。人文與社會科學簡訊,23(4),19–25。
吳紹群、吳明德(2007)。開放資訊取用期刊對學術傳播系統之影響。圖書資訊學研究,2(1),21–54。

延伸閱讀