mastouille.fr est l'un des nombreux serveurs Mastodon indépendants que vous pouvez utiliser pour participer au fédiverse.
Mastouille est une instance Mastodon durable, ouverte, et hébergée en France.

Administré par :

Statistiques du serveur :

591
comptes actifs

#scopus

0 message0 participant0 message aujourd’hui

✍️ Práticas da História: Journal on Theory, Historiography and Uses of the Past has a permanent call for papers. It is an #OpenAccess jounal indexed in #Scopus. It accepts proposals for articles, bibliographical essays, interviews, critical reviews, issues and thematic dossiers. The texts must be unpublished and can be written in Portuguese, English, Spanish or French.

👉 praticasdahistoria.pt/

@histodons
@histodon

📢 #Scientometric indicators in #research evaluation and research #misconduct: analysis of the Russian #university excellence initiative

👉 "The results showed that #RUEI #universities had a significantly higher number of retracted #publications in #WoS - and #Scopus -indexed #journals, suggesting that pressure to meet quantitative scientometric #indicators may have encouraged unethical research practices and #researchmisconduct."

link.springer.com/article/10.1

SpringerLinkScientometric indicators in research evaluation and research misconduct: analysis of the Russian university excellence initiative - ScientometricsThis study aimed to examine the impact of the Russian University Excellence Initiative (RUEI), also known as Project 5–100, on research misconduct in Russian higher education. Launched in 2013, the RUEI incentivized universities to increase the number of publications in internationally indexed journals. The analysis compares the prevalence of retracted publications—as a proxy for research misconduct—between universities that participated in the RUEI and a control group of universities that did not. A total of 2621 retracted papers affiliated with at least one Russian institution were identified. Of which 203 papers were indexed in Web of Science (WoS) and/or Scopus databases. The results showed that RUEI universities had a significantly higher number of retracted publications in WoS- and Scopus-indexed journals, suggesting that pressure to meet quantitative scientometric indicators may have encouraged unethical research practices and research misconduct. In addition, different reasons for retraction were found between publications indexed and not indexed in WoS and/or Scopus databases. These findings suggest that the direct and irresponsible use of scientometric indicators as performance measures may have unintended negative consequences that may undermine research integrity.

New #preprint 📢 - Can #OpenAlex compete with #Scopus in bibliometric analysis?

👉 arxiv.org/abs/2502.18427

@OpenAlex has broader coverage and shows higher correlation with certain expert assessments.

At the same time, it has issues with metadata completeness and document classification.

❗ Most intriguingly: it turns out that raw #citation counts perform just as well, and in some cases even better, than normalized indicators, which have long been considered the standard in #scientometrics.

arXiv.orgIs OpenAlex Suitable for Research Quality Evaluation and Which Citation Indicator is Best?This article compares (1) citation analysis with OpenAlex and Scopus, testing their citation counts, document type/coverage and subject classifications and (2) three citation-based indicators: raw counts, (field and year) Normalised Citation Scores (NCS) and Normalised Log-transformed Citation Scores (NLCS). Methods (1&2): The indicators calculated from 28.6 million articles were compared through 8,704 correlations on two gold standards for 97,816 UK Research Excellence Framework (REF) 2021 articles. The primary gold standard is ChatGPT scores, and the secondary is the average REF2021 expert review score for the department submitting the article. Results: (1) OpenAlex provides better citation counts than Scopus and its inclusive document classification/scope does not seem to cause substantial field normalisation problems. The broadest OpenAlex classification scheme provides the best indicators. (2) Counterintuitively, raw citation counts are at least as good as nearly all field normalised indicators, and better for single years, and NCS is better than NLCS. (1&2) There are substantial field differences. Thus, (1) OpenAlex is suitable for citation analysis in most fields and (2) the major citation-based indicators seem to work counterintuitively compared to quality judgements. Field normalisation seems ineffective because more cited fields tend to produce higher quality work, affecting interdisciplinary research or within-field topic differences.

Earlier this week an opinion piece authored by me and a number of great colleagues was published on the @upstream blog. Our piece introduces criteria for innovation-friendly bibliographic databases doi.org/10.54900/d3ck1-skq19.

We express our deep concerns about the treatment of @eLife by the #WebOfScience and #Scopus databases. We see this as an example of databases hindering rather than supporting innovation in scholarly communication and research assessment.

@cwts

Upstream · Criteria for Bibliographic Databases in a Well-Functioning Scholarly Communication and Research Assessment EcosystemBibliographic databases should support innovation and experimentation. Here, we offer four criteria for innovation-friendly bibliographic databases. We urge the global research community to use databases that support and do not hinder innovation in scholarly communication and research assessment.

Good news at #CNRS Open Science Day:

"CNRS's cancellation of #Scopus subscription will help support its full transition to open, non-commercial model, a point reiterated by Antoine Petit ... 'We will eventually need to stop using commercial databases for bibliometrics and bibliography'. In the meantime CNRS has maintained subscription to Clarivate's #WebOfScience database while free bibliographic databases are being developed like open access not-for-profit solution @OpenAlex."

@BarcelonaDORI

New study: "Non-selective databases (#Dimensions, #OpenAlex, #Scilit, and #TheLens) index a greater amount of retracted literature than do databases that rely their indexation on venue selection (#PubMed, #Scopus, and #WoS)…The high coverage of OpenAlex and Scilit could be explained by the inaccurate labeling of retracted documents in #Scopus, Dimensions, and The Lens."
link.springer.com/article/10.1

SpringerLinkThe indexation of retracted literature in seven principal scholarly databases: a coverage comparison of dimensions, OpenAlex, PubMed, Scilit, Scopus, The Lens and Web of Science - ScientometricsIn this study, the coverage and overlap of retracted publications, retraction notices and withdrawals are compared across seven significant scholarly databases, with the aim to check for discrepancies, pinpoint the causes of those discrepancies, and choose the best product to produce the most accurate picture of retracted literature. Seven scholarly databases were searched to obtain all the retracted publications, retraction notices and withdrawal from 2000. Only web search interfaces were used, excepting in OpenAlex and Scilit. The findings demonstrate that non-selective databases (Dimensions, OpenAlex, Scilit, and The Lens) index a greater amount of retracted literature than do databases that rely their indexation on venue selection (PubMed, Scopus, and WoS). The key factors explaining these discrepancies are the indexation of withdrawals and proceeding articles. Additionally, the high coverage of OpenAlex and Scilit could be explained by the inaccurate labeling of retracted documents in Scopus, Dimensions, and The Lens. 99% of the sample is jointly covered by OpenAlex, Scilit and WoS. The study suggests that research on retracted literature would require querying more than one source and that it should be advisable to accurately identify and label this literature in academic databases.

New study: "Journals published in Europe, Oceania and North America were more likely to be indexed in #Scopus and #WebOfScience compared to other world regions. Journals published in sub-Saharan #Africa were the most underrepresented and were four times less likely to be indexed than those published in #Europe."
link.springer.com/article/10.1

PS: I'm sorry that this comparison did not include @OpenAlex (#OpenAlex).

SpringerLinkRegional disparities in Web of Science and Scopus journal coverage - ScientometricsThe two most important citation indexes used by the global science community contain marked regional disparities in their representation of academic journals. Existing work on the geographical coverage of Web of Science and Scopus citation indexes compared their coverage of journals in a small sample of ‘top’ countries. This paper offers the first regional analysis of journal representation in these two indexes across all eight UNESCO world regions, compared to the total number of active Ulrich’s directory academic journals in these regions. Journal lists from 239 countries/territories were collected from Ulrich’s periodical directory and analyzed by region. This enables a comparison of the regional distribution of journals within Web of Science (20,255 matched journals) and Scopus (23,348 matched journals) with those in Ulrich’s directory (83,429 journals). Journals published in Europe, Oceania and North America were more likely to be indexed in Scopus and Web of Science compared to other world regions. Journals published in sub-Saharan Africa were the most underrepresented and were four times less likely to be indexed than those published in Europe. The analysis also offers a quantitative breakdown of journal publication languages, highlighting how Scopus and Web of Science disproportionately index English language publications in all world regions. Finally, the analysis shows how field coverage by Web of Science and Scopus differs between the regions, with the Social Sciences and Humanities still under-represented, in comparison to Natural Sciences and Medical & Health Science.

CNRS has unsubscribed from Scopus publications database cnrs.fr/en/cnrsinfo/cnrs-has-u

"Unsubscribing from #Scopus bibliographic database is first stage of process of freeing #CNRS from commercial databases and gradually switching to free bibliographic tools"

"#WOS subscription will be maintained during this transition"

"CNRS's decision is in line with international vision underpinning announcement by @cwts that it is launching transparent reproducible version of its world ranking of universities"

CNRSThe CNRS has unsubscribed from the Scopus publications databaseThe CNRS committed to open science several years ago and of course this includes publication databases for which sustainable open solutions need to be found.