The process of peer review has existed for a long time ... attempting to ensure that scholarly materials are accurate, factual, and in many cases reproducible. The peer review process has relied upon a combination of expert editors, volunteer reviewers, and a process of blind reviewing to identify accurate and important contributions to a discipline.
This form of collegial review has been relatively satisfactory for a long time; however, there have always been concerns about the ability of novel ideas to be accepted by editors who serve as gatekeepers - often with very traditional perspectives. Consequentially, some important paradigm shifts have not been created by articles published in the most respected journals. These radical ideas often were initially only accepted in less popular journals.
Pressures have been introduced into the scholarly communication system as scholarship has become more of a profession. The "publish or perish" situation has grown more intense as additional research institutions have created greater competition. As a result, traditional society-based publications, many with author pages charges, have been challenged by commercial publications (originally without author fees) -- but with much higher subscription prices. In addition, larger popular, and inter-disciplinary journals have also seen greater competition from more targeted niche journals. All of these editorial boards claimed to be using blind peer review to guarantee quality.
Over time, additional publishers entered the market, creating options for the publication of many additional articles that were not accepted in the leading journals. These publishers also claimed to be following all the same quality controls through the use blind peer review methods. Citation rankings attempted to represent the relative quality of these journals, but there are many questions about the accuracy of citations to represent readership and impact. Altmetrics are now starting to measure social impact factors in hopes of capturing additional value and impact factors.
NOTE: The introduction of Open Access journals, which are simply peer review journals with a different business model, have been wrongly accused of providing lesser quality material. In actuality, the business model is completely separate from the peer review process, so these initial fears based upon misunderstandings should disappear if adequate peer review is utilized.
Lately, the easier distribution mechanism on the web has meant that new journals can be started without a great deal of capital and infrastructure. As a result, there has been a growth in predatory publishers -- those more interested in generating a profit from publication than with screening for quality articles. An article that has analyzed this situation is: ‘Predatory’ open access: a longitudinal study of article volumes and market characteristics" by Cenyu Shen and Bo-Christer Björk. BMC Medicine 2015,13:230.
Regardless of the specific quality of the peer review process used, there have been increasing reports of intentional gaming and flaws in the peer review system in all types of journal publications.
Authors considering submission of materials to a journal should consider the types of questions addressed by the Think.Check.Submit collaborative initiative to identify trusted publishers.
To monitor the latest reports of peer review retractions, see Retraction Watch.
Scholars have increasing opportunities to publish their work both in traditionally published and online only journals. Journal publishers such as Oxford, Wiley, Elsevier, and professional association journals increase opportunities in open access or hybrid journals. Cabell's Directory of Publishing Opportunities is one of many sources to consult for selecting the right journal in social sciences. There are also useful online resources to guide authors in selecting a journal for publishing their research.
Sherpa/Romeo is a database of publisher contracts and copyright agreements. It can be searched by journal title or publisher with direct linking to publishers' journal websites.
The four tools listed below use similar searching algorithms to find the best journals by matching the abstracts or titles of manuscripts.
JANE: Journal/Author Name Estimator identifies potential journals by entering a text, key words or title of a paper to be published and comparing it to similar texts in MEDLINE to find the best matches.
Elsevier Journal Finder works on the same principle as JANE but allows limiting the selection to disciplines.
Springer Journal Selector --- similar to Elsevier tool trying to match the manuscript abstract or title with 2,600 Springer publications.
Edanz Journal Selector will find a journal match in 18,000 journals.
Google Scholar has recently added Google Scholar Citations tool allowing authors to track their cited works, create their profiles and connect with other researchers in the field. Google account is required to search the Google Scholar Citations. The following is an example of a scholar's profile.
The traditional peer review metric has been the Citation Impact Factor. This counts the number of times an article has been cited since publication, assuming there is a correlation between readership, citations, and impact. These values are often still based upon journal rankings rather than individual articles, and recent studies hacve shown that outliers can significantly skew such assessments. Various modifications have been made to the algorithm to make it more meaningful, including averaging the number of citations over a three year period, and normalizing these numbers fro cross-disciplinary comparison. Impact factors should be used as one factor among many to determine value and impact, and should not be the sole tool in measuring author performance.
A more detailed description of the tools used to measure the impact of journals, authors, and articles can be found on our On Scholarly Publishing page.
A new Relative Citation Ratio (RCR) calculation attempts to place an individual article citation number in context with other articles in the same field. See iCite, the NIH tool for calculating RCRs. There is also an article describing the RCR approach.
archived version of the Beall's list - a list of potential predatory publishers created by a librarian
An excellent introduction with a listing of print-on-demand publishers originally compiled by a librarian.