Click this link to go directly to the Journal of Cognitive Neuroscience online submissions portal: JOCN online manuscript submission tool.
Mission and Editorial Policies
The purpose of the Journal of Cognitive Neuroscience (JoCN) is to provide a single forum for research in the biological bases of mental events and behavior. The Journal publishes papers that bridge the gap between descriptions of information processing and specifications of brain activity. As such, much of the work will be interdisciplinary in nature, drawing on developments in fields including neuroscience, neuropsychology, experimental psychology, neurology, computational modeling, engineering, and statistics. The Journal will not publish research reports that bear solely on descriptions of function without addressing the underlying brain events, or that deal solely with descriptions of neurophysiology or neuroanatomy without regard to function. The Journal also will not publish papers where the emphasis is on defining the cognitive deficits associated with a particular clinical condition, nor on the assessment of treatments, if the findings do not inform an understanding of the neural bases of cognition and behavior. All papers should be written to be accessible to an interdisciplinary readership.
Types of Papers
The Journal will consider empirical and methodological papers, as well as reviews of recent experimental research on a timely topic that explores the relationship between brain and behavior. The Journal also publishes “Essays,” which are a place to introduce new ideas, reflect, and even challenge current dogmas or present ideas about future directions, historically placed from past discoveries.
For empirical papers, the Journal of Cognitive Neuroscience gives preference to reports that present results from rigorously and transparently conducted studies that are adequately powered and statistically sound, and that make a theoretical contribution to the field of cognitive neuroscience. Note that theoretical significance does not imply that results have to be positive and novel. Null findings from well conducted research can be equally meaningful when supported statistically (e.g., by Bayesian statistics).
Preregistration and Preregistered Research Reports
JoCN encourages, but does not require, preregistration of studies and their analysis plans. Preregistration shifts the evaluation of research from its outcomes (which should not be under the control of the experimenter) to the quality of the question addressed and the methods used in the research (Nosek, Ebersole, DeHaven, & Mellor, 2018). The journal offers the option to preregister a study and its data analysis plan for peer review before the study is conducted via the Preregistered Research Reports mechanism. In principle, acceptance of a "Stage 1" Preregistered Research Report ensures that the results will be published regardless of its outcomes. (Procedures and policies for Preregistered Research Reports are detailed below.) Studies can also be preregistered in an independent registry (i.e., non-peer reviewed preregistration; e.g., https://osf.io/prereg/ or http://aspredicted.org/). Policies and instructions for Preregistered Research Reports can be found further along on this page.
Scientific Transparency and Reproducibility
Authors should provide all details of the research design, study materials and data analysis in their Methods sections, permitting replication by any researcher. They should also disclose any data exclusions, all the conditions/groups tested, and all the dependent measures collected, even if not included in the analyses presented in their submission. Guidelines for reporting fMRI and M/EEG studies that can be followed are described in Poldrack et al. (2008) and Keil et al. (2013), respectively.
JoCN endorses, but does not require, the depositing of materials, data, and analysis scripts into an open access repository for publication. In manuscripts accepted by the Journal, authors will be asked to state how researchers may access the study data and materials (e.g., via a link to an open access repository, or by email to the lead author), or describe why the data cannot be shared (e.g., no IRB approval for data posting/sharing).
Statistical Rigor and Reporting
Studies should include adequate control conditions/groups, have adequate statistical power, and avoid other common statistical mistakes (Makin & Orban de Xivry, 2019).
Authors must explicitly justify their sample size(s) in the Methods section, e.g., based on an a priori power analysis or previous work.
JoCN encourages researchers to also report effect sizes to avoid problems associated with null-hypothesis significance testing (e.g., Szucs & Ioannidis, 2017) and/or include Bayesian statistics that can assess evidence for the null hypothesis (e.g., Harms & Lakens, 2018).
Test statistics should be reported with two decimal points (e.g., t(49) = 4.28). In addition, exact p values should be reported for all results greater than .001 (with three decimal points for p values between .001 and .10; and with two decimal points for p values greater than .10); p values below .001 should be described as “p < .001”. Exact Bayes Factors should also be reported (between two to four decimal points depending on the precise BF; e.g., BF01 = .0067; BF01 = .67; BF10 = 11.58).
JoCN supports posting article preprints (e.g., on https://www.biorxiv.org/) to enable more rapid dissemination of results.
Diversity in Citation Practices
Longstanding inequities in our field unfairly hinder career advancement for nonmale-identifying individuals and for individuals from underrepresented racial, ethnic, and socioeconomic groups. They also hinder the development and advancement of our science. For JoCN, it has been determined that, from 2009 through 2020, papers by nonmale-identifying research groups have been systematically undercited relative to proportions of authorship in the journal (Fulvio et al., 2020). In an effort to call attention to this problem, and to perhaps start remedying it, JoCN encourages authors to estimate the gender citation balance indices (GCBIs) of the Reference section of their manuscript—using the tool at https://postlab.psych.wisc.edu/gcbialyzer/)—and to include these metadata along with their submission. Reviewers will be instructed to not consider the GCBIs in their assessment of the suitability of the manuscript for publication, but to take them into account when recommending to the authors papers from undercited groups that the authors may consider including in subsequent revisions. Beginning with vol. 33, all papers published in JoCN will include a Citation Diversity Statement, and authors will be invited (but not obligated) to report their paper’s gender citation balance in this statement. The journal will keep a record of GCBIs, labelled by iteration of submission, and will periodically report the aggregate data, as a way of assessing the influence of this initiative.
Publishing Open Access
The Journal of Cognitive Neuroscience offers authors the opportunity to publish their articles in Open Access. Please see MIT Press's Open Access information page Publishing Open Access.
Peer Review Taxonomy
Journal of Cognitive Neuroscience and the MIT Press are participating in a pilot of STM's Working Group on Peer Review Taxonomy.
Background statement: "STM, the International Association of Scientific, Technical and Medical Publishers, has recognised a need to identify and standardise definitions and terminology in peer review practices in order to help align nomenclature as more publishers use open peer review models. A peer review taxonomy that is used across publishers will help make the peer review process for articles and journals more transparent, and will enable the community to better assess and compare peer review practices between different journals."
- Identity transparency: Double Anonymized
- Reviewer interacts with: Editor
- Review information published: None
Submission of Manuscripts
The Journal of Cognitive Neuroscience only accepts online submissions. To submit a manuscript for review visit the JOCN online manuscript submission tool. There is a flat fee of $1500 charged for publishing a manuscript; this fee offsets some of the expenses of the journal's editorial operations. The Journal of Cognitive Neuroscience no longer charges authors to publish color images.
The Journal of Cognitive Neuroscience does not accept supplemental material for manuscripts, except in rare cases and only with the explicit approval of an editor. To be clear, this policy does not relate to the "depositing of materials, data, and analysis scripts into an open access repository for publication," which is strongly encouraged. Rather, it means that manuscripts must be fully self-contained, and cannot refer the reader to supplementary methods, figures, or analyses that are needed to fully evaluate the results and/or interpretations presented in the manuscript. The rationale is twofold: first, such supplementary material puts undue burdens on reviewers, and subsequently on readers; second, such supplementary material can compromise the archival integrity of a paper published in JoCN, because the journal can control neither the content nor the accessibility of this material once the paper is published.
Organization of Manuscripts
A major objective of the Journal is to promote interdisciplinary understanding and communication in the mind sciences. Accordingly, the abstract should clearly state the relevance of the article to fields outside of the subdiscipline being reported. It should be written in complete sentences, without subheadings. The background, objectives and hypothesis of the study should be presented in the introduction. The experimental methods will be the next section. The results and discussion sections should follow and in some instances may be combined. Sections and subheadings should not be numbered. The format for references is as specified in the American Psychological Association Publications Manual and must include a Crossref DOI in the format https://doi.org/10.5334/joc.98). There is no page limit for submitted manuscripts. Manuscripts should be submitted in the .doc format (Word). For initial submissions, it is permissible to paste figures into the manuscript file. For all revisions, figures must be uploaded as separate files, and image resolution for figures should be 300 dpi for color images (in RGB format), and 600 dpi for grayscale art and line art. Acceptable image formats are: jpeg, jpg, tif or tiff.
Preregistered Research Reports
Preregistered Research Reports (also known as "Registered Reports") are a type of empirical article in which the theoretical motivation, the hypotheses, and the proposed methods have been peer reviewed and approved by the journal prior to the research being carried out. If the “Stage 1” submission receives in-principle acceptance, the journal guarantees publication of a “Stage 2” manuscript that presents the results and their interpretation, on the condition that the methods used to collect and analyze the data did not deviate from what was approved at Stage 1. There are several ways in which Preregistered Research Reports encourage best practices. At Stage 1, peer review will necessarily focus on the importance of the question being addressed, and the rigor of the methods proposed to address this question. Therefore, reviews of a Stage 1 manuscript cannot be biased by whether or not a reviewer “likes” the conclusions drawn from the results. This prereview of the methods also decreases the likelihood that time and money will be wasted by carrying out an experiment that, in retrospect, is found to contain an experimental confound that renders its results uninterpretable. At Stage 2, publication bias is reduced because researchers cannot carry out multiple post hoc analyses in quest of a statistically significant result. Indeed, the guaranteed publication of null results will contribute to the robustness of our science, because the review of the Stage 1 proposal increases the likelihood that these null results will be interpretable and potentially important. The format also allows for the reporting of results from analyses conceived after in-principle acceptance, providing these separated from the results from the preregistered analyses and explicitly labeled as exploratory. (For more on registered reports, see Nosek & Lakens, 2014.)
From initial submission through to publication, Preregistered Research Reports go through two editorial stages. The initial submission to the journal is the Stage 1 Preregistered Research Report, which comprises the Abstract, Introduction and Methods. The Abstract is similar to conventional manuscripts, with the exception that it will describe the predicted pattern of results, and what would be the theoretical implications. The Introduction is similar to conventional manuscripts, with the primary exception being that it concludes with an explicit enumerated list of hypotheses. The hypotheses should be stated in terms of the concrete independent and dependent variables that make up the experimental design, not in terms of the rationale underlying them, nor in terms of the theoretical constructs operationalized by the experimental variables. The statement of these hypotheses should be concise, with no more than two or three sentences for each. It is understood that the statement of a hypothesis may need to refer to methods and/or measures that won’t be fully explicated until the Methods section. Sections of the Introduction referring to the planned experiments and possible outcomes should be written in the future tense. The Methods section must include an explicit a priori justification of statistical power (to increase the likelihood that possible null results will be interpretable and potentially important).
It can also happen that authors want to include pilot data in a Stage 1 manuscript. In some cases, such data help illustrate a novel method or demonstrate the plausibility of some aspect of the proposed experiment. Another scenario is when the authors have discovered an unexpected result in a completed experiment, and are using the Preregistered Research Report mechanism to carry out a peer reviewed, a priori replication of this result. If it is decided that the inclusion of pilot data would be an important element of an eventual published Preregistered Research Report, the description of these data should be included in the Introduction. Figures illustrating pilot data should be kept to the minimum of what is deemed necessary for the thorough evaluation of the Stage 1 manuscript.
An important element of the Stage 1 submission is the cover letter that accompanies the manuscript. This cover letter must include the following elements:
- A statement confirming that all necessary support (e.g., research funding), approvals (e.g., for human subjects or nonhuman animal research), and resources (e.g., equipment, research personnel) are in place, and that the experiment will be started as soon as the proposal receives in-principle approval (or as soon as is practicable).
- A pledge that, upon in-principle acceptance, the Stage 1 manuscript will be placed in a recognized repository (either immediately accessible or under embargo until acceptance of the Stage 2 manuscript). This archived document must be unchanged from the manuscript that received in-principle acceptance. The intended repository should be named and a URL supplied, and part of the peer-review process may involve verification of the suitability of the proposed repository.
- An anticipated timeline for carrying out the proposed research.
- A statement confirming that, upon acceptance of the Stage 2 manuscript, the authors will make their data freely accessible, with details about how this will be accomplished (e.g., naming the repository where the data will be hosted).
- A statement confirming that, if the authors withdraw their submission at any time after in-principle acceptance of the Stage 1 manuscript, the journal can publish the abstract from that manuscript in a section for Withdrawn Preregistrations.
Peer review of Stage 1 submissions will proceed in the same manner as for conventional submissions, with “Reject” or “Revise and Resubmit” the two most likely outcomes, and the latter requiring that a point-by-point response accompany the revised Stage 1 manuscript.
The Abstract, Introduction, and Methods can only change from the accepted Stage 1 document in the following ways. The Abstract can replace mention of anticipated results with a sentence or two summarizing the actual results and a sentence summarizing their interpretation. On rare occasions, it might be acceptable to summarize the results from exploratory analyses. Where appropriate, verb tense in the Introduction can be changed from future to present or past tense. Importantly, no mention of exploratory results is permitted in the Introduction. Changes to the Methods section might come about due to minor and/or unavoidable deviations from the procedures as described in the Stage 1 manuscript, or due to the performance of additional exploratory analyses. For deviations from approved procedures for the planned experiment, the original verbiage should be kept as-is, followed by one or more sentences stating how the procedure for the actual experiment was changed, and explaining the reason. Additional exploratory analyses should be described in a newly added final subsection of the Methods section. The Results should, to the extent possible, be organized in terms of their associated hypotheses. The Discussion should first summarize the results from the proposed analyses, including specific statements about the outcome of each hypothesis test, and summarize the implications of these results for the theoretical questions raised in the Introduction. If additional exploratory analyses were carried out, their summary and possible integration with the planned analyses should only appear after the conclusions based on the planned analyses.
The cover letter needs to include the following:
- A statement confirming that none of the data presented in the Results section were collected prior to the date of in-principle acceptance of the Stage 1 submission.
- Specification of every change made to the Abstract, Introduction, and Methods relative to the accepted Stage 1 manuscript.
- The page number in the Stage 2 manuscript containing the URL for the archived Stage 1 proposal.
- The page number in the Stage 2 manuscript containing the URL for the archived study data and accompanying materials.
- The page number in the Stage 2 manuscript containing the URL for the archived processing and analysis code, if archived separately from the data.
Rejection of a Stage 2 Manuscript
Eventual publication of the proposed research is assumed, regardless of the pattern of the results, upon in-principle acceptance of a Stage 1 manuscript. Reasons for rejection of a Stage 2 manuscript are limited to circumstances that violate the spirit of the Preregistered Research Reports mechanism. These include deviation from the Stage 1 experimental procedure that is deemed nontrivial by reviewers and/or the editor, and evidence that a condition for publication was not met (for example, archived Stage 1 manuscript differs from the accepted version; data collection was started prior to date of in-principle acceptance; research ethics approvals are found to have not been obtained prior to submission).
Nonhuman Primate Neurophysiology
Many nonhuman primate (NHP) neurophysiology studies entail recording from hundreds or even thousands of neurons and/or electrode penetrations, across multiple experimental sessions (each replicating the same experimental conditions), all within the same animal. For the typical experiment that also entails the performance of a behavioral task (i.e., during chronic recording from an awake, behaving animal), the neurophysiological recording sessions are preceded by months of training the animal to perform the task to a predefined criterion level of performance. For studies like these, the operative statistical tests are whether an experimental manipulation produces a systematic effect in the signals recorded from this experimental animal over these multiple sessions (e.g., the degrees of freedom for an analysis might be the number of neurons that was recorded from within a particular region). The results from such an experiment can be viewed as a case study – a detailed analysis of the behavior and neurophysiology of a single organism.
For decades, the convention for NHP neurophysiology studies has been that results obtained in one animal should be replicated in a second animal before they are written up for publication. Although this practice may boost subjective assessment of the robustness of a set of results, it does not support stronger inference in any formal way. The generation of a second case study does not, for example, support inference to the population from which the two sample cases were drawn (c.f., Fries & Maris, preprint). (Indeed, in practice one often reads that, upon observation that the results obtained from a second animal did not differ appreciably from those obtained in a first animal, data from the two animals are pooled before the final statistical analyses are carried out.) Additionally, although it has been suggested that the “two-animal rule” acts as a deterrent to engaging in “questionable research practices” (c.f., Martinson, Anderson, & de Vries, 2005), this is a supposition for which we know of no definitive evidence. JOCN’s policy is that the factor of number of animals included in the experiment does not carry special status relative to other aspects of the experimental procedure that are also evaluated when a manuscript is being considered for publication. Thus, for example, an n of 1 in an NHP neurophysiology study is not, by itself, a valid a priori reason for rejecting a manuscript. Rather, just as is the case with experiments carried out in other species (including humans), the n selected for an NHP neurophysiological study needs to be explicitly justified. Factors such as ethical considerations, effort, and/or expense can, in principle, be satisfactory justification for an n of 1, so long as the authors can successfully argue that the results from the experiment are scientifically valuable. (We note that the merits of peer-reviewed preregistration are the same for NHP neurophysiology research as for other types of cognitive neuroscience research.)
Journal of Cognitive Neuroscience
Dept. of Psychology
1202 West Johnson St.
Madison, WI 53706
Dworkin, J. D., Linn, K. A., Teich, E. G., Zurn, P., Shinohara, R. T., & Bassett, D. S. (2020). The extent and drivers of gender imbalance in neuroscience reference lists. Nature Neuroscience. doi:https://doi.org/10.1038/s41593-020-0658-y
Fries, P., & Maris, E. (preprint). What to do if N is two? https://arxiv.org/abs/2106.14562.
Fulvio, J. M., Akinnola, I., & Postle, B. R. (in preparation). Gender (im)balance in citation practices in cognitive neuroscience. bioRxiv. doi:doi.org/10.1101/2020.08.19.257402
Harms, C., & Lakens, D. (2018). Making ‘null effects’ informative: Statistical techniques and inferential frameworks. Journal of Clinical and Translational Research, 3, 7. doi:dx.doi.org/10.18053/jctres.03.2017S2.007
Keil, A., Debener, S., Gratton, G., Junghöfer, M., Kappenman, E. S., Luck, S. J., . . . Yee, C. M. (2013). Committee report: Publication guidelines and recommendations for studies using electroencephalography and magnetoencephalography. Psychophysiology, 51, 1-21. doi:doi.org/10.1111/psyp.12147
Makin, T. R., & Orban de Xivry, J.-J. (2019). Science Forum: Ten common statistical mistakes to watch out for when writing or reviewing a manuscript. eLife, 8, e48175. doi:doi.org/10/7554/eLife.48175
Martinson, B. C., Anderson, M. S., & de Vries, R. (2005). Scientists behaving badly. Nature, 435, 737–738. doi:https://doi.org/10.1038/435737a
Nosek, B. A., Ebersole, C. R., DeHaven, A. C., & Mellor, D. T. (2018). The preregistration revolution. Proceedings of the National Acadamy of Sciences, USA, 115, 2600-2606. doi:doi.org/10.1073/pnas.1708274114
Nosek, B. A., & Lakens, D. (2014). Registered reports: A method to increase the credibility of published results. Social Psychology, 45, 137-141. doi:dx.doi.org/10.1027/1864-9335/a000192
Poldrack, R. A., Fletcher, P. C., Henson, R. N., Worsley, K. J., Brett, M., & Nichols, T. E. (2008). Guidelines for reporting an fMRI study. NeuroImage, 40, 409-414. doi:doi.org/10.1016/j.neuroimage.2007.11.048
Szucs, D., & Ioannidis, J. P. A. (2017). When null hypothesis significance testing is unsuitable for research: A reassessment. Frontiers in Human Neuroscience, 11, article 390. doi:doi.org/10.3389/fnhum.2017.00390