Chair & Reviewer Guidelines
The Program Committee’s job is to select the talks that maximize the benefit to the entire conference audience, within the space and time constraints given.
SciPy Japan 2020 will implement a double-open review: authors and reviewers are known to each other. Area chairs will make acceptance decisions, informed by reviewer comments and conference schedule limits. The intention is to offer transparency in the process, building trust in the community. (There is also research attesting to higher quality in signed reviews, compared to anonymous reviews.)
But, research shows that peer review can suffer from implicit bias—just like the hiring processes—that disadvantages women, minorities and people from less-prestigious institutions. There is also research attesting to higher quality in signed reviews compared to anonymous reviews,, . For this reason, we want to educate our reviewers about bias in the review process, and equip them with tools to interrupt this bias.
Bias in the review process
Human beings are consistently, routinely, and profoundly biased. We not only are profoundly biased, but we also almost never know we are being biased. (Interested?,)
First, by realizing and accepting that we all have bias, we can learn to watch for it in ourselves and help others who work with us to do the same. This process of building awareness is analogous to what happens when we step on the clutch in a manual-transmission automobile. The motor doesn’t stop running (bias doesn’t stop), but the car is no longer moving forward. When we are on the lookout for our biases, they are less likely to blindly dictate our decisions.
Be on the lookout! As you review, ask yourself …
Are there things about the proposal that particularly influence my impression? Are they relevant to the talk?
What assessments have I already made about the speaker? Are these grounded in solid information or are they simply my interpretations?
Does this person’s work remind me in any way about myself? Is my own agenda influencing my assessment of this proposal? Are there past experiences of mine that are influencing factors?
Does this person remind me of anybody, positively or negatively? (Be careful to observe rater biases relating to current projects or work. It is easy to slip into evaluating proposals more or less positively because they happen to be on a project that is important to you.)
Does this person’s academic institution, place of employment, or any other affiliations affect how I perceive their work?
Things to think about as you prepare to review:
Bias in pattern-recognition responses—Does one person benefit because they do things “the way we do it around here,” rather than because it is the most innovative, productive, and effective way?
Pay attention to your projections about the work being evaluated.
Look for patterns of assessment among different groups. For example, are women in general rated differently than men?
Look out for Semantic gender priming: exposure to words more strongly associated with male (e.g., aggressive, competitive) or female (e.g., supportive, nurturing) stereotypes affects subsequent evaluation of male or female targets.
Inferences about an individual based on aggregate data or assumptions about a group: making assumptions about individuals based on cultural stereotypes is a type of ecological fallacy. For example, because fewer women lead large center grants, making the assumption that an individual woman is less able to lead a large program; or because overall Black student achievement is lower than for White students, making the assumption that an individual Black student is less prepared.
Tips for the reviewer
Interrupt your bias—refresh yourself after each section as to what rubric/metrics you’ve established and make sure that you’re reviewing a submission accordingly. Be polite. Instead of telling, ask new ways of approach.
Don’t be rude. Always say at least one good thing.
Be grateful. Point out what you learned and say thanks.
Stay fresh while reading. Long time reviews might cause some spots to be left uncovered.
Reviewers have multiple demands on their time, and are frequently under time-constraints to finish reviews, be aware that cognitive biases are efficient, time pressure promotes their influence on decision-making.
Other interesting links on the topic about gender bias:
You should analyze the presentation in the aspects described below, thanks to Doug Hellmann for these tips:
The Abstract: occasionally a title comes along that is so compelling that I have to remind myself to keep reading before voting +1 and moving on to the next talk. It isn’t enough for a proposal to cover an interesting topic. It has to indicate that the talk will be interesting, too. While I am reading, I look for several factors.
Is the abstract clear? The speaker should describe the topic they plan to talk about in terms I can understand, even if I don’t know anything about that subject area. A clearly written abstract, without a lot of domain-specific jargon, tells me the speaker will be able to communicate with the audience.
Is the abstract complete? An incomplete proposal is the largest negative factor I consider. If a proposal is incomplete, I can’t really say what the speaker will talk about, or even if they know the subject matter for their talk. If a proposal does not have a detailed summary or outline, I as the submitter to provide more detail.
Is the abstract compelling? Without regard to the actual subject, is the abstract written in a way to attract an audience? I look for an abstract that shows the speaker is excited about the topic, and that they will be conveying that same excitement to the audience.
The Topic: for some people, the subject matter of a talk is the most important, or only, aspect taken into consideration when voting. I have seen presentations on topics I thought would be boring, but which were delivered with such enthusiasm that I enjoyed them more than talks I thought would be interesting from the outset. In my mind, the topic is an important factor, but not necessarily the most important, to be considered.
How relevant, immediately useful and novel is the topic? I look first at whether the topic is relevant to the conference attendees. Attendees will have a range of experience levels, interests, and backgrounds. Although we want a broad set of topics, we do need to be careful to avoid talks that are too narrowly focused on a niche. As a counterpoint to considering whether a topic is too niche, I also try to take into account whether the audience will take away something immediately useful.
Conflicts of interest
The definition of a Conflict of Interest in peer review is a circumstance that makes you "unable to make an impartial scientific judgment or evaluation.” (PNAS Conflict of Interest Policy).
Conflict of Interest (COI) is different from individual bias. It is concerned with independence from any interest that could impair your objective assessment, any concern other than the intellectual quality of the work.
As a reviewer, your present or previous association with certain authors poses textbook COI: recent collaborators in research funding or publications (in the past 4 years), family members, business partners, and thesis student/advisor or mentor. Other associations could be more tenuous: being employed by the same institution, for example. In this case, your obligation as reviewer is to disclose the potential COI to the area chair or Program Chair, and state whether you think you are able to make an impartial assessment of the work.
If you have a conflict of interest with a submission, you should decline to review it and disclose the specific reason to the Program Chairs.
As Area Chair of a conference track or mini symposium, you should avoid conflicts of interest in making acceptance decisions: you should recuse yourself from decisions on the work of your collaborators (or indeed, your own submissions), and let that decision be handled by your co-chair (or Program Chairs).
Declaring the potential of a conflict of interest is required under professional ethics. If in doubt: ask the Program Chairs.
Sources & Citations
Prepared by the SciPy 2017 Diversity Committee and Program Chairs. Thanks to Philip B. Stark for useful links.
This document is a compilation from different sources listed below:
Everyday Bias: Identifying and Navigating Unconscious Judgments in Our Daily Lives. Howard J. Ross.