For two decades, unconscious bias theory depended both conceptually and empirically on the Implicit Association Test (IAT). During the last 3-4 years it has, however, become increasingly clear that the IAT does not assess unconscious beliefs or attitudes, and does not predict discriminatory behavior better than self-reported attitudes and beliefs. Reliance is now increasingly on the theory that underlies the IAT, the 2-system theory of the mind, System 1 being the automatic and fast information processing system that operates through stereotypes – and is vulnerable to biases – and System 2 which has the potential to use rational deliberation to correct the automatic reactions of System 1.
It is commonly assumed that the stereotypes of System 1 are unconscious, are largely dissociated from conscious beliefs and attitudes, and give rise to for example gender disparities.
In this paper I argue that stereotypes are not unconscious in the sense of being inaccessible to introspection. Further, empirical evidence does not support the notion of dissociation of conscious and unconscious beliefs and attitudes. Finally, a systematic review of the published literature about unconscious gender bias in organizations did not disclose empirical studies which document gender inequalities arising from unconscious bias. Studies assumed implicit bias to be the explanation of findings, rather than provide evidence of this being the case.
It is noted that there are important moral and practical consequences in organizations of assuming that stereotypes are unconscious rather than accessible for rational deliberation and correction.
Keywords: unconscious gender bias; implicit gender bias; organizations; gender equality; gender equity
In 2017 the Editor in Chief of Science, Jeremy Berg, took an Implicit Association Test (IAT) at a website at Harvard University and found that he was biased against women in science. He confessed:
As someone who grew up with a mother who was a medical researcher, who has been married to a woman very active in scientific research for more than 30 years, and who has had many female colleagues and students, I was surprised when I first took a test to measure implicit gender bias and found that I have a strong automatic association between being male and being involved in science (Berg 2017).
Obviously, Berg believed that he harbored unconscious attitudes that played tricks with him behind his back. Similar beliefs had been expressed by the previous Editor in Chief of Science, Marcia McNutt (2016). The Danish documentarist, Mette Korsgaard,, wrote a book on the premise that we all are male chauvinists, even she, who has been a feminist for decades (Korsgaard 2018), and also for her it was the IAT that had opened her eyes to her sexism towards her own gender.
The belief that one can be biased against women without wanting to, and that the IAT pinpoints this unconscious bias is widespread, as is the belief that there is a disconnect between consciously held attitudes and the unconscious. “The implicit associations we hold do not necessarily align with our declared beliefs or even reflect stances we would explicitly endorse” (Staats et al. 2015).
Much of this stems form the theory behind the IAT and the data that has been produced with the test. And most of it is wrong.
The Implicit Association Test
In the 1990ies several papers emerged that noted the disappearance of overt expressions of racist and sexist beliefs in American public discourse whereas no corresponding decline in the discrimination and disadvantaged state of disadvantaged groups could be discerned. It was proposed that previous overt biases had gone undercover, had become covert (Dasgupta 2004; Swim et al. 1995).
One strand of research originating from these observations drew on new methods that enabled psychologists to measure unconscious memory, and introduced the idea of unconscious or implicit beliefs and attitudes. The theoretical understanding encompassed three factors influencing behavior: the person’s unconscious beliefs and attitudes, the covert beliefs and attitudes, and the explicit beliefs and attitudes (Dasgupta 2004). Figure 1 summarizes the model’s assumptions that all three factors have a direct impact on behavior. The dashed lines indicate that beliefs may be correlated. A central claim of the model is, however, that the correlation between the unconscious beliefs and attitudes and the conscious, overt (explicit) beliefs and attitudes is weak.
Figure 1. Model of relation between beliefs and attitudes and behavior
In empirical research the Implicit Association Test (IAT) was introduced as a method to assess what respondents were not able to self-report (Dasgupta 2004). This was a major methodological break-through. IAT has been a mainstay in the research on unconscious bias ever since. Figure 2 depicts a typical empirical set-up of lab experiments inspired by the above model. Explicit beliefs and attitudes are assessed by self-reports in questionnaires or interviews, and the attitudes and beliefs that people are not able to a willing to self-report are assessed by the implicit association test (IAT).
Figure 2. Laboratory study setup corresponding to the model in Figure 1
Figure 2 brings out an important aspect in which the unconscious bias theory that relies on IAT, is not concerned with unconscious attitudes: IAT includes covert attitudes. The assessments are of attitudes that people are “not able to or willing to self-report” (Dasgupta 2004). The method has no means of distinguishing the contribution of each.
Further, the founders of the IAT, Anthony Greenwald and Mahzarin Banaji, came clear in their 2017 paper that unconscious beliefs and attitudes as measured by IAT should not be understood literally as unconscious:
many publications continue to use implicit and explicit with conceptual meaning (as unconscious vs. conscious), rather than with their better justified empirical (indirect vs. direct) meaning. Even though the present authors find themselves occasionally lapsing to use implicit and explicit as if they had conceptual meaning, they strongly endorse the empirical understanding of the implicit– explicit distinction (Greenwald and Banaji 2017, 862).
Concerning the “dissociation” of conscious and unconscious attitudes, Nosek (2005) investigated the correlation between implicit and explicit measures in a large study and found an average correlation of 0.36. Items concerning feminism / traditional values showed an implicit-explicit correlation of 0.69. Hofman et al. reported an average correlation between self-reported and implicit measures of 0.24 (Hofmann et al. 2005, 1376). These correlations are not negligible, and the claims of dissociation seem ill founded in empirical data.
More importantly a meta-analysis by Oswald et al. (2013) showed that except for brain correlation studies the implicit measures did not fare better than explicit assessments in criterion studies. Finally, and most importantly, the large recent meta-analysis by Forscher et al. (2019) showed that people’s read-out on implicit measures can be changed by interventions but that changes in behavior are not correlated with the changes in implicit measures, casting doubt on the validity of IAT as predictor of behavior.
There are thus two reasons why Berg and others should not be worried about their IAT findings: IAT does not necessarily measure something they do not know about, and IAT does not predict their behavior.
Stereotypes and bias
This is not to say that we cannot be biased against women in the workplace, it’s just a good deal more complicated than it seems at first sight. To understand how, we must take a step back and consider the 2-system theory of the mind. According to this influential theory we have 2 main information processing systems, aptly termed System 1 and System 2, where System 1, employs heuristics for quick decision making, and System 2 examines the available information more thoroughly and makes deliberate and reasoned decisions (Kahneman 2011; Kahneman and Frederich 2002))
The Royal Society’s video Understanding unconscious bias (2015) illustrates unconscious bias with a well-known finding about the shortcomings of System 1, the bat-and-ball problem: a bat and a ball cost 1 pound 10 pence. If the bat costs 1 pound more than the ball, how much does the ball cost? The video goes on to say that most people get this wrong. Which is true for the setting in which the data was generated, namely US college students responding in a test set-up where they have little at stake. If their life depended on getting the answer right, no doubt most would figure out the correct answer after considering the calculus for a few minutes and checking the result. They would be mobilizing their System 2.
There is ample experimental evidence that the mind can be tricked into being biased in time-pressed situations and with limited information available. Such experiments produce errors of reasoning like the bat-and-ball problem, and a large number have been described, see e.g. (Kahneman 2011).
It is this theory that underlies the IAT as well as a wealth of publications about unconscious gender bias. Implicit attitudes as processed by System 1 are not unconscious in the sense of being inaccessible to introspection (Brownstein 2017). Implicit attitudes are part of the mind’s automatic information processing but as illustrated by the bat-and-ball problem, System 2 can stop the automatic processing and subject the problem to conscious scrutiny. In many cases implicit attitudes may be viewed as priors in the Bayesian sense (sometime called base rates, e.g. (Jussim et al. 2018)) that are activated when little other information is available and a decision has to be made.
This is now well established, and the interesting research question, which is being addressed by for example economists under the heading of stereotypes (Bordalo et al. 2016), is under which circumstances such priors are biased or accurate respectively. Bordalo et al.’s simplest example of a stereotype is the belief that “Florida residents are elderly”. The stereotype has a “kernel of truth”, the age distribution of the population of Florida is indeed shifted to the right compared to the whole US population, but the shift is too small to justify the general statement that Florida residents are elderly. Bordalo et al. analyze several datasets and demonstrate that some stereotypes are quite accurate, whereas some are biased in a predictable direction.
As in the case of the bat-and-ball problem, it is obvious that the stereotype about the age of residents of Florida is not unconscious to those who harbor it. The application of it to a specific situation, say receiving an e-mail from an unknown resident of Florida, may be quick and automatic – the recipient may assume that the sender of the e-mail is an octogenarian until proven otherwise. But the stereotype itself can be subjected to critical assessment by consulting population statistics for Florida (where one would find that only 17.4% are above 65, compared to 13.1% in the US population (Bordalo et al. 2016)) and can subsequently be corrected.
The interest in the shortcomings of reasoning seems to have shifted the theoretic and practical emphasis to the fallacies of System 1-processing, whereas the corrective actions of System 2 receive little attention. A “reconceptualization of the relation between the conscious and the unconscious” proposed by Greenwald & Banaji (2017, 861) hardly mentions rational thought. Although Kahneman and Frederich (2002), Kahneman (2011), and Stanovich (e.g. (2011)) have made efforts to establish a more balanced view of the dual system processing, a one-sided emphasis on the biases of System 1 has set the stage for explaining discrimination against minorities as well as women as the result of biased information processing carried out by System 1.
There is a large body of literature that describes men’s and women’s stereotypes about what men and women can and do in the workplace and what characterizes men’s and women’s styles and expectations. Theoretically, biases could arise from automatic application of such stereotypes to small and large decisions in the workplace, and could add up to gender disparities in representation and pay.
The extent to which this happens in real life is what I am interested in in this paper. I am, in essence, looking for empirical evidence to support statements of the form:
- Women scientists in STEM, due to their underrepresentation, embody a critical point for the whole research system. This is broadly considered as the result of cultural embedded unconscious bias that affects female professional career paths (Di Tullio 2019)
- The way leaders recruit and promote employees is influenced by gendered stereotypes that cause us to evaluate men as better leaders than women. (Muhr 2019), (my translation).
- There is ample evidence that implicit bias is a (if not the) major cause of less favourable assessment of women’s academic capacities in research, teaching and leadership. This bias is present in access to power and to resources, including salaries and research funding (Gvozdanovic and Maes 2018).
These statements commit to a number of assumptions. The unconscious nature of the biases can be understood in the strong sense where beliefs and attitudes are inaccessible to introspection, or a weaker sense that commits to automatic – one could say thoughtless – application of stereotypes. In order to empirically substantiate the statements, studies would have to assess unconscious stereotypes (strong version of theory), or application of stereotypes that are in principle accessible, including an assessment that the stereotype was indeed used automatically, rather than with intent (weak version of theory). Further, there is the commitment to the effect being present at the group level.
This amounts to measuring the left and right sides of Figure 1, and analyzing their relationship, while controlling for the consciously held attitudes and beliefs, in groups of individuals in organizations.
This may seem a tall order for empirical studies, but the complexity of the theory calls for complex studies. The statements cannot be substantiated by theory alone, since the theory implies that stereotypes can be checked by System 2.
Do studies exist which provide compelling evidence for the above 3 statements?
I have approached the question with a scoping review of the literature about unconscious or implicit gender bias. The findings relevant to academia are published in NN 2020, and here the review is extended to other organizations and to management.
Material, methods, results
The literature search was made in Scopus and psycInfo. Scopus is the largest database of peer-reviewed papers, encompassing 46 million references from 19,500 journals from a wide range of disciplines. psycInfo is the most comprehensive database of psychological literature relevant for this review.
Titles, abstracts, and keywords were searched for the terms:
(((“unconscious bias” AND (gender OR women OR woman)) OR (“implicit bias” AND (gender OR women OR woman))))
A total of 425 publications were identified in the database search. One study was identified from other sources (Gaustad and Raknes 2015) and was included in the review for at total of 426. Of these, 312 were evaluated on the abstract alone, 108 were evaluated on the full paper, and 6 were not evaluable on the abstract alone, and could not be found at the Royal Danish Library or on the Web, and were thus excluded.
After removal of book reviews, conference papers, conference reports, dissertations/theses, and publications about healthcare disparities, legal aspects of discrimination, and other publications not concerned with gender bias in organizations, 156 remained.
Of these 156 publications, 65 reported original empirical data about gender bias, and 91 were editorials, commentaries, and reviews based on existing data.
In 22 of the 65 publications with original data, the data did not provide evidence of bias against women – the findings were either neutral or there seemed to be a bias in favor of women. The remaining 43 publications reported bias against women in organizations (Akingbade 2010; Beasley et al. 2019; Beeler et al. 2019; Braddy et al. 2019; Cochran et al. 2013; Davids et al. 2019; Davies et al. 2019; Dayal et al. 2017; Di Tullio 2019; Dion et al. 2018; Dixon et al. 2019; Dresden et al. 2018; Fan et al. 2019; Files et al. 2017; Foley and Williamson 2019; Gaustad and Raknes 2015; Gerull et al. 2019; Girod et al. 2016; Hansen et al. 2019; Hardcastle et al. 2019; Heath et al. 2019; Holman and Morandin 2019; Jackson et al. 2014; James et al. 2019; Kalejta and Palmenberg 2017; Khoo-Lattimore et al. 2019; Krishnan et al. 2019; Lincoln et al. 2012; Lukela et al. 2019; Magua et al. 2017; Malmström et al. 2020; Manlove and Belou 2018; Martin 2011; Miller-Friedmann et al. 2018; O’Meara et al. 2018; Pardal et al. 2020; Ramos et al. 2016; Rojek et al. 2019; Salerno et al. 2019; Salles et al. 2019; Tanwir and Khemka 2018; Thomson et al. 2019; Turrentine et al. 2019). Two of the 43 (Kalejta and Palmenberg 2017; Martin 2011) did not clearly propose unconscious or implicit bias as an explanation of the findings.
The 41 studies that proposed unconscious or implicit bias as an explanation of the findings were reviewed in order to identify any that showed an association of a measure of implicit or unconscious bias, or stereotype, base rate or heuristic in the sense of System 1 and an endpoint that could at least hypothetically indicate lesser employment or career opportunities for women.
The simple conclusion is, that no publications of empirical data to document such an association could be identified.
Evans et al. (2014) is an example of a study that despite claims to the contrary, does not study unconscious bias. The data collection used a “mixed methods approach…, encompassing qualitative ‘one to one’ interviews and focus groups” (Evans et al. 2014, 540). ”Questions were posed explicitly to provoke some reflection on how inclusive the workplace culture was for women”. Based on the interviewees’ reports about their perceptions about inclusiveness, the authors concluded that the study provided empirical evidence for the existence of unconscious bias against women in the workplace (Evans et al. 2014, 507).
Upon a little reflection it will be clear that this study only assessed the outcome side of the equation – the right side of Figure 1 as it were. Study participants were asked about their colleagues’ behavior, and they were asked about their opinion about the causes of such behavior. But this cannot count for an assessment of the unconscious attitudes or beliefs of their colleagues.
Though the study is compatible with unconscious bias, it does not demonstrate it. The study is typical in that it presents statistics about differential experiences or treatment of men and women in an organization and concludes offhand that the statistics show unconscious bias. What should be demonstrated is taken for granted; the reasoning is circular.
Lincoln et al. (2012) analyzed data from 13 STEM societies about receipts of awards by men and women. Women received fewer research awards than expected based on their representation among the nominees. The authors ascribed this in part to award committees being implicitly biased. The award committee members’ attitudes were however not assessed in the study. They were only assumed on the basis of the skewed statistic about awards. As in the Evans et al. study, what should be demonstrated by data collection and analysis of an association, was instead assumed.
Of the 41 studies, 31 shared the property that they only measured the outcome side of Figure 1. In 4 studies, IAT was measured as an independent variable but with an irrelevant (Pardal et al. 2020) or no outcome variable measured (Dresden et al. 2018; Hansen et al. 2019; Salles et al. 2019). Five studies neither measured dependent nor independent variables (Di Tullio 2019; Foley and Williamson 2019; Miller-Friedmann et al. 2018; O’Meara et al. 2018; Tanwir and Khemka 2018). A single study manipulated stereotypical scenarios presented to the participants in an experimental set-up but failed to control for the participants’ conscious decision rules (Martin 2011).
The scoping review revealed a number of studies that purportedly were concerned with unconscious bias but fell short of fulfilling the requirement of measuring dependent as well as independent variables and controlling for extraneous factors, such as conscious beliefs.
The study by Teelken et al. (2019) provides an interesting illustration of the complex relations between overtly or automatically expressed stereotypes on the one hand and behavior on the other. 14 full professors participated, most of whom expressed “male” stereotypes about the preferred profile of professors. This notwithstanding, when presented with similarly qualified CVs, 12 out of the 14 respondents preferred the female candidate. With the limited number of respondents in mind, the findings should be interpreted with caution, but the study clearly demonstrates that stereotypes can be overridden.
More studies of this type with considerably larger number of respondents will be needed to confirm or refute the hypothesis that unconscious bias contributes to gender disparities in organizations. A better method to assess the automatic application of stereotypes than the IAT would have to be employed.
The scoping review only included publications that self-declared to address unconscious or implicit gender bias. Informal searches were made in Scopus and Google Scholar for studies about automatically applied stereotypes, without any surfacing that fulfilled the criteria of connecting the two sides for Figure 1 and controlling for conscious attitudes and beliefs. It is unlikely that studies have been missed, given the great interest in the scientific community for these issues.
It may be noted that the research question whether unconscious bias plays a role for gender inequalities in organizations is different from the question whether individuals may experience sexist behavior from colleagues or superiors who thoughtlessly display stereotypical attitudes and beliefs. The literature as well as the public discourse abound with stories which lend themselves to this interpretation. In an organization the effect of some individuals’ automatically applied stereotypes can be balanced out by others who are aware of and override their stereotypes or who have opposite stereotypes. The net effect of these factors is not a given and will have to be documented in organizations instead of the psychology lab.
There is an ambiguity in the theory about unconscious bias which stems from the lack of clarity as to the meaning of “unconscious”. In the common understanding an unconscious memory, belief or attitude is inaccessible to introspection. Only for those familiar with the research field will it become clear that “unconscious” may also be understood in a weaker sense as application of stereotypes, heuristics or base rates that the acting person fails to control even though he or she could have done so. The misconception of what lies in “unconscious” is widespread, and it would seem timely to abandon a wording that is this prone to create misleading connotations.
How difficult it is to get to know and correct one’s stereotypes? The stereotype about the age of Florida citizens stereotype seemed to be readily available for inspection and self-correction. Switching to System 2 to solve the bat-and-ball problem did not seem difficult either.
This question has a bearing on unconscious bias training. The rationale behind unconscious bias training is that employees, although not overt racists or sexists, conduct themselves in sexist and racist manners. Taking the theory seriously that prejudices and beliefs are inaccessible for conscious consideration has important implications for this training. Firstly, if the beliefs and attitudes are truly unconscious, the bearer is relieved of the moral responsibility for their actions (Noon 2018). It stands to reason that we cannot be held responsible for the tricks our unconscious plays with us. To begin with at least, our prejudices, biases, maybe even discriminatory behavior is not our fault.
This may be seen as an advantage in that conflicts over values are avoided. But this applies only until course participants are being told of their unconscious prejudices. If they take an IAT (as they likely will ), like Jeremy Berg they will be induced to believe that they have biases they did not know of. While they were initially absolved of their biases and discrimination, they learn that they are morally corrupt – even when they try to be good.
Although somewhat counterintuitive, it is possible to imagine how one could consciously work on unconscious prejudices, without actually knowing them. The trick is that that stereotypes can be re-learned. Unconscious attitudes and beliefs are the results of our upbringing, our taking in the cultural products around us throughout the years, our whole experience as human beings. If we are vigilant and persistent, we may be able to replace the unwanted stereotypes with more appropriate ones.
The process is, however, difficult because – in the strong version of the theory – we cannot know which unconscious beliefs and attitudes are in need of replacement, and we cannot know what it takes to make new and better unconscious beliefs and attitudes – except that somehow it takes a radical replacement of our past experience of important aspects of life involving our relationship with people of other gender(s), colors, ages, faiths etc.; and we cannot check whether we were successful in the replacement of the harmful stereotypes – except by taking another IAT. So the task of repairing our morally corrupt mental set-up is a daunting one.
It should be no surprise that bias training that uses this theoretical framework can create a backlash among participants, especially if they are not convinced of the scientific validity of the theory.
Atewologun et al. (Atewologun et al. 2018) reviewed the evidence as to the effectiveness of unconscious bias training and noted that most programs involve an IAT and that the interventions typically aim at supplanting unwanted stereotypes with more desirable ones, and implementing measures that are wholly outside the minds of participants, such as blinding of hiring and promotion procedures. This seems to indicate that unconscious bias training does subscribe to the strong version of the unconscious bias theory.
However, this is not entirely consistent. For example, the Google course available on the web (Unconscious Bias @ Work | Google Ventures 2014) urges participants to question their first impressions, justify their decisions even when not being challenged, challenge themselves, and endeavor to “make the unconscious conscious”, in other words activate their System 2 to challenge the automatic reactions generated by System 1.
Concluding about unconscious bias training there are practical implications of the relative weight given to the accessibility of the unconscious, and courses struggle to find their feet – claiming on the one hand that prejudices are all-pervasive and impossible to get rid of and on the other that the unconscious should be made conscious. This is probably not conducive to attaining the aims of the training. Where to set the cut-point is on the other hand difficult to define. The theoretical framework of the 2-system theory would seem to indicate that stereotypes are not all too difficult to control – for those who want to and invest some time and critical thought into their decision making.
Considering the lack of empirical evidence for the assertion that gender disparities in organizations are due to unconscious biases, the question remains whether studies have not been adequately designed or whether unconscious bias plays no role in practice. Both options are open. Studies are indeed poorly designed, most of the reviewed studies only measured the outcome and assumed the conclusion, a few attempted to measure an independent variable and an outcome but failed to control for conscious attitudes. The deficits could be remedied by more stringent study design, particularly if better methods to assess unconscious attitudes or beliefs and/or automatic stereotyped decisions could be developed – and covert attitudes and beliefs could be ruled out.
Akingbade RE 2010. Between a rock and a hard place: Blacklash towards agentic women aspiring to high ranking jobs in Nigeria. Gender & Behaviour 8(2): 3265–3278. DOI: 10.4314/gab.v8i2.61946.
Atewologun D, Cornish T and Tresh F 2018. Unconscious bias training: An assessment of the evidence for effectiveness. Manchester: Equality and Human Rights Commission. Available at: www.equalityhumanrights.com.
Beasley SW, Khor S-L, Boakes C, et al. 2019. Paradox of meritocracy in surgical selection, and of variation in the attractiveness of individual specialties: to what extent are women still disadvantaged? ANZ Journal of Surgery 89(3): 171–175. DOI: 10.1111/ans.14862.
Beeler WH, Griffith KA, Jones RD, et al. 2019. Gender, Professional Experiences, and Personal Characteristics of Academic Radiation Oncology Chairs: Data to Inform the Pipeline for the 21st Century. International Journal of Radiation Oncology Biology Physics 104(5): 979–986. DOI: 10.1016/j.ijrobp.2019.01.074.
Berg J 2017. Measuring and managing bias. Science 357(6354): 849. DOI: 10.1126/science.aap7679.
Bordalo P, Coffman K, Gennaioli N, et al. 2016. Stereotypes. The Quarterly Journal of Economics 131(4): 1753–1794. DOI: 10.1093/qje/qjw029.
Braddy PW, Sturm RE, Atwater L, et al. 2019. Gender Bias Still Plagues the Workplace: Looking at Derailment Risk and Performance With Self–Other Ratings. Group and Organization Management. DOI: 10.1177/1059601119867780.
Brownstein M 2017. Implicit Bias and Race. In: The Routledge Companion to the Philosophy of Race. Taylor & Francis Group, pp. 261–276. Available at: https://www.taylorfrancis.com/books/e/9781315884424/chapters/10.4324/9781315884424-19 (accessed 20 April 2020).
Cochran A, Hauschild T, Elder WB, et al. 2013. Perceived gender-based barriers to careers in academic surgery. American Journal of Surgery 206(2): 263–268. DOI: 10.1016/j.amjsurg.2012.07.044.
Dasgupta N 2004. Implicit Ingroup Favoritism, Outgroup Favoritism, and Their Behavioral Manifestations. Social Justice Research 17(2): 143–169. DOI: 10.1023/B:SORE.0000027407.70241.15.
Davids JS, Lyu HG, Hoang CM, et al. 2019. Female representation and implicit gender bias at the 2017 American society of colon and rectal surgeons’ annual scientific and tripartite meeting. Diseases of the Colon and Rectum 62(3): 357–362. DOI: 10.1097/DCR.0000000000001274.
Davies R, Potter TG and Gray T 2019. Diverse perspectives: gender and leadership in the outdoor education workplace. Journal of Outdoor and Environmental Education 22(3): 217–235. DOI: 10.1007/s42322-019-00040-8.
Dayal A, O’Connor DM, Qadri U, et al. 2017. Comparison of male vs female resident milestone evaluations by faculty during emergency medicine residency training. JAMA Internal Medicine 177(5): 651–657. DOI: 10.1001/jamainternmed.2016.9616.
Di Tullio I 2019. Gender equality in stem: Exploring self-efficacy through gender awareness. Italian Journal of Sociology of Education 11(3): 226–245. DOI: 10.14658/pupj-ijse-2019-3-13.
Dion ML, Sumner JL and Mitchell SM 2018. Gendered Citation Patterns across Political Science and Social Science Methodology Fields. Political Analysis 26(3): 312–327. DOI: 10.1017/pan.2018.12.
Dixon G, Kind T, Wright J, et al. 2019. Factors that influence the choice of academic pediatrics by underrepresented minorities. Pediatrics 144(2). DOI: 10.1542/peds.2018-2759.
Dresden BE, Dresden AY, Ridge RD, et al. 2018. No Girls Allowed: Women in Male-Dominated Majors Experience Increased Gender Harassment and Bias. Psychological Reports 121(3): 459–474. DOI: 10.1177/0033294117730357.
Evans M, Edwards M, Burmester B, et al. 2014. ‘Not yet 50/50’ – Barriers to the progress of senior women in the Australian Public Service. Australian Journal of Public Administration 73(4): 501–510. DOI: 10.1111/1467-8500.12100.
Fan Y, Shepherd LJ, Slavich E, et al. 2019. Gender and cultural bias in student evaluations: Why representation matters. PLoS ONE 14(2). DOI: 10.1371/journal.pone.0209749.
Files JA, Mayer AP, Ko MG, et al. 2017. Speaker Introductions at Internal Medicine Grand Rounds: Forms of Address Reveal Gender Bias. Journal of Women’s Health 26(5): 413–419. DOI: 10.1089/jwh.2016.6044.
Foley M and Williamson S 2019. Managerial Perspectives on Implicit Bias, Affirmative Action, and Merit. Public Administration Review 79(1): 35–45. DOI: 10.1111/puar.12955.
Forscher PS, Lai CK, Axt JR, et al. 2019. A meta-analysis of procedures to change implicit measures. Journal of Personality and Social Psychology 117(3): 522–559. DOI: 10.1037/pspa0000160.
Gaustad T and Raknes K 2015. Menn som ikke liker karrierekvinner. Hovedresultater fra en eksperimentell undersøkelse. Agenda. Available at: https://tankesmienagenda.no/uploads/images/medias/tankesmien_agenda_rapport_menn_som_ikke_liker_karrierekvinner_1__1567709038980.pdf.
Gerull KM, Loe M, Seiler K, et al. 2019. Assessing gender bias in qualitative evaluations of surgical residents. American Journal of Surgery 217(2): 306–313. DOI: 10.1016/j.amjsurg.2018.09.029.
Girod S, Fassiotto M, Grewal D, et al. 2016. Reducing implicit gender leadership bias in academic medicine with an educational intervention. Academic Medicine 91(8): 1143–1150. DOI: 10.1097/ACM.0000000000001099.
Greenwald AG and Banaji MR 2017. The implicit revolution: Reconceiving the relation between conscious and unconscious. Am Psychol. 72(0003-066X (Linking)): 861–871. DOI: 10.1037/amp0000238.
Gvozdanovic J and Maes K 2018. Implicit bias in academia: A challenge to the meritocratic principle and to women’s careers – And what to do about it. League of European Research Universities (LERU).
Hansen M, Schoonover A, Skarica B, et al. 2019. Implicit gender bias among US resident physicians. BMC Medical Education 19(1). DOI: 10.1186/s12909-019-1818-1.
Hardcastle VG, Furst-Holloway S, Kallen R, et al. 2019. It’s complicated: a multi-method approach to broadening participation in STEM. Equality, Diversity and Inclusion 38(3): 349–361. DOI: 10.1108/EDI-09-2017-0200.
Heath JK, Weissman GE, Clancy CB, et al. 2019. Assessment of Gender-Based Linguistic Differences in Physician Trainee Evaluations of Medical Faculty Using Automated Text Mining. JAMA network open 2(5): e193520. DOI: 10.1001/jamanetworkopen.2019.3520.
Hofmann W, Gawronski B, Gschwendner T, et al. 2005. A Meta-Analysis on the Correlation Between the Implicit Association Test and Explicit Self-Report Measures. Personality and Social Psychology Bulletin 31(10). SAGE Publications Inc: 1369–1385. DOI: 10.1177/0146167205275613.
Holman L and Morandin C 2019. Researchers collaborate with same-gendered colleagues more often than expected across the life sciences. PLoS ONE 14(4). DOI: 10.1371/journal.pone.0216128.
Jackson SM, Hillard AL and Schneider TR 2014. Using implicit bias training to improve attitudes toward women in STEM. Social Psychology of Education: An International Journal 17(3): 419–438. DOI: 10.1007/s11218-014-9259-5.
James A, Chisnall R and Plank MJ 2019. Gender and societies: A grassroots approach to women in science. Royal Society Open Science 6(9). DOI: 10.1098/rsos.190633.
Jussim L, Stevens ST and Honeycutt N 2018. Unasked questions about stereotype accuracy. Archives of Scientific Psychology 6(1): 214–229. DOI: 10.1037/arc0000055.
Kahneman D 2011. Thinking Fast and Slow. Penguin.
Kahneman D and Frederich S 2002. Representativeness revisited: attribute substitution in intuitive judgement. In: Gilovich T, Griffin D, and Kahneman D. (eds) Heuristics and Biases: The Psychology of Intuitive Judgment,. Cambridge: Cambridge University Press.
Kalejta RF and Palmenberg AC 2017. Gender parity trends for invited speakers at four prominent virology conference series. Journal of Virology 91(16). DOI: 10.1128/JVI.00739-17.
Khoo-Lattimore C, Yang ECL and Je JS 2019. Assessing gender representation in knowledge production: a critical analysis of UNWTO’s planned events. Journal of Sustainable Tourism 27(7): 920–938. DOI: 10.1080/09669582.2019.1566347.
Korsgaard M 2018. Min Uimodståelige Mand. Vi Er Alle Mandschauvinister. København: Gyldendal.
Krishnan N, Biggerstaff D, Szczepura A, et al. 2019. Glass Slippers and Glass Cliffs: Fitting In and Falling Off. Transplantation 103(7): 1486–1493. DOI: 10.1097/TP.0000000000002603.
Lincoln AE, Pincus S, Koster JB, et al. 2012. The Matilda Effect in science: Awards and prizes in the US, 1990s and 2000s. Social Studies of Science 42(2): 307–320. DOI: 10.1177/0306312711435830.
Lukela JR, Ramakrishnan A, Hadeed N, et al. 2019. When perception is reality: Resident perception of faculty gender parity in a university-based internal medicine residency program. Perspectives on Medical Education 8(6): 346–352. DOI: 10.1007/s40037-019-00532-9.
Magua W, Zhu X, Bhattacharya A, et al. 2017. Are female applicants disadvantaged in National Institutes of Health peer review? Combining algorithmic text mining and qualitative methods to detect evaluative differences in R01 reviewers’ critiques. Journal of Women’s Health 26(5): 560–570. DOI: 10.1089/jwh.2016.6021.
Malmström M, Voitkane A, Johansson J, et al. 2020. What do they think and what do they say? Gender bias, entrepreneurial attitude in writing and venture capitalists’ funding decisions. Journal of Business Venturing Insights 13. DOI: 10.1016/j.jbvi.2019.e00154.
Manlove KR and Belou RM 2018. Authors and editors assort on gender and geography in high-rank ecological publications. PLoS ONE 13(2). DOI: 10.1371/journal.pone.0192481.
Martin DE 2011. Internal compensation structuring and social bias: Experimental examinations of point. Personnel Review 40(6): 785–804. DOI: 10.1108/00483481111169689.
McNutt M 2016. Implicit bias. Science 352(6289): 1035.
Miller-Friedmann J, Childs A and Hillier J 2018. Approaching gender equity in academic chemistry: lessons learned from successful female chemists in the UK. Chemistry Education Research and Practice 19(1): 24–41. DOI: 10.1039/C6RP00252H.
Muhr SL 2019. Ledelse af køn – LEDELSE i DAG | Lederne. Available at: https://www.lederne.dk/ledelse-i-dag/ny-viden/2019/ledelse-i-dag-juli-2019/frihed-til-selv-at-vaelge (accessed 13 November 2019).
Noon M 2018. Pointless Diversity Training: Unconscious Bias, New Racism and Agency. Work, Employment and Society 32(1): 198–209. DOI: 10.1177/0950017017719841.
Nosek BA 2005. Moderators of the Relationship Between Implicit and Explicit Evaluation. Journal of Experimental Psychology: General 134(4). US: American Psychological Association: 565–584. DOI: 10.1037/0096-34220.127.116.115.
O’Meara K, Templeton L and Nyunt G 2018. Earning professional legitimacy: Challenges faced by women, underrepresented minority, and non-tenure-track faculty. Teachers College Record 121(12). Available at: https://www.scopus.com/inward/record.uri?eid=2-s2.0-85068429541&partnerID=40&md5=c32a8cf9a3c3f28223a6221aadb6342d.
Oswald FL, Mitchell G, Blanton H, et al. 2013. Predicting ethnic and racial discrimination: A meta-analysis of IAT criterion studies. Journal of Personality and Social Psychology 105(2). US: American Psychological Association: 171–192. DOI: 10.1037/a0032734.
Pardal V, Alger M and Latu I 2020. Implicit and explicit gender stereotypes at the bargaining table: Male counterparts’ stereotypes predict women’s lower performance in dyadic face-to-face negotiations. Sex Roles: A Journal of Research. DOI: 10.1007/s11199-019-01112-1.
Ramos MR, Barreto M, Ellemers N, et al. 2016. Exposure to sexism can decrease implicit gender stereotype bias. European Journal of Social Psychology 46(4): 455–466. DOI: 10.1002/ejsp.2165.
Rojek AE, Khanna R, Yim JWL, et al. 2019. Differences in Narrative Language in Evaluations of Medical Students by Gender and Under-represented Minority Status. Journal of General Internal Medicine 34(5): 684–691. DOI: 10.1007/s11606-019-04889-9.
Salerno PE, Páez-Vacas M, Guayasamin JM, et al. 2019. Male principal investigators (almost) don’t publish with women in ecology and zoology. PLoS ONE 14(6). DOI: 10.1371/journal.pone.0218598.
Salles A, Awad M, Goldin L, et al. 2019. Estimating Implicit and Explicit Gender Bias among Health Care Professionals and Surgeons. JAMA Network Open 2(7). DOI: 10.1001/jamanetworkopen.2019.6545.
Staats C, Capatosto K, Wright RA, et al. 2015. Understanding Implicit Bias. Kirwan Institute. Available at: http://kirwaninstitute.osu.edu/research/understanding-implicit-bias/ (accessed 30 March 2020).
Stanovich KE 2011. Rationality and the Reflective Mind. Oxford: Oxford University Press.
Swim JK, Aikin KJ, Hall WS, et al. 1995. Sexism and racism: Old-fashioned and modern prejudices. Journal of personality and social psychology 68(2): 199–214. DOI: 10.1037/0022-3518.104.22.168;10.1037/0022-3522.214.171.124.
Tanwir M and Khemka N 2018. Breaking the silicon ceiling: Gender equality and information technology in Pakistan. Gender, Technology and Development 22(2): 109–129. DOI: 10.1080/09718524.2018.1496695.
Teelken C, Taminiau Y and Rosenmöller C 2019. Career mobility from associate to full professor in academia: micro-political practices and implicit gender stereotypes. Studies in Higher Education. DOI: 10.1080/03075079.2019.1655725.
Thomson A, Horne R, Chung C, et al. 2019. Visibility and representation of women in multiple sclerosis research. Neurology 92(15): 713–719. DOI: 10.1212/WNL.0000000000007276.
Turrentine FE, Dreisbach CN, St Ivany AR, et al. 2019. Influence of Gender on Surgical Residency Applicants’ Recommendation Letters. Journal of the American College of Surgeons 228(4): 356-365.e3. DOI: 10.1016/j.jamcollsurg.2018.12.020.
Unconscious Bias @ Work | Google Ventures 2014. Available at: https://www.youtube.com/watch?v=nLjFTHTgEVU (accessed 9 September 2020).
Understanding unconscious bias 2015. Available at: https://www.youtube.com/watch?time_continue=4&v=dVp9Z5k0dEE (accessed 8 September 2020).