{"id":4490,"date":"2022-03-14T17:25:04","date_gmt":"2022-03-14T16:25:04","guid":{"rendered":"https:\/\/eticasfoundation.org\/?p=4490"},"modified":"2022-03-14T17:25:04","modified_gmt":"2022-03-14T16:25:04","slug":"gender-discrimination-in-the-algorithmic-field-a-look-at-the-algorithms-in-the-oasi-register","status":"publish","type":"post","link":"https:\/\/dev.eticasfoundation.org\/es\/gender-discrimination-in-the-algorithmic-field-a-look-at-the-algorithms-in-the-oasi-register\/","title":{"rendered":"Gender discrimination in the algorithmic field: a look at the algorithms in the OASI Register"},"content":{"rendered":"<p><span style=\"font-weight: 400;\">Throughout history, and still today, there have been large groups of society that have been or continue to be discriminated against for reasons such as color, origin, beliefs, sexual orientation or gender. One of the most salient and established kind of discrimination is the one affecting half of the human population: women and girls. The world as we see it today, the way the history is told and the decisions that shaped it, are highly designed by men. Caroline Criado Perez, journalist and feminist activist, said: \u201cThe result of this deeply male-dominated culture is that the male experience, the male perspective, has come to be seen as universal, while the female experience -that of half the global population, after all- is seen as, well, niche.\u201d And, as she also said: \u201cWhen we exclude half of humanity from the production of knowledge we lose out on potentially transformative insights.\u201d<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Whether it\u2019s about education, work, the arts, science or any other kind of social and economic activity, women have been continuously restricted from access and full participation. And when women and girls did succeed, their contributions were not registered on equal terms as those of men. As a result, the documentation, information and data our societies have been producing are biased against women, who are less present and unjustly treated when compared to men in all that information.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Interestingly enough, in the field of computer science, traditionally seen as completely male-dominated, one of the most prominent pioneers was a woman, <\/span><a href=\"https:\/\/en.wikipedia.org\/wiki\/Ada_Lovelace\"><span style=\"font-weight: 400;\">Ada Lovelace<\/span><\/a><span style=\"font-weight: 400;\">, an English mathematician who wrote what is considered the first computer programme in history (she died at the young age of 36 in 1852).<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Thanks to Lovelace\u2019s and other people\u2019s work (<\/span><a href=\"https:\/\/en.wikipedia.org\/wiki\/Women_in_computing\"><span style=\"font-weight: 400;\">including more key contributions by other women<\/span><\/a><span style=\"font-weight: 400;\">), computer programming and digital technology evolved to the point where we are today, when most of us carry supercomputers in our pockets (i.e. our smartphones) that can instantly access a big part of all the information ever produced by humans (i.e. the internet).<\/span><\/p>\n<p><span style=\"font-weight: 400;\">And thanks to the advances and proliferation of computer and digital technology, these days we are also seeing the development and deployment of more and more <\/span><a href=\"https:\/\/eticasfoundation.org\/oasi\/what-are-algorithms\/\"><span style=\"font-weight: 400;\">algorithmic and so-called artificial intelligence (AI) systems<\/span><\/a><span style=\"font-weight: 400;\">, which are used to automatise activities and run complex tasks, including by processing huge amounts of data, that would take us humans a much longer time, or that we could simply not do ourselves.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">The way many of these algorithms work is by being trained on existing data, and when we speak of machine learning that means that an algorithm <\/span><i><span style=\"font-weight: 400;\">learns<\/span><\/i><span style=\"font-weight: 400;\"> to extract patterns and other significant correlations from the data it\u2019s fed and trained on.<\/span><\/p>\n<p>&nbsp;<\/p>\n<h3><span style=\"font-weight: 400;\">But what happens if such data are biased, for example against women, as we described above?<\/span><\/h3>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">If a company or a state body use an algorithm to offer personalised recommendations to job applicants, and if that algorithm is trained on existing data about jobs and professions, then the algorithm may recommend jobs like banker, and engineer, and plane pilot to men, while recommending jobs like cleaner, care-giver, or sales assistant to women.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">And that would be because existing data show that, traditionally, women have had such jobs, and the data and the algorithm don\u2019t start to consider that such bias is the result of social inequality and structural discrimination against women. In effect, the algorithm would be reproducing and maybe even reinforcing existing biases against women, and that\u2019s why we speak of gender discrimination as a <\/span><a href=\"https:\/\/eticasfoundation.org\/oasi\/social-impact-algorithms\/\"><span style=\"font-weight: 400;\">potential social impact of algorithms<\/span><\/a><span style=\"font-weight: 400;\">.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">To make things more problematic when it comes to AI, machine learning and algorithms, most people working in those fields still today, and generally in IT and technology, are men; and that\u2019s even more pronounced regarding top and decision-taking roles. That means that most AI programmes and algorithmic systems have been developed by men or by mostly-male teams, and that may make it more difficult to compensate for existing biases that discriminate against women.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Los <\/span><a href=\"https:\/\/eticasfoundation.org\/oasi\/register\/\"><span style=\"font-weight: 400;\">OASI Register<\/span><\/a><span style=\"font-weight: 400;\"> can already offer an insightful overview of algorithmic systems that may result in gender discrimination. If you go to the <\/span><a href=\"https:\/\/airtable.com\/shrsAN2oTf68kM6O9\/tblG2604tSoMOcwWX?backgroundColor=teal&amp;viewControls=on\"><span style=\"font-weight: 400;\">Register<\/span><\/a><span style=\"font-weight: 400;\"> and filter the \u201cSocial impact\u201d column by the \u201cgender discrimination\u201d category, you get a list of quite a few entries, encompassing many of the <\/span><a href=\"https:\/\/eticasfoundation.org\/oasi\/methodology-categories\/\"><span style=\"font-weight: 400;\">different domains catalogued by the OASI Register<\/span><\/a><span style=\"font-weight: 400;\"> and including algorithms used by some of the so-called tech giants: Google, Facebook, Amazon and Microsoft.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Google (which is owned by Alphabet) appears three times. There is the <\/span><a href=\"https:\/\/airtable.com\/shrsAN2oTf68kM6O9\/tblG2604tSoMOcwWX\/viwoQmFjcPVR6VOSE\/recX3VJ1Safif1OXo?backgroundColor=teal&amp;viewControls=on\"><span style=\"font-weight: 400;\">algorithm of Google AdSense<\/span><\/a><span style=\"font-weight: 400;\">, a system developed by Google for website publishers who want to advertise their services online in a targeted way (most of Google\u2019s money comes from its advertising business). Already back in 2015, a study by academic researchers found that women were far less likely than men to be shown adverts for highly paid jobs through the Google AdSense system. The researchers concluded there was no way to know why that happened, because the AdSense algorithm was opaque (in 2013, AdSense was also <\/span><a href=\"https:\/\/edri.org\/wp-content\/uploads\/2021\/06\/EDRi_Discrimination_Online.pdf\"><span style=\"font-weight: 400;\">found to reproduce racist biases [PDF]<\/span><\/a><span style=\"font-weight: 400;\">).<\/span><\/p>\n<p><span style=\"font-weight: 400;\">A similar trend was observed when researchers looked at the <\/span><a href=\"https:\/\/airtable.com\/shrsAN2oTf68kM6O9\/tblG2604tSoMOcwWX\/viwoQmFjcPVR6VOSE\/recLWJOXy1CocwODH?backgroundColor=teal&amp;viewControls=on\"><span style=\"font-weight: 400;\">algorithm behind Google\u2019s Image Search<\/span><\/a><span style=\"font-weight: 400;\">, which lists images found online based on the keywords entered into the search field by the user. As the OASI Register reports, in 2015 a team of academic researchers found that if you searched for \u201cCEO\u201d on Google Images, almost all the results were men (or, rather, white men). Likewise, if you searched for \u201cdoctor\u201d on Google Images most of the images listed were of male doctors. However, if you searched for \u201cnurse\u201d, then almost all the results were of female nurses.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">That was in 2015, and a casual test done while writing this article showed that the situation has barely improved: around 90% of the images shown after searching for \u201cCEO\u201d were of white men, and almost all the images found under \u201cnurse\u201d were of women. However, when searching for \u201cdoctor\u201d the results did include a substantial number of female doctors, even though slightly more than half of the images were still of men. More in-depth research, published in February 2022, agrees that <\/span><a href=\"https:\/\/www.washington.edu\/news\/2022\/02\/16\/googles-ceo-image-search-gender-bias-hasnt-really-been-fixed\/\"><span style=\"font-weight: 400;\">Google Image Search still has a gender bias problem<\/span><\/a><span style=\"font-weight: 400;\"> (other image search engines also showed biased results, according to that research).<\/span><\/p>\n<p><span style=\"font-weight: 400;\">The third current Google example in the OASI Register is the <\/span><a href=\"https:\/\/airtable.com\/shrsAN2oTf68kM6O9\/tblG2604tSoMOcwWX\/viwoQmFjcPVR6VOSE\/rec3TxGSNdEpgKCLl?backgroundColor=teal&amp;viewControls=on\"><span style=\"font-weight: 400;\">Google Translate algorithm<\/span><\/a><span style=\"font-weight: 400;\">. In 2018, this software was also shown to reproduce discriminatory gender biases when translating non-gendered English words for jobs, like \u201cdoctor\u201d, into gendered words in other languages, like Spanish, where it would systematically be translated as the male versions (\u201cm\u00e9dico\u201d or \u201cdoctor\u201d and not \u201cm\u00e9dica\u201d or \u201cdoctora\u201d). The same bias was present when translating English words like \u201cnurse\u201d and also \u201cbeautiful\u201d, which would be translated into the female versions of the words in other languages. Today, this bias seems to have been corrected, and at least when translating from English to Spanish now Google Translate offers the two gendered versions of \u201cdoctor\u201d and \u201cnurse\u201d as potential translations.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Still in the labour field, an <\/span><a href=\"https:\/\/airtable.com\/shrsAN2oTf68kM6O9\/tblG2604tSoMOcwWX\/viwoQmFjcPVR6VOSE\/recU8umW4bPksZLU7?backgroundColor=teal&amp;viewControls=on\"><span style=\"font-weight: 400;\">internal algorithm developed by Amazon to streamline its own recruitment process<\/span><\/a><span style=\"font-weight: 400;\"> was abandoned after the company discovered that the algorithm was systematically penalising women candidates and giving them a lower score simply because they were women. The reason was that the algorithm was trained on data from Amazon\u2019s workforce and recruitment process, to which mostly men applied, and because of that the algorithmic system<\/span><i><span style=\"font-weight: 400;\"> learnt<\/span><\/i><span style=\"font-weight: 400;\"> that men candidates were preferable. After reportedly trying to fix it, Amazon discontinued the use of this algorithm in 2017.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">A similar bias was observed in another internal algorithm used in recruitment, <\/span><a href=\"https:\/\/airtable.com\/shrsAN2oTf68kM6O9\/tblG2604tSoMOcwWX\/viwoQmFjcPVR6VOSE\/recxAZCVLEBvfghPT?backgroundColor=teal&amp;viewControls=on\"><span style=\"font-weight: 400;\">this time at Uber<\/span><\/a><span style=\"font-weight: 400;\">. In 2019, it was reported that the car-riding company had been using an algorithm to pre-select job candidates by looking at successful past applicants and other recruitment data. Given the fact that nearly 90% of Uber\u2019s workforce was male, the algorithm started favouring male candidates too.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">While Google, Facebook and Uber are well-known representatives of today\u2019s tech companies, similar problems were already plaguing some of the earliest algorithms developed to try to make application processes more efficient. That was the case of the oldest of algorithms in the OASI Register, <\/span><a href=\"https:\/\/airtable.com\/shrsAN2oTf68kM6O9\/tblG2604tSoMOcwWX\/viwoQmFjcPVR6VOSE\/rechdCcfW7B0IJcPm?backgroundColor=teal&amp;viewControls=on\"><span style=\"font-weight: 400;\">implemented in the 1970s at St. George\u2019s Hospital Medical School in the UK<\/span><\/a><span style=\"font-weight: 400;\">. Back then, this school started using a computer programme to automate the applicant selection process. As it\u2019s still today the case, the algorithm was based on the then-existing data from successful past applicants, most of whom were men. As a result, the algorithm systematically denied interviews to female applicants. The algorithm stopped being used reportedly in 1988.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Those examples show that training algorithms on historical data from the fields of labour and education (as well as many other fields) is problematic because of the implicit biases contained in such datasets. That\u2019s why the OASI Register also flags \u201cgender discrimination\u201d as a <\/span><a href=\"https:\/\/airtable.com\/shrsAN2oTf68kM6O9\/tblG2604tSoMOcwWX\/viwoQmFjcPVR6VOSE\/recJqS2RmR5d5ufMk?backgroundColor=teal&amp;viewControls=on\"><span style=\"font-weight: 400;\">potential impact of algorithms like Send@<\/span><\/a><span style=\"font-weight: 400;\">, which unlike the former cases is not from the private sector but has been developed and is being used by the Spanish Public Employment Service. While the stated aim of Send@ is to offer personalised advice to job-seekers, the fact that its algorithm is trained on data from past successful job candidates, and that we know that recruitment processes have traditionally and implicitly been discriminatory towards women, means that Send@ could inadvertently reproduce and maybe also reinforce such discriminatory biases.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Another revealing kind of bias was found in the <\/span><a href=\"https:\/\/airtable.com\/shrsAN2oTf68kM6O9\/tblG2604tSoMOcwWX\/viwoQmFjcPVR6VOSE\/recmOf1wRdkJWlLPL?backgroundColor=teal&amp;viewControls=on\"><span style=\"font-weight: 400;\">search algorithm of Facebook<\/span><\/a><span style=\"font-weight: 400;\"> (now owned by Meta). In 2019, research published in Wired magazine showed that the algorithm readily generated results when the search was \u201cphotos of my female friends\u201d while no result was given when the search was of \u201cphotos of my male friends\u201d. The reason was that Facebook assumed \u201cmale\u201d was a typo for the word \u201cfemale\u201d. To make things more problematic, if you started typing \u201cphotos of my female friends\u201d in the search box on Facebook, then the algorithm offered you to complete your search text by adding \u201cin bikinis\u201d or \u201cat the beach\u201d.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Facebook defended itself saying that the autocomplete function simply offers the terms most searched for by users. But that\u2019s a troubling answer, because by trying to engineer such an automated function as neutral, Facebook was reproducing the sexist behaviour of many of its users, which in this way was being automated and reinforced and was being offered as the normal thing to do to users who had not shown a sexist bias.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">The two last current examples in the OASI Register help show how widespread gender discrimination is in the algorithmic field. One of them involves another of the big tech companies, Microsoft, who in 2016 launched a <\/span><a href=\"https:\/\/airtable.com\/shrsAN2oTf68kM6O9\/tblG2604tSoMOcwWX\/viwoQmFjcPVR6VOSE\/recFoU9md8sYEDa08?backgroundColor=teal&amp;viewControls=on\"><span style=\"font-weight: 400;\">Twitter account named Tay and managed by an algorithmic system<\/span><\/a><span style=\"font-weight: 400;\">. Tay was supposed to talk like a teenage girl and it was meant to learn as it interacted with people on social media.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">At the beginning, things looked OK and Tay seemed to talk like a teenage girl. However, after just a few hours of use, Tay started posting messages that were explicitly sexist, as well as racist and anti-Semitic. Microsoft\u2019s answer was along the lines of that of Facebook described above. Microsoft said that the algorithm responded that way because human Twitter users were talking to Tay in a sexist, racist and anti-Semitic way. Which again brings about the question of how to deal with algorithms that aim to be neutral but promptly start replicating abusive human behaviour because such behaviour is very common online.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">The final case is a cautionary tale. In 2013, two organisations working about homelessness launched <\/span><a href=\"https:\/\/airtable.com\/shrsAN2oTf68kM6O9\/tblG2604tSoMOcwWX\/viwoQmFjcPVR6VOSE\/recZlV0vxBkcPWCi3?backgroundColor=teal&amp;viewControls=on\"><span style=\"font-weight: 400;\">VI-SPDAT<\/span><\/a><span style=\"font-weight: 400;\">, an algorithmic system designed to help civil servants or community workers in the process of providing housing services to homeless people. The algorithm, which was free to use, was intended as the first step in the complex process of assessing each individual case.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">However, in the real world, where civil servants and NGO workers are overwhelmed and work under a lot of stress, VI-SPDAT was mostly used as the only step in deciding whether to offer housing or not to someone. That turned the whole process into an automatised one that provided standard answers to very diverse cases, and which also tended to discriminate against women (and other disadvantaged groups), as other algorithms tend to do. In December 2020, the developers of VI-SPDAT said they regretted its misuse and that they would stop supporting the algorithm. However, all the organisations who had downloaded it and were using it are still free to keep doing so on their own.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Gender discrimination is probably the most prevalent way of discrimination in the algorithmic field, and that\u2019s because gender discrimination is probably the most prevalent way of discrimination in the human world, from which AI and algorithms come. And has happens about gender, algorithmic systems may reproduce and reinforce other unjust and discriminatory biases present in the way human societies behave and in all the data we\u2019ve been generating.<\/span><\/p>\n<p><a href=\"https:\/\/www.eticasconsulting.com\/guide-to-algorithmic-auditing\/\"><span style=\"font-weight: 400;\">Algorithms should be explainable<\/span><\/a><span style=\"font-weight: 400;\">, so that we all affected by their use can understand how the algorithms work, and so that we can hold those responsible to account. <\/span><a href=\"https:\/\/eticasfoundation.org\/oasi\/\"><span style=\"font-weight: 400;\">Eticas Foundation\u2019s OASI project<\/span><\/a><span style=\"font-weight: 400;\"> is an effort in that direction. On the OASI pages you can read more <\/span><a href=\"https:\/\/eticasfoundation.org\/oasi\/what-are-algorithms\/\"><span style=\"font-weight: 400;\">about algorithms<\/span><\/a><span style=\"font-weight: 400;\"> and their <\/span><a href=\"https:\/\/eticasfoundation.org\/oasi\/social-impact-algorithms\/\"><span style=\"font-weight: 400;\">social impact<\/span><\/a><span style=\"font-weight: 400;\">, and in <\/span><a href=\"https:\/\/eticasfoundation.org\/oasi\/register\/\"><span style=\"font-weight: 400;\">the Register <\/span><\/a><span style=\"font-weight: 400;\">you can browse an ever-growing list of algorithms sorted by different kinds of categories. And if you know about an algorithmic system that\u2019s not in the Register and you think it should be added, <\/span><a href=\"https:\/\/eticasfoundation.org\/oasi\/tell-us-about-an-algorithm\/\"><span style=\"font-weight: 400;\">please let us know<\/span><\/a><span style=\"font-weight: 400;\">.<\/span><\/p>","protected":false},"excerpt":{"rendered":"<p>Throughout history, and still today, there have been large groups of society that have been or continue to be discriminated against for reasons such as color, origin, beliefs, sexual orientation or gender. One of the most salient and established kind of discrimination is the one affecting half of the human population: women and girls. The [&hellip;]<\/p>","protected":false},"author":8,"featured_media":4491,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":[],"categories":[19,17],"tags":[],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v19.5.1 - https:\/\/yoast.com\/wordpress\/plugins\/seo\/ -->\n<title>Gender discrimination in the algorithmic field: a look at the algorithms in the OASI Register - Eticas Foundation<\/title>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/dev.eticasfoundation.org\/es\/gender-discrimination-in-the-algorithmic-field-a-look-at-the-algorithms-in-the-oasi-register\/\" \/>\n<meta property=\"og:locale\" content=\"es_ES\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Gender discrimination in the algorithmic field: a look at the algorithms in the OASI Register - Eticas Foundation\" \/>\n<meta property=\"og:description\" content=\"Throughout history, and still today, there have been large groups of society that have been or continue to be discriminated against for reasons such as color, origin, beliefs, sexual orientation or gender. One of the most salient and established kind of discrimination is the one affecting half of the human population: women and girls. The [&hellip;]\" \/>\n<meta property=\"og:url\" content=\"https:\/\/dev.eticasfoundation.org\/es\/gender-discrimination-in-the-algorithmic-field-a-look-at-the-algorithms-in-the-oasi-register\/\" \/>\n<meta property=\"og:site_name\" content=\"Eticas Foundation\" \/>\n<meta property=\"article:published_time\" content=\"2022-03-14T16:25:04+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/dev.eticasfoundation.org\/wp-content\/uploads\/2022\/03\/OASI-gender_Mesa-de-trabajo-1.png\" \/>\n\t<meta property=\"og:image:width\" content=\"2134\" \/>\n\t<meta property=\"og:image:height\" content=\"1600\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/png\" \/>\n<meta name=\"author\" content=\"Nour Salih\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:label1\" content=\"Escrito por\" \/>\n\t<meta name=\"twitter:data1\" content=\"Nour Salih\" \/>\n\t<meta name=\"twitter:label2\" content=\"Tiempo de lectura\" \/>\n\t<meta name=\"twitter:data2\" content=\"11 minutos\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\/\/dev.eticasfoundation.org\/gender-discrimination-in-the-algorithmic-field-a-look-at-the-algorithms-in-the-oasi-register\/#article\",\"isPartOf\":{\"@id\":\"https:\/\/dev.eticasfoundation.org\/gender-discrimination-in-the-algorithmic-field-a-look-at-the-algorithms-in-the-oasi-register\/\"},\"author\":{\"name\":\"Nour Salih\",\"@id\":\"https:\/\/dev.eticasfoundation.org\/#\/schema\/person\/3b4e17941f5b7d8e5a7a633164309942\"},\"headline\":\"Gender discrimination in the algorithmic field: a look at the algorithms in the OASI Register\",\"datePublished\":\"2022-03-14T16:25:04+00:00\",\"dateModified\":\"2022-03-14T16:25:04+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\/\/dev.eticasfoundation.org\/gender-discrimination-in-the-algorithmic-field-a-look-at-the-algorithms-in-the-oasi-register\/\"},\"wordCount\":2260,\"commentCount\":0,\"publisher\":{\"@id\":\"https:\/\/dev.eticasfoundation.org\/#organization\"},\"articleSection\":[\"Algorithms\",\"Gender\"],\"inLanguage\":\"es\",\"potentialAction\":[{\"@type\":\"CommentAction\",\"name\":\"Comment\",\"target\":[\"https:\/\/dev.eticasfoundation.org\/gender-discrimination-in-the-algorithmic-field-a-look-at-the-algorithms-in-the-oasi-register\/#respond\"]}]},{\"@type\":\"WebPage\",\"@id\":\"https:\/\/dev.eticasfoundation.org\/gender-discrimination-in-the-algorithmic-field-a-look-at-the-algorithms-in-the-oasi-register\/\",\"url\":\"https:\/\/dev.eticasfoundation.org\/gender-discrimination-in-the-algorithmic-field-a-look-at-the-algorithms-in-the-oasi-register\/\",\"name\":\"Gender discrimination in the algorithmic field: a look at the algorithms in the OASI Register - Eticas Foundation\",\"isPartOf\":{\"@id\":\"https:\/\/dev.eticasfoundation.org\/#website\"},\"datePublished\":\"2022-03-14T16:25:04+00:00\",\"dateModified\":\"2022-03-14T16:25:04+00:00\",\"breadcrumb\":{\"@id\":\"https:\/\/dev.eticasfoundation.org\/gender-discrimination-in-the-algorithmic-field-a-look-at-the-algorithms-in-the-oasi-register\/#breadcrumb\"},\"inLanguage\":\"es\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/dev.eticasfoundation.org\/gender-discrimination-in-the-algorithmic-field-a-look-at-the-algorithms-in-the-oasi-register\/\"]}]},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/dev.eticasfoundation.org\/gender-discrimination-in-the-algorithmic-field-a-look-at-the-algorithms-in-the-oasi-register\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\/\/dev.eticasfoundation.org\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Gender discrimination in the algorithmic field: a look at the algorithms in the OASI Register\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/dev.eticasfoundation.org\/#website\",\"url\":\"https:\/\/dev.eticasfoundation.org\/\",\"name\":\"Eticas Foundation\",\"description\":\"Society \u00b7 Technology \u00b7 Responsibility\",\"publisher\":{\"@id\":\"https:\/\/dev.eticasfoundation.org\/#organization\"},\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/dev.eticasfoundation.org\/?s={search_term_string}\"},\"query-input\":\"required name=search_term_string\"}],\"inLanguage\":\"es\"},{\"@type\":\"Organization\",\"@id\":\"https:\/\/dev.eticasfoundation.org\/#organization\",\"name\":\"Eticas Foundation\",\"url\":\"https:\/\/dev.eticasfoundation.org\/\",\"sameAs\":[],\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"es\",\"@id\":\"https:\/\/dev.eticasfoundation.org\/#\/schema\/logo\/image\/\",\"url\":\"https:\/\/eticasfoundation.org\/wp-content\/uploads\/2020\/04\/eticas-foundation-logo-grande.png\",\"contentUrl\":\"https:\/\/eticasfoundation.org\/wp-content\/uploads\/2020\/04\/eticas-foundation-logo-grande.png\",\"width\":750,\"height\":400,\"caption\":\"Eticas Foundation\"},\"image\":{\"@id\":\"https:\/\/dev.eticasfoundation.org\/#\/schema\/logo\/image\/\"}},{\"@type\":\"Person\",\"@id\":\"https:\/\/dev.eticasfoundation.org\/#\/schema\/person\/3b4e17941f5b7d8e5a7a633164309942\",\"name\":\"Nour Salih\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"es\",\"@id\":\"https:\/\/dev.eticasfoundation.org\/#\/schema\/person\/image\/\",\"url\":\"https:\/\/secure.gravatar.com\/avatar\/c09ca17aa5e384bf61e21ecd1b91b3c1?s=96&d=mm&r=g\",\"contentUrl\":\"https:\/\/secure.gravatar.com\/avatar\/c09ca17aa5e384bf61e21ecd1b91b3c1?s=96&d=mm&r=g\",\"caption\":\"Nour Salih\"},\"url\":\"https:\/\/dev.eticasfoundation.org\/es\/author\/nour\/\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"Gender discrimination in the algorithmic field: a look at the algorithms in the OASI Register - Eticas Foundation","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/dev.eticasfoundation.org\/es\/gender-discrimination-in-the-algorithmic-field-a-look-at-the-algorithms-in-the-oasi-register\/","og_locale":"es_ES","og_type":"article","og_title":"Gender discrimination in the algorithmic field: a look at the algorithms in the OASI Register - Eticas Foundation","og_description":"Throughout history, and still today, there have been large groups of society that have been or continue to be discriminated against for reasons such as color, origin, beliefs, sexual orientation or gender. One of the most salient and established kind of discrimination is the one affecting half of the human population: women and girls. The [&hellip;]","og_url":"https:\/\/dev.eticasfoundation.org\/es\/gender-discrimination-in-the-algorithmic-field-a-look-at-the-algorithms-in-the-oasi-register\/","og_site_name":"Eticas Foundation","article_published_time":"2022-03-14T16:25:04+00:00","og_image":[{"width":2134,"height":1600,"url":"https:\/\/dev.eticasfoundation.org\/wp-content\/uploads\/2022\/03\/OASI-gender_Mesa-de-trabajo-1.png","type":"image\/png"}],"author":"Nour Salih","twitter_card":"summary_large_image","twitter_misc":{"Escrito por":"Nour Salih","Tiempo de lectura":"11 minutos"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/dev.eticasfoundation.org\/gender-discrimination-in-the-algorithmic-field-a-look-at-the-algorithms-in-the-oasi-register\/#article","isPartOf":{"@id":"https:\/\/dev.eticasfoundation.org\/gender-discrimination-in-the-algorithmic-field-a-look-at-the-algorithms-in-the-oasi-register\/"},"author":{"name":"Nour Salih","@id":"https:\/\/dev.eticasfoundation.org\/#\/schema\/person\/3b4e17941f5b7d8e5a7a633164309942"},"headline":"Gender discrimination in the algorithmic field: a look at the algorithms in the OASI Register","datePublished":"2022-03-14T16:25:04+00:00","dateModified":"2022-03-14T16:25:04+00:00","mainEntityOfPage":{"@id":"https:\/\/dev.eticasfoundation.org\/gender-discrimination-in-the-algorithmic-field-a-look-at-the-algorithms-in-the-oasi-register\/"},"wordCount":2260,"commentCount":0,"publisher":{"@id":"https:\/\/dev.eticasfoundation.org\/#organization"},"articleSection":["Algorithms","Gender"],"inLanguage":"es","potentialAction":[{"@type":"CommentAction","name":"Comment","target":["https:\/\/dev.eticasfoundation.org\/gender-discrimination-in-the-algorithmic-field-a-look-at-the-algorithms-in-the-oasi-register\/#respond"]}]},{"@type":"WebPage","@id":"https:\/\/dev.eticasfoundation.org\/gender-discrimination-in-the-algorithmic-field-a-look-at-the-algorithms-in-the-oasi-register\/","url":"https:\/\/dev.eticasfoundation.org\/gender-discrimination-in-the-algorithmic-field-a-look-at-the-algorithms-in-the-oasi-register\/","name":"Gender discrimination in the algorithmic field: a look at the algorithms in the OASI Register - Eticas Foundation","isPartOf":{"@id":"https:\/\/dev.eticasfoundation.org\/#website"},"datePublished":"2022-03-14T16:25:04+00:00","dateModified":"2022-03-14T16:25:04+00:00","breadcrumb":{"@id":"https:\/\/dev.eticasfoundation.org\/gender-discrimination-in-the-algorithmic-field-a-look-at-the-algorithms-in-the-oasi-register\/#breadcrumb"},"inLanguage":"es","potentialAction":[{"@type":"ReadAction","target":["https:\/\/dev.eticasfoundation.org\/gender-discrimination-in-the-algorithmic-field-a-look-at-the-algorithms-in-the-oasi-register\/"]}]},{"@type":"BreadcrumbList","@id":"https:\/\/dev.eticasfoundation.org\/gender-discrimination-in-the-algorithmic-field-a-look-at-the-algorithms-in-the-oasi-register\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/dev.eticasfoundation.org\/"},{"@type":"ListItem","position":2,"name":"Gender discrimination in the algorithmic field: a look at the algorithms in the OASI Register"}]},{"@type":"WebSite","@id":"https:\/\/dev.eticasfoundation.org\/#website","url":"https:\/\/dev.eticasfoundation.org\/","name":"Eticas Foundation","description":"Society \u00b7 Technology \u00b7 Responsibility","publisher":{"@id":"https:\/\/dev.eticasfoundation.org\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/dev.eticasfoundation.org\/?s={search_term_string}"},"query-input":"required name=search_term_string"}],"inLanguage":"es"},{"@type":"Organization","@id":"https:\/\/dev.eticasfoundation.org\/#organization","name":"Eticas Foundation","url":"https:\/\/dev.eticasfoundation.org\/","sameAs":[],"logo":{"@type":"ImageObject","inLanguage":"es","@id":"https:\/\/dev.eticasfoundation.org\/#\/schema\/logo\/image\/","url":"https:\/\/eticasfoundation.org\/wp-content\/uploads\/2020\/04\/eticas-foundation-logo-grande.png","contentUrl":"https:\/\/eticasfoundation.org\/wp-content\/uploads\/2020\/04\/eticas-foundation-logo-grande.png","width":750,"height":400,"caption":"Eticas Foundation"},"image":{"@id":"https:\/\/dev.eticasfoundation.org\/#\/schema\/logo\/image\/"}},{"@type":"Person","@id":"https:\/\/dev.eticasfoundation.org\/#\/schema\/person\/3b4e17941f5b7d8e5a7a633164309942","name":"Nour Salih","image":{"@type":"ImageObject","inLanguage":"es","@id":"https:\/\/dev.eticasfoundation.org\/#\/schema\/person\/image\/","url":"https:\/\/secure.gravatar.com\/avatar\/c09ca17aa5e384bf61e21ecd1b91b3c1?s=96&d=mm&r=g","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/c09ca17aa5e384bf61e21ecd1b91b3c1?s=96&d=mm&r=g","caption":"Nour Salih"},"url":"https:\/\/dev.eticasfoundation.org\/es\/author\/nour\/"}]}},"jetpack_featured_media_url":"https:\/\/dev.eticasfoundation.org\/wp-content\/uploads\/2022\/03\/OASI-gender_Mesa-de-trabajo-1.png","_links":{"self":[{"href":"https:\/\/dev.eticasfoundation.org\/es\/wp-json\/wp\/v2\/posts\/4490"}],"collection":[{"href":"https:\/\/dev.eticasfoundation.org\/es\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/dev.eticasfoundation.org\/es\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/dev.eticasfoundation.org\/es\/wp-json\/wp\/v2\/users\/8"}],"replies":[{"embeddable":true,"href":"https:\/\/dev.eticasfoundation.org\/es\/wp-json\/wp\/v2\/comments?post=4490"}],"version-history":[{"count":1,"href":"https:\/\/dev.eticasfoundation.org\/es\/wp-json\/wp\/v2\/posts\/4490\/revisions"}],"predecessor-version":[{"id":4492,"href":"https:\/\/dev.eticasfoundation.org\/es\/wp-json\/wp\/v2\/posts\/4490\/revisions\/4492"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/dev.eticasfoundation.org\/es\/wp-json\/wp\/v2\/media\/4491"}],"wp:attachment":[{"href":"https:\/\/dev.eticasfoundation.org\/es\/wp-json\/wp\/v2\/media?parent=4490"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/dev.eticasfoundation.org\/es\/wp-json\/wp\/v2\/categories?post=4490"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/dev.eticasfoundation.org\/es\/wp-json\/wp\/v2\/tags?post=4490"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}