Copilot stops working on gender
related subjects
#72603
Replies: 8 comments 2 replies
-
Good luck deugging! Sometimes you can add an x_ at the front or something to get around bugs like that. Or create codewords/ encryptions basically. |
-
Wow, that's awful. Devs, please fix this! |
-
Also frustrated by this issue. Working in the fashion industry, demographic classifications like age and gender are integral to the domain model and entirely apolitical. This idea of "banned words" is essentially biased - an American bias. It is long established that ban-list content filtering is terrible. This artifact of the propagandized American political environment isn't necessary. Funding schools and improving public literacy is a better way to keep AI-generated propaganda in its place than naive measures like this... |
-
This is idiot. Technology should not be mixed with ideologies. We don't care about this type of things when we are working, we should not! |
-
This behavior is still on today, and it seems like it's not going away anytime soon. So frustrating, and completely unnecessary. |
-
As some people already mentioned here or here, Copilot purposely stops working on code that contains hardcoded banned words from Github, such as
gender
orsex
.I am labelling this as a bug because this behavior is unexpected and undocumented.
I guess you might be embarrassed by what your AI says when it autocompletes gender related matters since it is probably trained on biased datasets. However disabling autocompletion is not a great solution since gender can be very present in :
For which Copilot shutting down on most of the files is deceptive and pretty annoying.
I don't have any elegant solution in mind and I know this seems like an edge case, but I hope this will be taken into account someday.
All reactions