Reflect on the experiences and insights shared by Chancey Fleet and Max Evans in their talks. What does Chancey mean by “dark patterns in accessibility?”
Chancey describes the term “dark patterns” as “[something that] is designed to exclude you. It keeps you from collaborating; it slows you down. It’s demoralizing. Sometimes it means that you can’t pursue your education and you can pursue your job. So I think even when it’s a question of neglect rather than malice. When tech tools are designed without being accessible, you know, that is a genuine dark pattern.”
Chancey shined light on how Twitter has a dark pattern. There is a way for a user to describe images in a text field call “alt text”, but it’s disabled by default. They then say “You have to know already that you want to be an ally and go to settings and enable Image Descriptions for people who are blind or visually impaired.” This goes to show how Twitter, while still having the option, enables inaccessibility and limits awareness for the visually impaired. Twitter giving the users the “option” to be an ally and, still, it’s nearly hidden.
Chancey says “I wonder what would happen if, not only it was displayed by default, but it was a required field and you couldn’t tweet until you described. I think we’d see a very different Twitter landscape very quickly if people felt. That it was an expectation and a need to make everything described and it also helps with discover ability and search ability. So everybody wins.”
I agree with Chancey and have not noticed this dark pattern with Twitter prior to this; and that is exactly the problem.
What does Max mean by companies that “cede their authority to the algorithm?”
Max uses the example of how before his top surgery he would have never been able to post his bare chest, specifically the nipple, without getting flagged or banned. Post-surgery, Max posts a picture without his shirt on, with the “same nipple just sewed back onto my body”, and does not receive a notice from Instagram to take it down.
Max notes that big tech companies will create an algorithm that they don’t fully understand, “unleash it onto the world”, then when consequences come up they blame the algorithm and abstain from responsibility (cede their authority to the algorithm) without engaging with people and the algorithm constructively. Max says “there’s a lot of things you can do to engage with that technology in a more responsible way.”