
One of my favorite web designers, Stephanie Stimac, asked me to write the foreword for her amazing new book, Design for Developers. With her permission, and Manning’s, I’m reprinting it here.
One of my favorite web designers, Stephanie Stimac, asked me to write the foreword for her amazing new book, Design for Developers. With her permission, and Manning’s, I’m reprinting it here.
In reading through Joe Dolson’s recent piece on the intersection of AI and accessibility, I absolutely appreciated the skepticism he has for AI in general as well as the ways in which many have been using it. In fact, I am very skeptical of AI myself, despite my role at Microsoft being that of an Accessibility Innovation Strategist helping run the AI for Accessibility grant program. As with any tool, AI can be used in very constructive, inclusive, and accessible ways and it can be used in destructive, exclusive, and harmful ones. And there are a ton of uses somewhere in the mediocre middle as well.
I love Jeremy’s proposed compromise on JavaScript in web apps:
Your app should work in a read-only mode without JavaScript.
Interesting examination of label positioning relative to checkboxes and radio controls in forms. While ostensibly web-focused, it applies equally to any GUI.
I’m very excited to see Apple roll out greater support for PWAs (though I’d bet good money on them never using that term publicly) in macOS Safari! I sincerely hope this is the beginning of many good things to come.
The old computer programming adage “garbage in, garbage out” is going to ring even more true as search engine crawlers consume more and more empty calories in the form of AI-generated bullshit and misinformation.
The question is why: why do rings of fakes websites like these even exist?
Part of the answer is, of course, money. Fake websites can be used to sell real advertisements.
This is an excellent post from Steve Faulkner on some of the issues with Large Language Models like ChatGPT, especially when it comes to accessibility. He clearly outlines three key areas where we are failing:
One of the features I really love about Mastodon is their first-class Content Warning feature. With one additional step, you can add any warning of your choice to your post and it will be hidden by default, showing only the content warning text. It’s a super-simple idea, but so powerful when it comes to reducing potential the likelihood of causing our readers to experience the kinds of trauma that could have severe consequences.
I’m not a metal fan, but I love everything about this: Metallica’s new record, 72 Seasons, will have an ASL interpretation video for every song!
First up, the title track, “72 Seasons”:
I absolutely love Amber Galloway’s signing (and enthusiasm) on this video. Kudos to Metallica for doing this!
Why do companies release software before it’s safe? Chances are they actually consider their product to be their stock price rather than their software… yet another victim of the financialization of our economy.
One worker’s conclusion: Bard was “a pathological liar,” according to screenshots of the internal discussion. Another called it “cringe-worthy.” One employee wrote that when they asked Bard suggestions for how to land a plane, it regularly gave advice that would lead to a crash; another said it gave answers on scuba diving “which would likely result in serious injury or death.”
Google launched Bard anyway. The trusted internet-search giant is providing low-quality information in a race to keep up with the competition, while giving less priority to its ethical commitments, according to 18 current and former workers at the company and internal documentation reviewed by Bloomberg. The Alphabet Inc.-owned company had pledged in 2021 to double its team studying the ethics of artificial intelligence and to pour more resources into assessing the technology’s potential harms. But the November 2022 debut of rival OpenAI’s popular chatbot sent Google scrambling to weave generative AI into all its most important products in a matter of months.