top of page
Search

Recognition, Representation, and the Future of Tech

  • tessmack96
  • 3 days ago
  • 4 min read

Updated: 2 days ago



The key to good technology is not good software or even good design. Good technology meets the consumer where they are. In this article, we recognize two Black women who center the idea that building good technology begins with “building from the margins.” In anticipating the needs of the whole rather than just some, “good technology makes life richer for all.” As Google’s Director of Products for All, Annie-Jean Baptiste expresses the urgency of including the voices at the margins in the design process. CEO of Girls Who Code, Tarika Barreet, similarly understands the importance of elevating women in the AI industry, a decisive factor that may profoundly impact consumers, particularly those at the margins. In this article, we celebrate these two women who are redefining what it means to build technology that works for everyone.


(Mercury Public Affairs via AP)
(Mercury Public Affairs via AP)
   (Girls Who Code (via Girls Who Code website))
   (Girls Who Code (via Girls Who Code website))
















Designing for Recognition: Why Being Seen Matters


In technology, you have the tangible product, the functional technology, the marketing, etc.; but all of it is useless unless you have a vision for your product which recognizes the consumer. Annie-Jean Baptiste, Director of Products for all at Google, believes that “recognition” is that feeling you get when you feel truly seen by a product. As a spokeswoman for inclusive and equitable design, Baptiste is helping the industry define this standard. In her position at Google, Baptiste pushes her teammates to consider every scenario of human ability rather than a narrow few, enabling Google teams to expand their input beyond specific disability examples and reach broader audiences. In anticipating the possibilities of disadvantage in traditionally marginalized communities, moreover, – as equitable design does, Google is able to truly delight.


Expanding Input, Expanding Impact


Many consumer products – technological or otherwise – utilize inclusive and equitable design in their planning. Many of you will recognize examples of both, and intuitively understand the phenomenon of delight. Consumer Products company OXO responded to the needs of its arthritic consumers, designing kitchen tools with strategically comfortable grips.The end product delighted not just intended customer segments, but elevated the kitchen experience for all customers. Similarly, early in her role at Google, Baptiste and her teams helped consumers understand the true meaning of nostalgia—by leading teams to create camera technology that recognized darker skin tones just as well as lighter ones.


(Joyce Kim with Pixel 6 Pro - Used for Blog about Google Pixel Pro) 
(Joyce Kim with Pixel 6 Pro - Used for Blog about Google Pixel Pro) 
  (Deun Ivory with Pixel 6 Pro - Used for Blog about Google Pixel Pro) 
  (Deun Ivory with Pixel 6 Pro - Used for Blog about Google Pixel Pro) 













The Stakes of Exclusion in AI


The phenomenon of feeling seen is just as important in the industry of AI. However in AI, recognition produces a distinct form of satisfaction. Tarika Barrett and her team at Girls Who Code is pushing girls for success in the field of AI not only because women represent a disproportionately low percentage of the workforce (26%); but also because the more voices represented in AI-driven products, the more accurately those products will be able to address diverse concerns. As in the case of Google, with their Pixel camera, increased diversity in AI development allows companies to more aptly address the whole. However unlike with consumer tech offerings, the consequences of biased AI include denied mortgages, unfair hiring decisions, and disproportionate body image issues.


(Alexander Stein @Pixabay)
(Alexander Stein @Pixabay)













How Bias Becomes Harm


The stakes are dire. When the AI space disproportionately includes only a limited range of voices, the risks include not only algorithmic bias, but also epistemic exclusion, and unequal impacts across diverse user groups. Barrett is doing her best to counter that narrative. At a time when 50% of women leave tech by the age of 35, Barrett and her team are making sure to “[spark] the interest of girls in junior high school, [sustain] their commitment in high school… and [inspire] college undergraduates by reframing computer curricula.” Needed for sustained change are women role models and a less discriminatory environment. Each step Barrett and her team take in this urgent mission is a step closer to AI solutions that serve not just some, but all.


A New Standard for Good Tech


In the tech industry, products have historically been centered around a small number of voices. There is nothing inherently wrong with that, except that when products are created with one type of audience in mind; the output will not only invariably fail to serve other populations, but the company will also miss its opportunity to delight a larger audience. Baptiste and Barrett have both taken the opportunity to shape what it means to be a tech country in America. As we can see, by finding challenges for those on the margins, we learn to address solutions that impact all. By recognizing the industry’s true audience Baptiste and Barrett promise to ensure that technology offers not just solutions, but richer lives for all.




(@StockCake)
(@StockCake)

Rigou, Vasia. “Annie Jean-Baptiste on Why Design Starts at the Margins.” IIDA, 26 June 2025, iida.org/articles/annie-jean-baptiste-on-why-design-starts-at-the-margins.


UC Berkeley Haas School of Business. “Inclusive by Design: The Evolution of Google’s Product Design Practices.” Berkeley Haas Case Series. https://cases.haas.berkeley.edu/cases/inclusive-by-design-the-evolution-of-googles-product-design-practices/


 
 
 

Comments


bottom of page