Get Even More Visitors To Your Blog, Upgrade To A Business Listing >>

Book Summary: More than a Glitch – Confronting Race, Gender, and Ability Bias in Tech

Recommendation

You may be tempted to dismiss instances of machine Bias as “glitches.” However, they’re structural and reflective of real-world racism, sexism and ableism, says data journalism professor Meredith Broussard. Technology should work for everyone – nobody should feel barred from using technology based on their skin color, gender, age or ability. Broussard presents several case studies of machine bias, detailing the harm it’s caused in areas including policing and health care. She urges Big Tech to embrace accountability and work toward the public interest.

Take-Aways

  • Machine bias is a structural problem requiring complex solutions.
  • Machines “learn” to uphold the status quo and replicate oppressive systems.
  • Law enforcement’s faith in biased algorithms has a human cost.
  • Big Tech sometimes fails to design for disability and foster inclusive cultures.
  • Computer systems often enforce the gender binary, showing bias against LGBTQIA+ people.
  • Using AI as a medical diagnostic tool can be unhelpful and even dangerous.
  • Embed an algorithmic review process into your business model.
  • Build a better world with algorithmic auditing and accountability.

Summary

Machine bias is a structural problem requiring complex solutions.

People assume computers can solve social problems, but this isn’t always true. Machines can only calculate mathematical fairness, which is different from social fairness. The person programming a computer may program it to create a solution mathematically, but that doesn’t mean their algorithms lack bias, leading to neutral decision-making. Programmers are humans who bring their biases, such as those rooted in racism, privilege, self-delusion and greed, to work with them. The belief that technology will solve social problems indicates “technochauvinism,” as it ignores the fact that machine bias exists and that equality often differs from justice or equity.

“Digital technology is wonderful and world-changing; it is also racist, sexist and ableist.”

People rarely build biased technology intentionally. Most engineers probably incorrectly assume they’re building a “neutral” technology. As an example of machine bias, consider the video of racist soap dispensers that went viral in 2017: A darker-skinned man found that the soap dispenser didn’t recognize his hands as human hands, as it only recognized lighter-skinned hands and thus didn’t dispense soap for him. When technology shows biases like this, it often occurs because engineers use a homogenous group of test subjects. Machine bias isn’t a “glitch” – it’s a structural problem demanding complex solutions.

Machines “learn” to uphold the status quo and replicate oppressive systems.

When a machine learns, it doesn’t do so as a human would but rather detects and replicates patterns in data. If an algorithmic system is “trained” using data that are reflective of racist policies and actions, it’s going to replicate those patterns, maintaining the status quo. For example, if you train an AI model with real data about past recipients of loans in the United States, the model will continue to give Black and Brown people fewer loans, perpetuating the financial services’ history of bias.

“Let’s not trust in it when such trust is unfounded and unexamined. Let’s stop ignoring discrimination inside technical systems.”

When giving AI models statistical data, one must consider that the current statistical methodologies people rely on today were developed by the outspoken eugenicists and racists Karl Pearson, Ronald Fisher and Francis Galton. Tukufu Zuberi, a sociologist at the University of Pennsylvania, calls for a “deracialization” of statistical analysis. Zuberi says it’s time to replace studies that use racial data without context with those that more fully capture all dimensions of identity and the broader social circumstances of people’s lives. In today’s racist society, simply saying “I’m not racist” isn’t enough. Endeavor to be anti-racist, which entails critically examining your assumptions about race and working to eliminate racist practices when you see them. Embedding anti-racism into machine learning systems means going beyond simply implementing mathematically fair solutions. It requires actively building technologies that challenge the current systems of oppression and end white supremacy.

Law enforcement’s faith in biased algorithms has a human cost.

Facial recognition technology (FRT) uses biased algorithms, so when law enforcement relies on the technology, it can result in harm. This is because the technology works better on people with lighter skin tones than on those with darker tones. FRT is also better at recognizing men than women and frequently misgenders nonbinary, trans and gender-nonconforming people. FRT doesn’t calculate definite matches and only detects similarities. Yet police use it to make arrests, routinely using it against communities of Black and Brown people the technology fails to work for. Although humans check the accuracy of the matches, this doesn’t prevent bias, as humans have biases and can confirm a false match – for example, perhaps they think all people of a certain race look alike. There have been multiple cases of police arresting Black people for crimes they didn’t commit when using FRT to identify suspects.

“Intelligence-led policing is not data science. It’s not scientific. It is just an oppressive strategy that upholds white supremacy.”

Crime-predicting algorithms also show racial bias. For example, the Philadelphia company HunchLab’s crime-predicting software maps neighborhoods where it claims crime is most likely to occur, prompting the Philadelphia Police to maintain a greater presence in these areas for public safety. Often, these maps depict communities where Black people live, and increased police patrols can actually make communities less safe, given the prevalence of police brutality in the United States against Black people. HunchLab’s maps fail to send patrols to neighborhoods where white-collar financial crimes, such as tax evasion, occur. A mapping project called “White Collar Crime Risk Zones” situates in heavily white areas, such as Manhattan and Wall Street, demonstrating bias regarding the types of crimes police view as worth policing. Police forces across the United States may claim to be doing intelligence-led policing, but they’re presenting a false image of objectivity to maintain the status quo.

Big Tech sometimes fails to design for disability and foster inclusive cultures.

People with disabilities need an outlet for their concerns in today’s discussions about equity and technology. For one, tech companies could start using the appropriate language when referring to users with disabilities. For example, tech companies may refer to Deaf people as hearing-impaired, but calling someone impaired can make them feel like they’re broken when they’re just different from those companies perceive as typical users. Richard Dahan, who used to work at a Maryland Apple Store, found that Apple failed to honor its commitments to inclusivity because he had a manager who failed to properly accommodate his needs as a Deaf person (for example, making information accessible via a video interpreter). Dahan says this manager also allowed customers to refuse to be served by him because he was Deaf. He started the viral movement #AppleToo, highlighting that Apple had some aspects of disability bias within its culture.

“Technology is a crucial component of accessibility, but more of what is currently called cutting-edge technology is not necessarily the answer.”

Big Tech often uses images of people with disabilities to promote inclusive technologies when their technologies rarely are accessible. For example, there have been reports of delivery robots making life harder for people with disabilities – for instance, blocking the path of a Blind woman and her guide dog. Yet, “inspiration porn” is rampant, which activist Stella Young refers to as images companies use showcasing people with disabilities to inspire people without disabilities (as if simply existing were an inspirational feat with a disability). Rather than objectify people with disabilities, Elise Roy, a Deaf “human-centered designer,” urges people to start designing for disability. Universal design benefits everyone, not just people with disabilities, as you create innovative solutions that make technology more user-friendly, she stresses. Ruha Benjamin, in her book Race After Technology takes the conversation a step further, calling for “design justice,” meaning that people from marginalized communities should lead design initiatives – design shouldn’t be reproducing structural inequalities.

Computer systems often enforce the gender binary, showing bias against LGBTQIA+ people.

Technology often enforces the gender binary, displaying bias toward nonbinary, trans and gender-nonconforming individuals. This is because most computer systems today encode gender as a fixed, rather than changeable, binary value. The way computers encode gender hasn’t changed much since 1951 when UNIVAC – a digital computer designed to create databases – gave people only two gender options (M/F). In many ways, this 1950s ideology remains due to “hegemonic cis-normativity and math.” It’s easier to put people into clean, simple categories than embrace their complex, shifting identities when doing data analysis, so programmers use gender binary and binary representation when coding.

“The rigid requirements of ‘elegant code’ enforce cis-normativity, the assumption that everyone’s gender identity is the same as the sex they were assigned at birth.”

Big Tech companies that claim to support people with LGBTQIA+ identities often fail to make algorithms inclusive. For example, trans people using Google Photos often find that the software identifies them as different people before and after their transition. Facebook may have been one of the first social media companies that allowed users to change their gender identity. Still, the system didn’t encode nonbinary identities, saving only the identities of male, female and null. Rather than “nullify” those who don’t fit into the gender binary, it’s time to extensively update computer systems to correct past biases.

Using AI as a medical diagnostic tool can be unhelpful and even dangerous.

Some envision a future in which AI diagnostics replace those of human doctors, but research suggests AI has a long way to go (if such a future is even possible). In a 2021 study of three different deep learning models, researchers found that each AI only performed well as a diagnostic tool when using the datasets from the specific hospital the model trained on (which included the National Institutes of Health Clinical Center, Stanford Health Care and Beth Israel Deaconess Medical Center). When researchers introduced data from different hospitals, the AI results went “haywire.” In hospitals that use diagnostic AI, doctors often ignore it. For example, a study of AI usage amongst diagnostic radiologists found that the doctors making bone age determinations and diagnosing breast cancer viewed the AI results as “opaque and unhelpful.”

“The fantasy of computerized medicine sounds a lot like the fantasy of the self-driving car: fascinating, but impractical.”

AI diagnostic tools aren’t just unreliable – they can also display harmful bias. For example, Google launched an AI-powered dermatology tool in 2021 designed to help people detect various skin issues. Google had its own goals when releasing the tool, as it wanted to motivate users to search more. The tool contained bias, as Google trained its AI on images from patients in only two US states, and the vast majority of these images featured patients with light and medium-toned skin, while only 3.5% featured patients with darker-toned skin. Given that skin cancer appears different in people with different skin tones, rolling out a product without training the AI model on a wide range of skin tones demonstrates a lack of concern for the public interest.

Embed an algorithmic review process into your business model.

Moving forward, organizations should embrace the following organizational process to embed responsible AI into their business processes:

  1. Take inventory – Examine your company’s algorithms and system vendors.
  2. Audit a single algorithm – Just as engineers inspect roads and bridges, it’s essential that you audit algorithms, start by auditing one.
  3. Remediate “harms” – If you discover biases, take corrective action and reshape your business processes to avoid future harm.
  4. Learn from your mistakes – Learn from this process, proactively looking for similar problems.
  5. Make algorithmic review an ongoing process – Auditing may have less marketing hype than innovation, but it’s a vital business process. Update your business processes, ensuring you have funding in place for infrastructure – not just innovation.
  6. Repeat these steps – Audit more algorithms, remediating harms where necessary. Work with people from diverse backgrounds, as people with varying perspectives will be better equipped to flag potential issues.

Build a better world with algorithmic auditing and accountability.

Tech companies often use the “lean start-up” frame, searching for pain points, solutions and target markets on their journey toward scaling up. When attempting to find solutions to complicated problems, organizations would be best served by the frame of “public interest technology.” Those creating public interest technology aspire to work toward to public good and advance collective well-being.

“Algorithmic auditing and algorithmic accountability reporting are two strains of public interest technology that I think show the most promise for remedying algorithmic harms along the axes of race, gender and ability.”

In particular, creating public interest technology that supports algorithmic accountability reporting and auditing could remedy some harms caused by machine bias. In the United States, new bills limit the activities of Big Tech, such as the Algorithmic Accountability Act of 2022, which would legally require companies to check algorithms for both effectiveness and bias before introducing them to the public. It’s time to forgo technochauvinist optimism for a more discerning, critical view of technology while working toward a more just, equitable world.

About the Author

Meredith Broussard is a data journalist and the author of multiple books, including Artificial Unintelligence: How Computers Misunderstand the World. She’s also an associate professor at New York University’s Arthur L. Carter Journalism Institute and a research director at the NYU Alliance for Public Interest Technology.

Review 1

Sure, here is a long-form brief review of the book More than a Glitch: Confronting Race, Gender, and Ability Bias in Tech by Meredith Broussard:

More than a Glitch is a book about the ways in which race, gender, and ability bias are encoded in technology. Broussard, a data journalist and computer scientist, argues that these biases can have a profound impact on people’s lives, from the way they are treated by algorithms to the opportunities they have to succeed.

The book is divided into three parts:

  • Part One: The Problem introduces the concept of bias in technology and provides examples of how it can manifest.
  • Part Two: The Solutions explores the different ways to address bias in technology.
  • Part Three: The Future looks at the challenges and opportunities that lie ahead in the fight against bias in technology.

More than a Glitch is an important and timely book. It provides a much-needed overview of the problem of bias in technology and offers concrete solutions for addressing it. Broussard’s writing is clear and engaging, and she does a great job of making complex concepts accessible to a general audience.

Here are some of the things that I liked about the book:

  • Broussard does an excellent job of explaining complex concepts in a way that is easy to understand.
  • She provides a comprehensive overview of the problem of bias in technology.
  • She offers concrete solutions for addressing bias in technology.
  • Her writing is clear, engaging, and persuasive.

Here are some of the things that I didn’t like about the book:

  • The book can be a bit dense at times.
  • Some of the solutions that Broussard proposes are not yet feasible.
  • The book could have been more international in scope.

Overall, I thought More than a Glitch was an important and timely book. It provides a much-needed overview of the problem of bias in technology and offers concrete solutions for addressing it. I would highly recommend it to anyone who is interested in this topic.

Here are some additional thoughts on the book:

  • Broussard does a great job of showing how bias can be encoded in technology in both intentional and unintentional ways. She also shows how bias can have a disproportionate impact on people of color, women, and people with disabilities.
  • I appreciate that Broussard does not shy away from the challenges of addressing bias in technology. She acknowledges that it is a complex problem with no easy solutions. However, she also provides concrete examples of how progress is being made.
  • I think More than a Glitch is an important book for anyone who cares about the future of technology. It is a wake-up call that we need to do more to address the problem of bias in technology if we want to create a more equitable and just society.

Review 2

“More than a Glitch: Confronting Race, Gender, and Ability Bias in Tech” by Meredith Broussard is a thought-provoking and timely book that sheds light on the pervasive biases present in the tech industry. Broussard, an expert in data journalism, delves into the intersectionality of race, gender, and ability within the tech sector and offers insights on how to address these biases.

One of the book’s main strengths is its extensive research and analysis. Broussard provides a comprehensive examination of the biases that exist in tech, drawing from a wide range of sources including academic studies, industry reports, and personal anecdotes. This evidence-based approach lends credibility to her arguments and helps readers understand the gravity of the issues at hand.

Broussard’s writing style is engaging and accessible, making complex concepts understandable for readers with varying levels of familiarity with the tech industry. She breaks down technical jargon and presents her ideas in a clear and concise manner, allowing readers to grasp the key points without feeling overwhelmed.

What sets “More than a Glitch” apart is its focus on practical solutions. Broussard goes beyond simply highlighting the problems and provides actionable steps that individuals and organizations can take to address bias in tech. Whether it’s advocating for inclusive hiring practices, promoting diverse representation, or reimagining algorithms, Broussard offers tangible strategies for creating a more equitable industry.

The book’s emphasis on intersectionality is particularly noteworthy. Broussard highlights how bias in tech is not limited to just one dimension but often intersects with race, gender, and ability. By exploring these intersections, she brings awareness to the complex and nuanced challenges faced by marginalized communities in the tech sector.

One area where the book could have been stronger is in providing more in-depth case studies and examples. While Broussard incorporates real-life anecdotes, a deeper exploration of specific instances and their consequences could have added further impact to her arguments.

In conclusion, “More than a Glitch: Confronting Race, Gender, and Ability Bias in Tech” is a significant contribution to the ongoing conversation surrounding bias and inequality in the tech industry. Meredith Broussard’s research-driven approach, practical solutions, and emphasis on intersectionality make this book a valuable resource for anyone interested in addressing bias and fostering a more inclusive tech sector. By shedding light on these issues and offering actionable steps, Broussard challenges readers to critically examine the biases present in tech and work towards positive change.

Review 3

More than a Glitch: Confronting Race, Gender, and Ability Bias in Tech by Meredith Broussard is a timely and insightful book that exposes the hidden biases embedded in technology and calls for a radical redesign of our systems to create a more equitable world. Broussard, a data scientist and one of the few Black female researchers in artificial intelligence, draws on concepts from computer science and sociology to illustrate how racism, sexism, and ableism are not just incidental errors, but intentional features of the algorithms that shape our lives. She provides a range of examples, such as facial recognition technology that fails to recognize darker skin tones, mortgage-approval algorithms that promote discriminatory lending, and medical diagnostic algorithms that produce dangerous feedback loops when trained on insufficiently diverse data. She argues that the solution is not to make tech more inclusive, but to eliminate the algorithms that target certain demographics as “other” in the first place. With implications for fields from law to medicine, this book is essential reading for anyone who cares about building a more just and fair future.

Review 4

“More than a Glitch: Confronting Race, Gender, and Ability Bias in Tech” by Meredith Broussard is a thought-provoking and eye-opening examination of the pervasive biases embedded in technology and their far-reaching impacts on society. Broussard’s book offers a critical analysis of the tech industry’s shortcomings in addressing issues of race, gender, and ability, while also providing actionable insights for creating more inclusive and equitable technological solutions.

Broussard’s exploration of bias in technology is a central theme of the book. She skillfully uncovers how algorithms and AI systems can perpetuate existing inequalities and reinforce harmful stereotypes. By dissecting real-world examples and case studies, she demonstrates how biases can manifest in tech products and services, leading to discriminatory outcomes.

The book’s emphasis on intersectionality is a strong point. Broussard delves into the interconnected nature of bias, highlighting how marginalized communities often face compounding effects of discrimination. Her insights into the ways in which biases intersect and amplify each other provide readers with a deeper understanding of the complexities at play.

One of the book’s key strengths lies in its actionable recommendations. Broussard doesn’t just uncover problems; she offers tangible solutions for individuals, organizations, and the tech industry as a whole. Her call for more diverse representation in tech, algorithmic transparency, and ethical considerations provides a roadmap for creating more inclusive and equitable technological systems.

Broussard’s writing is engaging and well-researched, making complex concepts accessible to readers. She uses clear language and relatable examples to illustrate her points, ensuring that readers can grasp the significance of bias in technology. The book’s structure, with a mix of analysis and practical suggestions, keeps readers engaged from start to finish.

In conclusion, “More than a Glitch: Confronting Race, Gender, and Ability Bias in Tech” is an essential read for anyone interested in understanding the hidden biases present in technology and their societal implications. Meredith Broussard’s insightful analysis, combined with actionable recommendations, provides readers with a deeper awareness of the need for change in the tech industry. This book serves as a wake-up call and a call to action for creating a more just and inclusive technological landscape.

Review 5

“More than a Glitch: Confronting Race, Gender, and Ability Bias in Tech” by Meredith Broussard is a compelling and eye-opening exploration of the biases and inequalities prevalent in the tech industry. Broussard’s expertise in both technology and social issues shines through as she delves into the complex intersection of race, gender, and ability within the tech world, and offers insightful analysis and practical solutions.

One of the book’s notable strengths is its comprehensive examination of the various biases that exist in the tech industry. Broussard unpacks the ways in which race, gender, and ability impact access, representation, and opportunities in the field. She presents a compelling case for why these biases are not mere glitches but deeply ingrained structural issues that need to be addressed. By providing concrete examples and research-backed evidence, the author highlights the urgency of confronting these biases and their detrimental effects.

The book offers a balanced perspective, combining personal narratives, case studies, and data-driven analysis to support its arguments. Broussard’s writing is accessible and engaging, making the book suitable for both tech industry insiders and general readers interested in understanding the societal implications of technology. The author’s ability to translate complex concepts into relatable language ensures that readers can grasp the significance of the issues at hand.

Additionally, “More than a Glitch” goes beyond simply highlighting problems by offering actionable solutions. Broussard proposes strategies to mitigate bias in hiring and promotion practices, foster inclusive work environments, and create more equitable systems within the tech industry. The book encourages readers to take responsibility for challenging and dismantling biased systems, and provides valuable insights on how individuals, organizations, and society as a whole can work towards a more inclusive and just tech landscape.

Furthermore, Broussard does an excellent job of emphasizing the intersectionality of biases in tech. She highlights the experiences of individuals who face multiple forms of discrimination, shedding light on the unique challenges they encounter and the need for inclusive solutions that address the complexities of intersecting identities. This intersectional lens adds depth and nuance to the book’s analysis, making it a valuable resource for understanding the broader implications of bias in the tech industry.

One potential limitation of the book is that it primarily focuses on the biases and challenges within the tech industry, and may not provide as much in-depth analysis or solutions outside of that specific context. However, the insights and strategies presented can still be relevant and applicable to other fields grappling with similar issues of bias and inequality.

In conclusion, “More than a Glitch: Confronting Race, Gender, and Ability Bias in Tech” is a thought-provoking and informative book that tackles the biases and inequalities prevalent in the tech industry. Meredith Broussard’s expertise and comprehensive analysis make this book a vital resource for individuals and organizations seeking to understand and address the intersectional biases that perpetuate inequalities in technology.

Review 6

This book examines biases against marginalized groups in the technology industry and how tech can help reduce broader social inequities.

The author, a data journalist, argues that tech products and systems reflect the perspectives and limitations of their creators, resulting in biases against women, people of color and those with disabilities. She discusses examples of algorithms discriminating against minorities in hiring and criminal justice systems.

The book highlights several approaches to counter these biases, including increasing diversity in tech workforces, implementing ethical design practices, auditing algorithms for fairness, and developing technologies that empower underserved communities. Broussard advocates for a “just tech” movement that centers equity and values over narrow profit motives.

The book’s strengths lie in its wealth of insights from technologists, researchers and people harmed by biased systems. However, some proposals for reforming the tech industry may seem vague at times.

Overall, More than a Glitch makes a compelling case that tackling bias in tech is necessary both to improve products and build a more just society. The book serves as an important wake-up call for the industry to recognize its blind spots and work towards
inclusive innovation that benefits all people.

In summary, the book reveals racial, gender and ability biases endemic in today’s technology, argues that reforming the industry requires centering equity and broad social responsibility, and calls for a “just tech” movement to confront these entrenched biases and create technologies that empower historically marginalized groups.

Review 7

“More than a Glitch: Confronting Race, Gender, and Ability Bias in Tech” by Meredith Broussard is a compelling and timely exploration of the biases that exist in the tech industry, particularly in relation to race, gender, and ability. Through insightful analysis and real-world examples, Broussard highlights the systemic issues that perpetuate bias and offers strategies for creating a more inclusive and equitable tech industry.

The book’s strength lies in its comprehensive examination of the various forms of bias that marginalized groups face in the tech world. Broussard delves into the intersectionality of race, gender, and ability, shedding light on the unique challenges experienced by individuals who are often underrepresented in the industry. Her analysis is well-researched and supported by compelling evidence, providing a thorough understanding of the complexities of bias in tech.

Broussard’s writing style is accessible and engaging, making the book suitable for both tech professionals and general readers. She effectively breaks down complex concepts and presents them in a clear and concise manner. The author’s passion for the subject matter shines through, and her commitment to addressing bias and promoting inclusivity is evident throughout the book.

One of the book’s strengths is its emphasis on actionable solutions. Broussard goes beyond simply highlighting the problems and offers practical strategies for confronting and mitigating bias in the tech industry. She explores the importance of diverse representation, inclusive hiring practices, and the need for comprehensive training and education. These recommendations provide readers with tangible steps they can take to effect change in their own organizations and communities.

While the book provides a comprehensive overview of bias in tech, some readers may find that certain topics or experiences are not covered in depth. Given the breadth and complexity of the subject matter, it is understandable that not every aspect can be thoroughly explored within the book’s scope. However, Broussard’s analysis serves as an excellent starting point for further exploration and discussion.

In conclusion, “More than a Glitch” is a thought-provoking and insightful book that exposes the biases present in the tech industry and offers strategies for creating a more inclusive and equitable future. Meredith Broussard’s well-researched analysis and actionable recommendations make this book a valuable resource for anyone interested in understanding and addressing bias in tech. By confronting these issues head-on and working towards a more inclusive industry, we can create a tech landscape that embraces diversity and fosters innovation.

Review 8

“More than a Glitch: Confronting Race, Gender, and Ability Bias in Tech” by Meredith Broussard is a thought-provoking and insightful book that examines the various forms of bias that exist in the tech industry. Broussard, a professor of computer science and journalism, leverages her extensive research and personal experiences to shed light on the systemic issues that affect diverse groups in tech, including women, people of color, and individuals with disabilities.

The book is divided into three parts. The first part provides an overview of the tech industry’s history, culture, and structures, highlighting the ways in which bias has been embedded in the sector from its inception. Broussard skillfully explains how the tech industry’s reliance on algorithms, AI, and machine learning can perpetuate and amplify existing biases, leading to a lack of diversity and inclusivity in the workplace.

The second part of the book delves into the various forms of bias that exist in tech, including implicit bias, confirmation bias, and cultural bias. Broussard offers concrete examples and case studies to illustrate how these biases manifest in the workplace, from sexist and racist hiring practices to inaccessible technology design. This section is particularly insightful, as it highlights the ways in which well-intentioned individuals can perpetuate bias without realizing it.

The third part of the book offers solutions and strategies for confronting and overcoming bias in tech. Broussard provides actionable advice for individuals and organizations looking to create a more inclusive and diverse tech industry. She emphasizes the importance of diversity and inclusion initiatives, mentorship programs, and diversity and inclusion training.

Throughout the book, Broussard’s writing is clear, concise, and accessible. She uses relatable anecdotes and real-life examples to make the book engaging and easy to understand. The book’s strength lies in its ability to convey complex concepts in a digestible way, making it an excellent resource for anyone interested in understanding and combating bias in tech.

The book’s weaknesses are minor. Some readers may find the book’s focus on the tech industry’s biases and shortcomings to be confronting, but Broussard’s approach is crucial in sparking necessary conversations about diversity, equity, and inclusion in tech.

In conclusion, “More than a Glitch: Confronting Race, Gender, and Ability Bias in Tech” is an essential read for anyone interested in creating a more inclusive and diverse tech industry. Broussard’s’s book offers a unique perspective on the systemic issues that affect diverse groups in tech, providing actionable advice and strategies for overcoming bias. The book is a valuable resource for anyone seeking to create a more inclusive and diverse tech industry.

Overall, “More than a Glitch: Confronting Race, Gender, and Ability Bias in Tech” is a crucial book that encourages readers to confront their biases, leading to a more inclusive and diverse tech industry.

I highly recommend this book to anyone interested in creating a better tech industry.

The post Book Summary: More than a Glitch – Confronting Race, Gender, and Ability Bias in Tech appeared first on Paminy - Information Resource for Marketing, Lifestyle, and Book Review.



This post first appeared on Paminy - Information Resource For Marketing, Lifestyle, And Book Review, please read the originial post: here

Share the post

Book Summary: More than a Glitch – Confronting Race, Gender, and Ability Bias in Tech

×

Subscribe to Paminy - Information Resource For Marketing, Lifestyle, And Book Review

Get updates delivered right to your inbox!

Thank you for your subscription

×