When I was eight years old and in the third grade, my elementary school connected me and my classmates to our own Google accounts. Little did I realize then, sitting in that classroom, that this would be the start of a long and intimate collection of personal data.

By middle school, many of my classmates were dedicated Facebook, Instagram, and Snapchat users, and by high school, these networks and technologies were omnipresent in our lives. They were so omnipresent, in fact, that my school made passing a digital literacy class a graduation requirement. Importantly, I was taught that my digital footprint would affect me for life. Having inappropriate content memorialized on social media sites, for example, could prevent me from accessing educational and professional opportunities. While I took these warnings seriously, I was still ill-equipped to grasp the depth of this surveillance, and the outsize impact this information would have on my adult life.

Every person using these technologies generates a vast amount of data, whether through online shopping, social media interaction, or even physical movements captured through geolocation tools. Notably, the Federal Trade Commission recently found that retailers use this data to charge people different prices for the same product ; someone may be charged more for a product based on their perceived ability to pay, or because an algorithm has determined that they are more likely to make an impulse purchase. This type of surveillance pricing is powered by algorithms that analyze a range of consumer behaviors like purchase history, browsing habits, location, and even psychological profiles. Similarly, employers may use personal information to determine a worker’s compensation, often based on their online behavior, social media presence, or other personal attributes unrelated to their job performance. 
While these practices may appear economically beneficial to businesses, they raise alarming privacy concerns. Surveillance pricing and wage setting do not only intrude in our personal lives — they also create systemic risks to autonomy, equality, and fairness in economic exchanges. 
Often, individuals — even those in my generation — fail to grasp the extent to which their personal information is being harvested, traded, and sold. There is little transparency about how our personal data is being used or the implications it might have for our lives. This practice not only compromises consumer privacy but also gives businesses unprecedented power over individuals, making it difficult for consumers to protect themselves from exploitation. 

Wage setting that uses personal data, such as social media activity or online behaviors, introduces a serious breach of privacy. By determining pay based on a digital profile rather than a person’s actual qualifications or performance at work, workers are subjected to the intrusion of their private lives in ways that may be irrelevant or damaging. This can lead to discrimination or unjust pay disparities based on factors unrelated to work. For example, Uber and Lyft use algorithms to compensate drivers differently based on a multitude of factors. 

One of the more alarming consequences of surveillance pricing and wage setting is the potential for discrimination. When businesses or employers rely on algorithms that assess consumer behavior or employee profiles to determine pricing or pay, there is a high risk of reinforcing existing biases. Algorithms are often trained on historical data, which may contain biases related to race, gender, socioeconomic status, or other demographic factors. This can lead to discriminatory outcomes, where certain groups of people remain systematically disadvantaged. 

While I have been fortunate enough to grow up in the digital age, this means that I have also had years of data generation available for retailers and employers to use to their advantage. I worry about my ability to afford not only necessities but also to afford to live and work in a state like Colorado where the cost of living is still far too high. Surveillance pricing and wage setting undermines the potential for all individuals to participate in the economy free from deception and discrimination. Additionally, these practices ignore fundamental privacy rights, erode personal autonomy, perpetuate discrimination, and undermine trust in the marketplace. 

As this technology becomes more entrenched in our lives, Coloradans must advocate for stronger privacy protections. Policymakers have proposed HB25-1264, which — if passed — would prohibit unfair surveillance pricing and wage-setting practices. Without these safeguards, retailers and employers may be empowered to continue engaging in these unfair practices, resulting in weaker financial autonomy and less financial stability for many Coloradans and their families. Our legislators must support this bill to ensure that our rights are protected for generations to come and make clear that our digital footprint should not haunt our pocketbooks. 

Date

Wednesday, April 16, 2025 - 8:30am

Featured image

Photo of hands holding a smartphone in a dark room with a dark background

Show featured image

Hide banner image

Tweet Text

[node:title]

Related issues

Privacy & Technology

Show related content

Author:
Mika Alexander, Policy Fellow (she/they)

Menu parent dynamic listing

21

Show PDF in viewer on page

Style

Centered single-column (no sidebar)

Teaser subhead

Surveillance pricing is a problem. Legislators must pass HB25-1264 to address it.

Show list numbers

D. K.

As a Pawnee woman who is Deaf, I’ve long faced barriers to being evaluated fairly, not because I lack experience or qualifications, but because of systemic bias and technology that wasn’t built with people like me in mind.

So when I was offered a job at Intuit, a financial software company, in late 2019 as a tax associate, I was thrilled. In this role, I helped customers with their tax questions and consistently received high ratings for my service. I took pride in being able to resolve customer concerns quickly and with empathy.

My experience reflects a bigger problem: the systemic discrimination embedded in AI-powered hiring tools.

But, during my first year, I was shocked to learn that one of my key performance indicator scores was unusually low. After meeting with my manager, I learned that Intuit’s artificial intelligence (AI) software—used to measure how closely employees followed call scripts—wasn’t accurately recognizing my speech because of my Deaf accent. Instead of correcting the problem, I was reassigned to a role that no longer involved answering customer calls. Even after that setback, I stayed committed to my work. In 2021, I was promoted to Tax Expert Lead. Over the next three tax seasons, I consistently hit high performance metrics and received positive feedback.

In 2023, I joined Intuit’s Accessibility Team to help identify and address barriers that people with disabilities face across the company’s services. During that time, I raised concerns about Intuit’s use of HireVue—a vendor that provides AI-based video interviewing software—as part of the company’s hiring process. I specifically noted that the platform posed challenges for deaf and hard-of-hearing applicants. The Accessibility Team chair said they would look into it, but I never heard about any follow-up or action taken.

AI should never be used as a barrier. It’s time for action, accountability, and justice.

After the 2023 tax season, my manager—who was also part of the hiring committee—encouraged me to apply for a seasonal manager position. It was the next logical step in my career, and I knew I was qualified. I applied in spring 2024.Soon after, I received an invitation to complete a video interview using the HireVue platform. I immediately knew this would be a problem because the platform didn’t provide consistent subtitles for all audio content. In fact, studies show that the technology underlying HireVue performs worse for non-white speakers and even worse for speakers with a deaf accent.

I requested an accommodation: human-generated captioning for the interview. Unfortunately, Intuit did not provide me with this requested accommodation, instead saying that HireVue had built-in subtitles. But, when I began the interview, those subtitles weren’t there for all the content. I had to rely on Google Chrome’s auto-captions, which were full of errors and made it hard to fully understand the questions. Still, I pushed forward. I did my best, confident in my qualifications and experience.

Weeks later, I got an email letting me know Intuit had moved on with other candidates. The feedback I received was devastating: I was told to improve my communication by being more concise, adapting my style to different audiences, and projecting more confidence. What hurt the most was the suggestion that I “practice active listening. ”As a Deaf woman, that comment was not only ignorant—it was deeply offensive. It made me feel like the HireVue system had completely failed to assess me fairly. Worse, it made clear that the people interpreting the HireVue results didn’t understand the realities of Deaf communication.

My experience reflects a bigger problem: the systemic discrimination embedded in AI-powered hiring tools. These systems were not built for people like me. Native professionals, deaf individuals, and countless others are being unfairly screened out by biased technology that prioritizes data over human understanding.

That’s why the ACLU, the ACLU of Colorado, Public Justice, and Eisenberg & Baum, LLP have filed a complaint with the Colorado Civil Rights Division and the Equal Employment Opportunity Commission. The complaint charges Intuit and HireVue with violating the Colorado Anti-Discrimination Act (CADA), the Americans with Disabilities Act (ADA), and Title VII of the Civil Rights Act.

Real change is needed. Companies must stop using hiring technologies that discriminate against disabled and non-white applicants. They must implement accessible, equitable hiring practices that evaluate people based on their skills, experience, and potential—not on biased algorithms.

AI should never be used as a barrier. It’s time for action, accountability, and justice.

Date

Monday, April 7, 2025 - 1:30pm

Featured image

Software company Intuit logo seen displayed on a smartphone with an Artificial intelligence (AI) chip and symbol in the background.

Show featured image

Hide banner image

Override default banner image

Software company Intuit logo seen displayed on a smartphone with an Artificial intelligence (AI) chip and symbol in the background.

Tweet Text

[node:title]

Share Image

ACLU: Share image

Show related content

Imported from National NID

204226

Imported from National VID

204245

Menu parent dynamic listing

21

Imported from National Link

Show PDF in viewer on page

Style

Centered single-column (no sidebar)

Teaser subhead

D.K., a Pawnee woman who is Deaf, was denied a promotion after being assessed through biased automated hiring technology.

Show list numbers

DENVER — Today, a U.S. District Judge issued a preliminary injunction in a lawsuit over the Elizabeth School District’s ban and removal of 19 books from its school libraries. The Elizabeth School District now must return the books to school libraries by March 25, 2025 and is enjoined from removing additional books because the board disagrees with their content or viewpoint. 

The following statement can be attributed to Tim Macdonald, ACLU of Colorado Legal Director: 

“This is a major victory for the students of Elizabeth and all Coloradans. Having access to a diversity of viewpoints is integral to the well-being and education of all students, and this injunction gives them that opportunity. 

“School Districts that ban books because the officials disagree with the content or viewpoints expressed in those books do a disservice to students, authors, and the community.  Such book bans violate the Constitution — period. We’ll keep fighting to ensure a permanent end to this practice.” 

Date

Wednesday, March 19, 2025 - 5:15pm

Featured image

A stack of books on a table against a gray background

Show featured image

Hide banner image

Tweet Text

[node:title]

Share Image

A stack of books on a table against a gray background

Related issues

Freedom of Expression & Religion

Documents

Show related content

Pinned related content

Menu parent dynamic listing

21

Show PDF in viewer on page

Style

Standard with sidebar

Show list numbers

Pages

Subscribe to ACLU Colorado RSS