The Algorithm Will See You Now

Welcome to the age of the algorithm. If you live in the US, you might’ve heard that the retailer, Target has recently started playing a new game with the data it knows about you and as a result, your data just made your groceries more expensive. An algorithm scans your data. Where you live, what you earn, how you shop and then adjusts prices accordingly. That carton of eggs that costs $1.99 in Rochester is actually $2.29 in Tribeca. Not because the eggs are different, but because you are, as a customer. This is ‘personalised pricing’, where everything costs exactly what the algorithm thinks you’ll afford to pay.

The technology that was supposed to make markets more efficient and transparent is now being used to make them more opaque and exploitative. And the kicker? Most people won’t even notice it’s happening.

Let’s be clear about what’s going on here. Dynamic pricing isn’t new, airlines have been doing it for decades. But there’s a difference between adjusting prices based on demand and adjusting them based on who’s buying. When an algorithm looks at your postcode (or zip code), browsing history, purchase patterns, and income markers to decide you can afford an extra thirty cents, that’s not market efficiency. That’s really personalised gouging dressed up as ‘innovation’.

The argument in favour goes something like “companies have always tried to charge what the market will bear but this approach is just more precise.” In other words, if someone in Tribeca can afford to pay more, why shouldn’t they? This is capitalism working as intended, maximising profit (for the owners of capital, the 1%) while theoretically still making goods available to everyone.

Except that’s not really what’s happening. This isn’t about charging luxury prices for luxury goods. It’s about charging different prices for the exact same necessities based on data you as a consumer didn’t knowingly provide for that purpose. You’re not opting into personalised pricing. You’re being quietly segmented and charged accordingly, with no visibility into why or how.

The problem runs deeper than fairness. When prices become individually targeted, we lose the ability to comparison shop meaningfully. You can’t know if you’re getting a good deal because you can’t know what anyone else is paying. The shared reference point that makes markets function, the idea that a thing has a price essentially disappears and instead, everything has your price, calculated in real-time based on data you can’t see or challenge.

This should worry everyone. I think it’s just the beginning. Today it might be eggs and groceries. But the same technology could be applied to anything. Insurance premiums. Loan interest rates. Rent. Medicine. Once companies realise that they can extract maximum value from every transaction by treating each customer as their own market segment, why would they stop?

The defence you’ll hear is that consumers always have the choice to shop elsewhere if they don’t like it. But that only works if competitors aren’t doing the same thing. When every major retailer adopts algorithmic pricing, there’s nowhere else to go. The choice becomes paying the personalised price or going without. That’s not a functioning market. It is a coordinated shakedown enabled by technology.

There’s also something fundamentally corrosive about this approach. It takes the data exhaust from our daily lives, where we live, what we browse, what we buy, and then weaponises it against us. Every interaction becomes a data point that might cost us all more later. That’s not innovation serving people. That’s technology serving profit (capitalism) at people’s expense.

I feel that we need to be honest about what’s being built here. This is a system designed to extract maximum value from people based on their vulnerability, circumstance, and digital footprint. Not smart pricing or market optimisation. The fact that it’s done by algorithms rather than a person doesn’t make it less exploitative just makes it scale better.

The uncomfortable truth is that technology doesn’t have values. We give it values through how we choose to deploy it. Right now, we’re choosing to build systems that treat people as revenue opportunities to be optimised rather than customers to be served. We should be demanding better than this.

Imagine the end result of a system and a world where everything costs exactly what an algorithm thinks you can afford to pay which, conveniently, will always be just a little bit more than you wanted to spend.