‘Fairness in the Feed’ launches to demand LinkedIn algorithm action
- Leah Morris
- 35 minutes ago
- 3 min read

Refusing to be silenced by algorithmic bias, business leaders Cindy Gallop, Samantha Katz and Jane Evans have this week launched an international petition demanding action from the tech giant.
‘Fairness In The Feed’ is demanding transparency, fairness and equal opportunity on LinkedIn.
Cindy Gallop, who is the Founder & CEO of MakeLoveNotPorn, has been using LinkedIn for twenty years and used to recommend it to everyone as ‘the platform that makes your professional goals happen’.
Despite having 140,000 LinkedIn followers from around the world – collected over a long and fruitful career including BBH New York (where she was Chair), TED, Cannes Lions and just about every other advertising industry conference and podcast of note – Gallop is now averaging a few hundred impressions per post.
Starting when LinkedIn changed its algorithm in January 2025, this ‘catastrophic drop in reach’ has posed a serious business problem for the launch of her new venture, MakeLoveNotPorn.Academy, an ethical sex education platform which already comes with censorship challenges due to its stigmatised category.
Creative Director and author Jane Evans has faced similar strife. This year, she launched a new, solutions-focused community for women, The Seventh Tribe. Yet, her reach is averaging 8% of her 19,000 followers. Says Evans:
“I feel as though the contract has been broken between content creators and LinkedIn. They are the only platform with no financial incentive, the deal was we’d create great content and build an audience and in return LinkedIn allowed you to promote your business. Now if I add a sales link to any of my content it goes straight into LinkedIn jail.”
In June, Evans and Gallop began their independent investigation to better understand the biases at play. They posted the same content as two male colleagues, using banned words (including those the Trump had specifically barred from the NSA website). Where Gallop achieved 0.6% reach, one of the men obtained a whopping 143% reach for the same content .
Gallop and Evans teamed up with Samantha Katz, a strategist, community builder and advocate for inclusion and economic belonging, who had been working on the issue of algorithmic discrimination with community leaders and change makers for the last two years.
They identified that ‘proxy bias’ was likely to be the cause.
Proxy bias occurs when an AI model, despite being prohibited from using sensitive attributes like race or gender, learns to rely on correlated, seemingly neutral features (like post codes or education history) as ‘proxies’ to reproduce existing systemic biases.
Says Gallop: "Proxy bias stifles economic growth and benefit, for anyone whose tone, content or identity falls outside traditional norms. That includes global-majority professionals; women and non-binary people; disabled and neurodivergent users; LGBTQI+ and trans voices; migrant, working class or refugee professionals; non-native English speakers; creators outside the US; and small business owners.”
This indirect discrimination means the model's decisions, in this case whose content to grant visibility, perpetuate historical inequities. Effectively achieving the same biased outcome without explicitly using the forbidden characteristic.
Naturally, this left Gallop and Evans wondering what ranking mechanisms LinkedIn is applying to the feed, to mitigate proxy bias? And given that big decisions are made by organisations and governments based on LinkedIn’s Economic Graph data, what does this mean for women and minorities?
Gallop, Evans and Katz believe LinkedIn has a responsibility to act now.
“The more this continues, the more the algorithm perpetuates digital, economic and societal inequity that becomes the systemic hallmark of an AI-driven future.” – Cindy Gallop
"Leadership today means ensuring our technology strengthens opportunity for all and elevates the performance we expect from world-class organisations." – Samantha Katz
Through Fairness In The Feed, Gallop, Katz and Evans are calling on LinkedIn for:
A formal process to report unexplained reach collapses and a clear commitment to investigating them.
Transparency on how posts are categorised and ranked, including what signals are prioritised and how decisions are made.
An independent equity audit of the algorithm and its impact on underrepresented voices.
Sign the petition calling on LinkedIn for fair visibility here.
Experienced algorithmic bias? Call it out using these campaign assets and hashtag #FairnessInTheFeed.
