When the Tools Forget the Hands That Hold Them
As algorithms grow smarter, real progress in healthcare depends on understanding biology, not just data
This Post Will Not Go Viral.
There’s a growing genre of online writing that reads like it was made in a sterile room, piped through a focus group, and polished by a mildly attentive AI before it’s posted. My LinkedIn feed has become a museum of it - immaculate paragraphs from people who, until recently, I would have bet good money could write something far more interesting themselves.
Post-docs, pharma execs, even Nobel laureates. People whose minds are capable of sparking with wit, nuance, and original thought - all producing “content” that feels like it’s been through the same rinse cycle. The tone is consistent. The phrasing familiar. The cadence suspiciously machine-like.
It’s not that there’s anything wrong with using AI to draft your updates, we’ve all done it. But the result is that everything starts to sound the same. The quirks vanish. The edges are sanded down. You lose the little linguistic fingerprints that make human writing feel alive.
And then there’s the bigger, more insidious problem: people are no longer writing for their audience - they’re writing for an algorithm.
LinkedIn, for all its faults, was once a place where original thinking could reach the right audience without an advertising budget. That’s becoming rare.
Now, it’s increasingly pay to play. Organic reach has plummeted unless you either pay for ads or feed the algorithm exactly what it craves. The result? An outbreak of uniform posts designed to juice engagement metrics rather than spark meaningful discussion.
We’ve entered the era of the humblebrag case study, the faux-vulnerable leadership lesson, the relentless carousel of success theatre. Engagement pods, bots, and engagement for the sake of boosting one’s own algorithmic score rather than to add any real value to a conversation.
All trying to trick the algorithm into thinking something’s hot.
But something is most definitely not.
Here’s a perfect parody of the modern LinkedIn post, shared with me by a friend, original source unknown to me - it made me laugh, and then die a little inside:
What does it take to be a LinkedInfluencer? Let me share the formula I’ve worked out.
I used to write normally.
Paragraphs. Flow. Structure.Then I discovered something.
✨ White space.
✨ Fragmented wisdom.
✨ The art of… dramatic pacing.Now?
I write like this.Because it makes you stop.
Scroll.
Stare.What am I saying?
Does it matter?Influence.
Isn’t built on clarity.
It’s built.
On rhythm.👇
If this moved you, let’s connect.
If it didn’t, think deeper.
This is strategy.
This is provocative.This
Is
Content. 🔥
The Disappearing Craft
The deeper loss here isn’t just that our feeds are less interesting - it’s the potential that the underlying skill of constructing an idea in words is eroding. Writing isn’t just a delivery mechanism for information. It’s multiple cognitive processes working together - reasoning, structuring, and refining in real time.
When we outsource too much of that to AI, we risk hollowing out the skill itself. And that doesn’t just affect writers. Scientists, mathematicians, and engineers also think through making - whether that’s in words, diagrams, models, or code. The process itself is the crucible for insight. Remove it, and you risk losing the deeper problem-solving instincts that emerge only through the act of wrestling with an idea.
A young researcher who never has to iterate on a paper draft may never develop the discipline to spot weak arguments or poorly structured logic. A data scientist who always uses automated model selection may never learn how subtle parameter changes shift results.
Over time, my fear is that these small absences compound into a profound skills gap.
AI in Healthcare: Promise and Precaution
Psychologist Mihaly Csikszentmihalyi believes creativity arises from a dynamic interaction where the individual engages deeply with a field and its established knowledge, shaping it to introduce something - a feedback loop where the act of making reshapes the maker. Automating too much of that loop risks breaking it entirely.
Nowhere is this tension more visible to me than in healthcare. AI is now helping radiologists detect tumours earlier, predicting disease outbreaks before they happen, summarising clinical trial literature for overburdened researchers, and, remarkably, managing to transcribe doctor’s handwriting so that their notes may actually be read.
Incredible.
The upside is real: faster decisions, earlier interventions, and the possibility of extending high-quality care to communities that have long been underserved.
But medicine is more than the application of pattern recognition to a dataset. Biology does not behave like data - it is complex, dynamic, and context-dependent. Where algorithms see correlations, biology demands causation. A model might correctly associate certain symptoms or biomarkers with a condition, but without understanding the underlying mechanisms, it cannot explain why that pattern exists or how it might differ from one patient to the next.
AI’s biggest limitation is that it inherits the biases and blind spots of its training data. If the data underrepresents certain populations or diseases, the AI will too. Biology, by contrast, is inherently unbiased. The mechanisms of disease operate independently of who a person is or where they live. The challenge is that we still lack the tools to read those mechanisms clearly and connect them to meaningful patient outcomes.
That is where the future of healthcare lies - not in replacing clinicians with black box algorithms, but in giving them better visibility into the causal biology of disease. We need analytics that can identify the biological drivers of disease rather than just describe associations. We need mechanistic diagnostic tests that can detect those biological signatures in patients early, inform clinicians of underlying risk factors, and triage people more precisely to the right specialist or care pathway. These tools can guide targeted interventions before conditions worsen, enabling a shift from reactive to predictive, preventive medicine.
AI can help accelerate that shift, but only if it is used to illuminate biology, not obscure it. That means building explainable systems rooted in mechanistic understanding - systems that enhance clinical decision-making by showing why a disease occurs and how to intervene effectively.
Because in healthcare, as in writing, science, and art, understanding the process matters as much as the product. And when the process is lost, so too is our ability to question, to adapt, and to truly heal.
Holding on to the Human Parts
Every technological leap has sparked fears of skill loss. The printing press, the calculator, the camera - each forced adaptation. The skills didn’t vanish, but they changed. The difference with AI is its scope: it touches not one craft, but many, simultaneously. It’s not just a single skill we risk losing, but the broader capacity to think creatively across domains.
The danger is that, in skipping the wrong turns, we lose the part of the journey that makes us capable of right turns in the first place. In healthcare, that could mean missing the nuance that turns a standard treatment into the wrong one for a particular patient. In every field, it means risking the erosion of our most human abilities.
The risk is not that AI will make us worse creators. The risk is that we will stop being creators altogether - in our writing, in our science, and even in our medicine. The tools are powerful, but without the hands to guide them, we may end up with precision and speed at the cost of empathy and understanding.
This post will not go viral. But it is mine - written by a human brain, flawed, subjective, and shaped by experience.
Signal over noise.
Thanks for reading Signal Over Noise
If you made it all the way here, either you’re deeply interested in the subject or you’re just too polite to stop scrolling – either way, I’m grateful.
If you think this piece is worth sharing, please do. It helps get these ideas in front of more people like you.



