The Hidden Hand Behind Your Feed: Who Decides What We See on Instagram?
Riya Kumari | Feb 27, 2025, 00:00 IST
Instagram is not just an app. It’s a curator, a gatekeeper, a quiet force that decides what enters our minds. We scroll, thinking we’re choosing what to consume, but the truth is, that decision was made long before we even opened the app. Because someone—or rather, something—has already filtered it for us
Instagram used to be about one thing: connection. It was a way to share moments with friends, capture beautiful scenes, and discover stories that would never have crossed our paths otherwise. But now? Now, it feels like a never-ending scroll of curated chaos. It's a place where you can’t help but ask: Who’s in charge here? What is Instagram, really, when we peel away the filters? And yet, there they are, showing up on your feed and getting countless views. These aren’t just exceptions; they’re regular, creeping reminders that the platform’s filters—those supposed safeguards against inappropriate content—are failing us. Instagram, it seems, has forgotten one fundamental thing: It’s not just about what’s posted. It’s about who gets to see it and the impact it has.
1. The Hidden Harm in What We See

It’s one thing to have explicit content on the platform—it’s another when that content is being shown to people without their consent. You can be minding your own business, scrolling through your feed, and suddenly, you’re faced with nudity or explicit videos that have no place being there. The unsettling part is that these posts don’t need you to click on anything to show up—they’re just there.
2. Violence as Entertainment: A Digital Paradox

Then there’s the violence. Fight videos, accidents, raw footage of physical confrontations—content that would typically make anyone cringe or recoil is somehow just part of the daily scroll. You’d think, given Instagram’s own rules, these kinds of posts wouldn’t make it past the gatekeepers, but time and time again, they slip through.
3. Humor or Harm? The Thin Line That’s Vanishing

Instagram has become the breeding ground for jokes, memes, and humor—often under the guise of “dark humor.” We’ve all seen it: memes that cross the line from funny to outright disrespectful, filled with bullying, body-shaming, and sometimes, just flat-out cruelty. And it’s not just the jokes themselves. It’s how the platform facilitates this culture of humor by allowing these posts to be tagged as “funny” or “relatable,” as if that somehow absolves them of their potential harm.
4. A Platform for All, But Not All Should Be Seen

What’s most alarming about all of this isn’t just the content itself—it’s the message Instagram is sending about what’s acceptable and what isn’t. By allowing this kind of content to be so readily accessible, Instagram is implicitly saying that there’s no real line between what’s “for everyone” and what’s meant for a specific audience. It’s about as far from being a safe space as you can get, and it’s about time we start calling it what it is: a failure in its responsibility to users.
The children who scroll through Instagram deserve better. The teenagers and young adults who are already navigating a world of complex relationships, self-image issues, and mental health challenges should not have to fight through explicit content, violence, and cruel jokes just to enjoy a harmless scroll. It’s not about censorship—it’s about context, moderation, and intention.
Instagram was never meant to be a platform that promotes exploitation or normalization of toxic behaviors. It was built to be a social space, to connect people, share stories, and engage with the world in meaningful ways. But right now, the conversation is about how the app has strayed from that vision. It’s about how we, as users, have to navigate through a minefield of inappropriate content to get to the good stuff.
What Needs to Change?
For Instagram to regain its purpose, it needs to take a hard look at how it moderates content. It’s not enough to just remove content after the fact—it needs a system that actively filters out harmful, explicit, and inappropriate posts from the very beginning.
1. The Hidden Harm in What We See
Social media
( Image credit : Times Life Bureau )
It’s one thing to have explicit content on the platform—it’s another when that content is being shown to people without their consent. You can be minding your own business, scrolling through your feed, and suddenly, you’re faced with nudity or explicit videos that have no place being there. The unsettling part is that these posts don’t need you to click on anything to show up—they’re just there.
- Why does this matter? Because it says something about how we, as users, are valued: only as passive consumers, susceptible to whatever is pushed into our view. Nudity, when it appears out of context, isn’t just a random video; it’s a direct challenge to our sense of privacy, consent, and emotional boundaries.
2. Violence as Entertainment: A Digital Paradox
Likes
( Image credit : Pexels )
Then there’s the violence. Fight videos, accidents, raw footage of physical confrontations—content that would typically make anyone cringe or recoil is somehow just part of the daily scroll. You’d think, given Instagram’s own rules, these kinds of posts wouldn’t make it past the gatekeepers, but time and time again, they slip through.
- What’s the bigger issue here? It's not just that violence is being tolerated. It’s that violence has become part of what’s deemed “entertaining.” The algorithm rewards shock value over sensitivity. When we repeatedly see violence in this context, it stops feeling shocking. It becomes just another video to swipe past.
3. Humor or Harm? The Thin Line That’s Vanishing
Fun
( Image credit : Pexels )
Instagram has become the breeding ground for jokes, memes, and humor—often under the guise of “dark humor.” We’ve all seen it: memes that cross the line from funny to outright disrespectful, filled with bullying, body-shaming, and sometimes, just flat-out cruelty. And it’s not just the jokes themselves. It’s how the platform facilitates this culture of humor by allowing these posts to be tagged as “funny” or “relatable,” as if that somehow absolves them of their potential harm.
- Why is this dangerous? Because humor, when it perpetuates hate, bullying, or harmful stereotypes, isn’t just humor—it’s a tool of division and pain. The fact that it’s allowed to thrive on a platform with millions of young minds only amplifies its impact. When something is laughed at, it’s normalized. And when something is normalized, it becomes part of our social fabric.
4. A Platform for All, But Not All Should Be Seen
Scroll
( Image credit : Pexels )
What’s most alarming about all of this isn’t just the content itself—it’s the message Instagram is sending about what’s acceptable and what isn’t. By allowing this kind of content to be so readily accessible, Instagram is implicitly saying that there’s no real line between what’s “for everyone” and what’s meant for a specific audience. It’s about as far from being a safe space as you can get, and it’s about time we start calling it what it is: a failure in its responsibility to users.
The children who scroll through Instagram deserve better. The teenagers and young adults who are already navigating a world of complex relationships, self-image issues, and mental health challenges should not have to fight through explicit content, violence, and cruel jokes just to enjoy a harmless scroll. It’s not about censorship—it’s about context, moderation, and intention.
Instagram was never meant to be a platform that promotes exploitation or normalization of toxic behaviors. It was built to be a social space, to connect people, share stories, and engage with the world in meaningful ways. But right now, the conversation is about how the app has strayed from that vision. It’s about how we, as users, have to navigate through a minefield of inappropriate content to get to the good stuff.
What Needs to Change?
- User Control: Give users more control over what they see. Make it possible to block explicit or harmful content altogether.
- Better Moderation: Instagram’s algorithm should prioritize human interaction in decision-making, not just views or likes. We need more responsible oversight, not automated decisions.
- A Return to Respect: It’s time Instagram returns to the roots of respect—respect for its users, their boundaries, and their well-being.