Flag

We stand with Ukraine and our team members from Ukraine. Here are ways you can help

Home ›› Artificial Intelligence ›› The Hidden Cost of Being Connected in the Age of AI

The Hidden Cost of Being Connected in the Age of AI

by Louis Byrd
9 min read
Share this post on
Tweet
Share
Post
Share
Email
Print

Save

In an increasingly connected world, AI is transforming the way we communicate and interact—but at what cost? This thought-provoking article dives deep into the hidden downsides of AI-driven connectivity, revealing the significant environmental toll, heightened privacy risks, and the subtle weakening of human relationships. From the energy demands of AI infrastructure to the ethical dilemmas posed by constant connectivity, it challenges us to rethink how we balance technological advancement with sustainability and responsible usage. Prepare to explore the unseen consequences of our digital age and the future it shapes.

It was one of those moments that sticks with you, not because it was particularly profound, but because of the weight of the ignorance I had to confront. The “mentor,” someone I was assigned to as part of the accelerator program, was offering advice as if it were gospel.

“What do you mean you’re not offering a freemium?” he asked, his voice laced with that faux-privileged tone of authority, the kind people use when they think they’re saying something revolutionary. “You have to give people an opportunity to use your product for free! Nobody’s going to pay for your product this early. You’ve got to give it away — think Meta, Facebook, Instagram, LinkedIn. These are all platforms that are free, and billions of people use them.”

I sat back, letting his words linger in the space between us. It was as if he couldn’t hear how ridiculous it all sounded. Free. That word, so deceptively simple, so full of promise. But nothing is ever free — not really. And certainly not in this digital age, where every click, every post, every scroll comes with a price tag most folks aren’t even aware of.

He went on, oblivious to the storm that was brewing in my mind. “These companies make their money off premium subscriptions and ads, but they hook people with free!” he said as if he had just cracked the code to success.

That was when I knew — this man didn’t understand the world he was trying to “mentor” me through. Tech? Business? He didn’t have the faintest clue.

I looked him in the eye, not out of anger, but out of a deep, aching frustration. “That’s true,” I replied slowly, letting the words settle. “But you’re missing something. People are paying. Every day. Perpetually.”

He blinked, confused. I could see his brain scrambling, trying to catch up. “They pay in personal data,” I said.

He stopped, frozen in thought, the weight of my words finally sinking in. For the first time, he seemed to realize he might have overlooked something. “I never really thought of it that way,” he muttered.

And that was the problem, wasn’t it? So many people never really think of it that way. We walk around, tethered to our devices, giving away pieces of ourselves for convenience, connection, and what we’ve been led to believe is free.

But nothing in this world is free — not your time, not your labor, and certainly not your data. The currency may have changed, but the transaction remains the same. The question is, how long are we willing to pay before we realize what we’ve lost?


We’ve all done it — rushed to join a new app or platform, eager to dive in. Without thinking, we scroll past endless walls of tiny legal text and hit “I agree.” Just let me in already! Right?

But with that single click, we give up ourselves. We give up control.

Our data becomes the price for a few moments of distraction, connection, or the illusion of productivity. We trade privacy for convenience, often without even realizing it — or caring for that matter.

It’s a strange thing, to live in a world where every word, every image, every fleeting moment you share is no longer yours the moment you press “post.” You sit down, thinking you’re simply connecting with friends, showing off a vacation, or sharing a thought — but somewhere, in a boardroom, analyzed by a scrum team, or research lab far removed from your life, they’re mining every piece of you for profit.

Take LinkedIn, for instance. You think you’re building a professional network, but behind the scenes, your resume, your posts, your connections — they’re not just tools for you, they’re tools for them.

You get a LinkedIn message: “Congratulations! You’re one of the few selected experts invited to answer a business question.” It feels good, right? They’re appealing to your ego, making you feel special, and valued, like your expertise is in high demand.

But what’s happening? You’re handing over your ideas for free, fueling their algorithms while they profit from your knowledge. And sure, they’ll let you opt out now, but what’s already been taken? What’s already been used? There’s no undoing that. The damage is done.

Or Snapchat, with its playful “My Selfie” feature. You think it’s innocent fun, just a filter to make you smile. But how quickly that smile turns into something else — a face in an ad, a face sold to the highest bidder? And they tell you, “Oh, it’s just for research, just for improving the tech.” But we know better than that, don’t we? They’re turning you, your very likeness, into a commodity — a product.

Even Facebook and Instagram — places where we connect and share our lives with the people we care about — have already admitted it. They’ve been using our words, our images, our moments, to train their AI, to build their empire. And they don’t even need your permission if a friend tags you in a photo. You’re caught in the web, no matter how far you try to run.

And then there’s Reddit — no illusions there. If you’re posting, they’re using it. Every word you write is feeding into a machine, making it smarter, and more efficient, while you sit there thinking you’re having a conversation. But conversations don’t belong to us anymore. They belong to them. Not the AI, but it’s the owner.

These are just a few examples, not to mention the endless stream of startups across industries, or the data-hungry AI giants like Google and OpenAI.

You’re not just a user. You’re not just a person. You’re a resource — their resource. While they profit off your data, you’re left questioning how much of yourself you’ve already given away. How much of you is still yours? Lately, I’ve been asking myself these very questions. And for me, the answer is somewhere between very little and nothing at all.

The truth about data collection isn’t buried in a checkbox. It’s a matter of trust, respect, and understanding. It’s about more than a quick agreement — it’s about ensuring people truly know how their digital lives are being shaped, manipulated, and used. We deserve more than vague assurances. We deserve transparency, plain and simple.

Imagine a world where, instead of hiding behind technical jargon, those who build AI take the time to explain — really explain — what’s happening with your data. Where your questions aren’t just tolerated but invited. Where your concerns aren’t dismissed but discussed, because you have a right to know how much of your life is being turned into profit.

In this world, ethical data collection isn’t just a checkbox; it’s a dialogue. A conversation that continues. And it’s built on respect, where your privacy isn’t something to be exploited but something to be protected.

You should never have to trade your dignity for convenience or feel trapped into giving more of yourself than you’re willing to. No one should feel like opting out means losing out. You should have the power to decide — without fear, without penalty, and without prejudice.

Yet, too often our freedom of choice in the matter when it comes to our data is not clearly defined. And what we must ask ourselves, what is the hidden cost of connection in the age of AI?


I’ve been giving away parts of myself since 1999. I was 15, and my girlfriend talked me into joining Black Planet — a corner of the internet where Black folks could come together, create profiles, share pictures, and message people we’d never met.

It felt like a world of possibility, a space that was ours. If you knew a little HTML, you could even customize your page. Looking back, I realize this was the first real social media platform — where we unknowingly began trading pieces of our identity for connection.

A few years later, I was pouring my heart into music, sharing tracks on this place called MySpace, hoping it might be my ticket to something bigger. Around the same time, I joined Facebook, back when you needed a college email to even get in.

It felt exclusive like we were part of some special circle. I was sharing my thoughts, creating groups, and posting all the wild moments from my college days. Then my mama joined. That changed everything. Suddenly, she was tagging me in baby pictures, kindergarten shots, and even prom photos with the same girlfriend who introduced me to this whole world of social media.

And with every new platform — Twitter, LinkedIn, Vine, Instagram, Snapchat, Mastodon, Fanbase — I was right there, giving these companies more and more of myself. It became a routine, almost an addiction. I’d post something, and then I’d check back every few minutes, searching for the validation: the heart, the thumbs-up, the smiley face, the sad tear, a reshare, or a comment.

When a post didn’t land, I’d tweak the next one, trying to crack the code, chasing engagement like it held the key to my value.

Then the algorithms changed. Suddenly, it was pay to play — if you wanted your posts to reach your audience, you had to pay for it. And when the engagement started to drop, especially on platforms like LinkedIn and Instagram, where I’d once thrived, I couldn’t help but question my worth.

Even knowing it was a rigged system, I felt the weight of it. I watched my network, and my colleagues, get the accolades, the congratulations, and the engagement I was no longer seeing. It ate at me. The anxiety crept in, and soon after, the depression followed.

All of this, before AI even became the driving force behind these platforms.


Now, the algorithms have grown more powerful, supercharged by trillions of data points, quietly steering our choices — what music we listen to, what movies we watch, even which presidential candidate seems to align with our interests.

The lines are blurring. I’ll read an article and wonder if the words are the author’s own or just another AI-generated entry. It’s getting harder to tell the difference, harder to trust what’s real.

And it’s not just the content — it’s the control. I can’t even express myself freely anymore. If I try to celebrate Black pride, speak on issues like Black Lives Matter, or raise my voice on something that matters to me, I risk being flagged or shadowbanned by the same AI that’s supposed to “connect” us.

Let me be clear: I don’t blame the technology. This is about us — about our humanity, about the choices we’ve made as individuals, technologists, business leaders, and as a society. We’ve let that primitive part of ourselves, the one that craves shortcuts and convenience, take control.

Instead of using AI to lift us to new heights, it feels like we’re letting it replace who we are at our core. Not in some dramatic, fear-mongering way — this isn’t about Skynet or some doomsday scenario. It’s more subtle, more simple. We’re losing the ability to just be — to be present, to exist fully in the world around us, without distraction, without something artificial pulling us away from what’s real.

This is the hidden cost. We take our lived experiences, the most personal parts of ourselves, and share them with the world. But instead of those experiences staying true, they’re twisted into something artificial, something that no longer uplifts us. Instead, they become fuel for others to exploit and profit from — while we’re left wondering what, if anything, we’ve gained from it.

And now we’re paying the price, watching the world slip further into a disillusioned reality shaped by artificial intelligence, losing touch with what’s real, with what truly makes us human.

In the end, we have to look ourselves in the mirror and ask: was it worth it? We handed over our data, our lives, to fuel Big Tech’s digital goldmine. And what did we get in return? Fool’s gold — fleeting moments of connection, empty engagement, and the illusion of control.


The article originally appeared on Medium.

Featured image courtesy: Gaelle Marcel.

post authorLouis Byrd

Louis Byrd
With over a decade of experience in Human Experience Design, Louis Byrd brings a wealth of expertise in design, technology, and business leadership. Louis is the co-founder and Chief Visionary Officer of Zanago, a social impact technology company known for its flagship product, Kataba. Throughout his career, Louis has designed experiences for a diverse range of clients, including small businesses, non-profits, and major brands such as Mazda, Dell, Coca-Cola, Hallmark, and Microsoft. Louis is passionate about leveraging technology and Human Experience Design to develop solutions that positively impact marginalized communities.

Tweet
Share
Post
Share
Email
Print
Ideas In Brief
  • The article discusses the hidden costs of AI-driven connectivity, focusing on its environmental and energy demands.
  • It examines how increased connectivity exposes users to privacy risks and weakens personal relationships.
  • The article also highlights the need for ethical considerations to ensure responsible AI development and usage.

Related Articles

Is true consciousness in computers a possibility, or merely a fantasy? The article delves into the philosophical and scientific debates surrounding the nature of consciousness and its potential in AI. Explore why modern neuroscience and AI fall short of creating genuine awareness, the limits of current technology, and the profound philosophical questions that challenge our understanding of mind and machine. Discover why the pursuit of conscious machines might be more about myth than reality.

Article by Peter D'Autry
Why Computers Can’t Be Conscious
  • The article examines why computers, despite advancements, cannot achieve consciousness like humans. It challenges the assumption that mimicking human behavior equates to genuine consciousness.
  • It critiques the reductionist approach of equating neural activity with consciousness and argues that the “hard problem” of consciousness remains unsolved. The piece also discusses the limitations of both neuroscience and AI in addressing this problem.
  • The article disputes the notion that increasing complexity in AI will lead to consciousness, highlighting that understanding and experience cannot be solely derived from computational processes.
  • It emphasizes the importance of physical interaction and the lived experience in consciousness, arguing that AI lacks the embodied context necessary for genuine understanding and consciousness.
Share:Why Computers Can’t Be Conscious
18 min read

AI is transforming financial inclusion for rural entrepreneurs by analyzing alternative data and automating community lending. Learn how these advancements open new doors for the unbanked and empower local businesses.

Article by Thasya Ingriany
AI for the Unbanked: How Technology Can Empower Rural Entrepreneurs
  • The article explores how AI can enhance financial systems for the unbanked by using alternative data to create accessible, user-friendly credit profiles for rural entrepreneurs.
  • It analyzes how AI can automate group lending practices, improve financial inclusion, and support rural entrepreneurs by strengthening community-driven financial networks like “gotong royong”.
Share:AI for the Unbanked: How Technology Can Empower Rural Entrepreneurs
5 min read

Curious about the future of AI? Discover how OpenAI’s “Strawberry” could transform LLMs with advanced reasoning and planning, tackling current limitations and bringing us closer to AGI. Find out how this breakthrough might redefine AI accuracy and reliability.

Article by Andrew Best
Why OpenAI’s “Strawberry” Is a Game Changer
  • The article explores how OpenAI’s “Strawberry” aims to enhance LLMs with advanced reasoning, overcoming limitations like simple errors and bringing us closer to AGI.
  • It investigates how OpenAI’s “Strawberry” might transform AI with its ability to perform in-depth research and validation, improving the reliability of AI responses.
Share:Why OpenAI’s “Strawberry” Is a Game Changer
3 min read

Tell us about you. Enroll in the course.

    This website uses cookies to ensure you get the best experience on our website. Check our privacy policy and