Flag

We stand with Ukraine and our team members from Ukraine. Here are ways you can help

Get exclusive access to thought-provoking articles, bonus podcast content, and cutting-edge whitepapers. Become a member of the UX Magazine community today!

Home ›› Siloed Security? Forget AI Adoption

Siloed Security? Forget AI Adoption

by Josh Tyson
1 min read
Share this post on
Tweet
Share
Post
Share
Email
Print

Save

As AI agents become more autonomous, generating software on-the-fly from human prompts, one question looms larger than ever: how do we keep them secure? In this episode of Invisible Machines, Robb Wilson and Josh Tyson sit down with Omar Santos, Distinguished Engineer of AI Security at Cisco and co-chair of the Coalition for Secure AI, to explore the evolving landscape of AI security in the agentic era.

Omar argues that traditional security models are no longer sufficient. The idea of a security department feels both antiquated and woefully inadequate. As AI agents create complex software environments dynamically, security must become an ever-present, integrated layer, supported by constant human oversight and the ability to simulate potential outcomes to mitigate risk. For organizations racing toward AI adoption, ignoring security isn’t just risky, it’s a barrier to progress.

The conversation dives deep into how AI agents are transforming work, teams, and technology ecosystems. Omar explains how advanced orchestration combines human judgment with AI capabilities, and why simulations and real-time risk assessments will be critical as agents evolve. He also shares insights from his work leading AI security at Cisco and guiding industry standards like CSAF and VEX.

For anyone exploring agentic AI, this episode is a masterclass in responsible innovation. It challenges leaders to rethink security as a core part of AI design, adoption, and management, because in the age of agentic AI, security is fundamental.

post authorJosh Tyson

Josh Tyson
Josh Tyson is the co-author of the first bestselling book about conversational AI, Age of Invisible Machines. He is also the Director of Creative Content at OneReach.ai and co-host of both the Invisible Machines and N9K podcasts. His writing has appeared in numerous publications over the years, including Chicago Reader, Fast Company, FLAUNT, The New York Times, Observer, SLAP, Stop Smiling, Thrasher, and Westword. 

Tweet
Share
Post
Share
Email
Print

Related Articles

Unpack how dark patterns manipulate users, why they’re becoming a legal issue, and what ethical designers can do about it.

Article by Tushar Deshmukh
Dark Patterns: When Design Crosses the Line
  • The article makes a clear case: dark patterns aren’t accidents but deliberate design decisions that put business gains over people.
  • The piece reminds us that no short-term conversion bump is worth losing user trust for good.
Share:Dark Patterns: When Design Crosses the Line
7 min read

Learn about common Agile anti-patterns. Lessons from Laura Klein.

Article by Paivi Salminen
Unhappy Agile Teams Are Unhappy in Familiar Ways
  • The article makes a sharp point: struggling Agile teams love to think their problems are unique. They rarely are.
  • It breaks down the traps that quietly kill Agile teams, like endless feature shipping, siloed workflows, and design treated as an afterthought.
  • The piece reminds us that looking Agile and actually being Agile are two very different things.
Share:Unhappy Agile Teams Are Unhappy in Familiar Ways
6 min read

Take a hard look at the fine line between good design and digital dependency.

Article by Tushar Deshmukh
Designing for Dependence: When UX Turns Tools into Traps
  • The article reveals how digital products are no longer just tools. They’re engineered to keep you hooked, often without you realizing it.
  • It challenges designers to ask: Are we building products that serve people, or ones that quietly exploit them?
  • The piece highlights that ethical design isn’t about removing persuasion. It’s about being honest and giving users the freedom to walk away.
Share:Designing for Dependence: When UX Turns Tools into Traps
8 min read

Join the UX Magazine community!

Stay informed with exclusive content on the intersection of UX, AI agents, and agentic automation—essential reading for future-focused professionals.

Hello!

You're officially a member of the UX Magazine Community.
We're excited to have you with us!

Thank you!

To begin viewing member content, please verify your email.

Get Paid to Test AI Products

Earn an average of $100 per test by reviewing AI-first product experiences and sharing your feedback.

    Tell us about you. Enroll in the course.

      This website uses cookies to ensure you get the best experience on our website. Check our privacy policy and