Flag

We stand with Ukraine and our team members from Ukraine. Here are ways you can help

Get exclusive access to thought-provoking articles, bonus podcast content, and cutting-edge whitepapers. Become a member of the UX Magazine community today!

Home ›› Figproxy: A Free and Open Tool to Connect Figma and Arduino

Figproxy: A Free and Open Tool to Connect Figma and Arduino

by Dave Vondle
4 min read
Share this post on
Tweet
Share
Post
Share
Email
Print

Save

Figproxy enables rapid prototyping of tangible user experiences by allowing Figma prototypes to talk to the external world. This article dives into what it is and why I made it.

What Is Figproxy?

Figproxy is a utility that allows bidirectional communication between Figma and physical hardware for prototyping interactions that involve screens and physical elements like motors, lights and sensors. It’s designed to talk to hardware prototyping platforms like Arduino.

Some potential use cases include:

  • Kiosks – Soda Machines, Jukeboxes, Movie Ticket Printers, ATMs
  • Vehicle UI – Control lights, radio, seats etc.
  • Museum Exhibits – Make a button or action that changes what is on the screen
  • Home Automation – Prototype a UI to trigger lights, locks, shades etc. And make it actually work
  • Hardware “Sketching” – Quickly test out functionality with a physical controller and digital twin before building a more complicated physical prototype
  • Games – Make a physical spinner or gameplay element that talks to a Figma game

It’s really great if you have UX designers working in Figma already and want to quickly connect a design to hardware.

It’s also really valuable if you want to get a prototype working in a matter of hours, not days. It’s intended to be utilized when building-to-think right after a brainstorm sketch, before you spend a lot of time refining the design.

A two-hour "hardware sketch" of a handheld device
A two-hour “hardware sketch” of a handheld device for triggering screens in Figma.

As a demonstration, I have supplied a couple of examples in the Github Repo: One where I have a Figma prototype that allows you to choose the colors that an LED strip lights up. The other is a knob that controls an on-screen representation in real time.

LED Strip lighting up
Changing LED colors of a LED strip
Turning a knob that controls a screen animation
Using a knob to control a prototype

How It Works

Figma does not support communication from a prototype to other software in its API. Because we can’t go the official route, Figproxy uses two different “hacks” to achieve communication.

Speaking Out (Figma → Arduino)

Note: I will be using “Arduino” as shorthand for any hardware that can speak over a serial connection. There are a lot of platforms that can communicate over serial, but Arduino is the most common in this space.

When you specify for Figma to go to a link, Figproxy looks at the link and if it starts with “send” (and not, for instance “http://”) we know it is intended to be routed to hardware.

Figma UI: send a

In Arduino, you can listen for a character and perform some action like this:

  if (Serial.available() > 0) {  
    // get incoming byte:
    char incomingByte = Serial.read();
    //in Figma the "Turn LED On" button sends "a",  "Turn LED Off" sends "b"
    if(incomingByte=='a'){
      digitalWrite(LED_BUILTIN, HIGH);
    }else if(incomingByte=='b'){
      digitalWrite(LED_BUILTIN, LOW);
    }
  }

If there is more complex data you need to send, you can send a string like “hello world!:

Figma UI: send hello world!

You can even send hexadecimal characters by preceding the string with “0x”

Figma UI: send hex code

Speaking In (Arduino → Figma)

In Arduino, you can send a character like this:

Serial.print('c');

To get data into Figma, Figproxy sends characters as keypress events.

Figma UI: receiving characters from keypresses

Why I Made It

At IDEO I work on a lot of physical product designs that incorporate displays. I commonly work with UX designers whose tool of choice for rapid iteration of experiences is Figma.

I was recently working on a project that was a kind of kiosk – a touch screen that has external hardware elements. We already had a phenomenal UI prototype built in Figma, and I had built some of the digital hardware elements out with Arduino to deliver an interactive model we refer to as an “experience prototype.”

I wanted to link the external LED animations to the UI prototype, so some LED behaviors could be choreographed to the moment in the user flow. I was floored when I came to the realization that this was not possible. Protopie did not import the Figma screens properly and would have resulted in days of re-work and making the UI team switch software. I also tried the fantastic software Blokdots (co-created by ex-IDEO’er Olivier Brückner) but this only allows hardware to talk to the “design view” of Figma as opposed to what I needed which was communication to the “prototype view.”

I’ve got a bit of a soft spot for making prototyping tools for hardware so honestly I was a bit excited that no-one had figured this one out yet. After digging into the Figma API, I realized why Blokdots hadn’t done it yet – Figma doesn’t support any communication to and from the prototype view in their API. I had to figure out a workaround. After looking at what prototypes could do, I had the idea to make a proxy browser and Figproxy started taking shape.

Try It Out

Detailed instructions for installation, examples and use can be found in the Github Repository here. The Figma files for the examples are here. I hope you like it!

post authorDave Vondle

Dave Vondle
Dave Vondle heads the Emerging Tech Lab at IDEO Chicago. Over a 19 year history at IDEO, he has specialized in blending electronic hardware with user experience, crafting products that not only meet essential needs but also bring a sense of playfulness, vital to the human experience.

Tweet
Share
Post
Share
Email
Print
Ideas In Brief
  • The article details the creation and functionality of Figproxy, a tool that bridges Figma prototypes and physical hardware, providing an efficient solution for rapid prototyping in interactive and tangible user experiences.

Related Articles

Unlock the secret to truly innovative UX by looking beyond the screen. This article reveals how inspiration from architecture, nature, and physical design can elevate your digital creations, making them more intuitive, user-centered, and creatively inspired. Step outside the digital world to spark new ideas and transform your UX design process.

Article by Rodolpho Henrique
The Secret to Innovative UX: Look Beyond the Digital World
  • The article explores how UX designers can draw inspiration from the analog world, including architecture, nature, and physical product design, to innovate digital experiences.
  • It highlights key design principles such as ergonomics, affordances, and wayfinding that can enhance digital interfaces.
  • The piece emphasizes the importance of stepping beyond the screen to foster creativity, prevent burnout, and create user-centered designs that feel natural and intuitive.
Share:The Secret to Innovative UX: Look Beyond the Digital World
5 min read

What if your brain could merge with a computer? BCIs are revolutionizing healing, learning, and thinking — but with risks like privacy threats and loss of autonomy. Explore the future of merged consciousness and how to harness it wisely.

Article by Oliver Inderwildi
Navigating the Convergence of Mind & Machine: On the Neural Frontier & the Implications of Merged Consciousness
  • The article explores how brain-computer interfaces (BCIs) are pushing the neural frontier, enabling breakthroughs in treating neurological disorders, enhancing human, cognition, and ultimately increasing our understanding of the brain’s functioning.
  • The piece defines the concept of merged consciousness and discusses its ethical and societal risks, including loss of autonomy, data privacy concerns, and potential socioeconomic divides.
  • It highlights the role of neuroplasticity in human-computer interaction, showing how feedback loops from technology accelerate learning and adaptation.
  • It calls for innovative policymaking to balance rapid technological advancements with safeguards, ensuring BCIs benefit humanity without compromising our future
Share:Navigating the Convergence of Mind & Machine: On the Neural Frontier & the Implications of Merged Consciousness
16 min read

Are we on the brink of an AI-first revolution? As more products are built entirely around AI engines, designers must adapt. From dynamic interfaces and non-linear journeys to helping users optimize prompts, discover how the next generation of AI-driven products will reshape UX design.

Article by Tom Rowson
AI-First: Designing the Next Generation of AI Products
  • The article introduces “AI-first” products, designed around AI engines to offer more than just chat interfaces and improve over time.
  • It highlights key challenges for designers: creating flexible interfaces, helping users with prompts, and managing AI errors like hallucinations.
  • The piece stresses the need to adapt to non-linear, iterative user journeys as AI-first apps evolve.
Share:AI-First: Designing the Next Generation of AI Products
4 min read

Join the UX Magazine community!

Stay informed with exclusive content on the intersection of UX, AI agents, and agentic automation—essential reading for future-focused professionals.

Hello!

You're officially a member of the UX Magazine Community.
We're excited to have you with us!

Thank you!

To begin viewing member content, please verify your email.

Tell us about you. Enroll in the course.

    This website uses cookies to ensure you get the best experience on our website. Check our privacy policy and