Ithaca's AI Agents

Designing trust, transparency, and a touch of humanity into an AI-driven DeFi experience.

The Quiet Power of Ithaca

Ithaca is a protocol built for serious DeFi players — it allows users to earn yield through advanced options strategies. The infrastructure is robust, the engine powerful, and under the hood, it does a lot.

But there was a gap.
A gap between what the protocol could do, and how it felt to use it.
Because most users don’t want to trade options — they just want to earn.

So I was brought in to rethink the experience.

The Problem

For the average user, options trading is a black box.
Greeks. Volatility. Expiries. All necessary, but overwhelming.

Even in DeFi, where interfaces have improved, the emotional gap remains:
People feel like they’re entering a space not made for them. They’re afraid of making the wrong move.

Ithaca wanted to change that.
The idea was bold: let users simply deposit, and let AI agents do the rest — automatically selecting and managing options strategies to grow their funds.

But that posed a new challenge:

How do you get people to trust something they don’t understand?

The Work

Our job wasn’t just to simplify. It was to abstract without alienating.

We didn’t want users to feel like they were missing something.
We wanted them to feel like someone had their back.

That’s where the design began — not with dashboards or charts, but with a feeling:
Calm. Capable. Human.

Making the AI Feel Human

These agents weren’t just technical engines — they were the heart of the product.
And if users were going to trust them with their funds, the interface needed to feel less like a machine, and more like a familiar guide.

This meant:

  • A conversational tone of voice

  • Human-like check-ins ("Your strategy is being rebalanced")

  • Simple summaries of performance, not technical logs

  • Interfaces that felt soft, intuitive, and quietly confident

We weren’t showing everything. But what we showed, we made count.

Transparency Without Complexity

We didn’t want users clicking through endless tabs to find out what was happening.
Instead, we surfaced insights just when they were needed, and in ways they could easily understand.

Rather than throwing charts or data dumps, we reframed information into plain English:

  • “Your funds are allocated across three strategies.”

  • “This week, volatility increased. The agent adjusted accordingly.”

  • “You’ve earned 3.2% this month.”

Each update was a moment to build trust.
Each message said, in its own way: we’ve got you.

A Shift in Mindset

What we saw during user testing was telling.
People stopped asking, “How does this work?”
Instead, they asked, “How’s it doing?”

That subtle shift — from curiosity to confidence — told us we were on the right track.
They weren’t trying to outsmart the protocol anymore. They were starting to trust it.

The Takeaway

Designing for AI in DeFi isn’t about making things invisible.
It’s about creating relationships — between humans and systems — where trust doesn’t come from control, but from clarity, tone, and care.

We didn’t just make options trading simple.
We made it feel safe.
And sometimes, that’s the most powerful thing design can do.

What’s Next

This is just the beginning.

We’ve launched the first version and tested it with a small group of early users. The feedback so far has been promising — people felt calmer, more in control, and surprisingly... trusting. That was the signal we were looking for.

But we’re not done yet.

There’s more to refine:

  • How we communicate risk, not just performance

  • How much agency users want over strategy selection

  • How the AI can “check in” with users without feeling intrusive

We’re also exploring different ways to make transparency more ambient — always there, never overwhelming.

The goal is still the same:
Make something powerful feel simple.
Make something unfamiliar feel safe.

And we’re just getting started.