Integrated Planning Environments: Lessons from the Evolution of IDEs
Derek RosenzweigRuntime Labs2025-07-20T16:36:21.912Z
Introduction: Externalized Memory and Planning
Information retrieval has advanced rapidly. Search is fast, semantic, and increasingly reliable across public and private data. But our ability to store and structure personal information—especially across time—has not kept pace.
It's not just that search should feel instant; storage and organization should feel seamless as well.
Óra is one implementation of this idea—the first step toward Integrated Planning Environments (IPEs): systems that present familiar calendar interfaces, yet are natively architected for surfacing interaction between humans and reasoning agents.
Reducing Friction to Enable Flow
A core trend in interface design is reducing configuration overhead—removing friction during onboarding and helping users quickly enter a flow state. When tools guide users smoothly into action, they enable focus on high-level reasoning rather than low-level setup.
IDEs as Testing Grounds for Reasoning
The evolution of IDEs exemplifies this principle. Developer experience in Cursor shows that language models—especially reasoning models—perform best when embedded within structured contexts.
Calendars as Ubiquitous Surfaces for Human-Agent Interaction
There are many tasks beyond programming that can benefit from agent-in-the-loop workflows. Calendars are ubiquitous across personal and professional endeavors—making them ideal environments for training and personalizing agents.
Óra as an Integrated Planning Environment (IPE)
Óra reframes the calendar as a multimodal surface for memory, reflection, and reasoning. We call this an Integrated Planning Environment: a time-based interface where reasoning models can interact with plans, tasks, notes, and images, all tied to time.
From Prompt to Plan
In Óra, a user can articulate their intent—"Breakfast with Tim next Friday at 9 am, tag it #founderchat"—and the system instantly translates that into a structured, editable event.
Voice as Input: Seamless Expression and Storage
Voice input offers a fast, accessible way to specify a title, describe an event, or attach a photo—often in the moment. The more expressive and multimodal these entries become, the more future inference-time compute has to work with.
Guiding Design Principles
- Blank canvas paralysis is real. Scaffolding helps even technical users in structured tasks.
- Constraint as affordance: Fewer, clearer primitives accelerate fluency.
- Structured surfaces enable model clarity: Anchoring context in time improves long-form coherence.
Multimodal Memory: Images, Notes, and Feedback
Every event in Óra can be enriched with images, files, notes, themes, and feedback—turning the calendar into a timeline of structured memory.
Grepping Across Your Calendar
Users can query their structured event logs like: “What events did I reflect on with an image last week?” or “Show me all events tagged #mlinfra since April.”
Scaling with Inference
- Context length: Models can now read event traces spanning weeks to months.
- Multimodal reasoning: Vision + language unlock richer summarization and linking.
- Personalization: Óra helps models learn user planning rhythms and event semantics.
Óra is part of a broader trend: turning familiar interfaces into environments for agent reasoning, reflection, and feedback.