We use cookies to understand how you use this site and improve your experience.

Alexandru Mareș@allemaar
Alexandru Mareș
  1. Home
  2. Writing
  3. Hybrid Workflows
Email
RSS
YounndAIYou and AI, unifiedBuilt withNollamaNollama

Hybrid Workflows: The Context Switch

Alexandru Mareș

On this page

  • The Continuity Problem
  • The Friction of Modes
  • The Hybrid Stream
  • The Unified Record
  • Continuity as Integrity
Previous226 Milliseconds
NextSwarm Protocol: When Bots Talk to Bots, Who Sets the Rules?
Related
Swarm Protocol: When Bots Talk to Bots, Who Sets the Rules?
12/02/2026
Emotional Vectors: Why I Gave the Machine Frustration25/01/2026
Memory Is a Pipeline20/01/2026
Published14/02/2026
Read time2 min
Topics
ArchitectureAIYON
Actions
00
Comments

Loading comments…

Leave a comment

0/2000

The Continuity Problem

I noticed it during a debugging session. I was reading a chat log to understand why an agent made a wrong call. The reasoning was in the chat. The action was in the system log. The link between them was gone.

Intelligence does not separate itself into bins. A human working on a problem moves from doubt to calculation. We sketch ideas in prose. We validate them with math. We execute them with code. The process is fluid.

Current software architectures force a choice. You are either chatting in Markdown or executing in JSON. The mode is binary. This separation is artificial. It breaks the line of continuity.

The Friction of Modes

We ask agents to context switch at the parser level. To speak to a human, the agent outputs text. To speak to a machine, it outputs a JSON object.

This creates a fracture in the record. The reasoning lives in the chat log. The action lives in the system log. When an error occurs, we see what broke. We rarely see why the agent thought it was the right solution.

The Hybrid Stream

I built hybrid mode to resolve this tension. It allows structured data, unstructured thought, and executable payloads to coexist in one stream.

The parser does not toggle. It reads the intent of each line.

@NOTE captures the conversation.

@THOUGHT captures the reasoning.

@STEP captures the action.

@BEGIN captures the payload.

The agent moves between these states without breaking the syntax. It is discipline with flow.

@DOC id=deploy-fix | title="Hotfix Deployment" | mode=hybrid | profile=agent

@NOTE text="The server latency is spiking. I suspect a memory leak."
@THOUGHT rid=t:1 | type=hypothesis | content="Recent commit 4f5a might be the cause."

@STEP rid=s:1 | op=std:fs.read | in=[file:src/main.py]
@BEGIN python
def aggressive_cache():
    # ... logic ...
@END python

@NOTE text="Found the issue. The cache isn't clearing. Patching now."
@STEP rid=s:2 | op=std:fs.patch | in=[file:src/main.py]

The Unified Record

The result is a single timeline of work. We do not have to correlate timestamps across three different log files. We read the story of the task.

We see the doubt. We see the investigation. We see the code change. We see the confirmation.

This makes the reasoning visible at every step. We are not just logging the output. We are logging the cognition.

Continuity as Integrity

Context is not a mode to be toggled. It is the substrate of work.

When an agent is forced to split its personality between chat and code, it loses effectiveness. It wastes tokens managing the format. It wastes memory tracking the switch.

By unifying the stream, we reduce cognitive load. The machine focuses on the problem. The human focuses on the solution.

Work is a stream. The format must honor the flow.