We use cookies to understand how you use this site and improve your experience.

Alexandru Mareș@allemaar
Alexandru Mareș
  1. Home
  2. Writing
  3. The Moment Ai Stopped Being A Tool
Email
RSS
YounndAIYou and AI, unifiedBuilt withNollamaNollama

The Moment AI Stopped Being a Tool

Three silver-glass plates labeled Computer Use, Agentforce, and Operator on an off-white paper baseline, joined by a single unbroken amber thread labeled OPERATE.
References
  1. articleAnthropic (2024). Introducing computer use, a new Claude 3.5 Sonnet, and Claude 3.5 Haiku
  2. articleSalesforce (2024).
Salesforce's Agentforce Is Here: Trusted, Autonomous AI Agents to Scale Your Workforce
  • articleOpenAI (2025). Introducing Operator
  • articleGartner (2025). Gartner Predicts 40% of Enterprise Apps Will Feature Task-Specific AI Agents by 2026, Up from Less Than 5% in 2025
  • bookHannah Arendt (1958). The Human Condition
  • paperAlexandru Mares (2026). Elastic Automators: A Diagnostic Vocabulary for Language-Model-Driven Workflow Systems
  • Alexandru Mareș

    On this page

    • A small cohort, one verb
    • When a thing acts
    • The verb, and only the verb
    • Why this moment matters
    NextThe 100x Cut Nobody Saw Coming
    Also on
    Related
    The 100x Cut Nobody Saw Coming01/05/2026The Chinese Room Has a New Tenant29/04/2026Elastic Automators: Why Most "AI" Is Not Intelligence26/04/2026
    Published05/05/2026
    Read time3 min
    Topics
    GeneralAIAgentic AI
    Actions
    00
    Comments

    Loading comments…

    Leave a comment

    0/2000

    The cursor moved across my screen. My hand wasn't on the mouse. Nothing on the desk had moved. The pointer slid across an open browser, found a button, and clicked it — then it kept going. It opened a tab, typed an address, scrolled, and a form filled itself out, field by field. I watched a small task get done by nobody, and for a few seconds I forgot what I was looking at.

    That moment is not a capability surprise. It is a category surprise. The same kind of machine that had been answering my questions for two years stopped waiting for them — and eighteen months later, in 2026, the scene above is normal.

    A small cohort, one verb

    Back in October 2024, Anthropic shipped Computer Use. The model looks at the screen, moves the cursor, clicks, and types. Salesforce shipped Agentforce within the same week; it began running CRM workflows on its own. A few months later, in January 2025, OpenAI shipped Operator — a research preview with its own browser, filling forms and placing orders. Three companies. Three surfaces. One verb.

    A year and a half on, Gartner forecasts that forty percent of enterprise applications will feature task-specific AI agents by year-end 2026, up from less than five percent in 2025 — the press release ran on August 26, 2025. The cohort is small. Eighteen months on, the verb they shipped is everywhere inside the building.

    When a thing acts

    When a thing acts, somebody owns the consequence. With a tool, that's the user. A hammer doesn't make a mistake — the carpenter does. A compiler doesn't ship a bug — the engineer does. With an actor, the chain breaks somewhere in the middle of the action, and the question of who is at fault gets harder to answer cleanly.

    Hannah Arendt drew a line through this in 1958, in The Human Condition. She split human activity into three. Labor keeps the body alive. Work makes the durable thing. Action is the one that begins something new in the world. Eighteen months ago, software did labor and work — it computed, it rendered, it stored. It didn't act. Now part of it does. That changes who owns what when it goes wrong.

    The verb, and only the verb

    AI didn't get smarter eighteen months ago. The capability curve kept climbing the way it had been climbing. What changed sat one level up. The verb changed. From query to operate. From answer the question to perform the action. From look up the address to book the flight. That is a different category of thing.

    This is the category I named elastic automation. What people call AI is automation flexible enough to negotiate language — and now, flexible enough to negotiate outcomes. I have watched the same loop run inside my own work for a year and a half. It reads context, picks a move, does the move, checks the result, and picks the next one. Same grammar. Larger stage now.

    Once the grammar moves, the risk calculus moves with it. Decision-making is no longer downstream of the user; it is co-located with the system. Responsibility attribution becomes a design problem, not a courtesy footnote. The systems we ship now have to declare what they will operate on, what they will not, and where the human is in the loop on purpose, not by accident.

    Why this moment matters

    That was the moment. Three companies. One verb. Eighteen months on, it's everywhere. The grammar changed first. The risk calculus followed. A tool does not act. An actor does. That is the shift.