Omar Shahine has a personal AI agent named Lobster. It tracks his packages across 25 different carriers. It triages his email by urgency before he opens his inbox. It warned him about an early-morning meeting the night before. It coordinated a Super Bowl watch party with his family. It has nine sub-agents running in parallel, doing different things simultaneously, and it writes its own diary entries.
Shahine is not a hobbyist blogger. He is a corporate vice president at Microsoft, the former leader of Microsoft Word, and as of March 2026, the person running Microsoft's push to bring that same kind of always-on, task-completing AI agent to 400 million Microsoft 365 users. His personal agent is the demo.
Microsoft confirmed this week that it is building autonomous AI agents, software that acts on your behalf without waiting to be asked, into its Microsoft 365 Copilot product, the suite of AI tools embedded in Word, Excel, Teams, and Outlook. The company created a new team called Ocean 11, led by Shahine, to build what he describes as "a new generation of proactive assistants, ones that lighten your load by taking on tasks end-to-end, and that they can also step in proactively when they can help." The earliest public preview is expected at Microsoft's Build conference in June.
The announcement was framed as exploration. The blog posts are not. Shahine has spent the past eight weeks documenting Lobster's development in public: nine dedicated agents, 34 software updates, a changelog, screenshots of real iMessage conversations, and a multi-agent architecture where family members each get restricted sub-agents with no access to his private email or financial data. He wrote: "The harness matters as much as the model. Sessions, memory, channels: these are the OS."
What Shahine is describing, always-on agents that monitor your calendar, act on your email, coordinate across your apps, and operate with persistent memory and a sense of identity, is exactly what Microsoft is now planning to sell to enterprise customers. And it is already live in one form inside Microsoft's own products. There is a fully integrated OpenClaw plugin for Microsoft Teams, enabling direct messages, group chats, or channel conversations with an OpenClaw agent running inside Microsoft's own infrastructure.
OpenClaw is the open-source agent framework at the center of this shift. Created by developer Peter Steinberger, who joined OpenAI in February 2026 to work on bringing agents to everyone, OpenClaw lets software operate autonomously on a machine: managing files, sending emails, browsing the web, automating workflows across existing applications. It has accumulated more than 354,000 stars on GitHub, been forked more than 70,000 times, and generated nearly 50,000 related code repositories. More than 44,000 agent skills, the modular capability packs that extend what an agent can do, are listed on ClawHub as of April 2026. That is a developer community voting with its attention in a way that is hard to dismiss.
The enterprise world is taking notice. Tencent launched its own OpenClaw product suite in March: QClaw for individual users, Lighthouse for developers, and WorkBuddy for enterprises, with WorkBuddy supporting more than 20 skill packages and the Model Context Protocol, the open standard that lets different AI systems talk to each other. Alibaba Cloud, Moonshot, and Xiaomi have released OpenClaw-compatible applications. Nvidia built NemoClaw, an enterprise-grade security layer on top of OpenClaw, with Adobe, IBM's Red Hat, and Box signaling interest. These are not startups guessing at a market. They are infrastructure companies placing bets on a specific technical architecture.
What Microsoft is attempting is a specific kind of integration: not just embedding a chatbot into Office, but giving AI agents persistent memory, cross-application coordination, and the ability to operate over hours or days rather than in single question-and-answer exchanges. The company has already incorporated technology from Anthropic's Claude Cowork into Copilot, enabling long-running, multi-step work that executes across apps and files over time. The distinction matters. A chatbot responds when asked. An agent acts before it is asked, then reports what it did.
This is where Shahine's framing is useful. He describes his agent's security model as raising a teenager: start with strict rules, watch how they handle small responsibilities, and gradually grant more autonomy based on trust. That is not a metaphor for a product demo. It is a description of how enterprises will actually need to deploy and govern these systems: with explicit permission boundaries, audit logs, and revocation controls that most organizations have not yet built.
The governance gap is real. OpenClaw has been shadowed by security incidents: vulnerabilities that let agents download data they should not have accessed, and attack patterns like prompt injection that can manipulate agent behavior through ordinary-looking inputs. Nvidia's NemoClaw exists precisely because enterprise buyers want security controls that the base OpenClaw platform does not provide by default. Microsoft will have to solve this before IT departments and legal teams sign off on always-on agents that have access to every email, every document, and every conversation in a Microsoft 365 tenant.
The OpenClaw ecosystem itself is in a period of structural transition. Steinberger's move to OpenAI means the framework's original creator is now inside one of the major model providers, while the framework itself transitions to an independent foundation. How that governance structure evolves will determine whether OpenClaw remains genuinely open or becomes another proprietary layer in an increasingly crowded AI infrastructure stack.
What Shahine's blog posts describe is the consumer proof of concept for what Microsoft is now architecting for the enterprise. Whether that translation from a personal assistant running on one executive's laptop to a managed, governed, enterprise-deployable product can happen cleanly, and on the timeline Microsoft needs, is the question that Build in June will start to answer.