Having a gander at Goose
Today I’ve been having a play with Goose, a new AI agent that you can run on your local machine. No subscriptions, highly configurable and extensible, and open source.
I’d been planning on getting around to setting up Claude Cowork, but I really want to stick to my local and open first principle, so Goose sounded like a great alternative.
A few weeks ago I tried playing with OpenClaw, getting it to run a local model instead of burning through 120,000 tokens for every request, but it didn’t work. Goose feels similar but there’s more control over what it can do on your machine.
Setting it up
Setting up Goose and getting it connected to LM Studio was super easy, thanks to native support for LM Studio. This is made possible through OpenAI’s AI server API standard – another win for standards.
Goose has a bunch of in-built extensions that bring agent capabilities. Stuff like general development tools that are useful for software development, tools for webscraping, file caching, and automations, to-do lists and sub-agents.
Adding more tools and data
You can add your own extensions through the MCP server standard (thanks, Anthropic). I’ve started using Colin Devroe’s Signboard to manage tasks and projects, as it’s built on the ‘file over app’ principle meaning everything is stored as Markdown. It comes with its own MCP server, so I followed the MCP_README to get it up and running with Goose.
That was surprisingly easy too. Mostly click-ops, two lines updated on a config, and that was it. Goose could see my project boards, read my task list, move things around, add stuff. There’s the potential for automated task management emerging here.
Next I hooked it up to my highlights and reading list through the Readwise MCP. One of the tasks on my board is to ‘read about rhizomatic learning’, and I’ve saved five posts about that. I asked Goose to look at them and tell me which to read first for a base understanding and which I should read next. Then I told it to plonk that ordered list in the card on my board, which it did.
Straight away I started looking at what other MCP servers exist, for example, whether I can hook it up to my calendar or email. A new world of automation is opening up, but all of it’s happening on my local machine, securely, for free, using energy from my supplier (with a heavy mix of renewables).
Markdown and standard config files make all of this so easy to implement too. Configuring APIs to pass data around from different cloud services is such a faff, but I can do this all locally. In fact, if you pass README files and config examples to Goose, it’ll use whatever model is loaded to figure out how to implement the extension for you. It feels really accessible.
So what?
In the early 2010s, SaaS products with extensible APIs were exciting. Then everyone removed or limited their APIs, everyone stuck their prices up, and you had to be picky about which things you subscribed to.
It’s the same with AI subscriptions now. I know some people have two or three, but it’s possible to have none if you run everything locally.
It’s kind of exciting. We’re in a place where it’s easy enough for relatively technical people to build their own boring tiny tools. For free.
Note: Last week I went to Goose in Worthing, an independent pub on the seafront. It gets the sun all day and is the perfect spot for a sunny pint – hence the OpenGraph image on this post!
