Federation

We GLOW
together.*

Every existing AI product makes you choose: own your data and learn alone, or share everything and benefit from collective learning. We refused that trade. Federation is the third path — sovereignty and compounding intelligence, by architecture.

The current landscape

Two options, both broken.

AI systems today fit neatly into two camps. Both make you give up something that shouldn't have to be given up.

Option 1

The Silo.

Run your own AI in isolation. Self-hosted models, personal RAG setups, ChatGPT's per-account memory, locally-cached knowledge bases. Your data is yours; nothing leaves the machine.

What you get

  • Complete data sovereignty.
  • Offline capability.
  • No vendor lock-in.

What you give up

  • You learn alone. Your AI never benefits from anyone else's experience. The wheel gets reinvented every time you ask a question someone else has already answered better.
  • No compounding intelligence. If a workflow pattern succeeds for ten thousand other users, your AI doesn't know. It has to discover the pattern from scratch on your machine.
  • Quality plateaus at your patience. The system gets smarter as fast as you can teach it. That's slow.
  • You become your own software-supply-chain officer. Want a new integration? Search online, vet a stranger's GitHub, audit their code, install at your own risk. Some of the most-installed packages on every major registry have turned out to be malicious — typosquats, post-acquisition payload swaps, plain old supply-chain attacks. Silos make every upgrade a security decision you weren't trained to make.
  • The developers who keep silos working are running on fumes. Open-source community projects rely on goodwill and donation buttons. Maintainers burn out. Critical infrastructure gets handed off to whoever volunteers — sometimes to the wrong person. The incentives don't pay for the work, so the work erodes. You're depending on a system whose economics are broken.

Silos are private and inert. Worse, they push every safety decision down to you, and they run on the goodwill of burnt-out maintainers.

Option 2

The Cloud Service.

Use a centralized AI provider. ChatGPT, Claude.ai, Gemini, Copilot, every major frontier API. The model is hosted by a company. Your conversations route through their servers. The model improves on their schedule.

What you get

  • Frontier-quality capability.
  • Improvements every release.
  • No infrastructure to run.

What you give up

  • You pay twice — once in money, once in data. The subscription buys you access. Your conversations, your queries, and the patterns of your work fund the model's next generation. You're billed and you're harvested. Most users never realize this is the structure.
  • You're locked into a system you don't control. Vendor sets the terms, the rate, the rate hikes, the deprecation schedule, the geographic availability, the rate limits, the data retention, the censorship lines. You don't get a vote. You can leave, but your accumulated context goes with the relationship.
  • The "secure and private" pitch is neither. Encrypted in transit, sure. Encrypted at rest with their key, sure. But "their key" is the exact opposite of secure-against-them. Their employees can be compelled, breached, suborned, or simply curious. Their servers are subject to whatever legal process reaches them. Whatever data-handling commitments they make today are revisable when the company gets acquired, sued, or short on quarterly revenue. The privacy posture is a marketing position, not a structural property.
  • Your data, by default. Your conversations and files become training material — sometimes by terms-of-service consent, sometimes by ambiguous data-handling, occasionally just by breach. The default posture is "we keep what you send us."
  • The improvement loop never closes for you. Your specific patterns get aggregated into a training set; you contribute. You never see the loop come back. Your AI doesn't get particularly better at your work — just generally better.
  • The model improves on their schedule, not yours. Your sessions reset between major versions. Your accumulated context evaporates when the company decides to ship a new generation.
  • You don't own anything. Not the model, not the memory, not the relationship. The product is a rental, structurally.

Cloud services improve, but you pay twice for that improvement and end up with neither real security nor real privacy in exchange. You're a paying training-data source; the relationship is structurally backwards.

The third path

Federation.

Each participant runs their own AI on their own hardware. Memory stays local. Conversations stay local. Files stay local. Sovereignty is total.

And the system still gets smarter — for you and for everyone — because anonymized patterns of what worked flow back to a shared learning layer. Not your content. Not your identity. Just the signal that "this kind of approach worked for this kind of task." That signal, multiplied across thousands of participants, becomes a tide that lifts every client on the network.

We named the company after the natural phenomenon for a reason. Alpenglow is the warm light reflected between peaks at dawn and dusk — each mountain illuminated by sunlight bouncing off the others. No peak glows alone. Each is warmer because the rest are warmer.

That's federation, made literal. We glow together.

How it works

Anonymized signals, not raw content.*

Federation runs as a regular cycle (weekly during BETA — the cadence is tuned for stability and anti-abuse). On each cycle, your client computes anonymized signals from what's worked and what hasn't, encrypts them, and contributes them to a shared pool. The pool aggregates across all participants and produces refinements that flow back to every client on the network.

What flows out

Aggregate signals.

Which models succeeded at which kinds of tasks. Which workflows produced confirmed-good outcomes. Which marketplace artifacts got installed and kept. Which integrations broke. Patterns of behavior, structurally decoupled from specific content.

What does NOT flow out

Your content.

Conversations. Documents. Files. Working memory. Names. Anything traceable to you, your account, or a specific exchange. None of it transmits during federation. None. That's not a policy, it's an architectural property.

What's emergent in the third path

Things only federation lets us do.

Your AI gets better at your work, specifically.

Federation doesn't just push generic intelligence downward. It surfaces the patterns relevant to people doing what you do — engineers benefit from engineering patterns, lawyers from legal patterns, researchers from research patterns. Your agent is shaped by the work it actually sees.

The marketplace becomes a meritocracy by usage.

Federation telemetry is what ranks marketplace artifacts. Things that work get surfaced; things that don't fade. Nobody — including us — can buy ranking. The signal comes from the user base.

Integrations heal themselves.

When a third-party API breaks, federation surfaces it within hours — across thousands of failed attempts at once. Publishers see the signal and patch fast. Your agent doesn't need to be the canary that finds the bug.

Improvement is continuous, not gated.

No "next major version" reset. No "we're retraining, check back in six months." The system gets better constantly, in increments small enough you don't notice individually but large enough that month-twelve Alpenglow is meaningfully different from month-one.

The collective future

A network that grows in your favor.

Today's AI economy runs one direction. Platforms harvest data from billions of people, train on it, and sell access back to the same people. The value created by participant activity flows upward — to model providers, ad networks, data brokers — and never returns to the people who created it.

Federation is the architecture to invert that. Not today. Today the network is small, and the federation signal stays inside it for everyone's benefit. But the ground is laid, and at scale, structurally different things become possible — none of them defaults, all of them participant-controlled:

  • Leverage over integrated data flows. Right now, when you connect Alpenglow to a third-party service, that service's terms apply — we don't control them. At network scale, that changes. Millions of sovereign agents represent collective weight that one participant alone doesn't have, and the terms participants can negotiate with upstream providers, third-party platforms, and data infrastructure look structurally different at that size.
  • Compensation returns for participants — opt-in only.* Today, federation signal stays internal to the platform; nothing is monetized. One day, with explicit participant opt-in, the network may support arrangements where value created by aggregated, anonymized signal flows back to the participants who chose to take part. The default stays full privacy and zero monetization. The opt-in is yours. The proceeds are yours.
  • Scale-only capabilities. Some things only become possible at network scale — kinds of federated learning, kinds of cross-participant integrity checks, kinds of safety signal that no single participant could generate alone — and federation is what makes them computable without ever centralizing raw data.

The point is structural. The bigger this network gets, the more leverage participants collectively have, and the more value flows into the network instead of out of it. That's a long-term frame, not a short-term promise. We're at the start. The scale is what we earn together.

The commitments

What federation will never become.

  • Not a data sale. Federation contributions are not sold, licensed, or shared with any third party. Not advertisers. Not data brokers. Not research firms. Not AI labs. Not the providers whose APIs you use.
  • Not a back door for surveillance. Federation contributions are anonymized at the device level before transmission. There is no key on our side that decrypts an individual's contribution back to their identity.
  • Not optional in name only. If you opt out of federation in Settings, your client genuinely stops contributing. You still benefit from improvements other people produce; you just don't contribute back. Flip it any time.
  • Not training data for an external model. Federation feeds back into Alpenglow itself. We don't bundle contributions into training sets for upstream providers.
  • Not an attention surface. No advertising, no recommendation feed, no engagement metric. Federation is the substrate getting smarter, full stop.

For the precise specification — what gets observed, what doesn't, when contributions are deleted, how anonymization works — see privacy.

The bigger frame

A different default for AI.

For most of computing's history, the trade has been: keep your stuff private and build alone, or join the network and accept the surveillance. The web inherited that trade. Cloud AI doubled down on it.

Federation is what happens when you refuse the trade and build the math to back it up. Sovereignty by architecture. Collective intelligence by architecture. Both, at once, because neither alone is enough.

Alpenglow is the first product where this is the default, not the exception.

Glow with us.

Get on the BETA. Run your own client. Watch it get smarter — without ever sending us anything we shouldn't have.

Join the BETA