fed_AI: A Federated Approach to Accessible, Local, and Trustworthy AI

fed_AI is an attempt to rethink how artificial intelligence services are built, deployed, and accessed. Instead of concentrating capability in a handful of large data centres owned by a few companies, fed_AI proposes a federated network of independently operated machines that collectively provide AI services. The goal is not to compete head-on with hyperscalers on raw model size, but to offer a practical alternative that prioritises accessibility, resilience, cost transparency, and user control. At its core, fed_AI separates the system into roles. Some machines act as routers, responsible for receiving requests, selecting appropriate compute resources, and coordinating execution. Other machines act as nodes, running AI workloads based on their hardware capacity. This distinction allows low-spec devices to participate meaningfully alongside more powerful systems. A small server, desktop PC, or even repurposed hardware can contribute, as long as it can perform a clearly defined task reliably.

What fed_AI hopes to achieve

The first objective is decentralisation with intent. fed_AI is not decentralised for ideological purity alone. It aims to reduce single points of failure, vendor lock-in, and opaque pricing models. By distributing workloads across many independent operators, the network becomes harder to censor, harder to monopolise, and more adaptable to local needs.

The second objective is practical affordability. Many users and small organisations cannot justify ongoing subscriptions to proprietary AI platforms, especially when usage is sporadic or highly variable. fed_AI is designed around pay-as-you-go principles, with an emphasis on micro-payments and peer-to-peer settlement. This allows costs to more closely track actual usage, rather than forcing users into fixed plans that assume constant demand.

The third objective is locality and data minimisation. In a federated model, requests can be routed to nearby or trusted nodes. This reduces latency and, more importantly, reduces unnecessary data exposure. Sensitive inputs do not need to traverse global infrastructures if a local or community-run node can perform the task just as well.

Who could benefit

Independent developers and startups are a clear beneficiary. fed_AI lowers the barrier to experimentation by allowing developers to deploy AI-powered features without committing to a single vendor or architectural model. Because nodes are pluggable, developers can choose models and capabilities that suit their use case rather than defaulting to whatever a large provider offers.

Communities and cooperatives are another key audience. Local groups could operate their own fed_AI infrastructure to provide services to members, such as document analysis, translation, summarisation, or image generation, while retaining governance over how the system is used.

Researchers and educators can benefit from a system that encourages transparency and inspection. Unlike closed platforms, a federated network makes it easier to study performance, bias, and failure modes across diverse hardware and operators.

Finally, individuals with spare compute resources can participate as node operators. Instead of leaving capable machines idle, they can contribute capacity and receive compensation in return. This helps rebalance value away from centralised providers and towards participants at the edge.

Why it should be implemented

The current AI landscape optimises for scale, not necessarily for trust, resilience, or long-term sustainability. Centralised models create dependencies that are difficult to unwind once embedded. fed_AI offers a way to explore a different equilibrium, one where capability is shared, incentives are aligned with contribution, and failure in one part of the system does not compromise the whole.

Importantly, fed_AI is designed to integrate with decentralised identity and discovery mechanisms, drawing inspiration from ecosystems such as Nostr. This makes service discovery, routing, and trust establishment possible without relying on traditional DNS or central registries.

fed_AI is not a silver bullet, and it does not claim to replace large-scale AI providers. Instead, it fills a growing gap between local needs and global platforms. By making AI more modular, federated, and economically fair, fed_AI represents a practical step toward an AI ecosystem that serves users and communities, not just platforms.

Download or inspect the code at -> https://github.com/imattau/fed_AI

Follow @fed_AI @fed_AI


No comments yet.