This number of #NEARVISION will be about Governance and Primitives of the protocol. Governance defines how the protocol is updated (“Technical Governance”) and how its resources are allocated (“Resource Governance”). Technical governance generally includes fixing bugs, updating system parameters and rolling out larger scale changes in the fundamental technology of the protocol.

NEAR insights
17 min readAug 24, 2021


Governance defines how the protocol is updated (“Technical Governance”) and how its resources are allocated (“Resource Governance”). Technical governance generally includes fixing bugs, updating system parameters and rolling out larger scale changes in the fundamental technology of the protocol. Resource governance generally includes the allocation of grant funding from community-initiated sources (like the allocation provided to the foundation).

Technical governance is particularly complex because of the required coordination between potentially thousands of independent node operators around the world. Each of those nodes must go through the upgrade process in order to participate in the newest version of the network. Any who do not may end up (attempting to start) running a separate chain. Thus it is important that the upgrade process is smooth and that the nodes it affects buy into the decisions that have been made.

Many protocols perform the decision-making aspect of governance “off chain”, meaning it occurs in text channels, in-person and via phone calls where key stakeholders or their representatives decide on the best course of action. This uses the dynamic nature of fluid human communication to debug major issues but is also subject to all the challenges of having big personalities and soft power structures in place.

Other protocols lean heavily on “on chain” governance, where decisions are explicitly made by the holders of the protocol’s key resources (eg tokens) via an online voting mechanism. This provides explicit clarity around decision making and rollouts but suffers from the need to very clearly specify each case. It also has potential problems arising from a lack of “human common sense” around some decisions and is therefore vulnerable to certain attacks that an off-chain process would not be.

Governance Design Principles

NEAR Governance Principles

Here is how NEAR’s core design principles apply to governance:

  1. Usability: Governance processes should be clear and understandable. Mechanisms for active participation and for voting (where available) should be simple and straightforward. Governance should be effective and efficient so it arrives at decisions quickly and implements them efficiently. The community of stakeholders should have sufficient voice that they support the legitimacy of decisions and do not exit or fork the platform.
  2. Scalability: Governance should scale as the scope and complexity of the platform itself grows, as the diversity of its stakeholders increases and as the breadth of participation expands.
  3. Simplicity: The most robust processes tend to be the simplest so good governance should avoid overengineering processes and acknowledge that often human-to-human communication is the simplest approach.
  4. Sustainable Decentralization: Governance should allow participation from the full breadth of stakeholders in the platform but be resilient against capture by any one of these over time.

It is important that governance design balances between efficiency and resiliency. Decisions must be made and implemented efficiently if a technical platform is to continue to evolve sufficiently to provide the best value for its stakeholders but that platform must ensure that it can not be captured over time by a particular group of stakeholders.


NEAR’s governance is designed to provide for efficient improvement of the protocol while allowing the community sufficient voice and oversight in order to ensure the protocol maintains its independence. The long term goals are to combine community led innovation with effective decision making and execution and to receive proper representation from each of the key stakeholder roles in the network.

For example, the NEAR community initially includes token holders, validators, application developers, protocol developers, community leaders and more. Each of these stakeholders has a different set of views, opinions and inputs to various key issues.

Having proper representation means that decisions will require deliberation and discussion, which can slow down the necessary evolution of the protocol if left unchecked. To maintain a bias for efficient execution, a highly qualified entity is needed to maintain the Reference Implementation of core protocol code. This maintainer, who is called the Reference Maintainer, should be selected and overseen by the community.

Initially, governance activities are coordinated by the NEAR Foundation, an independent nonprofit entity whose mission is well-aligned with improving the long term usefulness of the ecosystem. These activities include maintaining oversight over the Reference Maintainer, supporting the build up of the governance coordination tools, certain token distributions and laying the groundwork for community operated governance.


As a decentralized network, no single entity can ever force changes to the full NEAR network. Any changes made to the reference code base by its core contributors must be individually accepted by the nodes who are running the network.

It is still important to understand the core process which is used to push changes to the reference code base because these are most likely to represent the will of the community and thus receive acceptance by the network nodes which form part of that community.

NEAR’s governance defines a Reference Maintainer, which is an entity responsible for technical upgrades to the NEAR network. This entity has been selected to maintain the Reference Implementation and continue to suggest improvements on the specification. All major releases will be protected with community discussion and a veto process (a 2 week challenge period), while smaller bug fixes can be rolled out fast and delivered to node operators.

Initially, the Maintainer is selected by the Foundation Board and serves until the board votes to replace them. Over time, oversight of the Maintainer will be performed through a community-representing election process.


Resources provided by the network itself to the Protocol Treasury are governed and distributed by the NEAR Foundation. This foundation operates independently and will provide structured and transparent funding for projects and activities that are deemed to be most helpful to the ongoing health of the protocol’s ecosystem. This may include technical projects (like the Reference Maintainer) and nontechnical projects or initiatives that support the commons and the community at large.


With the emergence of every new computing paradigm, there is a significant amount of uncertainty about exactly how it can be most effectively utilized and what it means for the future of innovation. This time is no different.

A blockchain-based application platform like NEAR combines two existing cloud services — compute and storage — in a trustless and permissionless manner. The combination of these services in this way creates a set of brand new primitives which can be used to build new applications, new value chains and new businesses.

With something so new, it is generally best to start at this fundamental level in order to understand what it can — and also what it should not — be used for. It is important to acknowledge that the universe of possibilities which is created when new primitives and new value chains are introduced cannot be fully known during the early phases of the technology’s introduction. Few, for example, could have predicted how the camera+GPS in everyone’s pocket has changed the world when the first smartphones came out in the early 2000’s.

The following sections will explore what the technology is good for, what it is not good for, and many of the specific primitives that it enables today. They also provide a clear mental model for understanding when blockchain should be applied and when it should be avoided. Discussion of the future is left for the concluding section of this paper.

Technology creates primitives which enable use cases which are implemented by applications.

What the Technology is *Not* Best At

Before discussing primitives, it is important to acknowledge what this technology is *not* best for in order to dispel some persistent misconceptions.

A community-operated cloud like NEAR is neither inexpensive nor fast relative to existing compute and storage solutions alone. This is by definition — the specific benefits of using a community-operated cloud are that it leverages redundancy in both computation and storage to create the security that provides the network with its greatest benefits.

This can be understood on the most basic level by examining how these networks function — by aggregating compute and storage across a number of individual nodes.

If the network is made up of one shard containing 100 nodes and each node is running on its own individual hardware in parallel with the others, by definition running computation on the network will cost at least 100x more than running a single piece of hardware and it will be slower by an order of magnitude relative to the time it takes to communicate between these nodes on the network.

Similarly, while the storage capacity of the network is theoretically infinite, it is practically limited by the rate and cost at which new validating nodes can be added to the network. The storage capacity for each shard is fixed at a level which enables new validators to participate in the network and to sync with a new shard in time for each new shuffling action. Since each of the validators within a shard replicate its storage, a new shard must be created in order to add new storage capacity to the network. Each new shard requires the addition of a new set of validating nodes. Using the example above, this means that adding new storage capacity to the network will require bringing in another 100 validating nodes, whether as brand new validators or by enabling existing validators to span a greater number of shards. The economics of storage on the network must reflect this reality so it will always cost at least multiples more than running a single piece of new storage hardware.

Advances in storage technology do allow for some optimizations on how much storage is required but do not remove the fundamental reality of these economics.

Thus it is impossible for a blockchain-based system to claim to be faster or cheaper than a centralized cloud computing system like Amazon’s AWS or Google’s Cloud Provider for traditional compute or storage use cases. Applications which require optimization along these dimensions are *not* the use cases that are best enabled by the initial implementation of the NEAR platform.

New Benefits of the Technology

Before examining the specific primitives which are enabled by the platform, we examine the fundamental benefits that are provided by combining compute and storage in a permissionless and trustless way via the community cloud.

This high-level conceptual understanding is important because it moves us beyond simply asking how blockchain can do *existing* tasks cheaper or faster than current cloud architectures and into the realm of what *new* use cases are enabled.

New benefits of the Technology


Decentralization can be thought of as a spectrum along an axis which represents how many actors must be corrupted in order to compromise the system. On one end, most of today’s systems (including today’s cloud compute and storage solutions) have a centralization factor of one: by holding the access keys of a single player, data on that system can be arbitrarily modified. On the other end is a fully decentralized system, where one must corrupt dozens, hundreds or even thousands of actors in order to corrupt the underlying data.

Many of today’s blockchain-based systems are actually fairly centralized, whether by design or because their systems incentivizes pooling or delegation in a way which creates a small number of very powerful players or because their governance processes are easily captured.

As previously discussed, there is an additional cost associated with the redundancy provided by decentralization and thus the strongest use cases are those which benefit sufficiently from what this decentralization provides.

Most of the data which is stored in today’s applications is low value and high volume. Decentralization is most important in cases where the data being stored is highly sensitive or vulnerable to censorship, theft, corruption or other forms of modification.

In particular, this means data which represents digital money, identity and asset ownership benefit the most from storage on the NEAR platform.


The NEAR platform allows applications to access the same pool of shared data which makes it easy for multiple applications to share state with each other. This is different from traditional web applications, where each application typically stores its data in a proprietary database and these databases typically do not provide easy communication between each other.

The state which is shared across applications can be any of the data types previously mentioned — digital currency, identity, assets and more. This data is cryptographically secured by default so only applications which have the user’s permission can modify their data. Because the users effectively own their own data, they are able to modify — or transfer — it without the permission of a third party application.

This means that not only is NEAR able to store high value data like monetary assets but users and applications can also transfer those assets easily between each other in a way which isn’t possible using today’s platforms.

It should also be noted that data can still be encrypted and protected to preserve security and privacy.


In addition to its core benefits, NEAR gives developers access to a distributed architecture that typically takes months of setup to accomplish. This includes partition-resilient networking, a high-availability database and internationally distributed endpoints.

Developers also gain easy access to a number of primitives which typically require substantial development effort to implement in the current web world. As one example, they have native access to cryptographic primitives, which allows for sensitive use cases. As another, their applications have easy access to Single Sign On (SSO) for all users of the platform so those users experience little to no friction when trying out new applications.

Native Primitives

Native Primitives

Now that we’ve introduced some of the high level benefits of combining compute and storage into a single decentralized and shared data layer, let’s move into what new primitives that enables. Primitives are the fundamental building blocks of use cases.

These primitives fall into the following categories:

  1. Assets: Assets of all types (from money to data) are now digitally native, meaning they are verifiably unique, individually owned and completely programmable.
  2. Accounts: Every actor in the ecosystem has an account which gives them secure storage for their assets, an easy way to verify their identity to applications and an accumulation of reputation over time
  3. Transactions: Because assets are digitally native and accounts are part of the global pool, programmable transactions between parties are simple, cheap, secure and near-instant.
  4. Verification: Because NEAR’s storage is an immutable public ledger, data and code that are saved to the platform are publicly verifiable for both timing and content.

We’ll examine each of these in greater depth in the following sections.


Asset Primitives

Assets are now digitally native, verifiably unique and fully programmable. This can be used to provide the benefits of digitization to existing assets or the creation of new categories entirely.

For example, whereas money used to be a single one-dimensional commodity, now it is fully programmable — everything from the terms of its issuance to conditions of its use can be baked directly into the asset itself.

Asset primitives include:

  1. Programmatic Ownership: Each account has verifiable control of its money, its digital goods and its data (which can represent real-world things like identification) as well as the ability to programmatically determine (or split) that ownership.
  2. Digital Uniqueness: A digital asset can be 100% unique, representing anything from a specific fungible coin to a completely nonfungible token.
  3. Programmable Assets: Programmatic asset creation, evolution and destruction


Account Primitives

Every actor in the ecosystem, whether human, contract or device is treated as having a top-level account. Treating each of these as first class citizens of the platform allows for both old-world identity and new styles of interaction between autonomous or semi-autonomous actors.

Account primitives include:

  1. Autonomous: Everything on the chain gets assigned to an account. Accounts can represent a person, company, app or even a thing (eg a refrigerator). Accounts are each first-class citizens regardless.
  2. Single Sign-on (SSO): One account works across the whole world of apps and anything else that wants to tie into the chain.
  3. Reputation/History: Every account’s transaction history gives it reputation which can be used across services.


Transactions Primitives

The provision of Asset and Account primitives in the same shared data pool makes it trivially simple to create seamless interactions between those elements in ways which are almost impossible outside of the digitally native medium. The most commonly cited use case involves permissionless peer-to-peer transfer of money but this applies more broadly to any kind of digital asset.

Transaction primitives include:

  1. Direct: Transfers can be made directly between accounts without requiring the code or permission of a specific application, allowing peer-to-peer marketplaces or transfers.
  2. Instant: Financial and digital-good transactions have finality in seconds and do not require long waiting periods for confirmation.
  3. Micro: Negligible fees make high frequency or small quantity transfers significantly more viable than current financial infrastructure allows.
  4. Conditional: A smart contract can easily add logic to transactions, for example to create conditional escrow or time-based releases of currency or data.


Verification Primitives

The immutable public ledger used to store data on NEAR creates both a verifiable record of what has occurred in the past and a verifiable repository of the code which is being run behind particular applications. This can be used in a number of creative ways from the small (is a dice game actually using randomness?) to the large (creating audit trails for supply chains).

Verification primitives include:

  1. Checkpointing: Cryptographic timestamping means it is easy to store verifiable checkpoints of state at a particular time, allowing applications to verify the authenticity or occurence of previous activity.
  2. Verify Process Integrity: The code which runs apps deployed to the platform can be verified in a way that current server-side code cannot.

Combinatorial Primitives

Combinatorial Primitives

While it isn’t hard to examine the low-level primitives which are natively provided by the technology of the platform, perhaps the most exciting possibilities come from combining multiple primitives to create higher level primitives. While many of these will be discovered over time, some examples include:

  1. Permissionless Markets: Most markets today require permission from someone in order to function, for example the provider of the marketplace where activity occurs. The combination of multiple low-level primitives disintermediates this control and allows permissionless markets to flourish in places where there was no room to operate previously. This requires the combination of:
  2. A native medium of exchange (to transact in) and unit of account (to price in).
    Note that dynamic cross-currency conversion can make this easier across currencies but users generally still prefer to have a single schelling point currency.
  3. Verifiable ownership of an asset
  4. Peer-to-peer/permissionless transfer of the asset
  5. A censorship-resistant marketplace application (which provides discoverability, matching and pricing)

2. Derivative Assets: While it’s significant to provide censorship-resistant asset storage in the first place, combining multiple low-level primitives allows us to create an infinite variety of new assets which combine existing assets, transactions and logic to meet the risk management needs of anyone so inclined to use them. This requires the combination of:

  1. Verifiable ownership of an asset
  2. Programmatic escrow of an asset
  3. Programmatic rights transfer

3. Open State Components: Any app has access (when granted) to the shared pool of state data, whether in regards to specific assets or users of the platform. This allows components to operate as microservices do in today’s applications — performing specific functions that can be composed together to achieve larger business goals. Because they are public, competition will ensure that the best of these achieve usage. This requires the combination of:

  1. A shared data pool
  2. User-sovereign data
  3. Verifiable process integrity

The Future

The future, like the Internet itself, is infinitely configurable. We don’t know what it will look like but we can both identify some of the key forces which will govern its path and predict some of the major tools that will take us there.

Privacy: By default, activity and data on the blockchain is done in plain sight. The essence of privacy, however, is choice — whether their activity should be made transparent or hidden from view. Though the default technical tooling doesn’t provide this privacy protection, a number of solutions applied on top of it make this possible.

In the weak form, data can be encrypted before writing to the chain. This generally protects the integrity of the data itself but still leaves transactions vulnerable to tracking and good analysis can often piece together what actually occurred. Thus new technologies like zero knowledge proofs hold an interesting opportunity to make not just data private but also the very computations that modify it.

This technology isn’t baked into the NEAR platform day 1 but, should the community drive for it, it is possible to implement.

Private Shards: Not all blockchain use cases demand the full security and protection of the public chain for each transaction. Sometimes a consortium of users or even a single entity would prefer to run their own chain, where they control all of the validation and periodically checkpoint back to the main chain for security or verification and communication of activity. In this case, particular shards could potentially be configured to use a special predetermined validator set, thus making them “private”.

Mobile Nodes: Sharding is a horizontal scaling technology where the total processing power of the network is proportional to the number of CPUs attached to it. In simpler terms, the more devices which are supporting the community cloud by participating in the validation process, the more transaction throughput the network can handle.

There are billions of devices with viable CPUs spread across the world. The network will be able to achieve extraordinary scalability by tapping into even just a fraction of the more robust of these devices. But should additional capacity be required, the requirements for running nodes could be adjusted such that even mobile devices could participate. While the engineering tradeoffs are important, this could provide access to another billion nodes running in everyone’s pockets and is thus an interesting area for future exploration.

Internet of Things (IoT): IoT devices are an even more specialized case than mobile devices because they represent the lowest processing power and the highest number of available CPUs.

Composable Components (The Open Web): It starts with a global, free currency and continues with unkillable applications but, eventually, the dream of the open web becomes one where all the available applications can be easily assembled to create new functionality. Consider the hardware analogy of what the GPS, camera and an internet connection of the modern smartphone have unlocked and apply the fluidity of software to it. There’s no telling how rapidly innovation can occur in a world where this is possible. With NEAR’s global state accessible to all applications, this future will become reality.

What’s Next?

Take the first step! Stay in touch by joining the mailing list at

Development of the protocol is open source at and you can learn more about how the code works plus see examples of what you can build at You can ask questions at or on Stack Overflow at

You can visit NEAR Mates, NEAR insight, NEARians for daily news, infographics about NEAR Ecosystem, consider give us a follow to stick to the fastest updates.

Regardless of your experience level or skills, there is a way for you to participate so please join the journey and help NEAR build the future.

And that’s is the end of the post, what do you guys think? Comments down below and let us know.

FOLLOW US ON SOCIAL MEDIA: NEAR insight (@insight_near) / Twitter


In the next episode, we’ll deep dive in “Nightshade”, the key Technology of NEAR Protocol, stay tune!



NEAR insights

Insights, news and information on NEAR Ecosystem, focusing on NEAR Protocol’s Technology