PUP-4: Adjust DAOAllocation to Limit Service Node Counts

Note: this proposal has been edited with the consent of the author (Andrew Nguyen) because an alternate solution achieved rough consensus in the thread below. You can view the original proposal by clicking the pencil icon in the top right.

Attributes

  • Author(s): Andrew Nguyen, Adam Liposky
  • Parameter: DAOAllocation
  • Current Value: 10%
  • New Value: grant the Foundation permission to modify the value according to the schedule detailed below

Summary

Re-allocate servicer rewards to the DAO as the node count approaches 5,000, to prevent P2P scalability from affecting reliability of service.

Abstract

Even with the impending separation of validators/servicers in RC 0.6.0, meaning consensus scalability is no longer a concern, Tendermint has P2P scalability issues that are likely to cause degradation of service beyond 5,000 nodes. Therefore, to maintain network reliability as we approach 5,000 nodes, we recommend re-allocating servicer rewards to the DAO, to limit the incentives of spinning up new nodes beyond 5,000.

Motivation

We all strive for Pocket to have as many nodes as possible, but there are technical challenges to be addressed on the road to that vision. These are going to be addressed with future features that will be shared soon, but the recent explosion in node counts is an unexpected trend that we will need to counteract with incentives in order to allow for more time to build these solutions.

Solution

To be clear, the curve would stop beyond 5,000, meaning the floor of service rewards would be roughly 9% to ensure that nodes donā€™t stop breaking even.

The Foundation would have the discretion to make the changes to the DAO Allocation when it sees fit, as long as those changes are in line with the curve above, and will not need to communicate these changes until after the fact. These changes will be announced in the Foundation category of the forum and the #dao-hq channel in Discord.

This method of implementation will avoid the collective gaming issues raised by @Garandor.

Rationale

Currently in the bootstrapping phase of Pocket Network, nodes are receiving bootstrapping-like rewards (very high) to incentivize new nodes to join the network. Well, it seems these rewards did a very good job, so now we need to adjust the incentives.

There are many economic knobs that can be turned to curb the growth of nodes:

  1. Reducing RelaysToTokenMultiplier
  2. Increasing the ProposerAllocation
  3. Upping the minimum stake for Validators
  4. Increasing the DAOAllocation

1 was the original proposal. Because weā€™re early in the protocol, some felt that it would be better for the ecosystem as a whole to maintain inflation at current levels so that newcomers have the chance to catch up if theyā€™re effective node runners and the DAO has more funds to reward future contributors with.

One counter-argument to this is that the high inflation is what is incentivizing node runners to run as many nodes as possible, because they want to minimize their dilution:

2 will have little effect, because 1 node with 3X stake = 3 nodes with X stake in terms of block reward. And even if it did, we would be promoting centralization of the network, which is antithetical to our goals.

3 is untenable because it would force unstake anyone who is below the new minimum, burning their entire stake. Weā€™d therefore need to have 100% perfect coordination to ensure everyone is able to top-up their stake, which is unlikely. The risk that even one node runner would lose their stake is not something we should take lightly.

Which leaves 4, the solution that is most incentive-aligned with our goals, and which has the added benefit of increasing the DAOā€™s treasury, which it can re-invest in the growth of the ecosystem.

Dissenting Opinions

Copyright

Copyright and related rights waived via CC0.

Side note to voters reading this, check out the List of Governable Parameters if you need a refresher on what each parameter means.

Iā€™d be interested in discussing more reasons for/against reducing the mint rate vs these alternatives, as well as what equivalent values these parameters would need to have to achieve the same effect.

For example, some of my own thoughts:

  • Increasing ProposerAllocation should not be the solution, as this will have no net effect on validator incentives while service/validator nodes remain bundled.
  • Reducing mint rate will dilute existing nodes less than dividing the mint (the Allocation alternatives). Whether thatā€™s a good thing or not I donā€™t know.
  • What would the minimum stake need to be to achieve the same effect? My hunch is that increasing the minimum stake would deter new entry level node operators, but our professional node operators would continue as normal. So I see this having a bad long-term outcome on the diversity of our nodes.

Based on the above, I see two options if weā€™re opting to change a parameter:

  • The option proposed, if we want to minimize dilution of existing nodes
  • Increase DAOAllocation by 10% (same net effect), if we want to maintain current bootstrapping incentives but redirect a portion to the DAO, where it can allocate to contributors at its discretion

Echoing Andrewā€™s response to this, I donā€™t think such an undertaking should be rushed.

Some other thoughts:

Be mindful of interplay with wPOKT incentives

If we reduce validator incentives too much, the concerns that some have expressed about wPOKT farming incentives cannibalizing node incentives could become more realistic. To be clear, I donā€™t think reducing node incentives by 1/10 would do that, but I just wanted to make sure weā€™re mindful of that.

Give Foundation permission to change parameter if thresholds are crossed?

One of the main risks I see with this approach is it leaves us vulnerable to sudden exponential growth. This is especially important when you consider that there will be a lag on nodes changing their behavior once the parameter has been adjusted, due to the time taken to communicate the change to all node operators. If we see the kind of growth that warrants this change, it might be too late.

A middle ground between Approach 2 and changing the parameter now might be to decide that, should the # of validators cross threshold X, the Foundation can change the parameter to value Y. We can extrapolate this and give the Foundation permission at different stages up to the 5,000 limit.

For example:

  • 2,000 nodes ā†’ Change RelaysToTokenMultiplier to 9,000
  • 2,500 nodes ā†’ 8,000
  • 3,000 nodes ā†’ 7,000
  • 3,500 nodes ā†’ 6,000
  • and so onā€¦

This would allow the Foundation to dampen the effects of exponential growth while we decide on alternatives, if it comes to that.

Actually this isnā€™t true. I understand where your head is at, but proposer reward varies from Relay reward by being weighted proportionally to the amount staked.

Say we make it 25% proposer and 75% Relay (25% of all Relay rewards go to proposer)

There is now a much higher incentive to stake well over the minimum per Validator as the proposers are quite profitable!

  • Reducing mint rate will dilute existing nodes less than dividing the mint (the Allocation alternatives). Whether thatā€™s a good thing or not I donā€™t know.

This is also my thoughtā€¦ But if liquidity is a big concern, there are alternatives.

  • What would the minimum stake need to be to achieve the same effect? My hunch is that increasing the minimum stake would deter new entry level node operators, but our professional node operators would continue as normal. So I see this having a bad long-term outcome on the diversity of our nodes.

This is also possible, but the same thing could happen with any economic reduction. Small businesses usually fall before the corporate chains due to profit margins

I like this approach

2 Likes

Ah this is a good point. So it incentivizes consolidation of validator stakes by our biggest node operators, which should reduce the total node count.

Are there any negative side effects? One that I can think of is it also dilutes smaller node operators in relation, which might lead to centralization of our node ecosystem over time.

1 Like

Itā€™s easy enough to spin up additional nodes at the moment as relays are still relatively low, meaning that 1 ETH node can manage quite a lot of POKT nodes.

What Iā€™m getting at is that I expect there to be a drop off in nodes as demand for relays picks up as that will increase the devops and infra (i.e. more than one ETH node) costs for node runners, which should constrain the increase in active validators.

Does the distinction between active nodes (i.e. not jailed) and inactive nodes staked in the network matter for the 5,000 validator limit?

What Iā€™m getting at is that I expect there to be a drop off in nodes as demand for relays picks up as that will increase the devops and infra

While I agree with your assessment, I also recognize that Node growth vs App growth (Relay demand) are two completely different things. Node growth will likely always outpace App growth in this bootstrapping phase where the margins are so high.

I donā€™t expect App Growth to offset this naturally in the bootstrapping phase, rather I expect this behavior at maturity.

Does the distinction between active nodes (i.e. not jailed) and inactive nodes staked in the network matter for the 5,000 validator limit?

Yes, there is a distinction between jailed Validators and non-jailed Validators when it comes to the 5K limit. 5K active Validators is the current theoretical limit

@Andrew I would argue that this isnā€™t aggressive enough. Based on the growth of nodes weā€™ve seen thus far, decreasing revenue by 10% isnā€™t likely to dissuade many nodes from coming online.

Jackā€™s suggestion of decreasing rewards with the node count could have unforeseen consequences but generally, Iā€™d be in favor of it. Iā€™d like to lock in the amounts (for example, 3,500 nodes ā†’ 6,000) ahead of time so everyone knows what to expect.

1 Like

Percentage decrease in reward and exponential increase in minStake.
Screenshot from 2021-02-11 19-37-42

This is interesting as wellā€¦

The only caveat with the minimum stake refactor is that anyone who is ā€˜belowā€™ the new minimum would be force unstaked if burned (burning and falling below the minimum stake results in force unstake). This means we would need to ensure the param change is properly communicated, enough time is given (for those who are too close to the old minimum, and we consider an adjustment to the force unstake ā€˜featureā€™ (could be considered a bug) in a protocol upgrade

1 Like

For this reason I donā€™t think we should use min stake. I donā€™t think it will be possible to communicate this to 100% of nodes and burning anyone should be avoided. So I would rather create positive incentives for the more prolific node operators to consolidate their node counts (e.g. by increasing ProposerAllocation).

Side note: is there a reason you opted to reduce the relay multiple by 10% each time @BenVan ?

Iā€™m wondering if we should work out a rough breakeven relay multiple (at which running a node is no longer profitable), then make this the 4,500 validator value (or even 4,000 to allow for lag) and work backwards from there.

No particular reason for choosing 10%. The original proposed reduction was 10% and I prefer to copy and paste in excel rather than retype.

2 Likes

Ouchā€¦ I assumed that old nodes would be grandfathered in, and I guess they are, but the death for first offense penalty is a deal breaker for sure. We will need a separate discussion for stake change possibilities. To many ramifications for the issue being discussed here. I withdraw the suggestion.

1 Like

I agree about the separate discussion. Itā€™s worth discussing soon if we want to rope it into 6.0 or 7.0

I like this idea, but the unintended ramifications that have been pointed out donā€™t seem worth it. Itā€™d be great if we could incorporate this in the future in a way that could grandfather in existing nodes.

When we approve a change, letā€™s approve something that scales with growth.

Can I challenge the initial assumption: Why is 5000 so problematic, and why canā€™t we do anything about it instead of changing other parameters? If we want Pocket Network to grow, and service hundreds of blockchains with billions of relays, wonā€™t we need more than 5000 nodes anyway? So, forgive my ignorance, but can someone please tell me why we canā€™t improve the 5000 glass ceiling to a practically unlimited number?

@Andrew can provide more detail about the technical limitations, which as I understand it revolve around Tendermint consensus.

We share your vision for scaling Pocket to those levels, but need to work around the technical constraints of the network architecture.

One solution weā€™ve been talking about for a while now is to unbundle service nodes from validator nodes, meaning nodes can service relays without having to be involved in consensus, allowing us to achieve the scale of relays that you talk about. This is what the original proposal references below:

3 Likes

Right, to be clear this 5K assumption is based on previous tests with RC-0.5.0 and could prove to not be as problematic as I make it out to be. We are currently running a simulation with 10K nodes and that could certainly alleviate the concern a bit.

PNI has many scalability features on the roadmap including servicing and validator separation and Tendermint is currently overhauling their P2P module (which is currently the bottleneck it seems) so Iā€™m not concerned about the future of Pocket Network.

What concerns me is the `rate of node growth, which has exploded with the current Crypto market.

In highly volatile (1K nodes in a few weeks) situations like this, governance is the first line of defense, not ā€˜crisis level engineeringā€™ (which is a likely next step).

EDIT: Oh and to answer your question, the symptoms of > 5K at RC-0.5.0 were node crashes due to high resource consumption, blocks not being produced (before the default timeouts which can be adjusted) and something that seemed similar to peer pooling

1 Like