PUP-14: Increase MaxValidators to improve economic security

Since cat is out of the bag now, solution wise, I lean with @JackALaing that increasing the block producer share of the reward would incentivize a more secure network while not adding validation bloat (which is incurred when changing the validation from 1k to 5k).

I feel what is important to understand is that this just a v0 issue so any solution now is really just temporary. From what I see now, increasing the number of validators doesn’t decrease the number nodes on the network or incentive larger amounts of POKT to be locked up.

I’m open to being convinced otherwise, but increasing block rewards consolidates the number of nodes on the network, while creating a race to the top incentive for larger POKT holder, resulting in more POKT being locked up. Node consolidation and more POKT being locked up and both positives for the network as a whole.

1 Like

@pierre I appreciate the work you do and the research you are involved with, and I want to encourage you to keep at it. I would ask that you please see my comment above about my concern with discussing vulnerabilities publicly before discussing with the core-team.

If a vulnerability requires DAO involvement, I would ask that public disclosures be done in tandem with the core-team. I believe we need to establish a community culture where we work with the core-team doing the development. I know that for myself, I will share issues directly with them before talking about them publicly.

I feel that is not only safer for the ecosystem, but it also encourage collaboration between the core-team and contributors. Please take into consideration going to the core-team first about exploitable issues.

2 Likes

I strongly support this proposal along with increasing the validator rewards to make it happen like @JackALaing is suggesting. I also agree with @shane that a formal process for reporting potential network vulnerabilities needs to be put in place. But regardless, I appreciate @addison taking the initiative here.

5 Likes

Ironic, this entire post you just made could’ve been sent to Addison, not here.

The vulnerabilities brought into this proposals are properties of a BFT/Tendermint based blockchain. Getting 1/3’s or 2/3’s power voting power is textbook vulnerabilities that’s been outlined since the dawn of time. Let’s be glad that an involved community member brought up this proposal and potential solutions rather some FUD/shit poster. One of the bigger complaints we have is that there’s too many behind the scenes conversations. We are a decentralized community and have to act as one. Someone has put forth a problem and kickstarted us off, now let’s work together to get it solved.

2 Likes

I appreciate where you are coming from @poktblade.

I did message him about why I feel it’s important to talk publicly about how disclosures should be handled. @addison has posted many forum posts and he’s a quality contributor on every level. I’ve been supportive of his initiatives before, so I hope that my criticism here isn’t taken in bad faith.

We haven’t established a community process before for disclosing attach vectors. Community processes are a public matter, so in this case it does feel appropriate to state my concerns here. The proposal itself was great, I’m just concerned with established a disclosure process for the future.

I am glad indeed. I’m just hoping to encourage a process of disclosures that encourages further cooperation with the core-team.

I feel that standardizing a process of first consulting with the core-team about potential vulnerabilities doesn’t mitigate the idea of decentralization. Communities require processes, and I believe it is in the best interest of the community have a defined processes if there is potential exploit. Right now, a simple DM to @JackALaing could be process enough IMO while we are still small.

Agreed. Though I was aware of the mechanics, I personally wasn’t aware of the numbers until InfraCon. I’d still prefer that all perceived vulnerabilities be disclosed in cooperation with the core-team who are closer to the network’s development. I personally feel that it mitigates risk to work with the core-team.

1 Like

Having a process is fine, but people have no obligation to follow it. Maybe establishing a bounty program for the disclosure of security vulnerabilities would incentivize reporting of those issues in an organized manner and following an established process.

5 Likes

You are totally right. A bounty program as part of the process makes a lot of sense.

Great proposal. I agree with the points outlined and think this is a change we should make.

On the community side of things, I think we should end these behind the door discussions as they are detrimental to the collaborative & decentralized nature we are trying to build. We need not only the collaboration of the core team but vice versa the core team can benefit to collaborate with community members. How else can talented engineers begin to break into the space and offer their skill set if we’re constantly having secret talks behind the door with only a select few? This type of behavior is completely unacceptable and we should break that precedent if we are to mature and evolve… The OP has outlined the issue, proposed a GOOD solution, and now it’s time to discuss and vote. What other process could you possibly want?

And if we are to follow a different process, will the information be made public? Will this purely be private submissions to a centralized team? How will other people outside of that space know about the problem and offer their knowledge?

I think the process of

identify issue → if you have a solution, post it or collab with other people to come up with a solution then post → let everyone discuss → vote

is exactly what we want and what was done here.

I’m in two minds about this proposal and that is not because I run Liquify :see_no_evil:. I for one and shocked at the numbers and surprised more runners aren’t above the 1000 threshold.

5xing the count will add significant network bloat and added load on validators, Personally we have seen that running a node over the threshold has significant increased load on the node compared to a standard relayer node this load will only increase with a larger count, along with being more wide spread with a larger validator set.

I also totally agree with what @shane said, this shouldn’t be public on an open forum.

Maybe we can have a compromise and have a 2-2.5x increase and a means to reduce stake I’m happy to back down some validator seats (but means to lower stake probably has knock on affects elsewhere?). but this may mitigate the issue until v1 is ready…

This should 2x the price to attack and add further decentralization.

Or we increase the threshold over time, really worried 5xing it out the blue will have large implications

3 Likes

Good points.

Do you have any data that you could share to give everyone a better idea how much increased load a validator takes?

1 Like

Thanks for the proposal @addison

Very valid concerns.

I agree with the proposal to increase network security by a material amount.

However, rather than increasing the validator set size, I agree with the comments that instead propose increasing the rewards to validators and letting the market dictate the minimum stake for validators.

Saying this, it would be best IMO to only do so in the knowledge that the minimum stake per node is sufficiently high already. For example, 15k POKT per node was always a very low figure. Now, I think it’s drastically too low given the importance of the network and the network overhead caused by incentivising nodes to scale horizontally. So why not choose a number 10x more per node stake? Or something in between, eg 100k pokt per node?

My ideal proposal would be as follows:

  • increase minimum stake per Pocket node to 100k pokt
  • increase validator share of relay revenue to 5% (this number is somewhat arbitrary, but we should aim to incentivise the most technically minded infrastructure providers in the network to participate as validators)
  • consider - in light of the above - whether it is still beneficial to increase the validator set to a number > 1,000. Perhaps we need a community call or another forum to discuss this point?
  • in an ideal world, implement an “edit stake” function to allow all node runners to increase their stake without having to unstake first. Pocket’s economics are complex, and I expect that this won’t be the last time that the minimum POKT per node is changed, so having a straightforward method to edit stake would be valuable IMO.
  • provide all node infrastructure providers sufficient time - 6-8 weeks? - to coordinate with their customers about 1) the impact of increasing the minimum amount staked per node and 2) the game theory around optimising stake per node to be in the validator set of X validators

Keen to hear everyone’s thoughts

1 Like

I agree with that. I offer this as a change that does not require significant overhead or friction. Additional incentivization of validators would likely require a MaxValidator increase anyway to avoid a singular entities from taking control, since now you’re incentivizing them to increase their stakes and fill the top 1k slots.

I understand this concern and did not intend for this proposal to have this affec; however, this isn’t a complicated issue. Do not give a single entity 51%/66% of your mining/voting power.

I am a transparent and candid individual so I state things how they are, admittedly, sometimes this is a fault. :slight_smile:

+++

Prior to moving to 1k validators, we had ~15k validators and the chain functioned. Also as mentioned in my post the additional chiain bloat from having extra validators is insignificant. I could be wrong here but the core team can comment on blocksizes as I’m not sure where to find that information.

Yes, I do think it would be optimal to slowly raise the validator count over a period of a few days.

The issue with having only 1k validators is that it would take tremendous incentivization in order to create a safe validator set due to the increased probability of session selection by scaling horizontally. Right now, there isn’t an honest economic incentive to taking control of the validator set since the rewards are so low; however, if the block proposer reward were to be increased, it would be an economically viable strategy to try and take all of the validator slots. This is not a secure solution. Even with 1k validators, its extremely easy for a singular provider or entity to take control of all the slots, even with other competition. Increasing the block proposer reward further incentivizes a single entity to take control.

The problem with increasing the stake size is the amount of friction that this causes for people. They would have to unstake and wait 21 days before being able to consolidate. If we were to offer instant unstaking, this would require signficiant coordination.

This exists currently :slight_smile:

In summary,

I recognize the flaws with how this issue was prevented and I plan on being significantly mindful moving forward. I do think though that regardless of the disclosure, we should stay focused on finding a solution and fixing this issue.

I don’t think that increasing the proposer allocation is a catchall solution, nor is just increasing the maxvalidators. Both should probably happen in tandem in order to create a safe validator set. Max validators is just easier to increase since it does not involve a complicated economic discussion and leads to significant bolstering of economic security. One would need ~50m pokt to take control of a 5k validator set. This is a multi-faceted problem, and this is one of many potential solutions. I find it to be the smoothest, and something that we will likely need to do anyway after discussions of increasing the blockproposer reward.

I discussed this proposal with the core devs and their main concerns were the possibility of nodes falling into the top 5k who are optimized for servicing and thus running non-validator-friendly setups.

Therefore it’s recommended that we do a phased implementation, with advanced notice in each phase to the node runners who would become validators when the parameter changes.

I would suggest that we increase by 50% each time, since 67% is the validator power required to keep the chain moving. This would make the increments: 1500, 2250, 3375, 5000

If this proposal passes, I would suggest that the Foundation can coordinate these phased parameter changes and take as much time as needed to communicate to the validators and confirm that consensus isn’t affected with each change.

If you agree @addison, please add this to the proposal text under an Implementation section.

3 Likes

I also support the idea of having a phased approach with advanced notice in each phase, a fantastic idea to minimize impact.

I am ready to vote on this proposal as soon as possible. This proposal is the lowest risk to fast track while still improving the security of the network by magnitudes.

1 Like

tl;dr I support this proposal provided @JackALaing’s phased approach but was wondering if @addison or others could collect additional stats on the recent block size distribution.


Potential Cons

In addition to what Jack mentioned w.r.t. Node Runners, the issues I personally see (which have mostly been covered by others) are:

  1. Network Congestion
  2. Block Bloat

Network Congestion

I haven’t delved deep or benchmarked Tendermint’s gossip, but in 2018, it was determined to have cubic complexity. [1] Even with the refactor [2] that they completed in Q3 2021 [3], I haven’t been able to find benchmarks to show what the new gossip complexity is.

With that being said, since the range we’re dealing with is quite small (1000-5000), I don’t think we need a deep investigation into the benchmarking difference as long as we keep an eye on the network during the phased increased approach.

Block Bloat

Since Tendermint doesn’t do any sort of signature aggregation, but simply collects them into a list, I’m less worried about the chain bloat on disk for a single node, but moreso the proportion of space taken up in a single block and the size of the messages being sent around.

@addison / @pierre / @poktblade Given that the max block size is 4MB, have any of you looked at the distribution of the block sizes recently to see how close we’d be to the limit?


Pros - Network Security

In addition to this, it’s worth noting that some of the other ongoing economic proposals may result in a reduction of this value back to 1,000 (or somewhere in that range), but still require more discussion and evaluation. For that reason, this is a quick and simple approach to increasing the network’s security.

[1] https://arxiv.org/pdf/1809.09858.pdf
[2] tendermint/adr-062-p2p-architecture.md at 7172862786cabaeb8ac06a6d646955a2faa6da31 · tendermint/tendermint · GitHub
[3] Tendermint Roadmap | Tendermint Core

Thanks Jack for the insight and for looking into this. I definitely think we should do that

I will add it.

No. I would need some insight from v0 team to figure out how to do this. I will get that from v0-contributers :smiley:

I was planning on waiting to see what happens with GV etc before putting this to a vote. I think we will need to increase the parameter in any case, but I think we should see what happens first before determining what value we pick. I am now thinking though that we should just increase it to something in the interim to increase security and reevaluate after other proposals are passed.

Maybe we can increase the validator count by 4k in the next 4-6 weeks to increase our security while giving us optionality in the midst of GV etc?

Thoughts? @Olshansky @pierre @poktblade @JackALaing

Appreciate the research and thoughtful response Daniel.

1 Like

Here’s a metric from live nodes, this graph covers last 7 days. Seem like we’ve hit the limit a few times.

1 Like

IMO it doesn’t make sense to vote on this proposal until there has been a chance to see what the validator stake changes to with the implementation of PUP-19

If it doesn’t strain the chain, it doesn’t hurt to have extra consensus security as well as expanding the number of nodes that can participate in the new 5% validator rewards.