Understood, I was just trying to make sure that once a gateway stakes the gateway and request relays no one else can discover the service and start using it for their own relays.
If the answer to that is yes, and the main reason to set the committee pricing is to encourage the growth of gateway traffic / busniness model, why not go into shannon with two sets of pricing per chain. That at a higher burn /mint rate for a public end point, and that at the lower burn/mint rate for the registered gateways?
Ramiro and Tracie, thank you for your work and for publishing this long-expected material on the new tokenomics. From my perspective there are different sides in this material, both positive and controversial. I would like to comment on this material not from a position of criticism but how I would approach the issue of tokenomics.
1.So far the description of the new tokenomics seems to be too complicated for the average uninitiated person with the introduction of Compute Unit. Then the rhetoric questions arise. What is Unit responsible for: specific electricity consumption, specific processor load, specific bandwidth requirement depending on the type of blockchain? Will these Compute Units be adjusted? How often can they be changed? What parameters and how will be influencing on the change of Compute Unit. As it appears that Compute Unit can be considered as a constant value. Anyway in this regard it seems difficult to understand the formation of the price of the service. If it is correct then the price formation for service can be expressed as follows:
Daily Demand for Relays (in USD) = Number of Daily Relays * Compute Unit (per chain) * CUTTM (Pokt for 1 Compute Unit) * Pokt Market Price
2,5 USD = 1.000.000 daily relays * 2500 (average CU per chain) * 100.000 x 10^-12 (Pokt for 1 Compute Unit) * 0,01 USD (Market Price for 1 Pokt)
The tricky thing here consists in that in order to support the level of prices for provided rpc service the product of numbers: âCUTTM (Pokt for 1 Compute Unit)â and âMarket Price for 1 Poktâ must be constant what means that with the increase in pokt price Compute Unit has to be automatically changed and changed permanently with every tick in pokt price. It was mentioned that CUTTM is subject to a periodical control by PNF. So, how this parameter will be controlled? Automatically? By voting process? In this case it could be simpler to initially set fixed prices for rpc services for every type of blockchain in USD for 1 mln relays without CUTTM and Maket Pokt Price involvement.
2.Ok, the first point was more philosophical because CUTTM is already put into ground and maybe this approach with CUTTM is more convenient for the project than it seems at first sight. The second point concerns the question of creation of the deflationary mechanism. This question is far more urgent and of current interest because in case of the absence of clear beforehand rules it will be put to a vote or what is more riskily wonât be put to a vote at all how it was with the uncontrollable maturity phase. And the voting activity of the former PNF on some questions showed that it was inefficient, long and belated.
Now it is going to be launched an equilibrium mechanism when Burn = Mint and this approach is more or less clear. What is hidden from the first sight how the process of deflationary mechanism will work? How? Under what conditions? Will it be algorithmically automated or voted manually? Why arenât there any examples with numbers to make the whole picture transparent? The only thing that was mentioned in TG group and at the official tokenomics docs that deflationary mechanism could be triggered on high demand for relays. What does it mean âhigh demandâ? 1 bln or 100 bln daily relays? At official tokenomics docs was indicated the number of 20 bln. Why exactly 20 bln and not 50 bln?
It is clear that deflationary mechanism can be carried out by minting less tokens than burning or burning more tokens than minting, so on different circumstances there can be established the corresponding ratios of burning and minting, for instance, burning 1 pokt and minting 0,5 pokt. For clients (apps, gateways) it is the same what is a market pokt price and what is a burning-minting ratio because the rpc service is valued at USD and will remain the same at their constant demand for relays. But for other participants such as third-party node runners, services and validators a burning-minting ratio plays a big role as they get their rewards in tokens.
A third-party node runner by receiving pokt from revshare is interested in crossing a break-even point and getting within reasonable limits commensurate net profit. Servicer and validator are interested in receiving APY bigger than Fed rate plus inflation or the average staking APY of the most reliable famous well-funded crypto projects and in establishing deflationary process because it will impact positively on pokt price even without marketing or KOLs (Of course, every participant would like cosmic profits but then we will be talking about scam or hype project and not about serious business). Holders are interested in a deflationary mechanism too and an active marketing to raise a token price.
When interests of all partcipants of the protocol are come together then the viability of the network is provided and the protocol continues to function well. Of course, this state of the optimal protocol functioning requires the corresponding demand for relays from clients. As of now it commonly known the network is overprovisioned with 12000 nodes. The system can self-balance itself with exit of node runners and servicers with validators but it will reduce security and decentralization of the whole protocol with an increase in circulated free-float tokens from their outstanding number. This could put the protocol at risk of survival. In this connection in order to save statues quo of the protocol this overprovisioness can be resolved only by higher demand for relays from clients.
The protocol activity is the business and as every business the project should imagine all expenses, revenues, profits etc. So, the whole scheme to create a future deflationary mechanism (which is not possible now) with a preliminary calculation of burnin-minting token ratios depending on (high) demand for relays in case of the protocol viability provision can be fulfilled by the following actions:
clarification of break-even point of node runners (Jinx mentioned this number) and their wishful (rational and reasonable, not hype) net profit for their stable functionin;
clarification or theoretical setting (rational and reasonable, not hype) APY (with inflation index premium) for servicers and validators;
calculation of minumum demand for relays from clients to provide the point of convergence of financial interests of node-runner and servicers with validators for the whole protocol at the current number of nodes (according to my preliminary estimation this viable state of the protocol functioning starts from 25 000 usd what corresponds to demand of 10 bln daily relays from clients, Jinx believes to reach this number in several months and I have all grounds to believe in Jinxâs personal efforts and vision);
calculation of burning-minting ratios depending on token price which can be( the same, lower) higher caused by an organic sustainable relay demand growth or by marketing activity, for instance, it can be 1 minting and 1,15 burning or 1 minting and 1,78 burning etc;
on that date in case of high demand for relays calculation of circulated free-float tokens from outstanding number and adding a more API to validators to secure the number of staked tokens to 1,2-1,5 bln (which is reckoned to be normal 0,5-0,6 ratio for stock market) (Jinx mentioned about his plans to raise stake for validators and I completely support his ideas).
3.Third point doesnât touch tokenomics. The speech is about convenience for servicers and validators. In any case there must appear the portal with list of all third-party node runners with one-click staking, upstaking and unstaking. And moreover, there should appear service to unstake instantly with paying some percent for that service if a person doesnât want to wait for 21 days. Certainly, this is not so easy to realize and requires time but this point should be in top-priority list (it seems that there were talks about it but it is not known in what state is this development now)
4.The question of deflationary mechanism stays open when a token price is high, for instance, 5 USD (which is unreal anymore from my perspective at least even in the medium term but who knows) and at daily demand for relays in 25 000 USD it means that the net purchase from the market exluding other trading activities and speculating activities will only be 5000 tokens. How will it be possible to provide a deflationary mechanism at the total supply of 2 400 000 000 tokens? This point needs to be addressed with further considerations. Maybe there is a place for buyback of tokens from market if there would be some funding or other ideas.
5.The transparency and the clear rules of tokenomics is the question of high importance because it can be connected with listing on Coinbase. As it was mentioned at one of the Office Hours Calls that Coinbase representatives wished more stability and transparency in tokenomics in order to move further on the way to listing. Of course, it is not sufficient and requires a lot more trading volumes but is one of the conditions. Therefore the tokenomics in advance (with different calculated options depending on price variations to show possibly waited options with minting and burning equilibrium and deflation along with the the forecasted levels of demand for relays from clients) can be topical and timely relevant
I think you are interested in reducing the total, as everyone, and knowing when is that going to happen. As a general answer, I cannot tell you when or how that can happen, there are too many moving parts right now. We are at the verge of a paradigm shift, modelling a multivariable thing like the Pocket Network tokenomics at this points is not possible.
The spirit of this proposal is to set a solid basis, after the change is done we can start constructing.
Iâll try to answer some of your questions now:
All of that, and more. I have tried to give it a formal description but the model is complex and out of topic here.
Think of the CU as a conversion unit between a single request for work (relay) and what you want to be paid for it. Each service will have different costs (in CU) because different reasons (arbitrary and chosen by the service/chain owner/creator).
Only by the owner of the service. Suppose you create a service named megachain, you assign it a cost, lets say 100 CU/relay, just because you think thatâs the fair price. The cost of megachain will be 100 CUs as long a you, the service owner, change your mind and decide that now after some improvements, each relay of megachain will cost 50 Cus. This TX will have a cost, but will enable dynamic prices and updates, and the owner of the service is always economically incentivized (owner share of 3%) to set a fair price that maximizes the usage.
Each service has its own cost that is set upon thew service creation, see here the values for Beta testnet.
Yes, but they are not each constant the product of them is constant: CUTTM (Pokt for 1 Compute Unit) [POKT/CU] * Market Price for 1 Pokt [USD/POK] = Constant [USD/CU] = 1 USD / 1 Billion Cus
Automated (partially, manually signed) and based on a simple calculation to keep the above constant value true.
Having the same price for each chain wont resolvethe issue of having to dynamically change the on-chain parameters to achieve a USD constant cost per relay.
Since the whole network is denominated in POKT and we cannot print USD (sadly), we will always need to resort to a mechanism like the CUTTM to adjust the token value to the relay price.
This proposal does not deal with the âdeflationaryâ mechanism because it is not proposing that. The proposal only deals with mint=burn.
Deflation, understood as the reduction of the total POKT in existence, can only be achieved if the amount being burned is higher than the minting: mint<burn. So, someone is paying more than what the other party is receiving.
So, in my opinion, to be able to set mint<burn we should be able to reach a level of scale (in total traffic) that is enough to keep the supply happy and hence we should be able to cut a little from their gains. We are not there yet, we need every penny that is burned to go to the supply (and DAO and Validators and Service Owners). When we will be ready? I cannot say and is very difficult to model in Shannon, a network where different services have different prices and costs⌠Forget about metrics in relays, and forget about simple models with fixed CUs target, any of them would be oversimplifications (IMO).
It is worth noting, as @msa6867 proposed, we can reach âdeflationâ today if we want, if the deflationary rate is very low (well below 1% annually), it will just mean an small reduction compared to the big reduction that is needed to get to mint = burn.
I want to make technical note here, we have no monetary controls (this is crypto, tokens come and go and are sold freely) and we want to have sovereign monetary policy (fixed supply or âdeflationâ), there is no way we cannot control the APY, it will be adjusted by the (uncontrollable) POKT exchange rate (see the impossible trilema).
So, no, APY target should be out of the table if supply controls are desired.
we know, and we expect this to happen. In relation to QoS, @Jinx and @TracieCMyers are working on plans for the node runners, to keep everything running.
I agree that we need to calculate this, but I donât think we are capable of doing it before the Shannon tokenomics kicks in.
The proposed tokenomics change will change the economic paradigm of the network, we need to wait for it to settle before trying to model anything. Getting many of the numbers you ask (costs, APYâs) directly from the sources (aka node runners and gateways) proved to be impossible, so if we hope to derive them from data we need a stabilized tokenomics at least.
We are building this, and we are behind schedule on this (Morse â Shannon migration was earlier than expected). I assure you that this exists and is (almost) working on Shannon Testnet. The product was commissioned by PNF.
This is more for the (long) backlog of features for Shannon, there is no way to include this in the Shannon migration, maybe later (tech complexity).
With a token of 5USD the DAO treasury would be huge, you can turn off DAO revenue (5% less minting than burning without affecting other actors) and there would be runway for years and yearsâŚ
At this point the DAO can get creative, many mechanisms can be used to absorb tokens when you have money to back your plays. Iâm not an economist but I know that thatâs the job of central banks and PNF can act like one (to some extent).
I think that the current tokenomics plan is the easiest and cleareer of all:
PNF will only change a single parameter to keep network cost fixed in USD: The CUTTM
PNF will keep the supply stable: mint=burn. Only a very low inflation would be allowed (<2%) and all signed to DAO (if it is implemented)
The PNF expenses are at a minimum, and calculated to last for as long as possible. They know that if they step of the line the treasury collapses (no new tokens are being printed).
The network wont promise APYs, because it is impossible to hold, the network will resize by market rules.
Ramiro, thank you for your answers, maybe there are points that controversial from my perspective but anyway it seems that the protocol team did a great thoughtful job on the new tokenomics