Park Rate Voting

Even if every shareholder dropped to 0% right now I’d still have 5 days or so before the rate started dropping.

It will start dropping from the next block with about 50% probability.

But you are right there will be enough time to lock-in. The point is you don’t really know where the top is because it follows the price of btc.

We are in a situation where the last 10k blocks looks like this:

oldest 2500 blocks are 0%
middle 5000 blocks are between 0 and 10%
latest 2500 blocks are 10% or higher

Assume all shareholders immediately begin voting for 0%. Until those oldest 2500 blocks drop off, the rate will not change. That means that in this hypothetical situation the park rate would remain entirely unchanged 100% of the time for another 2500 blocks.

If when the next block comes the oldest existing block in the window has a rate more than current rate (50% chance), then the sorted distribution will have one member less on the greater side of the median point. The median point of the distribution will be taken by the member on the smaller side of the previous median point member, which has an equal or smaller rate. If the oldest existing block in the window has a rate less than current rate, then nothing changes.

Right. This is the case currently and will be so for another half week or so.

OK I see. Park rate is really for the slow components (low frequency in Fourier space) of peg balancing. Don’t expect it to come or go quickly.

Realising this, it’s wrong to mandate that T4 reserve is used before park rate. T4 and park rates should always be used together as they cover different part in frequency space. The optimum park rate could be crafted from MA5 of e.g. buy/sell ratio. As the rate constantly changes, a bot can be used to update the park rate feed. This could save a lot of T4 reserve.


@desrever and @willy,

It’s important to record how many NBT are parked as a functiion of park rates, so future models can be made to set the optimum park rates. Now the rates are non-zero. Can you configure your server and record them?

I feel like that’s higher frequency than the standard. I really can’t think of a lower frequency smart trigger than the standard.


But for that you need a way to replenish T4 - which would be T6 in this case.
That’s why I prefer NSR sale to park rates.

Probably right, except for two problems: the Standard is currently only measured once a week while MA5 is an integral. The Standard is absolute number instead of a ratio. Both problem can be solved.

Edit: I forgot the important and unsolvable part with the Standard: it is the spot ratio, not reflecting the low frequency component. MA5(Standard/circulating) might do.

Excellent. To which I replied

1 Like

My issue with this is that park rates already account for the circulating dependency. Meaning that we don’t really get x nbt parked given a certain rate, we get x% of circulating nbt parked. As the network grows, number of parked nubits per park % will grow.

I agree. I imagine that the park rate will be continuously calculated based on MA5(Standard/circulating) where Standard and circulating calculations have had parked amount removed. As parked amount increases, Standard approaches 0.

Also if one is using the standard for the next week’s buyback, ideally one has to calculate the forward-looking standard, which takes into account of the to-be-freed-next-week parked amount.

If you take standard/circulating you’re double counting the circulating. Lets imagine 2 circumstances, one where there are 1mil circulating nbt and one where there are 100k. Let’s say the standard is -$5k. So if we use your calculation the park trigger is 5% in one case and 0.5% in the other. Let’s pretend we use that as the park rate. Let’s also assume that for each park rate %, that % of the network nubits gets parked. So for both cases $5000 nbt gets parked.

Lol, i think i just convinced myself that you’re right and im wrong. Ill think about it some more.

1 Like

I don’t assume it’s a linear function. The function should ideally determined by testing/optimizing against past data. So it doesn’t matter if it is standard/circulating or just standard, it is f(standard, circulating, tolerance, many other parameters).
The question is : what is the goal of optimization? My rough guess is we optimize to minimize peg maintenance cost in a given period.


I don’t believe the parking function is fundamentally repeatable because the structure of Nu, the blockchain, and the risk associated with Nu are all changing. Finding a function for this seems like an impossible task.

There are methodologies. Parameterize and fit. Monte-Carlo simulation. It can make a pretty good MAster’s degree thesis. The problem is we don’t have a lot of data.

The wall has been in more or less balanced state for almost two days. If the wall keeps balanced like now for another 15hr, I think shareholders can consider lowering voted park rates, knowing that it will take a few days to come down.

@willy, it’s likely that 5-day moving average gives important clue on how the park rates should be set. Could you add 5-day MA to ? It doesn’t have to show by default to avoid cluttering.

1 Like

Just to clarify. You want a five day moving average, not a five day window, right?

Right, Five day moving average. See here and here

Since 5day MA changes slowly, you can sample the 4hr MA results every hour and average the latest 120 samples to get 5day MA ,so the cost of resource is small. There will be one new data point on the plot every hour.

1 Like

There are only 2000nbt parked but the wall is balanced. We need more parked nbt to help replenish the rreserve.
I suggest the park rate set to

32% @ 11 day
16% @ 23day
8% @ 1.5 mo
4% @ 3mo
0% @ 6mo

The above base-2 exponential distribution of park rates will give the same total premium to any none-zero period. It encourage choosing the shorter periods due to less opportunity cost. Shorter periods require frequent renewal which aligns with Nu’s need to adjust the total amount of parked NBT acccording to liquidity and Standard.

With the base-2 exponential distribution Nu has only one parameter to adjust re park rates: the total premium. This will make operation and analysis easier.