Park Rate Voting

Strange. The restart “removed” the parked NBT! :slight_smile:

Let’s compare LP compensation and park rates

LPs compensation :

  • paid at 4.2% - 7.2% per month (50%-86% p.a.),
  • fund available so opportunity cost is low,
  • has exchange default risk. (~50% p.a. ?)
  • BTC price risk if in nbt/btc pair

Park premium:

  • fund not available when parked so there is opportunity cost.
  • no exchange default risk.
  • no btc price risk

To make it easy, assume the exchange default risk equals opportunity cost, I conclude the Park rates have to reach nbt/usd liquidity provider compensation rate (50% p.a.) to be attractive.

edit to add: Park rate is now 7.5% p.a. and there are 139 nbt parked total. A couple of percent used to attract tens of k. But I guess the novelty has worn off and expectation has changed.

1 Like

I’m waiting to park till it reaches the numbers some others have quoted, like 10% or 25%. Why would I park now when I know people on the forum are talking about increasing rates?

3 Likes

Because if BTC prices reverses tomorrow and park rate nosedives as a result?

Even if every shareholder dropped to 0% right now I’d still have 5 days or so before the rate started dropping.

It will start dropping from the next block with about 50% probability.

But you are right there will be enough time to lock-in. The point is you don’t really know where the top is because it follows the price of btc.

We are in a situation where the last 10k blocks looks like this:

oldest 2500 blocks are 0%
middle 5000 blocks are between 0 and 10%
latest 2500 blocks are 10% or higher

Assume all shareholders immediately begin voting for 0%. Until those oldest 2500 blocks drop off, the rate will not change. That means that in this hypothetical situation the park rate would remain entirely unchanged 100% of the time for another 2500 blocks.

If when the next block comes the oldest existing block in the window has a rate more than current rate (50% chance), then the sorted distribution will have one member less on the greater side of the median point. The median point of the distribution will be taken by the member on the smaller side of the previous median point member, which has an equal or smaller rate. If the oldest existing block in the window has a rate less than current rate, then nothing changes.

Right. This is the case currently and will be so for another half week or so.

OK I see. Park rate is really for the slow components (low frequency in Fourier space) of peg balancing. Don’t expect it to come or go quickly.

Realising this, it’s wrong to mandate that T4 reserve is used before park rate. T4 and park rates should always be used together as they cover different part in frequency space. The optimum park rate could be crafted from MA5 of e.g. buy/sell ratio. As the rate constantly changes, a bot can be used to update the park rate feed. This could save a lot of T4 reserve.

2 Likes

@desrever and @willy,

It’s important to record how many NBT are parked as a functiion of park rates, so future models can be made to set the optimum park rates. Now the rates are non-zero. Can you configure your server and record them?

I feel like that’s higher frequency than the standard. I really can’t think of a lower frequency smart trigger than the standard.

Right!

But for that you need a way to replenish T4 - which would be T6 in this case.
That’s why I prefer NSR sale to park rates.

Probably right, except for two problems: the Standard is currently only measured once a week while MA5 is an integral. The Standard is absolute number instead of a ratio. Both problem can be solved.

Edit: I forgot the important and unsolvable part with the Standard: it is the spot ratio, not reflecting the low frequency component. MA5(Standard/circulating) might do.

Excellent. To which I replied

1 Like

My issue with this is that park rates already account for the circulating dependency. Meaning that we don’t really get x nbt parked given a certain rate, we get x% of circulating nbt parked. As the network grows, number of parked nubits per park % will grow.

I agree. I imagine that the park rate will be continuously calculated based on MA5(Standard/circulating) where Standard and circulating calculations have had parked amount removed. As parked amount increases, Standard approaches 0.

Also if one is using the standard for the next week’s buyback, ideally one has to calculate the forward-looking standard, which takes into account of the to-be-freed-next-week parked amount.

If you take standard/circulating you’re double counting the circulating. Lets imagine 2 circumstances, one where there are 1mil circulating nbt and one where there are 100k. Let’s say the standard is -$5k. So if we use your calculation the park trigger is 5% in one case and 0.5% in the other. Let’s pretend we use that as the park rate. Let’s also assume that for each park rate %, that % of the network nubits gets parked. So for both cases $5000 nbt gets parked.

Lol, i think i just convinced myself that you’re right and im wrong. Ill think about it some more.

1 Like

I don’t assume it’s a linear function. The function should ideally determined by testing/optimizing against past data. So it doesn’t matter if it is standard/circulating or just standard, it is f(standard, circulating, tolerance, many other parameters).
The question is : what is the goal of optimization? My rough guess is we optimize to minimize peg maintenance cost in a given period.

2 Likes

I don’t believe the parking function is fundamentally repeatable because the structure of Nu, the blockchain, and the risk associated with Nu are all changing. Finding a function for this seems like an impossible task.

There are methodologies. Parameterize and fit. Monte-Carlo simulation. It can make a pretty good MAster’s degree thesis. The problem is we don’t have a lot of data.