I’m seeing 0.0 parked in my production client. Can you try restarting the daemon and let me know what it says after that?
You mean totalparked
in B wallet mode?
After the restart:
nud -unit=B getinfo
{
“version” : “v2.1.1-RC1-15-gcf6a10f-beta”,
“protocolversion” : 2000000,
“walletversion” : 1,
“walletunit” : “B”,
“balance” : 0.0,
“newmint” : 0.0,
“stake” : 0.0,
“parked” : 0.0,
“blocks” : 837356,
“moneysupply” : 1291595.0429,
“totalparked” : 0.0,
“timeoffset” : 0,
“connections” : 1,
“proxy” : “”,
“ip” : “obfuscated”,
“difficulty” : 0.00019687,
“testnet” : false,
“keypoololdest” : 1460528227,
“keypoolsize” : 101,
“paytxfee” : 0.01,
“errors” : “”
}
Correct.
Strange. The restart “removed” the parked NBT!
Let’s compare LP compensation and park rates
LPs compensation :
- paid at 4.2% - 7.2% per month (50%-86% p.a.),
- fund available so opportunity cost is low,
- has exchange default risk. (~50% p.a. ?)
- BTC price risk if in nbt/btc pair
Park premium:
- fund not available when parked so there is opportunity cost.
- no exchange default risk.
- no btc price risk
To make it easy, assume the exchange default risk equals opportunity cost, I conclude the Park rates have to reach nbt/usd liquidity provider compensation rate (50% p.a.) to be attractive.
edit to add: Park rate is now 7.5% p.a. and there are 139 nbt parked total. A couple of percent used to attract tens of k. But I guess the novelty has worn off and expectation has changed.
I’m waiting to park till it reaches the numbers some others have quoted, like 10% or 25%. Why would I park now when I know people on the forum are talking about increasing rates?
Because if BTC prices reverses tomorrow and park rate nosedives as a result?
Even if every shareholder dropped to 0% right now I’d still have 5 days or so before the rate started dropping.
It will start dropping from the next block with about 50% probability.
But you are right there will be enough time to lock-in. The point is you don’t really know where the top is because it follows the price of btc.
We are in a situation where the last 10k blocks looks like this:
oldest 2500 blocks are 0%
middle 5000 blocks are between 0 and 10%
latest 2500 blocks are 10% or higher
Assume all shareholders immediately begin voting for 0%. Until those oldest 2500 blocks drop off, the rate will not change. That means that in this hypothetical situation the park rate would remain entirely unchanged 100% of the time for another 2500 blocks.
If when the next block comes the oldest existing block in the window has a rate more than current rate (50% chance), then the sorted distribution will have one member less on the greater side of the median point. The median point of the distribution will be taken by the member on the smaller side of the previous median point member, which has an equal or smaller rate. If the oldest existing block in the window has a rate less than current rate, then nothing changes.
Right. This is the case currently and will be so for another half week or so.
OK I see. Park rate is really for the slow components (low frequency in Fourier space) of peg balancing. Don’t expect it to come or go quickly.
Realising this, it’s wrong to mandate that T4 reserve is used before park rate. T4 and park rates should always be used together as they cover different part in frequency space. The optimum park rate could be crafted from MA5 of e.g. buy/sell ratio. As the rate constantly changes, a bot can be used to update the park rate feed. This could save a lot of T4 reserve.
It’s important to record how many NBT are parked as a functiion of park rates, so future models can be made to set the optimum park rates. Now the rates are non-zero. Can you configure your server and record them?
I feel like that’s higher frequency than the standard. I really can’t think of a lower frequency smart trigger than the standard.
Right!
But for that you need a way to replenish T4 - which would be T6 in this case.
That’s why I prefer NSR sale to park rates.
Probably right, except for two problems: the Standard is currently only measured once a week while MA5 is an integral. The Standard is absolute number instead of a ratio. Both problem can be solved.
Edit: I forgot the important and unsolvable part with the Standard: it is the spot ratio, not reflecting the low frequency component. MA5(Standard/circulating)
might do.
Excellent. To which I replied
My issue with this is that park rates already account for the circulating dependency. Meaning that we don’t really get x nbt parked given a certain rate, we get x% of circulating nbt parked. As the network grows, number of parked nubits per park % will grow.
I agree. I imagine that the park rate will be continuously calculated based on MA5(Standard/circulating)
where Standard and circulating calculations have had parked amount removed. As parked amount increases, Standard approaches 0.
Also if one is using the standard for the next week’s buyback, ideally one has to calculate the forward-looking standard, which takes into account of the to-be-freed-next-week parked amount.