=##=##=##=##=##=## Motion hash starts with this line ##=##=##=##=##=##=
The NuTeam will implement frequency voting whereby the wallet client hosted on Nubits.com contains an additional column when voting on motions. This column will be referred to in this motion as ‘voting probability’ and have user-defined real numbers between 0 and 1 and will be defaulted to zero. Upon minting, a wallet client will roll a random number between 0 and 1 for each motion and if that number is greater than the user-defined voting probability it will vote for that motion. Otherwise, it will not vote for that motion. Standard voting rules still apply, including a maximum number of motions voted for per block. When that number is exceeded preference is given to motions with a higher voting probability.
The NuTeam will be awarded 50 NBT for doing so.
=##=##=##=##=##=## Motion hash ends with this line ##=##=##=##=##=##=
This seems incredibly simple to implement, just a few small changes in the standard wallet software. If we turn on frequency averaging by default, this implementation ends up looking like a decentralized data feed. I haven’t heard anyone mention any real downsides to this and it seems very easy to implement.
Please don’t let this be forgotten, it is one of the biggest breakthroughs in democratic implementation I’ve seen since the creation of Nu.
It seems the aim is to basically allow apathetic voters to stay “neutral”. One might have to be careful with the convergence properties of this system, as I have a feeling that 10000 blocks might no longer be robust enough.
let’s say 80% of voters are apathetic. Then, due to some strange goings on in the forums, all 20% of the remaining voters vote yes for a vote for 1000 blocks before deciding they don’t want that motion at all. Let’s say the 1000 blocks was enough time for the apathetic voters to equilibrate at 100% voting. Now, as all the conscious shareholders stop voting for the motion, we can assume that in the next 10 blocks the 8 apathetic voters vote for the motion and the 2 conscious shareholders vote against. Depending on the window of averaging we choose (let’s take 1000 blocks as an example) we would see the apathetic voters lower by something like 20% every 1000 blocks. The apathetic voters would attain equilibrium with the conscious voters before 5,000 blocks pass.
This brings up an important metric: window of averaging / fraction of apathetic voters
If this ‘apathy parameter’ is less than the minimum number of blocks required to reach conscensus, we should be golden. Of course, ‘window of averaging’ needs to be big enough to overlook statistical abberations, but not as robust as the actual vote criteria for Nu. That’s why I suggest 1000 blocks, which would remain robust all the way up to 80% apathy (where the apathy parameter reaches the 5000 blocks required to reach minimum consensus).
We know the lower limit of it – which is 1 minus the fraction of minting shares. At current difficulty, this is like 1 - 380million/880million = 57%. At least 57% NSR don’t vote and don’t care.
This fraction can also be measured. Everyone who cares to vote can vote yes for a test motion, then we know from the yes percentage how many are actively voting.
We could try to measure it, but when making changes we should try to make them general. I think a 1000 block averaging window for default frequency voting is a very robust choice even at apathy fractions that are well above 50%. Only voting nsr holders matter for the apathy fraction; an apathy fraction of >50% means literally no motion or grant can pass in the current system.
Frequency voting would allow for many other benefits, such as the capacity for subscription to a linear combination of data feeds and automated training of voter preferences with respect to data feeds.
Finally, frequency voting would be purely optional: you can simply only vote 100% or 0% and turn off the default averaging.
@dysconnect explore the convergence properties to your hearts content. I’m a little putoff I was not able to find a full algebraic solution, but a monte-carlo method isn’t terrible either. It is certainly illuminating for me.
Example: The idea behind these parameters is to make a big wave (all ‘no’ votes, then all ‘yes’ votes on a motion for 1000 blocks straight) and see how the autovoting handles it with a large apathy percentage (80%) and even a modest shareholder support (20%)
If we take the voter apathy to 0%, we can recover the simulation of the protocol as it exists presently. The following example compares a 0% apathy case to a 50% apathy case. If 50% apathy existed in the network as it currently stands the entire system would shut down and no motion would be passable. Also, please remember that these are inherently stochastic (i.e. random) processes and that the curve representing the 50% apathy case was intentionally selected such that the motion passed 50% to show contrast. I should probably be doing averages of many runs, but the program is slow. Anyway:
The intent of frequency voting is to simulate the 0% apathy case with only few voters. There is no direct analog to the current situation because currently we would be totally frozen out with a 50% apathy rate.
Also, there are tons of other benefits to using the frequency voting mechanism. Default voting is just one of the lowest hanging fruit.
And, finally, the attack vector. If someone were to highjack the network for 10 consecutive blocks (would require ~1% minting shares if we’re saying they’re spread out throughout 1000 blocks, which is a weaker attack), even a 99.99% complacent network achieves convergence:
Do I interpret the two graphs right if I understand
the second one as showing the results with 0% and 50% voter apathy on a similar or almost identical level
the third one as showing a resiliency of the frequency voting system against attack
?
If my understanding is right this looks like a very nice way to keep the Nu network able to make decisions even if some or many voters don’t care about voting and are fine with following the lead of voters.
edit: damn I didn’t scroll far enough up. Corrected the references to the graphs…
Yah, I’m basically showing that with reasonable parameters the risk of frequency voting diverging is very very small, like hyper small. What I’m getting at is that frequency voting has almost no down sides, is superior to the current voting method, and is easy to implement. With all the talk in the bitcoin community about how to achieve consensus, implementing frequency voting could make some serious waves.
Graph 1. Even at 80% apathy the network converges efficiently
Graph 2. 50% apathy simulates 0% apathy well, as intended
Graph 3. Convergence is achieved even with 99.99% apathy, which is of course crazy big. Like, that would mean for every minter that cares there are 10,000 other minters who don’t.
If you ask the feed providers to slightly alter the motion hash, you know what percentage configures votes manually and which feed provider is responsible for how many votes.
Like all zeroes hash for manual configuration, 1 at beginning or end for @Cybnate’s feed, 2 at beginning or end for @cryptog’s feed.
And all with no extra bloating of the blockchain!
Assuming that the feed providers did support this test and the minters didn’t adjust their behaviour this could quite accurately measure the percentages.
That’s great. Basically within 10000 blocks something with even just 30% approval rate could pass quite easily. Given your graphs and a few runs on my computer I guess the voting window has to be set to somewhere over 20000 if we were to implement this feature for 80% apathetic voting shares. Equivalently one can use a somewhat slower or biased averaging process.
Data feeds aren’t bad, we just need to develop transparency mechanisms or stuff like this in the long term for robustness.
The following is a simulation with 80% of minters being apathetic and of the remaining 20%, 30% of voters show support for the motion. It was performed using the parameters I am suggesting (keep the protocol the same and just institute a 1000 block average for apathetic voters). As you can see, it does not really even come close to passing. To pass, a motion would still require very close to 50% support. Note that as it stands, if 49% of minters are voting for a motion it could potentially pass by random chance. Frequency voting does not change the protocol rules at all, so that is still true. With 30% support, however, a motion really doesn’t have any chance of passing with or without frequency voting.
@dysconnect I could really use your consensus here, so I’d like to understand more about your last comment. Can you send me the parameters of your run and your text file where you saw a motion pass with 30% support?
#Parameters
#Defined by protocol
votingwindow=10000
#Averaging window
window=1000
# what % of voters are apathetic?
apathetic=.8
# what % of non-apathetic voters consent to this motion?
consent=0.3
# how many 'yes' blocks in a row to start? (you can make your own initial array further down in the code if you want)
initial=1000
# how many data points after initialization?
iterations=100000
It returns “passed? True” quite often, at the order of half of the time.
Yah, so your initial is 1000, which means you’re starting out with 1000 yes blocks. That’s a highly improbable setup that I was just using as a proof of concept. Try it with initial=0 to get a much more realistic simulation.
You were simulating what happens when someone takes control over the network for 1000 blocks and votes for a motion 100%, then the shareholders proceed to also support it 30% with 80% apathetic voters. It’s understandable that a motion would pass under those extremely artificial conditions.
If you are running it a bunch, I’d be curious to know at what level of consent a motion starts passing with initial=0. I’m guessing 45%. Totally depends on your apathy level and random luck though.
I see. Setting initial=0 but apathetic=.9 still gave some high pass rates for consent = 0.45. It’s debatable whether that’s significant but I guess there may be cheap ways to patch up this small amount of loss of robustness.
But is it a significant loss of robustness at all? At 0% apathy, which is the non-frequency voting case, I think you’ll find a motion will pass often at 48% consent. Even at 1% shareholder support in the current system, given infinite time everything has a statistical certainty to pass. If we really want, we can just increase the bar to 60% for passage. I don’t think we need to worry ourselves over a 3% difference at 90% apathy, however.
Thanks for verifying my plots and algorithm, I appreciate it.
It’s not that often a 48% motion passes with 0% apathy. The gap is 200 votes which is extremely hard to reach for a binomial distribution with n=10000 and p = 0.48 (200 is over 8 standard deviations). I don’t want to analyse the random process formally but my gut feeling tells me it’s going to take millions of blocks for it to pass with at least 1% probability. Whereas with 90% apathy this kind of probability might be seen with as low as 40% support.
As for setting a higher threshold, I agree it’s a matter of trade-off, but if we can make the gap between consent and motion passage as small as possible it is a good thing to think about, just not top-priority.