I have to admit to still being very much on the fence on this issue.

I like the idea of being able to quickly pass a motion but my caveat would be that there has been suitable visability and discussion around it. There have been motions in the past which have gained a majority consensus through discussion but which still take an extra few days to pass.

That isn’t what this shortening of the voting period does though, as has been pointed out, this could lead to motions which could pass without that discussion.

I like the flexability shorter voting time gives but I don’t like the possability that it could be more open to abuse.

I think that, like @Cybnate, I’ll have to digest this further.

Hi @willy

Here are the details for the Motion Vote on 04512be5c164e77dd354a8267d59f2f11fba29c2:

##04512be5c164e77dd354a8267d59f2f11fba29c2

**Blocks**: 2332 (`23.320000%`

)

**Share Days**: 2131811962 (`42.022881%`

)

Hi @cryptog

Here are the details for the Motion Vote on 04512be5c164e77dd354a8267d59f2f11fba29c2:

##04512be5c164e77dd354a8267d59f2f11fba29c2

**Blocks**: 2371 (`23.710000%`

)

**Share Days**: 1721210943 (`38.787730%`

)

The motion to make auction size adjustable took 10 days to pass.

It passed just after that the deadline that set the size of the next NSR auction .

This lag showed that we need faster motions.

What about starting with 2001 blocks (2~4 days) instead of the originally proposed 1001 blocks (1~2 days) ?

I would vote immediately for such a modified motion.

Both decisions were based on motions.

If motions had been altered to pass with fewer votes, both motions would have been affected from it.

So this is no reason for faster motions.

But as long as Nu doesn’t have automatic ways to burn NBT (and need to rely on manual interaction like the NSR sale / NBT burn motion) it might be good to be more agile.

Still the motion would move much slower than the market and it wouldn’t have a feedback loop on protocol level.

So this wouldn’t be a fix, but a workaround until Nu will be equipped with an automatic, continuous and reliable way to do that (which Nu desperately needs!).

I very much like this draft:

It may have made the time between when the auction motion passed and the auction control motion passed be less than a week. They both would have gone out faster, but the time between them also would have shrunk.

That said, I think this situation was hyper specific and rare. I think we should continue to discuss this motion (and possible variants if we don’t think this motion has been getting support) but we should not be quick to make the current situation indicative of future events. We have a full history of motions to look upon.

The motion to make auction size adjustable also took longer because it didn’t have a majority of voters at times. Faster motions won’t really help with that anyway. Still not convinced that faster motions is the answer.

For the burning we just have the manual burning in place, also Jamie burnt a lot in the last day. So it is hard evaluate what the effect is. Automatic ways can be better, but it won’t hurt to keep a human factor in it to start with. But I’m digressing from the topic.

Hi @cryptog

Here are the details for the Motion Vote on 04512be5c164e77dd354a8267d59f2f11fba29c2:

##04512be5c164e77dd354a8267d59f2f11fba29c2

**Blocks**: 1546 (`15.460000%`

)

**Share Days**: 853507570 (`22.375634%`

)

For reasons I don’t understand I have missed this thread since 25 days ago.

Anyway I share some of the concerns. Here is my idea to check the advantage of a shorter motion window: intruducing a cost.

If it is mandated that all motions passed with 2,000-block windows to be nullified after 7 days of passage, then a normal 10,000-block window motion will have to pass to make the motion permanent, with all due debate and deliberation.

The onus is on the motion proposer to do extra diligence to make a quick win permanent.

very interesting. We could do the tiered motion this way too, with short term motions, medium term, and permanent motions. This would allow us to have a lower threshold for something like a pool’s continuation proposal if no additional funds are required. In theory, nullification windows are only one possibility. We could define a set of clear, undebatable criteria that a motion must satisfy in order to be passed in a shorter time frame.

Where exactly shall the the danger from a reduced rolling windows size be coming?

I’m not talking of a rolling window size of 10 blocks here, but of 2,000 blocks like proposed.

While I agree that in general bigger windows sizes aren’t less secure than smaller sizes, they are not necessarily always more secure.

Basically this is what this security discussion is about - how far can we go below 10,000 blocks window size before the security is degraded by x %.

And what is “security” in this case?

One aspect of security is dealing with the variance when minting - if variance plays a big role for the outcome of a voting, I consider this relevant for security.

I still believe there’s not necessarily a big difference between 10,000 blocks and 2,000 blocks.

So far this hasn’t been calculated and I need to rely on my gut feeling.

A tiered motion model would help feeling better, but as long as it’s not sure that a 2,000 block window is (much) less secure, it would only introduce unnecessary complexity.

Can we try to focus on a mathematical comparison between the security of a 10,000 and a 2,000 block window size?

How is the voting probability depending on window size (and number of votes someone has)?

It would be amazing to calculate the “security” of any window size.

I think the start for this could be to calculate the probability of a single vote being in a given window size.

I need to make some assumptions/roundings. I hope they don’t ruin the outcome.

850,000,000 NSR are distributed.

Lots of at least 10,000 NSR can cast a vote.

85,000 votes are available in total (some need to wait until they have enough age).

I don’t know how to calculate the currently minting NSR, but let’s introduce a factor of n.

Let’s call the window size w.

Say someone has a total number of available votes v.

So n*85,000 votes are competing to put votes in w blocks.

I want to leave n out for now and set it to 1, meaning all distributed NSR are minting (which is not the case).

And now comes the point, where I need to make roundings. Once one of the 85,000 votes is cast, it will neither be able to cast in a 2,000 nor in a 10,000 window size.

It’s beyond my skill level and the capabilities of discourse to express that in a formula.

Each vote has the same probability of casting a vote which is not exactly true.

If you have v total votes, you should be allowed to expect at average each v/85000th vote.

For both 10,000 blocks and 2,000 blocks you can’t cast the same vote twice (7 days (10,080 blocks?) output age required to vote).

The probability to put (at least) m votes in a w sized window is the sum of the probability of the complementary element of not putting each of the votes there.

For 2,000 block the accurate calculation is

```
1-(84900/85000
*84899/84999
*84898/84998
*84897/84997
*84896/84996
[...]
*82901/83001)
=
1-(0.9988235294
*0.9988235156
*0.9988235017
*0.9988234879
*0.998823474
[...]
*0.9987951807)
=1-0.0922110982
=0.9077889018
```

So with a probability of a bit over 90% at least one in 2,000 votes gets cast if you have 100 votes available.

With a windows size w=10000, a total number of available votes v=100 the accurate calculation is

```
1-(84900/85000
*84899/84999
*84898/84998
*84897/84997
*84896/84996
[...]
*74901/75001)
=
1-(0.9988235294
*0.9988235156
*0.9988235017
*0.9988234879
*0.998823474
[...]
*0.9986666667)
=1-0.00000363311666069707
=0.999996366883
```

It’s not easy to calculate this for different window sizes w and different numbers of votes v.

That’s why I thought of an approximation.

The probability for casting 0 votes in w blocks is calculated by (I hope the brackets enhance the readability)

`((85000-v)/85000)^w`

So the probability for casting at least 1 vote (of v available) in a window sized w is calculated by

`1-((85000-v)/85000)^w`

The above is not accurate, because after each vote the number of available votes is reduced by one.

It’s dealing with casting only one vote, and only close to the precise value for small numbers of v.

But except for that it’s a quite good approximation

Allow me to compare the order of magnitude of the approximation and of the precise calculation for v=100, w=10000

The approximation is 1-(84900/85000)^10000

=0.999992279502

The precise value is

=0.999996366883

As you can see it’s close to each other.

Let’s resume:

there’s a 99.99 percent chance to put at least one of 100 available votes in a rolling window of 10,000 blocks.

And there’s a 90 percent chance to put at least one of 100 available votes in a rolling window of 2,000 blocks.

This might look like a big difference.

But if you do some calculations with other numbers of available votes, you’ll find out that the number of available votes has a bigger impact than the size of the rolling window.

If you increase the number of votes to v=200 the outcome is as follows:

Probability to cast at least one vote in w=10000 blocks:

1-(84800/85000)^10000=0.999999999941 (the precise value would be 0.999999999987006)

Probability to cast at least one vote in w=2000 blocks:

1-(84800/85000)^2000=0.991008066532 (the precise value would be 0.991521253204478)

As you can see there’s almost no difference between the 10,000 and the 2,000 window size for owners of 200 votes.

This gets much more complicated if you want to calculate the probability of m votes and I can’t think of a proper approximation.

But calculating the probability of 1 successful vote in different window sizes of w gives a hint about the differences.

Owners of small numbers of votes are more affected from a change of the window size than owners of big numbers of votes.

This is especially true if you do calculations for more than 1 vote.

Someone with 2 votes has only 1 left after the first one was successfully cast (50% of votes left) while someone with 100 votes still has 99 (99%) left.

I don’t know how many NSR owners use data feeds. But the data feeds mitigate these effects to some degree.

Sorry for the text wall.

But I wanted to contribute something “measurable”.

This is beautiful, and reaffirms my intuitive feeling that this motion is bad for people with few votes but now we know precisely why and by how much. Thank you @masterOfDisaster

For the record some more numbers.

With v=10 (equals owning 100,000 NSR) the probability to cast at least one vote for

w=2000 is roughly 21% and for

w=10000 i’s roughly 70%.

with v=20 and

w=2000: 37%

w=10000: 90%

with v=40 and

w=2000: 61%

w=10000: 99.1%

with v=80 and

w=2000: 85%

w=10000: 99.99%

with v=160 and

w=2000: 97.7%

w=10000: 99.99%

with v=320 and

w=2000: 99.94%

w=10000: 99.99%

with v=640 and

w=2000: 99.99%

w=10000: 99.99%

Owners of several million NSR don’t need to care much about a reduced window size.

Owners of less than some million NSR might face variance issues.

Please take into regard that there are some assumptions and roundings in my calculations.

Plus Nu is closer to 60,000 competing votes than 85,000.

This reduces the impact for people with few votes.

But the effect from votes that are cast which need to be excluded when calculating the probability of the next successful cast of a vote outweighs that by far.

This is once again a gut feeling, but I’m too lazy to calculate that.

Thanks @masterOfDisaster for these elaborations.

Now, what’s a sufficiently distributed Nu?

- how many shareholders?
- how many shares each?
- what’s a good w then?

http://blockexplorer.nu/topNSRaddresses/1

Sorry, I was in a rush, the y-axis is actually Log(v), which is Log(NSR)-4.

The fit for the linear part (given the log scale) is:

y = -0.0041x +3.1013

The reasonable thing, in my mind, would be to find the median address and pass motions for it. The x-intercept is 756, so the median is #378, at v~35, or 350,000 NSR.

Next, we choose an allowable % and that gives us our w. Something like 90% might be a good choice. Plugging in with @masterOfDisaster’s formula, and I get a **w~5600**

Motion RIPEMD160 hash: **4a083131a9260114f1b14793449691965d6b4cfb**

`=##=##=##=##=##=## Motion hash starts with this line ##=##=##=##=##=##=`

Currently a motion must receive 5001 votes within 10000 blocks and also have the majority of share days voting in favor of it in the same 10000 blocks.

Passage of this motion will change what is required for all future motions to be regarded as passed in these two ways:

- Share days will no longer be a criteria for evaluating the passage of a motion. One block equals one vote.
- A motion passes with 2501 votes in any 5000 block period. This will permit decisions to be made more quickly, while still permitting a diverse set of shareholders to participate.

Custodial grants will continue to require 5001 out of 10000 votes because the security concerns for grants are much higher because they automatically issue blockchain funds, whereas no action is automated in the case of a passed motion.

The getmotion RPC should have the default block height changed from 10000 to 5000. Blockexplorer.nu needs to be changed as well. However, these changes need not be made in order for the new rules to take effect.

`=##=##=##=##=##=## Motion hash ends with this line ##=##=##=##=##=##=`

##### Verify. Use everything between and including the <motionhash></motionhash> tags.

I think you are not precise. At now custodial grants require 5001 out of 10000 votes. Do you propose to change that?