Seems to JustWork here. OK it uses --home-network and a pre-existing docker setup helped me. However the ReadMe is good and @aatonnomicc published a handy script.
I think the team should consider throwing some effort behind this. If for no other reason than its a LOT prettier than LaunchPad. And perception matters, so we are told.
46 replies and the OP hasn’t been addressed by anyone in the know.
I somehow doubt I will be educated and now assume the negative has been verified as no one in the know wants to address it specifically or to educate me as to how this was not a failure in testing 32GB (ave) sized nodes. But rather there is diversionary responses not addressing the issue.
And internal tests for performance on digital ocean droplets does not count since they were not this test beta network. Also with optimised high speed links (max distance across the pond) it does not show the true effects of 4MB chunks on retail ISP connections across the world (not in EU with their superior retail connections)
will those numbers (at lest for the winner) be published for people to know what traffic to expect depending on fullness? (I do believe traffic increases with fullness but I’m not sure my current plan for 32gb nodes will stand the traffic at full large nodes - because we didn’t see them yet…)
All this test did was confirm that even 2GB is OK. You could have picked any size, this test did not test going to 32GB, just that you could increase the size above 2GB
Nothing about testing the extreme case as you yourself sold it to us
Again this is not testing the 32GB test case, just that changing the node size to anything above 1GB node size works,
Nothing to do with the 32GB extreme node size test case.
I did not say this test failed at everything nor test other needed things, just failed at testing the extreme case of 32GB nodes.
Topic title and subject was not answered. It wasn’t that it didn’t test things, just that it did not test the extreme 32GB size.
32GB beta test. Did it actually test 32 GB average node size.
I am not upset about things, just wanted to be educated and was not. If you said yes it did not test the extreme case, which your reply confirmed then that would be fine, we all make mistakes or couldn’t actually do a test. That is fine and seems the case
So all good and I was educated by the absence of education.
EDIT: Jim, we are 100% behind the project and on your side, so saying up front that it didn’t get to test the extreme 32 GB nodes is fine, just wanting to know if something was missed.
More testing is good and testing further the update tools and confirmation changes work is good testing, and more real life test is good. So it seems the testing worked but missed the 32GB part
Yes it does feel a bit like testing a car that’s built to shatter speed and acceleration records …But instead of putting it on the track to push its limits, you let it sit there idling for a week, barely moving.
would really be cool to see such a giga-network in the wild and try joining such a network with a couple of home node and see if they end up shunned or part of the network and what permanent load they then put on the internet connection …
maybe maidsafe wants to select some slightly tech-savvy people from different areas of the world with different setups to do the ‘real-world-test’ … e.g. @neo for Australia, I could test from rural Germany, Dimitar maybe from a site in Bulgaria with not that perfect internet as his home, etc. etc. maybe 10 or 20 volunteers (not doing it for incentives but just to provide additional data)
filling at least is mainly download one could hope … so I assume such a 16GB download might be possible within an hour for me … so somewhat acceptable … but who knows …
I have been thinking along the same lines. Ultimately, a single tightly integrated autonomi binary application that can run a number of (virtual) nodes will probably be easier to deploy and manage by the average (windows/mac) user than three components and numerous processes. A container is a good step towards that goal.