Update 13th June, 2024

Our first wave boosters have taken us way up into the stratosphere, way higher than we ever expected to go. We are now on ~40k nodes! Do we dare to dream how high that number could go when Wave 2 testers come on board?

Indeed, some Wave 2 early birds are already strapped in, having received invite codes from Wave 1. The rest of you will need to be patient just a little while longer though as we charge around like lunatics work tirelessly to make your user journey as comfortable as possible.

Thanks to the folks who have been testing out the v108.3 alpha node. This fixes a problem with Arm builds, and is now ready for primetime. See this post for info on how to reset your nodes if you’re a Node Launchpad user.

We’re waiting to build metrics into the Node Launchpad so we can see what’s going on there with regard to nodes that don’t start. This is particularly a problem with Windows, although it’s occurring elsewhere too. @roland has created a troubleshooter for Node Launchpad users on Windows whose nodes are stuck on ‘added’.

CLI users (safenode-manager) should replace the current safenode binary with the new one via safeup. See this post for details. Let us know if you need more explanations. It also reduces the load on bootstrap peers, which is also helpful for networking performance.

There’s been good progress on the connectivity issues that were caused by bootstrap nodes also being used as relay nodes and becoming overloaded. We’re also experimenting with reducing the number of nodes per cloud instance to reduce CPU spiking.

The DAG is not 100% as it should be, and we’ve just deployed a new one. So you should hopefully see an updated score now. In our own tests, more than half of our nodes are in the same boat with zero nanos. This is partly due to DAG slowness, and also because of the rapid expansion of the network diluting the content - which all in all is a nice problem to have, but understandably irritating for testers who want to see themselves on the ladder or earning nanos to fund uploads. We have put some rockets under the uploaders there, to get things moving.

General progress

Our main activities this week have been getting those fixes in place. @mazzi has been mostly on the Discord Bot, directing it to the new DAG and rejigging the architecture to make it easier to test.

@roland has been looking at stalled nodes on Windows, and put in a fix there.

Meanwhile @qi_ma has been tinkering with some DAG and faucet issues, along with @anselme, and put in some PRs to improve the auditing process.

@bzee has been mostly debugging the node with relays and DCUtR, and will soon join in with the DAG work.

And @joshuef and @shu have been clearing the upgrade path to prepare for the latest release.

66 Likes

It’s great to see all the progress!

17 Likes

Second or maybe third?

Edit: hah, i got my head out of the beta servers a few seconds before you @happybeing

14 Likes

What is odd is the:

  • slowness of my uploads (minutes per chunk)
  • the cost (I had 1700 nanos kindly donated, and managed only 4 chunks with that)

Slowness could be DAG. Cost though seems very odd given the sparseness of data on 40k nodes. That explains low earnings, but not why the cost is so high.

So for now it seems impossible to earn enough nanos for uploading even a handful of chunks.

I’ve just been given a further 700 and so have a total of 764 nanos for uploads but not sure its wise to try at this point! :man_shrugging:

26 Likes

Thanks so much to the entire Autonomi team for all of your hard work! :man_factory_worker: :man_factory_worker: :man_factory_worker:

And may we have a stable strong network soon! :man_factory_worker: :man_factory_worker: :man_factory_worker:

15 Likes

The new nodes are working great for me, more records and nanos for sure. Good work team!

16 Likes

First time here… Sixth place is good…

18 Likes

could I or we have some words about wallet intengration? Thx

7 Likes

Heres some

Its an open source project, as always, you are free at any time to submit a PR to show your efforts to help the project along and boost the value of your tokens. Should you personally lack the skills to add the code, nothing stops you employing others to try to provide what you feel is missing. Sometimes you need to invest a little time, effort and even money to ensure your investments continue to provide an income. After all, its better than actually working, isn’t it?

If it wasn’t so bleeding obvious already - here is the score

There are a limited number of devs on the Maidsafe/Autonomi payroll. Every single one of them right now is working hard addressing issues that are vital to the launch and maintenance of the beta programme.
Feel free to suggest that they be dragged off this work to scratch your own personal itch RIGHT NOW rather than deal with the basics first.
Good luck with that BTW.

Is that enough words, cryptoidiot ?

Nobody has forgotten about wallet integration. Nor will they.
Good stuff like wallet integration, fancy GUIs etc etc will happen when the underlying basics are sound and well tested.

Right now you are demanding we wallpaper the bedrooms before the the plumbing and electrics are installed and the cement is barely dry on the brickwork…

6 Likes

1700 nanos for 4 chunks? in the context of autonomi, you may have just become the “pizza” guy! :rofl:

21 Likes

There are plans afoot for this. We need tx to be sub second and network wide in the billions of tps. I know we can do this and you are right the DAG is slowing down some areas. It is being worked on with some interesting angles on offer. Stick with it for now and we will get it sorted though.

24 Likes

Nice :laughing:. Although this wasn’t really my money…

… which feels worse :cold_face:

People have sacrificed earnings to help get my app up for demo to everyone and it seems that’s been wasted. Donations are still coming in but I think it would be better for Autonomi to step in now.

I think they are considering it but it may just be too early so we may have to wait a while.

14 Likes

Do you know how much was each individual chunk?

3 Likes

No but I subsequently asked for an estimate for the remainder and it was also ridiculously high.

It feels like something is off, given how sparse the network is.

I started 22 nodes a few hours ago and only three have been asked to quote, and one of those has earned 10 for a single chunk.

What are the odds of four chunks landing on nodes with lots of active chunks?

7 Likes

fantastic as always thanks for the hard work to all the team :slight_smile:

currently 3 of us in the top 4 lost out places to lets say a non team player not wishing to join the nano donation cartel. which is also fine each to there own live your best life :slight_smile:

but could the team please furnish @happybeing with some funds for uploading ?

also possibly a multiplier for the participants of the nanos for happy being fund :slight_smile:

15 Likes

Great start to the Beta program.

It must be pretty full on for the team, but it’s great seeing solutions to problems popping up quickly, and seeing the big number of nodes revealing areas that need work.

Keep up the amazing work, and make sure nobody gets burned out with all of the pressure & excitement!

10 Likes

@maidsafe honestly, it really doesn’t make sense that you haven’t supported this or communicated clearly that you will, etc. You could also ask for the files from @happybeing and upload them yourself on his behalf as part of the uploads you’re already handling (thus avoiding passing out tokens in case there are factors influencing your reticence to do that).

4 Likes

Thx 4 the update Maidsafe devs

This is incredible news, the more the Network moves @ the speed of space expansion the better.

Would be fun, if 1Tb upload was hardcoded costing 1 nano, no more complaining about token shortage or limited uploads. If GETs cost 1 nano, that would stimulate uploads etc. We maybe need to experiment more than the 1 chunk 1 nano (or worst 4 chunk 1700 nano :scream:).

Use as little energy as possible (probably 1 of the universe fundamental rules)

I fear that when @mav finish and we can use our own SAFE that we might willynilly accept paying more for storage (personally i won’t, because i looked at what the free market is offering).

Hyperinflation is usefull for nobody in the real world. The first game should be upload as much as possible on the Network, second game GETs paid for it.

Keep coding/hacking/testing super ants

6 Likes

Loving the update, thank you for fixing the Arm build release :+1:

I know it’s still Beta, but I’m curious about how the relay peers are selected ?

tl;dr; my Pi3 running 8 nodes has --port set. Before the upgrade it’s network usage was about inline with my Pi4 node running --home-network

Now it’s upgraded, the Pi3 network usage has jumped up impacting %CPU to the point where I’ve had to drop it to 6 nodes.

I like to run with --port as it feels like I’m providing resources, but the poor Pi would benefit from some limits as it’s not as beefy as some of the other X64 machines people seem to be using.

  • Should I be using --port on Arm ? or leave that up to big setups to relay ?
  • If we can use port, than maybe a compile time max relay peers for Arm to limit the load would really help, and maybe a max rate limiter (there’s a nice implementation in GoLang, not sure if there is a rust equivalent)

But it’s all good :slight_smile: can’t wait until we can start upload testing! Also, The Pi4 with --home-network has much better relay distribution now, before upgrade it was all to a few ranges, and now much more distributed :+1:

Well done, really great progress and such a nice community, can’t wait to see where this goes.

HeatMap of Relay peers before upgrade, and after…

11 Likes