Update 20th March, 2025

While we all wait for improvements to uploads to be rolled out (the latest fix is looking good but was not quite ready for prime time at time of writing) we have a new dev tool to announce. Ant Analyzer is a CLI tool created by @anselme to read and decode addresses on Autonomi Network. It processes addresses to identify the type of data located there (e.g. Chunk, DataMap, Public Archive). It also decodes content stored at these addresses, such as files, metadata or nested structures, and can inspect uploaded files or folders to reveal their structure, metadata and chunk distribution. Builders will be able to use this tool to ensure files are correctly chunked, encrypted and stored on the network for debugging purposes and to better understand how data is stored on the network.

For those that missed the explainer on emissions, we currently have four emission services running. Every 120 seconds, each service initiates a reward round, selecting 100 random nodes from the live network. Each selected node earns 0.05662462854 ANT tokens. Over the next 12 weeks, we will be correcting the emissions error spotted by the community so it matches the plans in the white paper.

While the tasty carrot of encouragement :carrot: remains our social tool of choice, we’re also unveiling a wee punitive stick to persuade upgrade laggards to do the right thing. From Monday 24th March, to qualify for emissions payments you will need to be running the latest node version. More in this post, where you’ll also find info about the Impossible Futures programme to support app builders, which kicks off on April 22nd.

Please read @rusty.spork’s post about a vulnerability in cryptowallets, including Metamask, on Chrome browser. It includes a simple test to see if you may be compromised. Better safe than sorry.

Finally two big thumbs up (maybe three if you’re using Stable Diffusion) to @happybeing for his release of dweb which allows viewing and publishing of websites created using standard web tooling (e.g. Publii, Svelte, static site generators or plain HTML/CSS) on Autonomi, and to @bochaco for pushing at least two new releases of the Docker-based Formicacio node manager in the last week. :+1: :+1: :+1:

General progress

Anselme is working on a low-level network analyser to track Kad queries. The aim is to provide information about overall network health, the parameters we need to set to maximise performance, resilience of our API tools, and how our network works in practice.

In his spare time he also created Ant Analyzer, a versatile tool to read and decode addresses on Autonomi Network @vphongph has been testing this tool and is writing a tutorial on how to use it.

@bzee continued implementing Node.js bindings in the API. Most are now complete.

@dirvine has been digging deep into the libp2p implementation of Kademlia in search of efficiencies in the way nodes manage their routing tables.

@chriso continues to push fixes and improvements in our test networks and client as we seek to improve upload performance.

Ermine raised a PR to refactor verify_data_location and verify_routing_table testcases with nodespawner on the road to finally removing the rpc code.

Lajos has been working on security considerations for the Foundation emissions as well as smart contract architecture for the Impossible Futures programme.

@mick.vandijke has been amending the emissions process so that it agrees with the original scheme set out in the white paper. Mick also found and fixed an error in testing which the emission service failed due to the client failing to populate its routing table in time.

@qi_ma raised two PRs (2846 & 2848) to address node-side issues that might result in the payment to be rejected, causing upload failure. Qi and @chriso also investigated the effect of large churn events on download and upload performance, finding that recent fixes do seem to improve the situation. Plus Qi advised community members on what to look for in the logs to track whether chunks are being returned properly.

@roland raised a PR to return exit codes on failed uploads and downloads for debugging.

And @shu worked on further revision of documentation for UPnP simulation. He also worked on a POC for having Port Restricted Cone NAT simulation in Digital Ocean, where we run our tests. This means the team will eventually have UPnP, Randomised Symmetric NAT, Port Restricted Cone Nat, and Full Cone NAT simulation on Digital Ocean for internal and production testnets.

54 Likes

first haha

17 Likes

second now to read the peak of the week :slight_smile:

15 Likes

Thanks so much to the entire Autonomi team for all of your hard work! :man_factory_worker: :man_factory_worker: :man_factory_worker:

19 Likes

When upload will be ready, partners also will be ready to upload? Or what about few words about prove network is pernament/stable?

6 Likes

Good news about the stick

12 Likes

Thanks as ever to everyone involved.
Waiting impatiently for the next new versions from @chriso and @QiMa and thrilled to discover the orange thing has gained length rigidity and weight :slight_smile:
Lets hope its effective and we will soon be able to use this network effectively
I expect we will see a contraction of the network once the ā€œcarrotā€ is wielded effectively but that may not be entirely a bad thing and @dirvine will get to see more mass movement on a large network to satisfy his curiosity and hopefully reveal previously hidden Kademlia behaviours.
Ant Analyser sounds very interesting and I look forward to reading the tutorial and trying it out.
Thanks also to @happybeing and @bochaco for their tireless work on what we will sooner or later acknowledge as crucial components of the network. It would be unfair to miss out @Josh and @traktion’s work as well. We will soon see their worth as up/downloads improve.

All in all an excellent update, well done everyone.

Avanti!!

22 Likes

The latter of those PR’s has a description (my bolding):

There might be payee got blocked by us or churned out from our perspective.
We shall still consider the payment is valid whenever payees are close enough.
Hence using distance_range to verify payment payee as well

On the other hand, in the code there is this comment:

// In case we don’t have enough knowledge of the network, we shall trust the payment.

I’m curious, why is there even a need to know the payee? Wouldn’t it be enough to just have the payment done, no matter where the payment originated from?

6 Likes

Some notes about the recent changes, some not quite public, in both the Autonomi APIs and dweb, which should make uploads and downloads start to work again.

Upcoming fixes have improved downloads

Both @Traktion and I have tried downloading with the improved Autonomi client API which are in the latest release candidate but not yet the ā€˜stable’ release (so not yet in ant unless you build yourself).

I used dweb to download and inspect the content of a directory both with and without these tweaks.

I was downloading a small directory (which @Southside created for his website) and found that instead of 3 of 5 downloads failing, 5 out of 5 succeeded, although they took a loooong time (1 to 3 minutes). @Traktion reported something similar.

This improvement in reliability is very good to see. I still can’t do anything over mobile b/b hotspot, but if you have a good connection and on VPS things are improving.

Striving for uploads to publish websites

I haven’t had a chance to try uploads, but have also added code to make these more reliable in dweb. One is that when an Autonomi API call fails dweb will immediately try again and repeat this until it succeeds, or the universe ends. You can though set a limited number of tries if you don’t want to wait that long.

There is also a problem when trying to upload a directory. When this fails the Autonomi API (for putting a directory) starts again from scratch every time. This, and the fact you get charged for repeat uploads means that even a small error rate can mean it is very unlikely to succeed, plus, your wallet is being emptied much more than necessary every time you retry. Individual charges are small but will add up when dweb keeps repeating this over and over, so…

Upload file-by-file

I’ve also added the option (currently the default in dweb publish-new and dweb publish-update commands) for dweb to upload each file individually.

This, combined with the retry feature means that even if only a small percentage of PUT operations succeed, it will be possible to publish a directory or website (using dweb) if you wait long enough.

I’ll be trying this once the release candidate makes it into a release.

Inspecting a History and its Pointer

BTW dweb also has inspect commands, one of which shows you whether a pointer is pointing to the head of a graph for a History or is stuck pointing to an earlier entry.

This is another issue we have when using the History to implement versioned websites. But, again dweb now has a feature to work around this by ignoring the pointer if you want and scanning the graph from the Pointer target to the end (real head entry).

Uploads, downloads and web viewing may be close

All-in-all, it should be possible to publish directories and websites with dweb so long as a percentage of API calls are succeeding. This approach will mean upload and download will be very slow for now, but that’s better than not working well enough to publish and view websites at all.

This may be improved even more when nodes catch up with the last two releases.

Fingers crossed for next week that we’ll all be able to try out some of these features.

EDIT: all the above features of dweb are now available in dweb 0.3.1 so:

cargo install dweb-cli

and by all means beat me to seeing if it will upload, but watch your wallet if you leave this running! It will keep going and going until all the Autonomi APIs have returned success!

27 Likes

Great progress in general.

I’m particularly pleased by this, however, as ancillary tooling for exploration, verification, and debugging marks a new phase in the evolution of a tech.

10 Likes

Great job, applause for the Team :clap: :clap: :clap:
Although it may seem that the pace is no longer so fast, the size and breadth of the work being carried out is fascinating, bravo! :backhand_index_pointing_left: :ok_hand:

Is this talk of a preliminary tool for a future NRS/DNS system?

8 Likes

Good decision! That’s a big carrot! :slight_smile:

As @happybeing mentioned, this is much improved! I think the contrast was made stark by the old/current versions seemingly getting worse. Perhaps that is due to more nodes upgrading, which then works better with the new tooling? Either way, it means I can download things again.

Once the above is integrated into the API, I will cut some binaries on AntTP with streaming downloads. I am just testing an upgrade which fetches lots of chunks at the same time, rather than one by one. I want it somewhere in the middle, really, but will play with that next.

16 Likes

Will this be surfaced in the API too? Currently, I try to download files as archives and see if it blows up, assume it was a file. It would be nice if there was a better way to test this! :sweat_smile:

6 Likes

Just steal his code (and mine) :laughing:

7 Likes

This sounds great! Thanks for the hard work.

7 Likes

Thanks for all the hard work, team! We’re really making amazing steps already and I got a feeling we’re only seeing a small part of the improvements that have been made so far. Starting Monday, hopefully, we’re going to see a much larger portion of the network on the latest Antnode version and see the real progress you’ll have been making so far. Keep up the great work.

15 Likes

Thx 4 the update Maidsafe devs and all your hard work

Nice update and take your time with uploads, because you always give us the best outcome.

Super impressive @happybeing, @bochaco code speaks louder than…
@Josh, @traktion, @southside :clap: :clap: :clap:

Hey I didn’t know that super ants had spare :seven_thirty: :sweat_smile:, can’t wait to play with Ant Analyzer

@rusty.spork :+1:
Better SAFE than sorry.

Keep hacking super ants

13 Likes

The above mentioned features of dweb are now available in dweb 0.3.1 so:

cargo install dweb-cli

and by all means beat me to seeing if it will upload, but watch your wallet if you leave this running! It will keep going and going until all the Autonomi APIs have returned success!

14 Likes

Antalyze

And you have a really good team if people are keen enough to think of things like this and make them in their spare time.

11 Likes