Weather report

Hmm, on the second run it also failed.

Any pointers?

1 Like

Right, glad you got it resolved.

Has it downloaded on subsequent runs?

What about other files?

1 Like

well … that’s network errors :man_shrugging:

… one of the issues when trying to use mainnet (instead of testnet / alpha)


interestingly everything seems to work significantly worse locally than it works from my cloud dev machine …

from here friends doesn’t load; friendship requests error out, … after restarting dweb it doesn’t even find the first pointer … I guess the ant client is pretty robust now but when trying to use the python client or dweb it’s not exactly running smoothly …

did I read somewher that you increased the default number of parallel queries again?

1 Like

None of the defaults were changed in this release.

Sorry, I haven’t really been in to testing this kind of stuff. I would like to help out, but not really sure where to start.

2 Likes

okay - and all good :slight_smile: we’ll get there … it’s weekend now; get some rest @chriso

2 Likes

Yes.

They work, but this file took double the time on the second run:


Summary

toivo@toivo-HP-ProBook-450-G5:~$ time ant file download 6e93f32ac27275de1a17d104688bc3cab46b98d94d60ddd950cc18c925766041 .
Logging to directory: “/home/toivo/.local/share/autonomi/client/logs/log_2025-08-01_22-14-18”
Connecting to the Autonomi network…
Connected to the network
Fetching 3 encrypted data chunks from network.
Fetching chunk 1/3 …
Fetching chunk 2/3 …
Fetching chunk 3/3 …
Fetching chunk 1/3 [DONE]
Fetching chunk 3/3 [DONE]
Fetching chunk 2/3 [DONE]
Successfully fetched all 3 encrypted chunks
Successfully decrypted all 3 chunks
Fetching 15 encrypted data chunks from network.
Fetching chunk 1/15 …
Fetching chunk 2/15 …
Fetching chunk 3/15 …
Fetching chunk 4/15 …
Fetching chunk 5/15 …
Fetching chunk 6/15 …
Fetching chunk 6/15 [DONE]
Fetching chunk 7/15 …
Fetching chunk 5/15 [DONE]
Fetching chunk 8/15 …
Fetching chunk 1/15 [DONE]
Fetching chunk 9/15 …
Fetching chunk 4/15 [DONE]
Fetching chunk 10/15 …
Fetching chunk 7/15 [DONE]
Fetching chunk 11/15 …
Fetching chunk 2/15 [DONE]
Fetching chunk 12/15 …
Fetching chunk 8/15 [DONE]
Fetching chunk 13/15 …
Fetching chunk 3/15 [DONE]
Fetching chunk 14/15 …
Fetching chunk 11/15 [DONE]
Fetching chunk 15/15 …
Fetching chunk 10/15 [DONE]
Fetching chunk 15/15 [DONE]
Fetching chunk 9/15 [DONE]
Fetching chunk 12/15 [DONE]
Fetching chunk 13/15 [DONE]
Fetching chunk 14/15 [DONE]
Successfully fetched all 15 encrypted chunks
Successfully decrypted all 15 chunks
Successfully downloaded data at: 6e93f32ac27275de1a17d104688bc3cab46b98d94d60ddd950cc18c925766041

real 1m31.820s
user 0m27.541s
sys 0m16.719s

toivo@toivo-HP-ProBook-450-G5:~$ time ant file download 6e93f32ac27275de1a17d104688bc3cab46b98d94d60ddd950cc18c925766041 .
Logging to directory: “/home/toivo/.local/share/autonomi/client/logs/log_2025-08-01_22-16-16”
Connecting to the Autonomi network…
Connected to the network
Fetching 3 encrypted data chunks from network.
Fetching chunk 1/3 …
Fetching chunk 2/3 …
Fetching chunk 3/3 …
Fetching chunk 1/3 [DONE]
Fetching chunk 3/3 [DONE]
Fetching chunk 2/3 [DONE]
Successfully fetched all 3 encrypted chunks
Successfully decrypted all 3 chunks
Fetching 15 encrypted data chunks from network.
Fetching chunk 1/15 …
Fetching chunk 2/15 …
Fetching chunk 3/15 …
Fetching chunk 4/15 …
Fetching chunk 5/15 …
Fetching chunk 6/15 …
Fetching chunk 1/15 [DONE]
Fetching chunk 7/15 …
Fetching chunk 3/15 [DONE]
Fetching chunk 8/15 …
Fetching chunk 4/15 [DONE]
Fetching chunk 9/15 …
Fetching chunk 5/15 [DONE]
Fetching chunk 10/15 …
Fetching chunk 6/15 [DONE]
Fetching chunk 11/15 …
Fetching chunk 2/15 [DONE]
Fetching chunk 12/15 …
Fetching chunk 7/15 [DONE]
Fetching chunk 13/15 …
Fetching chunk 9/15 [DONE]
Fetching chunk 14/15 …
Fetching chunk 10/15 [DONE]
Fetching chunk 15/15 …
Fetching chunk 11/15 [DONE]
Fetching chunk 8/15 [DONE]
Fetching chunk 15/15 [DONE]
Fetching chunk 14/15 [DONE]
Fetching chunk 12/15 [DONE]
Fetching chunk 13/15 [DONE]
Successfully fetched all 15 encrypted chunks
Successfully decrypted all 15 chunks
Successfully downloaded data at: 6e93f32ac27275de1a17d104688bc3cab46b98d94d60ddd950cc18c925766041

real 3m0.432s
user 0m23.890s
sys 0m14.356s

1 Like

It’s cool, I’m still working, and I’m disappointed to hear that you’re having a bad experience with your apps. I thought this problem we resolved with the closest peers would cascade to the app performance too, but if this isn’t the case, I’d like to help gather some information I can take back to the team.

4 Likes

OK thanks.

I think sometimes there will just be a bit of natural variance in the download speeds for a probabilistic network.

I would say if we are getting files to consistently download, that is a win for us, for now. We can look at performance soon I’m sure.

3 Likes

Yeah, I’m satisfied with the way it is now, not complaining at all. I assume the performance gets more stable with forthcoming changes.

No problem, thanks for your help.

I think somehow you had an old bootstrap cache lying around from a previous version or something. We will need to look out for that, and it’s good to know.

1 Like

Anything else I can do? Very keen to get us started investigating why the file upload/download would be reliable but the apps are not.

I haven’t had a single working run of that speed test script.

2 Likes

after cleaning my bootstrap cache I get…

What should I test now to help get this properly investigated?

1 Like

Sorry can’t do much right now… But then again the python issue here boils down to

This runtime error is the error we get directly from the maidsafe library…

And

  1. We’re executing a Scratchpad_update there… The error doesn’t make sense at all… Even if there’s a fork and multiple versions then the Scratchpad update should just increase the counter and write the new version… Why would it throw the error?! (I complained about exactly that 2 days ago in the IF channel..)

  2. The error message is completely useless it tells us there’s a fork and it doesn’t tell us the content of the forked versions so we’re not able to resolve it by hand (for a read getting the forked versions would be helpful… Raising an exception without returning the contents/counters isn’t…)
    For all I know it’s absolutely possible that the error message could simply be wrong and it might e.g. Get returned the latest version and a previous version with lower counter and is confused by this :man_shrugging:

So no testing needed - the error handling of the autonomi-client lib is the main issue in this second (because we don’t get any diagnosis info about what might be going wrong here)

1 Like

Actually, that 100% variance was probably due to me forgetting my free tier Proton VPN on. Now I dowloaded that file a four times in a row. Times: 33s, 32s, 25s, 24s.

I’m proposing that each release is accompanied by a series of simple tests that should be run by every user.
including but not limited to

check time to download and checksum of a series of files fro 1k to 10M
check time to upload of random files from 1k to 10M
check scratchpad performance 

Also how much hassle would it be to create a centralised (ooooh!!) page to report the results to ?

3 Likes

Well and if we’re honest with ourselves here it’s immutable data up and downloads being reliable… Those worked well since over a year ago…

Issue is Scratchpads and pointers …

(Indelible avoids using those btw too and) they have been the most brittle part of mainnet since it exists…

2 Likes

There is nothing intentional about that.

OK, let’s see if we can maybe shift focus a little bit now over to these.

4 Likes

Sure - but it was the safest possible choice (which makes kind of sense for a short hackathon - but still made sure you only worked with the stuff that was super well tested for close to 2 years now and was always working since then…)

Would have been more exciting to see you juggle with pointers, Scratchpads and graphEntries :man_shrugging:

2 Likes

Let’s avoid uploading randomized junk files to mainnet.

2 Likes

Sorry, but can we avoid doing this please? You are making it sound like we intentionally avoided it, but there was nothing like that going on.

Let’s just move forward.

1 Like