IntolerantNodeNet [18/09/23 Testnet] [ Offline ]

There’s a Safe App opportunity for someone here, both a stand alone app and as a library for use by any app which stores data:

Uploads don’t happen immediately but enter a staging process where the app monitors storage price and will wait until a sensible opportunity to store each chunk arises. Lots of scope for fancy algo, sexy UI and user twiddling of knobs and settings!

May be more popular in thrifty countries. :thinking:

6 Likes

I am not bothering anyone, let me service the outer reaches in peace :grin:

4 Likes

I just killed my node and did the

rm -rf ~/.local/share/safe

… in order to get log files from CLI operations.

So, is this possibly right command to get CLI logs into some file? If not can someone tell me how it is done? If yes, where are the log files located?

export SN_LOG=all safe files

I just used --log-output-dest upload_log

Sigh… how? Not like these two ways I tries, I see:

topi@topi-HP-ProBook-450-G5:~$ --log-output-dest upload_log
--log-output-dest: command not found
topi@topi-HP-ProBook-450-G5:~$ time safe files upload -c 50 --batch-size 20 Waterfall_slo_mo.mp4 --log-output-dest upload_log
error: unexpected argument '--log-output-dest' found

  tip: to pass '--log-output-dest' as a value, use '-- --log-output-dest'

1 Like

Place it right after safe :smile:

3 Likes

Thanks, finally! :heartpulse:

4 Likes

OK, so I have found an error that I can reproduce:

  1. I uploaded successfully the same 59,7MB file three times in a row with this command:
time safe --log-output-dest upload_log files upload -c 50 --batch-size 20 Waterfall_slo_mo.mp4
  1. Then I aborted the fourth upload after one batch of 20 chunks
  2. Then the upload failed three times in a row with:
    0: Transfer Error Failed to send tokens due to Network Error Could not retrieve the record after storing it: 570596b8d4e95ea8026cbc3045cf17b9f3e1a0c43bcca2e57ad84d63e0a2d01f.. 1: Failed to send tokens due to Network Error Could not retrieve the record after storing it: 570596b8d4e95ea8026cbc3045cf17b9f3e1a0c43bcca2e57ad84d63e0a2d01f.
    Logs attached.
    Toivos_upload_failure.zip (2.8 MB)

ping @joshuef

EDIT: On top of the above all uploads are failing now the same way. I tried three different files of about 50kB.

9 Likes

Do we have any Aberdonian members?

1 Like

Can you try with -c 100 --batch-size 5 ?

1 Like

I’m now getting this when I try uploading:-

time safe files upload testfile5
Built with git version: 26c3d70 / main / 26c3d70
Instantiating a SAFE client...
🔗 Connected to the Network                                                                         Total number of chunks to be stored: 1
Transfers applied locally
Error: 
   0: Transfer Error Failed to send tokens due to Network Error Could not retrieve the record after storing it: ac9d2bafa0d601f7929ba664434979869f64a7c82eef57f2d6a7d32945e87ad0..
   1: Failed to send tokens due to Network Error Could not retrieve the record after storing it: ac9d2bafa0d601f7929ba664434979869f64a7c82eef57f2d6a7d32945e87ad0.

Location:
   sn_cli/src/subcommands/files.rs:179

Backtrace omitted. Run with RUST_BACKTRACE=1 environment variable to display it.
Run with RUST_BACKTRACE=full to include source snippets.

real	1m11.469s
user	0m11.507s
sys	0m0.418s

I’ve tried switching to the Safe nodes quoted in the startup top post. Although I can use mine for ‘safe wallet balance’ so I think it must be working as a target for clients.

2 Likes

This is new and somewhat terse…


safe@IntolerantNetSouthside01:~/.local/share/safe$ safe files upload -c30 --batch-size 40 /usr/share/doc/gawk/
Built with git version: 26c3d70 / main / 26c3d70
Instantiating a SAFE client...
🔗 Connected to the Network                                                                                                                                                  Total number of chunks to be stored: 113
Killed

Not killed by me, may I add.

3 Likes

OK - why not?

1 Like

That looks like a kernel kill, most likely oom

4 Likes

Mem use does not look too terrible riight now


safe@IntolerantNetSouthside01:~/.local/share/safe/node$ free
               total        used        free      shared  buff/cache   available
Mem:         1962008     1811040       85936         768       65032       38500
Swap:              0           0           0
2 Likes

Can you see the mem history ?

2 Likes

No. All I have are some lo-res graphs of CPU, Disk and network utilisation.
@Josh has a cunning script that might do it but it only takes a snapshot every 10mins as default.

Right now I am waiting for this upload to finish with -c 100 --batch-size 5
NOt wanting to say anything in case I jinx it :slight_smile:

OK that finished successfully

safe@IntolerantNetSouthside01:~/.local/share/safe$ safe files upload -c100 --batch-size 5 /usr/share/doc/gawk/
Built with git version: 26c3d70 / main / 26c3d70
Instantiating a SAFE client...
🔗 Connected to the Network                                                                                                                                                  Total number of chunks to be stored: 113
Transfers applied locally
After 15.217836169s, All transfers made for total payment of Token(8744) nano tokens for 5 chunks. 
Successfully made payment of 0.000008744 for 5 chunks.
Successfully stored wallet with cached payment proofs, and new balance 99.999479419.
After 29.030794695s, uploaded 5 chunks, current progress is 5/113.







After 17.556563423s, All transfers made for total payment of Token(1871) nano tokens for 1 chunks. 
Uploaded chunk #f8afb0.. in 22 seconds
After 706.504649052s, verified 113 chunks
16 failed chunks were found, repaid & re-uploaded.
======= Verification: 16 chunks to be checked and repayed if required =============
======= Verification Completed! All chunks have been paid and stored! =============
Uploaded all chunks in 37 minutes 53 seconds
Writing 4538 bytes to "/home/safe/.local/share/safe/client/uploaded_files/file_names_2023-09-19_21-40-41"

so hat-tip to @TylerAbeoJordan for the -c100 --batch-size 5 suggestion - Verrrry slow though nearly 38 mins for 396k of gawk docs.

3 Likes

I ran the original command again and got a more familiar error

safe@IntolerantNetSouthside01:~/.local/share/safe$ safe files upload -c30 --batch-size 40 /usr/share/doc/gawk/
Built with git version: 26c3d70 / main / 26c3d70
Instantiating a SAFE client...
🔗 Connected to the Network                                                                                                                                                  Total number of chunks to be stored: 113
Error: 
   0: Transfer Error Failed to send tokens due to Network Error Not enough store cost quotes returned from the network to ensure a valid fee is paid..
   1: Failed to send tokens due to Network Error Not enough store cost quotes returned from the network to ensure a valid fee is paid.

Location:
   sn_cli/src/subcommands/files.rs:179

Backtrace omitted. Run with RUST_BACKTRACE=1 environment variable to display it.
Run with RUST_BACKTRACE=full to include source snippets.

I had a quick and dirty memory logger script running which shows memory is tight ( you try running 30+ nodes on a 2Gb instance) but not critical. I will steal inspiration from @Josh to graph this and play with various values for concurrency and batch size if time permits.

safe@IntolerantNetSouthside01:~/.local/share/safe/tools/mem_tracker$ ./track_mem_usage.sh 
Timestamp: 2023-09-19 21:55:56  Total: 1916MB  Used: 1753MB  Free: 95MB  Cached: 56MB
Timestamp: 2023-09-19 21:56:01  Total: 1916MB  Used: 1757MB  Free: 87MB  Cached: 52MB
Timestamp: 2023-09-19 21:56:06  Total: 1916MB  Used: 1749MB  Free: 101MB  Cached: 58MB
Timestamp: 2023-09-19 21:56:11  Total: 1916MB  Used: 1753MB  Free: 95MB  Cached: 55MB
Timestamp: 2023-09-19 21:56:16  Total: 1916MB  Used: 1751MB  Free: 93MB  Cached: 52MB
Timestamp: 2023-09-19 21:56:21  Total: 1916MB  Used: 1752MB  Free: 91MB  Cached: 50MB
Timestamp: 2023-09-19 21:56:26  Total: 1916MB  Used: 1767MB  Free: 88MB  Cached: 40MB
Timestamp: 2023-09-19 21:56:32  Total: 1916MB  Used: 1767MB  Free: 88MB  Cached: 40MB
Timestamp: 2023-09-19 21:56:37  Total: 1916MB  Used: 1769MB  Free: 84MB  Cached: 37MB
Timestamp: 2023-09-19 21:56:43  Total: 1916MB  Used: 1772MB  Free: 82MB  Cached: 34MB
Timestamp: 2023-09-19 21:56:48  Total: 1916MB  Used: 1774MB  Free: 80MB  Cached: 33MB
Timestamp: 2023-09-19 21:56:53  Total: 1916MB  Used: 1777MB  Free: 76MB  Cached: 29MB
Timestamp: 2023-09-19 21:56:58  Total: 1916MB  Used: 1777MB  Free: 77MB  Cached: 30MB
Timestamp: 2023-09-19 21:57:04  Total: 1916MB  Used: 1778MB  Free: 76MB  Cached: 28MB
Timestamp: 2023-09-19 21:57:09  Total: 1916MB  Used: 1777MB  Free: 76MB  Cached: 29MB
Timestamp: 2023-09-19 21:57:14  Total: 1916MB  Used: 1778MB  Free: 77MB  Cached: 29MB
Timestamp: 2023-09-19 21:57:19  Total: 1916MB  Used: 1778MB  Free: 73MB  Cached: 27MB
Timestamp: 2023-09-19 21:57:25  Total: 1916MB  Used: 1779MB  Free: 73MB  Cached: 27MB
Timestamp: 2023-09-19 21:57:30  Total: 1916MB  Used: 1779MB  Free: 73MB  Cached: 26MB
Timestamp: 2023-09-19 21:57:36  Total: 1916MB  Used: 1776MB  Free: 77MB  Cached: 30MB
Timestamp: 2023-09-19 21:57:41  Total: 1916MB  Used: 1753MB  Free: 93MB  Cached: 50MB
Timestamp: 2023-09-19 21:57:46  Total: 1916MB  Used: 1751MB  Free: 87MB  Cached: 50MB
Timestamp: 2023-09-19 21:57:51  Total: 1916MB  Used: 1750MB  Free: 94MB  Cached: 54MB
6 Likes

I’m quite surprised by the variance in success rate of uploads: my attempts were all successful. I wonder if the difference lies more in bandwidth/latency rather than cpu/memory constraints.
Below is a link to a (partial) grab of the terminal output of a ~600 mb upload. I wanted to do an analysis of the variance in upload cost of chunks (the token(xxx) part), but I’m flat out on a bid. If someone would like to take that on, a graph showing that variance in time would be useful I think. I can’t see how the price can vary so much between batches (5 chunks batches, interval between them is only a few minutes, and I can see there are some major variances)

https://we.tl/t-kC6skATtOH

9 Likes

That sounds to me like you’ve aborted the command and corrupted the local wallet. So I’d assume it’s attempting to double spend for all further uploads.

If you erase your wallet, get from the faucet and try again, does that go?

5 Likes