NoEncryptionNet [07/02/24 Testnet] [Offline]

Congratulations! This testnet is another level. More than 12GB uploaded in just over three hour and without errors.

10 Likes

I was able to download this full file successfully, which is fantastic. The download took a lot longer than the upload, which is not what I’d have expected.

Edit:

This may explain it. I am bandwidth limited due to an old 100mbps network cable between my PC and router, so that may be why my downloads are slow.

9 Likes

Here it is explained:

7 Likes

Hopefully not too off topic…
how do I install vdash and start it up? Ubuntu.

4 Likes

If you download and run that script (might need to chmod +x to make it executable first) it will offer you the option to run install vdash - after that you execute it again and select run vdash and everything should be done for you automatically

(not my invention but @aatonnomicc s)

4 Likes

it needs testing on the vdash install did it work for you @riddim ?

2 Likes

If that worked for @riddim I’ll give it a go.

1 Like

testnet@testnet-MacBookPro:~$ safe files upload ‘/media/testnet/TEST FILES/Uploadfiles’
Logging to directory: “/home/testnet/.local/share/safe/client/logs/log_2024-02-07_11-20-40”
Built with git version: db930fd / main / db930fd
Instantiating a SAFE client…
Trying to fetch the bootstrap peers from https://sn-testnet.s3.eu-west-2.amazonaws.com/network-contacts
Connecting to the network with 97 peers
:link: Connected to the Network Starting to chunk “/media/testnet/TEST FILES/Uploadfiles” now.
Uploading 10203 chunks
⠋ [01:12:50] [##############################>---------] 7746/10203 Upload terminated due to un-recoverable error Err(SequentialUploadPaymentError)
Error:
0: Failed to upload chunk batch: Too many sequential upload payment failures

Location:
sn_cli/src/subcommands/files/mod.rs:415

Backtrace omitted. Run with RUST_BACKTRACE=1 environment variable to display it.
Run with RUST_BACKTRACE=full to include source snippets.


Not sure what happened here.
I’m going to try with a smaller batch of files

3 Likes

Try clearing the client folder, downloading the tokens again, and uploading that file again using

safe files upload --batch-size 8 <different_file_path>

I was also experiencing this problem.

I uploaded another file of 264 Mb, it went very smoothly, especially as my connection speed is not too high as I use at least 3 other online programmes at the same time.

PS C:\Users\gggg> safe files upload --batch-size 8 E:\gggg\Desktop\Filmy_iphone\IMG_0038.MOV
Logging to directory: "C:\\Users\\gggg\\AppData\\Roaming\\safe\\client\\logs\\log_2024-02-07_21-39-37"
Built with git version: db930fd / main / db930fd
Instantiating a SAFE client...
Trying to fetch the bootstrap peers from https://sn-testnet.s3.eu-west-2.amazonaws.com/network-contacts
Connecting to the network with 97 peers
🔗 Connected to the Network
Starting to chunk "E:\\gggg\\Desktop\\Filmy_iphone\\IMG_0038.MOV" now.
Chunking 1 files...
Uploading 530 chunks
⠄ [00:07:08] [#######################################>] 523/530
Retrying failed chunks 7 ...
⠒ [00:09:42] [#######################################>] 529/530
Retrying failed chunks 1 ...
Unverified file "IMG_0038.MOV", suggest to re-upload again.
**************************************
*          Uploaded Files            *
*                                    *
*  These are not public by default.  *
*     Reupload with `-p` option      *
*      to publish the datamaps.      *
**************************************
Among 530 chunks, found 0 already existed in network, uploaded the leftover 530 chunks in 11 minutes 58 seconds
**************************************
*          Payment Details           *
**************************************
Made payment of 0.000579016 for 530 chunks
Made payment of 0.000101923 for royalties fees
New wallet balance: 99.998862890
2 Likes

No clue :innocent: I had it installed before I had your script

When looking at the lines I think the cargo install from apt is not needed since it should come with rustup but it won’t be available immediately in the executing terminal unless we execute this source xyz command

1 Like

The script worked great for nodes and clients. Saw no option to install vdash. I may have some how messed it up.

3 Likes

curl https://sh.rustup.rs -sSf | sh

(just going with the default settings)

And then in a new terminal

cargo install vdash

Is the manual installation process

4 Likes

you will need rust Install Rust - Rust Programming Language
once you have rust running then its as simple as

cargo install vdash

7 Likes

Forgive me for I am slow, I see no chunks that are in plain text. In the past they were plentiful.

Who can point out the obvious here for me?

5 Likes

Previously data_maps (plain text) were all over the place. Now, this is only true if folk make them public. however, data_maps don’t bother us.

7 Likes

Sorry David, I am still struggling, what was being encrypted that is no longer being encrypted, data_maps?

6 Likes

Yes, data_maps, not really not encrypted, but not stored unless -p I doubt there is much plain text beyond that unless folk are using the low level API and creating their own chunks?

7 Likes

Ahh lights are turning on :slight_smile: , so this is the way it will be going forward or just to determine if it caused the memory issue.

9 Likes

I think the more we dive in the less desirable it is for nodes to encrypt like the last testnets. I am keen we can restart the whole network on a catastrophe, i.e. software bug or similar. I see that more likely thank a blackout of a country etc.

It’s an ongoing covo though, but to store all humanities’ knowledge then lose it would be criminal.

12 Likes

Earlier testnets had the self-encryption occurring for files larger than 3KB IIRC, so files under that 3KB would be unencrypted and stored in one chunk.

Using a country as an example only because its easy to visualise. But yes malware that turns the computer off on a certain date at 0:00UTC, or windows update issue, or software bug, or some other zero day malware is more likely.

I still believe that the node encrypting the chunks as an additional measure is still well worth the performance loss (assuming its workable) since the ABC agencies and governments love locking things down as much as they can. So give them no opportunity is my preference. And asking node operators for a recovery password and storing the temp key amongst other things on disk encrypted solves the loss of humanities’ knowledge.

10 Likes