**************************************
* Uploaded Files *
**************************************
"Autonomi White Paper.pdf" 3d8b80031d088fd244ab79498d245e62e75ce48bedc9afd6d0f46724801f4277
Among 8 chunks, found 0 already existed in network, uploaded the leftover 8 chunks in 2 seconds
**************************************
* Payment Details *
**************************************
Made payment of NanoTokens(110) for 8 chunks
Made payment of NanoTokens(14) for royalties fees
New wallet balance: 1.999999876
And now for something bigger…
safe@BET-southside01:~$ safe files upload -p -b8 openSUSE-Tumbleweed-NET-x86_64-Current.iso
Logging to directory: "/home/safe/.local/share/safe/client/logs/log_2024-04-04_19-55-11"
Built with git version: f682844 / alpha-tolerance / f682844 / 2024-04-04
Instantiating a SAFE client...
Trying to fetch the bootstrap peers from https://sn-testnet.s3.eu-west-2.amazonaws.com/alpha-tolerance-network-contacts
Connecting to the network with 50 peers
🔗 Connected to the Network Chunking 1 files...
"openSUSE-Tumbleweed-NET-x86_64-Current.iso" will be made public and linkable
⠁ [00:00:00] [----------------------------------------] 0/553 Splitting and uploading "openSUSE-Tumbleweed-NET-x86_64-Current.iso" into 553 chunks
**************************************
* Uploaded Files *
**************************************
"openSUSE-Tumbleweed-NET-x86_64-Current.iso" d143cc9cc859038d2380237f14244dee0292c210728232e1d5e4ebff57d1cc98
Among 553 chunks, found 0 already existed in network, uploaded the leftover 553 chunks in 1 minutes 43 seconds
**************************************
* Payment Details *
**************************************
Made payment of NanoTokens(8547) for 553 chunks
Made payment of NanoTokens(1129) for royalties fees
New wallet balance: 1.999990200
Using screen
, on a phone, remoting to cloud…setenv thing a total PITA in this scenario…I’ll sit on the bench on this one. Have fun peeps!
I’m with you @stout77 on holiday trying to do anything via phone is a PITA
So I will stand by on the BasicEconomy testnet as I believe we are keeping it going and will hopefully upgrade it
Yeah, we want to upgrade. It should happen at some point next week I think.
@chriso - think we have a minor but annoying typo needs fixed
"multi-user.targett" is not a valid unit name.
safe@BET-southside01:~$ sudo /home/safe/.local/bin/safenode-manager add --count 20 --node-port 15000-15019 --peer /ip4/209.97.139.230/udp/59695/quic-v1/p2p/12D3KooWPTyqVdNzqcaeNGzjHA4nNnqJWkxU81gW7uBPYewu3jop
=================================================
Add Safenode Services
=================================================
20 service(s) to be added
The safe user already exists
Retrieving latest version for safenode...
Downloading safenode version 0.105.6-alpha.4...
Download completed: /tmp/a9173a5f-0a15-44e2-9126-9f46c32886f3/safenode
Failed to add 20 service(s):
✕ safenode1: Failed to enable unit: "multi-user.targett" is not a valid unit name.
✕ safenode2: Failed to enable unit: "multi-user.targett" is not a valid unit name.
✕ safenode3: Failed to enable unit: "multi-user.targett" is not a valid unit name.
✕ safenode4: Failed to enable unit: "multi-user.targett" is not a valid unit name.
✕ safenode5: Failed to enable unit: "multi-user.targett" is not a valid unit name.
✕ safenode6: Failed to enable unit: "multi-user.targett" is not a valid unit name.
✕ safenode7: Failed to enable unit: "multi-user.targett" is not a valid unit name.
Hmm. What OS are you running that on? It would be the underlying node manager crate that’s generating that I think.
Ubuntu 22.04
safe@BET-southside01:~$ safeup node-manager
**************************************
* *
* Installing safenode-manager *
* *
**************************************
Installing safenode-manager for x86_64-unknown-linux-musl at /home/safe/.local/bin...
Retrieving latest version for safenode-manager...
Installing safenode-manager version 0.7.4-alpha.1...
[00:00:01] [########################################] 5.20 MiB/5.20 MiB (0s) safenode-manager 0.7.4-alpha.1 is now available at /home/safe/.local/bin/safenode-manager
Hey Wes, thanks for reporting this. You’re correct, we were performing certain actions from client that should only be performed by nodes. I’ve put up a fix for that.
But I’m afraid that might not be the root cause for the issue you’re facing currently. If you could share your entire log file, I can take a look into it!
Hmm, that is really bizarre indeed, because nothing should have changed anywhere near that area.
Was thinking the same myself - it was a simple update to the payload rather than any structural change, surely?
Can anyone else replicate this?
Just do safeup node-manager
and watch the version no
I’ve not replicated it on a 22.04 VM:
vagrant@ubuntu2204:~$ lsb_release -a
No LSB modules are available.
Distributor ID: Ubuntu
Description: Ubuntu 22.04.3 LTS
Release: 22.04
Codename: jammy
vagrant@ubuntu2204:~$ safenode-manager --version
sn-node-manager 0.7.4-alpha.1
vagrant@ubuntu2204:~$ sudo safenode-manager add --count 3 --version 0.105.6-alpha.4 --peer /ip4/209.97.139.230/udp/59695/quic-v1/p2p/12D3KooWPTyqVdNzqcaeNGzjHA4nNnqJWkxU81gW7uBPYewu3jop
=================================================
Add Safenode Services
=================================================
3 service(s) to be added
Created safe user account for running the service
Downloading safenode version 0.105.6-alpha.4...
Download completed: /tmp/8a3760ed-2c62-4529-9ebc-b7a54350861c/safenode
Services Added:
✓ safenode1
- Safenode path: /var/safenode-manager/services/safenode1/safenode
- Data path: /var/safenode-manager/services/safenode1
- Log path: /var/log/safenode/safenode1
- RPC port: 127.0.0.1:35129
✓ safenode2
- Safenode path: /var/safenode-manager/services/safenode2/safenode
- Data path: /var/safenode-manager/services/safenode2
- Log path: /var/log/safenode/safenode2
- RPC port: 127.0.0.1:36695
✓ safenode3
- Safenode path: /var/safenode-manager/services/safenode3/safenode
- Data path: /var/safenode-manager/services/safenode3
- Log path: /var/log/safenode/safenode3
- RPC port: 127.0.0.1:33557
[!] Note: newly added services have not been started
Can you cat
the contents of one of the service definitions? They should be at /etc/systemd/system/safenode1.service
.
[Unit]
Description=safenode1
[Service]
ExecStart=/var/safenode-manager/services/safenode1/safenode --rpc 127.0.0.1:36885 --root-dir /var/safenode-manager/services/safenode1 --log-output-dest /var/log/safenode/safenode1 --port 15900 --peer /ip4/209.97.139.230/udp/59695/quic-v1/p2p/12D3KooWPTyqVdNzqcaeNGzjHA4nNnqJWkxU81gW7uBPYewu3jop
Restart=on-failure
User=safe
[Install]
WantedBy=multi-user.targett
Yep right at the end there…
Is this a fresh machine you’re using? I just don’t know maybe if an existing service definition was overwritten and left some rogue character around or something.
Its a VPS and I can wipe the lot and start again if I have to
I would be very interested to know if you reproduced it from a scratch state.
OK I will kill this VPS and start from scratch
I have a problem uploading files, I tried uploading an mp.4 of 8.5 MB with a batch size of 40, then 16, then 4, then the same file without setting a batch size, then I tried another file of 10.2 MB, all files were not split and did not upload:
Connecting to the network with 50 peers
🔗 Connected to the Network
Splitting and uploading "E:\\gggg\\Videos\\signal-2023-07-04-102157_003.mp4" into 18 chunks
⠁ [00:24:44] [----------------------------------------] 0/18
Connecting to the network with 50 peers
🔗 Connected to the Network
Chunking 1 files...
Files upload attempted previously, verifying 21 chunks
21 chunks were uploaded in the past but failed to verify. Will attempt to upload them again...
Splitting and uploading "E:\\gggg\\Videos\\1677654404651_0.mp4" into 21 chunks
⠦ [00:18:46] [----------------------------------------] 0/21
Anyone also have a file transfer problem?
I will try to clear the client
folder and upload again.
EDIT: Before clearing the client
folder I still tried downloading White Paper from the testnet, it downloaded instantly with no problem.
Connecting to the network with 50 peers
🔗 Connected to the Network
Downloading "Autonomi White Paper.pdf" from 3d8b80031d088fd244ab79498d245e62e75ce48bedc9afd6d0f46724801f4277 with batch-size 16
Saved "Autonomi White Paper.pdf" at C:\Users\gggg\Autonomi White Paper.pdf
No Safe, no wave.