[MemDebugNet] [4/10/23 Testnet] [Offline]

Wow

Client downloaded file in 198.166923661s
Saved /volume1/ubuntu-18.04-desktop-amd64.iso at /volume1/
ubuntu-18.04-desktop-amd64.iso

Cli 0.83.31

6 Likes

Jeasus, my created_cash_notes folder is 9.2Gb ! Is there a way to purge? Or can that be placed on another volume? It’s stuffing my instance and not leaving space for the nodes…

1 Like

Educate me please, I don’t even know what this is?

Cli 0.8.31 with cleanup. A couple of upload OK but the next:

Error:
0: Transfer Error Failed to send tokens due to Network Error Record was not found locally…
1: Failed to send tokens due to Network Error Record was not found locally.

1 Like
~/.local/share/safe/client/w
allet/created_cash_notes

Must be the record of all the payments done to upload data?

2 Likes

Well you got me beat at my meager 5.6GB :joy:. mine is only a hour old though!

4 Likes

#MeToo

from home

willie@gagarin:~/projects/maidsafe/safe_network$ safe files download ubuntu-18.04-desktop-amd64.iso 6cfa28d385d5af711893744362aaa32e9116aacce06287614163c20e1b5064df
Logging to directory: "/home/willie/.local/share/safe/client/logs/log_2023-10-07_03-46-41"
Built with git version: 379bf7c / main / 379bf7c

🔗 Connected to the Network                                                                                                                                                  Downloading ubuntu-18.04-desktop-amd64.iso from 6cfa28d385d5af711893744362aaa32e9116aacce06287614163c20e1b5064df
Error downloading "ubuntu-18.04-desktop-amd64.iso": Chunks error Chunk could not be retrieved from the network: d6116b(11010110)...

on my 2Gb cloud instance this download is killed /runs out of memory every time - even after I shut down the nodes .

Logs available if needed
Uploaded log_2023-10-07_03-46-41_safe.log to 15a0ba4634b99d27572c7368f4798bd5efff29d3937f5d62b9d0a681f16b3df3

1 Like

I have found interesting bug while trying downloads. (on 0.83.31)

When it shows progress it finishes OK.

Downloading test.mp4 from b824ce7aa8266f2d3edef7245d64363d835220484ddd70b90f03cc01e9a4c9ce
Client (read all) download progress 1/287
Client (read all) download progress 2/287
Client (read all) download progress 3/287
...
Client (read all) download progress 285/287
Client (read all) download progress 286/287
Client (read all) download progress 287/287
Client downloaded file in 43.274878484s
Saved test.mp4 at /home/petr/.local/share/safe/client/test.mp4

With bigger files it downloads the data (about 2500 MB for a 500 MB file), but doesn’t show progress and when that happens it always fails after a while.

Connected to the Network 
Downloading test.avi from c7d16ad1322785f17db116643240c92eb8b2e26663823907f487d470991a995e
Error downloading "test.avi": Chunks error Chunk could not be retrieved from the network: 54d5dc(01010100)...

safe.log.zip (538.0 KB)

4 Likes

I am fearful of a cashnote discussion, but I have not really grokked it.
As @stout77 pointed out the cash_notes dir grows incredibly fast.
It seems after light poking 5 notes per chunk + 1.

I accumalted over 5GB of cashnotes in a hour.
It seems to me that if you lose these cashnotes you are screwed?

I deleted them and my funds are gone, the name makes sense now, lost cash… is that correct?

Is there any recovery of my balance after losing a cash_note dir?
Is this still a wip or do we need terrabytes to keep track of wallet balances?

5 Likes

I don’t know but suspect almost all are spent but not being deleted yet.

If so, a short term improvement would be to move spent notes to a subdirectory which can be safely purged by testers, unless of course there’s a good reason for keeping them at the moment.

3 Likes

As I understand it, there should be a way to consolidate all the cashnotes into one and store only the one. Maybe the wallet will be doing automatically in the future?

4 Likes

I can randomly delete some and everything keeps working, was hopeful that I only needed to keep the most recent to keep my final balance but that is wrong.

Delete the wrong one and its game over for now, it seems.

1 Like

I think the situation is the directory created_cash_notes is the log of cash_notes you have created to upload data.

On my Instance that is uploading 1MB random data files I have 2.3GB of created_cash_notes . It’s 79,000 files that are mainly 24KB or 34KB, with a few other smaller ones. That’s taken 3 days to accumulate.

I think they can be disposed of because they are useless once they have been spent but they are being kept around for now in case they are useful for debugging. I’m sure I remember a discussion like that a couple of testnets ago. (it’s all becoming a bit of a blur! In a good way!)

Maybe there has been a change with a recent client update that causes more files or larger files to be stored? I can’t see how you’d have got to 5GB already otherwise. Mine is still on 0.83.17.

I thought the situation is that you need the contents of the directory received_cash_notes

to have tokens to spend. But looking at mine it is empty! I thought I’d have ended up with some from when I hit the faucet. Maybe it is for cash_notes received from other people?

3 Likes

I only have a single dir cash_notes

mine is safe 0.83.31

Been playing with pretty big uploads, ads up fast.

2 Likes

Yes we need to purge spent cashnotes for sure. Really related to another issue, but there should be a live_cashnotes dir and it should only hold actual spendable notes. So yes clean up/purge is necessary here IMO

Good news, though, think of the amount of nanopayments happening.

18 Likes

I think you’re right. I deleted everything in that directory, and while the safe wallet balance was unchanged, the next upload caused a ‘not enough cash’ error, necessitating a visit to the faucet for a top up.

8 Likes

I’ve noticed something a bit odd. There seems to be a discrepancy between the number of Puts as shown on vDash and a count of the records. I’m sure in the past this matched exactly (as long as you’d had Vdash running since the node was started). It’s on all the nodes I have but this one that has been running since a few hours in.

ls $HOME/.local/share/safe/node/ | while read f; do echo ${f} ; ls $HOME/.local/share/safe/node/${f}/record_store | wc -l ; echo ; done
12D3KooWRa1iMzVBgjXv5W2sR1Be2Yf3BXcKXH4Jh2H6pyJyu6tK
186

So 186 records.

Vdash shows 241 puts.

Is it that a Put is being counted even if a record is one that is already stored? For example the repeated uploading of BegBlagandSteal? Or something else that counts as a Put but isn’t a Record? Or have I uncovered something that Vdash is counting as a Put when it shouldn’t be?

4 Likes

Interesting that it seems to have changed. vdash counts every log message saying a record was successfully written to disk as a PUT. If there’s a change it’s in what appears in the log with that message.

Maybe MaidSafe have made some changes that affect when that message happens? :man_shrugging:

I’d like a reliable measure of records stored and maximum records stored so I could revive the storage space gauge on the right because it looks cool as well as being useful, but haven’t looked to hard at the logs for a while. I used to use a message that included space used and space available but that was removed a long time ago. I’ve been meaning to have a look for something equivalent, maybe in the storage cost messages that I could use.

If anyone has come across useful log messages on the latter or anything else interesting, let me know.

5 Likes

Uploads with Cli 0.83.31 (after a cleanup):

File of 751 chunks → OK
File of 499 chunks → Error
File of 499 chunks (second attempt) → Error
File of 785 chunks → Error
File of 566 chunks → Error
File of 104 chunks → OK
File of 245 chunks → OK
File of 566 chunks → OK
File of 499 chunk (third attempt) → Error
File of 707 chunks → OK
File of 1369 chunks → Error
File of 959 chunks → OK

5 Likes

If you want your CPU to get some exercise give it safe files upload --batch-size 2900 Downloads/ubuntu.iso :laughing:

1 Like