Assemble at the Start Line. The Beta is About to Begin

Thanks for the clarification !

1 Like

It’s all available in theory

4 factors are defining the store cost:

let records_stored = quoting_metrics.close_records_stored;
let received_payment_count = quoting_metrics.received_payment_count;
let max_records = quoting_metrics.max_records;
let live_time = quoting_metrics.live_time;
  • Records stored are active records and must - as of now - be extracted from the logs
  • Payment count is appearing in the logs too
  • Max recordings is 4096
  • Live-time is the uptime of the node available through the rpc interface (or could be read from the logs+extrapolated to the current time)
3 Likes

This is an excellent post and does raise a very important point.

From the rabbit hole bitcoin has sent me down over the years, I do now feel I understand that nothing can really be free and sound money should represent real energy in the economy.

Paying once for an upload of data does seem too good to be true and potentially goes against sound economic principals.

I also tend to worry about the pay the developer idea which also potentially goes against a free market and a sound economy. I’d love to be wrong, but the fact that part will not be automated at full launch tends to make me feel the idea should be dropped to remove any centralization and maybe jeopardising the perception of the project. I believe if the network is stable, decentralized, open to all, and easy to build on, that should be more than enough to see devs building on it.

Maybe i don’t understand these parts of the project well enough? These are just my thoughts at the moment. I sway more to towards making sure the project is as decentralized as possible and the token/currency and economic system is as sound as possible over many other nice-to-haves. History makes me believe these things should not be overlooked.

2 Likes

Just spotted something weird :thinking: on a system running 8 nodes, it’s been earning so far like 10 Nanos here and there until today - one node (node 6) earnt 250 Nanos for a record, and is now a top earner by a huge margin, the other 7 combined only did 60 Nanos.

250 nanos feels a bit steep if the network is empty, at that rate it’s going to be 1000’s of nano’s to store a single cat meme :cat:

Also some weird things with cash note ? or is that just noise ?

Node 6 Log Snip

[2024-06-18T10:14:07.114345Z INFO sn_networking::cmd] Fetch b1de54(c45fca515c6064f6d5c17d5973e8c92f379c4cf40bebc1fbf57333f93a63466d) early completed, may fetched an old version record. [2024-06-18T10:14:07.118361Z INFO sn_transfers::wallet] Attempting to read wallet file [2024-06-18T10:14:07.124929Z DEBUG sn_transfers::wallet::watch_only] Loaded wallet from "/home/froq01/.local/share/safe/node/safenode6/wallet" with balance NanoTokens(250) [2024-06-18T10:14:07.169382Z INFO sn_networking] Getting record from network of 21179f(5366f38442346b2177a8d7de4a5d26153aa2818e452bf8c469b972a8b2a5707a). with cfg GetRecordCfg { get_quorum: All, retry_strategy: Some(Quick), target_record: "None", expected_holders: {} } [2024-06-18T10:14:07.175325Z INFO sn_networking::cmd] We now have 1 pending get record attempts and cached 0 fetched copies [2024-06-18T10:14:07.513084Z INFO sn_networking] Record returned: 21179f(5366f38442346b2177a8d7de4a5d26153aa2818e452bf8c469b972a8b2a5707a). [2024-06-18T10:14:07.590887Z INFO sn_networking] Getting record from network of 21179f(5366f38442346b2177a8d7de4a5d26153aa2818e452bf8c469b972a8b2a5707a). with cfg GetRecordCfg { get_quorum: All, retry_strategy: Some(Quick), target_record: "None", expected_holders: {} } [2024-06-18T10:14:07.593748Z INFO sn_networking::cmd] We now have 1 pending get record attempts and cached 0 fetched copies [2024-06-18T10:14:07.945313Z INFO sn_networking::replication_fetcher] Among 26 incoming replications from PeerId("12D3KooWSTG8toPyaYYqeNVDJFxzto2Pjh96YDbrT5st8UqUNib3"), found 26 out of range [2024-06-18T10:14:08.256758Z INFO sn_networking] Record returned: 21179f(5366f38442346b2177a8d7de4a5d26153aa2818e452bf8c469b972a8b2a5707a). [2024-06-18T10:14:08.284101Z WARN sn_node::put_validation] Invalid network royalties payment for record b1de54(c45fca515c6064f6d5c17d5973e8c92f379c4cf40bebc1fbf57333f93a63466d): InvalidTransfer("None of the CashNoteRedemptions are destined to our key") [2024-06-18T10:14:08.287604Z INFO sn_node::put_validation] 1 cash note/s (for a total of NanoTokens(250)) are for us for b1de54(c45fca515c6064f6d5c17d5973e8c92f379c4cf40bebc1fbf57333f93a63466d) [2024-06-18T10:14:08.294500Z INFO sn_transfers::wallet] Attempting to read wallet file [2024-06-18T10:14:08.294930Z DEBUG sn_transfers::wallet::watch_only] Loaded wallet from "/home/froq01/.local/share/safe/node/safenode6/wallet" with balance NanoTokens(250) [2024-06-18T10:14:08.295182Z DEBUG sn_transfers::wallet::wallet_file] Writing cash note to: "/home/froq01/.local/share/safe/node/safenode6/wallet/cash_notes/af7e4ad8aa7a4ee1e722fec7cfc79a6238838bca5f754a07219567b365703a5f.cash_note" [2024-06-18T10:14:08.302030Z INFO sn_node::put_validation] The new wallet balance is 250, after earning 0 [2024-06-18T10:14:08.309961Z WARN sn_node::put_validation] No network royalties payment found for record b1de54(c45fca515c6064f6d5c17d5973e8c92f379c4cf40bebc1fbf57333f93a63466d) [2024-06-18T10:14:08.314662Z INFO sn_node::log_markers] RecordRejected(b1de54(c45fca515c6064f6d5c17d5973e8c92f379c4cf40bebc1fbf57333f93a63466d), NoNetworkRoyaltiesPayment(b1de54(c45fca515c6064f6d5c17d5973e8c92f379c4cf40bebc1fbf57333f93a63466d)))

3 Likes

Similar situation for me. I have been running between 6 and 11 nodes since the start of the beta. After some tweaking here and there, I finally started earning nano’s on Saturday at a rate of 10 per day. Today one of my nodes raked in 280 nano’s minus the 15% royalty which gave me 230 nano’s in one go

4 Likes

No wonder I’m dropping in the rankings :joy:

6 Likes

Yes, that does seem like a lot. Are you using the --peer option so are forwarding ports and will be a relay node or the --home-network option so are relying on relay nodes? Are you on Windows, Mac or Linux?

1 Like

I’m using home network; anyway vdash says 1109 peers connected to a single node, yet safenode manager, under detail command, says that node is only connected to 161 peers, so a big discrepancy indeed.

1 Like

Thanks for this - interesting to see, though I can’t really follow that to figure much out.

With the nodes I’m running, I can’t see a close relationship between store cost and anything else, but maybe if I could see active records all pulled from the logs it would be a different story?

1 Like

I ran ‘/rank’ in Discord and was amazed to find I’d gone from position 68 in the rankings with 110 nanos I think it was to position 39 with 430 nanos! After some hunting through the 3 sites where I’m running nodes I found the culprit.

In the last 3 days since it was started it has had 2 payments of 10 nanos (good start), 13 payments of 20 nanos (great!) and then 2 payments of 280 nanos (niiiice!).

It has had 198 PUTs in total and currently has 111 records. There hasn’t been a big influx of PUTs apart from when it was started (which is normal) so it’s not like there were enough nodes in the area and some left. So it looks like that area of the hashspace doesn’t have many other nodes in it and hasn’t done for the last 3 days. Then when some records arrived into it my node was the best candidate.

So there is hope for us all and you can get a lottery win like this even running a small number of nodes.

6 Likes

In the purely logical mathematical world where 2 + 2 equals 4 your feelings are right, but surely your purely human brain will not generate a feeling of work if I send you a free pizza and you have to go to the door to take it…

I’ve gotten thousands of dollars in crypto from free airdrops over the years and having to make a few mouse clicks doesn’t make me feel any different - free is free.

The fact that I will click Autonomi.exe and pay 1.5 euros more at the end of the month for electricity does not make me feel any different - it feels like getting free crypto.

There are free things because our feelings tell us they are free, and this comes from the disproportion between effort and result.


Privacy. Security. Freedom

4 Likes

The peer count obtained via rpc call may be “active” connections vs “known” peer nodes from the logs. I only look at the former to judge if the node is still part of the network or needs restarting.

3 Likes

The number of records compared to max records is the majority factor. The other 2 are supposed to tweak the amount to favour the brand new nodes.

So records stored/max records gives a %full figure. So the same factor
the live time is to know if the node is brand new
the number of times paid is also to favour the nodes what have yet to earn

And from the figures you and other have gotten, there is a problem in the code or tweaks are doing way more than they wanted. Yea i queried the tweaks when they introduced them and that is what I was told, to give a small benefit to new or not paid nodes to help them while new.

3 Likes

Seems that the logs are not recording when a peer is no longer a peer and would explain why vdash is showing so many

4 Likes

Also when giving practical examples of how you can store forever with one payment seems so foreign to people that even giving the figures they go nope cannot be done.

Storage is one area where the cloud storage people are stealing from the world.

I can show for a home user who actually buys a hard drive to store chunks will still only pay 1/10 in 5 years to store all that data on the brand new drive they buy to replace the original and then in 5 years its 1/100 th the cost to store that original amount of data and so on.

So the formula to pay for perpetual data in a home system is 1.111111111111 the cost of the original drive. Home spare resources though means the original drive costs zero and when replacing it in 5 years its still zero cost needed for chunk storage since again its the spare resources.

And as you say the additional electricity cost is so small a cup of coffee from the cafe covers it for one year.

So really the earnings is only having to pay the tiny overheads of spare resources and enough to make you feel good and like its worth it.

The problem is of course people want money making machines and why many turn off when talking to them about the network. Or the other way they want to run 1000’s of nodes where ever they can and build out their hardware etc like the old home bitcoin rigs with 8 GPU cards and so on

Here is a real life example in 1985 I was running a 5MB drive
in 2025 the trend predicts 50TB
I can buy 50TB now in SSD or 24TB (i have some) in rotating media in 2023

So yea perpetual data is cost effective as long as you charge the extra needed and keep the extra for future. the 5MB cost in todays terms would be close to the 24TB drive too

4 Likes

Thanks for your reply Dimitar, but i think it has further proven what I’m saying.

In a world where everyone starts to receive free pizza it is very likely you will quickly see the demise of pizza restaurants and those that produce pizza’s. Of course somebody can give another person a pizza for free, but the economic energy behind that pizza had to be sound enough to be able to have a pizza ready and available. Lots of human energy goes into making the pizza, from the growing of the raw ingredients, transportation and cooking. None of those things are free.

If everyone on the planet received the air drops you mentioned, it is very likely the token will be worthless. And most of these air drops are worthless unless there is actually something of value tied to it and there is some level of scarcity to the token. Of course any shit coin can have a pumped up value in the short term, but long term the true value is found by the market.

2 Likes

What you say is true, all other things being equal.

But it is not equal, the daily advances in all fields of science allow today to get free things for which until yesterday we paid and in a world where countries print money out of thin air for free for them just by adding a few ones and zeros in a server is possible other things to be free too.

Is it sustainable over time - most likely no, is it sustainable in our lifetime - most likely yes.

So I say don’t stress so much, if the economic model doesn’t work it will change it. No need to preoptimize before we know if it’s possible or not.


Privacy. Security. Freedom

2 Likes

Yes you are correct, technology is deflationary and the trajectory for probably all things is zero in the long run, but it takes sound economic systems to get us there.

I’d be very happy to be proven wrong with these concerns as i love the idea of pay once and pay the developer, but I believe this area should really be thought about deeply and steel manned as much as possible.

4 Likes

I’ve created a GPT nodes assistant, to help people set up nodes. Feel free to give me some feedback on this. other community members could probably do a much better job.

5 Likes

A whole day without a message in this topic so things must be running well! Or we’ve had to at least pretend to do our day jobs for a bit!

I’d like to run my nodes at home in the --node-port mode and not using --home-network so that rather than adding to the load of relays I’m being a relay to others. I’m doing that and it seems to work: I’m getting PUTs, GETs and occasional nanos. I think the port forwarding is working fine on the nodes at home or surely I wouldn’t be getting these and would have whole load of errors. There doesn’t seem to be any difference in the number of peers or amount of shunning for them.

However, I don’t think I’m being a relay. I see only messages like this that have ‘relay client event’ in them:-

[2024-06-21T04:44:56.014547Z INFO sn_networking::event::swarm] relay client event event=OutboundCircuitEstablished { relay_peer_id: PeerId("12D3KooWKMxudHDuoNybcxFKfqe4vp6KYeScshYLxi93GEAtNeUo"), limit: None }

whereas on the AWS Instance I have running in --node-port mode I see messages like this with ‘relay server event’ in them :-

[2024-06-20T19:53:14.453319Z INFO sn_networking::event::swarm] relay server event event=CircuitReqAccepted { src_peer_id: PeerId("12D3KooWGxUfJadxJaSczKEayjYHaEQAJf4SZ48TrAHT6eY5uDW7"), dst_peer_id: PeerId("12D3KooWBqpik2LeBiCWrdJk9x2uoAmP5L7nsyVmzS5BPbjYpF8Q") }
[2024-06-20T19:53:14.643699Z INFO sn_networking::event::swarm] relay server event event=CircuitClosed { src_peer_id: PeerId("12D3KooWGxUfJadxJaSczKEayjYHaEQAJf4SZ48TrAHT6eY5uDW7"), dst_peer_id: PeerId("12D3KooWBqpik2LeBiCWrdJk9x2uoAmP5L7nsyVmzS5BPbjYpF8Q"), error: None }

along with the occasional one like this:-

[2024-06-20T19:54:59.504950Z INFO sn_networking::event::swarm] relay client event event=OutboundCircuitEstablished { relay_peer_id: PeerId("12D3KooWDvVM5USnyPYb4NR2ihfDcZeawk11NPsKeibXaz8YGYaq"), limit: None }

I see this message in the first log on both which suggests the nodes are setup to not be relay clients:-

[2024-06-12T22:38:28.628980Z INFO sn_networking::relay_manager] Setting relay client mode to false

whereas when I was running nodes from home with the --home-network option I got this in the first logs:-

[2024-06-17T23:07:04.822169Z INFO sn_networking::relay_manager] Setting relay client mode to true

So am I being a relay on the home nodes? Doesn’t look like it.

I don’t see any harm in running in this way so I’ll continue just in case but it would be good to know.

Interestingly, the network activity looks slightly different. This is the graph for the last 24 hours on the interface the RPi4 is on with 10 nodes:-

cutover from 10 nodes in --home-network mode to 10 nodes in --node-port mode was at 1200.

So the bandwidth used looks slightly higher on average but with fewer peaks which is actually better because it’s the peaks which are the problem for other things on the network. I should be able to push to more nodes. Unless it is something on the network that changed around then.

But it would be good to know if they are working as relays and if not fix it.

Or is it that my home nodes are eligible to be peers but there is some kind of selection process and they are just not very attractive but the AWS Instance is?

3 Likes