Dont waste HQ ressources

I beg your pardon if this idea is already presented but I cant find it elsewhere.
…And sorry for my english.

I worry about ressouces wasted. I mean very good bandwith, very fast storage, with top end cpu power etc…

It would be nice if users can choose to buy a quality level of ressources.

  • If I just want some disk space just to store and secure some GB of
    photos I choose to buy REGULAR GB (ping ok-bandwith medium) —>I have to pay The basic Safecoin fees

  • If I need some space to store a small website or videos and I know I have to stream them on a daily basis I choose to buy FAST GB (good ping -high bandwith ) —>I have to pay The basic Safecoin fees X Bonus%

  • If I need some space to store a highly frequented website with streamings I choose to buy VERY FAST GB (very low ping - very high bandwith ) —>I have to pay the price : The basic Safecoin fees X BIGBonus%

The more this sort of ressources are scarce on the network → higher is the bonus to pay.

This way it is possible to regulate the demand and spare very good ressources for who needs it.
I know about the cache system but I dont think it is enougth to deal with this problem.

I think no one will know of how well the latency improving features on the network will perform until tests have been run on the network at scale.

Your idea is an interesting one, however, my own personal hope is that the network performs sufficiently well for everyone that tiered services are not required. Time will tell though!

1 Like

Tiered performance sounds like it might go against the decentralising, empowerment delivered by SAFE.

A great feature IMO is that anyone can create a product (website, app etc) on SAFE and if it is valued, it automatically scales without the need for investment in infrastructure. This means anyone can build a SAFE facebook or twitter (that respects privacy) without the need for VC investment, which means no need to exploit us to pay back investors.

That’s a win on two counts:

  • anyone can build the next big thing, with near zero near zero cost of entry
  • users can receive services at very low cost, just paying for the resources they use (and not with their privacy), and without having to fund share dividends for investors or to pay for overheads and cloud corporate
1 Like

I just think… what a waste if I just want to store 2TB of bulk datas that I use once or twice a year and they are located on powerful computers with a fiber connexion ! I understand it is quite random right now. Am I right ? Is it really random ?

It is clever OMHO to locate those datas on regular computers and connexions and leave top end stuff to those who need it.

It is unnecessary, complicated and unjustifiable.

You are not paying for bandwidth and neither is the vault holder paid. You cannot “save” him any cost by downloading slower.

No, you don’t, because if your web server is a gateway, it will cache any data that’s actually frequently requested (and so will the caching layer), so paying more per GB wouldn’t do anything.
If people are accessing data natively, they would most likely get data from caching nodes.

No, you don’t, because frequently accessed data will be cached by the intermediate nodes.

Any native apps will get the CDN effect from the caching layer.

Tiering and QoS would be nice to have. At this stage, though, they are not essential and IMO feature creep should be kept in check until v1.0 is stable.

1 Like

Hear! Hear!