TEST5 (AKA destructive test) - is now running

A spider with a dictionary (my kind of spider) would have no trouble guessing some of those. But we’re talking about a spider that is so “simple” it wouldn’t make it up a drain pipe, never mind being washed down in the recent rain. :smile:

Oh my omnicoin! Who is running g the quake 3 safenet?!?!?

Actually, I do see the potential of a true search engine even at this early stage, and I have some ideas for implementing it. I think Safe-FS is missing the boat by focussing on files.

3 Likes

While we wait for yours, mine will have to do.

I expect it might be @Infining based on post suggesting it

That will come after my messaging and game service, which might be ready for the next testnet.

2 Likes

The network seems to be quiet right now. For hours I have just about 4,5KB/sec. up and download speeds when I run the vault. This seems like the average data usage to keep the network alive with heartbeats and updating routing tables. Quite nice actually, I think most nodes (no matter how small) should at least be able to support the network and security in this way.

My vault only has 13MB stored now in a map called “chunk store” which started this morning. If in next tests we’re allowed to store more data (like 1000 puts :heart_eyes:) we’ll probably see more activity as people start to store videos on their safenet sites to share with others.

7 Likes

@davidpbrown and @bluebird, such a spider, how is this done?

Trying to wrap my head around this:

  • To paraphrase someone here, once I log into the SAFE Network, I’m in a self encrypted decentralized colony of vaults
  • If it was available I could do “safeclient.exe dnsnames”?
  • Look in their public folder for files?
  • Pick out the ones with an html extension?
  • Look inside using one of the more or less established ways of filtering for in this case, standard templates?

All this of course because I wonder how a search engine would work, itself decentralized at that

1 Like
  • To paraphrase someone here, once I log into the SAFE Network, I’m in a self encrypted decentralized colony of vaults

Websites are publicly accessible - made easier, if you know they are there. If you choose a bitcoin-like address and do not declare it, then likely no-one will ever find it. Anything more simple can be guessed. The best signal, is a suggestion that a site exists. Over time the list of those declared grows and crawling those for links will throw more out.

and obviously private data is not public; so, unless you share your access to it, there is no route to finding it.

  • If it was available I could do “safeclient.exe dnsnames”?

There no access to DNS addressbook that I’m aware of. The test messaging might contain that detail for devs purposes atm but it’s unlikely all urls will be available like that. My script remembers urls from past testnets and retries them which helps a lot.

  • Look in their public folder for files?

There’s no reason to expect that will be possible; the intent is not for all public data to be easily found, it’s just accessible, if you know it’s there.

  • Pick out the ones with an html extension?

Spiders can trivially extract links from webpages and most websites have at least index.html… but that’s not even necessary, the request goes to the domain name and it serves up whatever page it considers is the header page, which atm and perhaps because SAFE is not server side function, is index.html. Links from that might obviously suggest depth, but any orphan pages will remain hidden until made apparent.

  • Look inside using one of the more or less established ways of filtering for in this case, standard templates?

Yes, looking inside the index.html shows if there is the standard default template. That changed for Test4 but still it’s trivial to match to.

So, my approach is a mix of noting urls suggested in the forum coupled with healthy doses of sed; grep; and guesswork.

The practical bit is using wget with http_proxy that fetches pages where those exist. A long list takes a while to run; so, the trick is to not guess the entire dictionary of all possible Subdomains.PublicID.safenet … though I wonder @bluebird will disagree for the hell of it :imp:

LOL. I never disagree “for the hell of it” but only because I happen to disagree, but whether I disagree never takes account of social pressure, so to conformists it might indeed seem just as you frame it.

Thanks for the description of your spider, which is as I suspected, minus the tilting of the table against “overly simple” sites. :slight_smile:

My proposed spider also builds from what it knows but in addition resorts to dictionary attempts. and in so doing it is bound to find a few that haven’t been published. Such a dictionary would include the handle of anyone who has posted on the forum.

My simple spider asked I suggest that it already thought of that. What else you got?..

Oh OK, but you didn’t mention that in your reply. Not being telepathic, I thought it might be original with me. :wink: The what else I’ve got is the use of a dictionary of the commonest English words and first names. A few thousand of those shouldn’t take too long to search.

I would use @cretz 's go-safeclient, for example, by scripting systematic attempts to register DNS names and saving the ones that say it is already in use.

It’s been done…

There is an exercise where you get a room full of people to write down words associated to a given few words, then pool then to resolve who is the more creative in the room… the obvious jumps in thought, are ones that many people think.

:open_mouth: That’s an idea I wouldn’t be doing as it would be disruptive. You don’t need to register PublicIDs to test them. I’ll let you wonder a while on what more I do for that.

By whom? You? The context is the people in this conversation and their spiders.

As to disruptive: we’re talking about test networks that will be destroyed in a week or so. Also, the registration attempts will: 1. exclude all known handles, and 2. be late in the test, to give everyone a chance to put their stuff up. Then everything gets saved for future tests. I see no particular problem.

EDIT: Here’s another url to search: known handles with a short number appended. E.g. bluebird1 and so on.

Given the context, obviously yes.

Little point in something that’s not sustainable. For the effort to register a large subset of potential PublicIDs just to resolve some not apparent through other means, on the off chance that name would be used again, would be excessive of the benefit… and disruptive for latecomers wanting to register their preferred name.

I’ve done that where it’s obvious the host has ever done that… so bluebird3 is on the list as is whiteoutmashups4 but doing for all just double triples etc the overhead.

What I’ve not done is look to the top 1million clearnet urls and map those as test data because it’s unlikely those will find hits but that might occur in future, especially if SAFE is deliberately letting those owners grab their safenet address for a fee.

Wow, you’re so clever!! :clap: :hugging: :wink: What I suggest, it just so happens (we find out after the fact) you have already done or thought of!

But anyway… you asked so I answered.

Whether or not to attempt to register comes down to what is faster since, as I mentioned, the destructibility of the networks, and doing it late, means that it is no particular disruption. The fact will be that the base of people will be aggregated rather than replaced and they can be expected to keep naming stuff as they are accustomed to doing, so building such a database is useful a base for kickstarting a future search engine, whatever algorithms it uses.

1 Like

The embryo is being tested against teratogens in the womb. Quickening. The dream is finally being tested.

Nobody here, I hope, wants it to have an arm growing out of its forehead.

1 Like

Other way round.

If you can do better, knock yourself out.

1 Like

Actually, the order of events seems pretty clear: 1. you and your spider challenged me, and 2. I had said nothing about my spider until asked.

and where did 1. occur?

You opened with

:rolling_eyes:

1 Like