Firefox Plugin Browser Leaks, SAFE Browser

The FireFox plugin is an interesting idea, yet there is a major possibility that the masses will use the browser in non-SAFE land and that cookies, etc will leak information on which SAFE sites people have visited.

I think it necessary to create a SAFE browser similar to Tor Browser that mitigates the possibility of leaking information of activities that happen within the confines of the SAFE network, i.e. what happens in the SAFE network stays in the SAFE network. Such a browser should also not have the possibility of leaking information that is identifiable, nor have data reporting, telemetry, WebRTC, speculative preconnections or other so called ‘features’ We need a ‘dumb browser’ that is safe to use. It should remove the ability of javascript, etc hijacking the browser functionality. There is plenty of browser code. Also a complimentary web server might be useful in the future.

1 Like

So let’s spend 1 million bucks (that could be used for SAFE development) to save the stupid…

If anyone wants to go along with this idea I propose that the plugin blocks off everything except .safenet.

function FindProxyForURL(url, host) {
return "PROXY localhost:8101";
return "DIRECT";

We’re just missing a big green button that says “Click here to enable Internet and SAFE access (WARNING: lower security!) for additional convenience” so the masses can instantly remove any protection and catch up with the latest FB happenings in another tab.


Has been discussed a while back SAFEspace browser

Not discussed lately, so any ideas you have would be welcomed


Such a browser should ignore external requests.

( btw “RequestPolicy” for firefox / icecat does just that for the regular web )

While I do agree with your point of view, maybe some users do not care about privacy or anonymity, though, and would hapilly browse SAFE sites with a leaking thing like firefox or Chrome.

I think I read in this forum that a safe browser as you describe, is a project that several persons already envisionned. In my opinion it is a necessary conpanion to the network/protocol so that the whole thing makes sense. I would use it everyday.


That’s not a knife

Could one… create a proxie circuit or something for each tab? That is one tab could be for SAFE and the next could be for the regular internet and neither would know about one another. Every time you loaded a new web page it would be isolated and modular. Is that possible?

Such a browser should also not have the possibility of leaking information that is identifiable, nor have data reporting, telemetry, WebRTC, speculative preconnections or other so called ‘features’ We need a ‘dumb browser’ that is safe to use. It should remove the ability of javascript, etc hijacking the browser functionality

could we elaborate on this idea ? I am looking at the source of “netsurf” ( ) trying to find what could be stripped from there to achieve a really dumb browser. It would parse and display HTMl, text and images, follow safe links.Netsurf is all C and easy to read, it looks to me like a good candidate for an experimental fork.

Not pretending this would be THE safe browser, just an experiment in which I have personal interest.

I already think about removing cookie support, refferer, external requests, javascript, of course only allowing .safenet pages through launcher proxying.

History, bookmarks, caching if any, user preferences… should go into the network instead of locally

What else ?

Is Servo headed in the right direction, written in RUST

1 Like

Janitor be nice! :point_left:


Go to this address: http://prueba.piluso.safenet/
and tell me that it doesn’t freak you out to see your ip address.

Checkout brave By Brendon Enoch it’s pretty sweet and you may see more of it very soon :wink:


Cheers, I have Brave installed on Win64 and IOS. Why are you keen on Brave ahead of the RUST based Servo, given your Devs would be comfortable with the language now?

1 Like

For the short term, simplicity. Longer term I think it may be different though. Front end code is way more flexible than the back end or systems code.So we can use much more of what’s available now for much faster development of front end apps. Rust is moving fast in that area as well now though.


Could you comment on how search might get implemented in the network. Is that likely to come from the research community or maybe something that can be worked out internally?

I have been so far away from the front end and capabilities now it’s a bit in the dark. Stuck in some real low level stuff I never wanted to get involved in so a bit rusty just now with all the capabilities (tough year).

Search though from a decentralised perspective can make use of lucene type indexing, with the indexes using using Immutable Data, this has to be arranged in a manner where it can append information as more links and data is found.As data can be referenced via a datamap hash then the end data will always be found (like way back machine inbuilt to the network).

So the research part if you like will be fitting this in a type of binary tree/Directed Graph where leaves can be updated to reflect new information that fits that part of the object (graph).

So very possible and a few whiteboard sessions could product a crude first attempt at this. If done correctly and without doubt in the open then I think we can have a decent, albeit crude search.

More interesting though will be the use of deep learning algorithms to provide more than search, so more like wolfram type question engines with larger data sets. I spoke briefly with Wolfram and I am sure there is a very long conversation to be had there, with the new data structures available in SAFE as well as their capabilities in domain specific languages in this field.


I’m guessing search would provide a revenue stream that Maidsafe or the foundation would want to have a crack at.

So it’s front end via API and not in the core?

1 Like

I actually just created a thread about the security bugs currently present in the SAFE browser plugin.

The concerns you mentioned are valid and reflect on the concerns I mentioned in my thread.

Is it reasonable to think we can remove all avenues of attack? Limiting the SAFE web would make it worse than the current web, so it will never catch on for browser games or real-time WebRTC networking apps.

How could we possible better secure the enormous environment that is javascript. It sounds like you’re talking about rewriting the entire javascript engine of the browser, but that would introduce new bugs anyway.

I think to start with, a SAFE browser should have most features off by default, but let the user turn them on with a single click. Whereas currently WebRTC and other unnecessary protocols you mentioned are turned on by default, and can be used by attackers.

You summed it up pretty nicely with this:

It’s obvious a SAFE browser is inevitable. I would not be surprised if @dirvine was working on that now. Perhaps replacing the Tor browser bundle with the SAFE browser bundle.


Yes I suspect it will be.

1 Like

A browser that gives the end user total control over the way the interface functions. No pop ups, no modals, no access to scrolling for anyone but the end user, no reloading, no access to back and forward, not access to volume control or auto play.

I’m not sure how the full feature set of the current internet can be replicated on the SAFE network if everyone is using browsers that restrict its usage to that of the late 90s.

1 Like