Tag Archives: XML

Cross-Domain AJAX

When making an xmlhttprequest from a website the browser will restrict you to the site from which the script came. This is a security precaution. If sites were able to tell the browser to make requests from other domains then they would be able to DDOS a site with a users browser. There are legitimate reasons to make requests to other sites though.

Many sites offer web services, xml data and json encoded data. These can provide almost anything from the weather, to search results, to advanced APIs. To use these services from your site using javascript you’ll have to employ one of the methods below.

Signing Javascript

Firefox allows you to sign your Javascript and place it in a jar file. This will give your code more privileges, You can also request these permissions explicitly without having your code signed, but having a dialog box appear for every AJAX request could get very tiring for the user. Another problem with this approach is that it isn’t documented very well and its Firefox specific. The first link in the references section deals with this method.

Access-Control Headers

This is the w3 approved method of allowing a client from another domain to access your web service. It is a server side method and requires no changes on the client to implement. This is both and advantage and a disadvantage. If you have control over the server then this method is simple, otherwise (for sites such as Yahoo API or other public services) you will not be able to implement this. It should also be noted that this was implemented in Firefox 3.5 so it can’t be used with earlier versions, or other browsers.

To use this method you tell your service to output extra headers that tell the browser whether access was allowed or denied.

Flash Enabled xmlhttprequest

This method involves using an invisible flash player to perform the actual request then handing the result back to the Javascript for processing. Flash still performs permission checking by looking for a /crossdomain.xml file in the root directory of the domain the request is being made to. There are several libraries that implement this approach and a few even implement in a way which is compatible with xmlhttprequest. One downside is this Flash is required, though recently Flash is required for several major sites and most browsers will have it installed.

Add Sites To Trusted Zone

Internet Explorer allows and denies cross-domain based xmlhttprequests based on the security setting. This approach is likely not going to be used on the Internet as it requires user interaction and is Internet Explorer specific. On a corporate Intranet this is slightly less difficult but not by much.

Apache mod_proxy

With this method you use the same server you shared the page from to proxy the requests automatically to the server with the data you’re fetching. For this to work your version of Apache has to be compiled with proxy support or you need to have the mod_proxy dso loaded. This method increases the latency of requests as they must first go via your server. It should also be noted that this cannot be implemented in .htaccess file and must be done in the main configuration.

Manual Proxy

If you don’t have control over your servers configuration then you can mimic the above method by writing a script that forwards the variables required and forwards back the data. This approach can even be more preferable than the above method as it allows you to preprocess the variables and cache the data if required.

References

http://www.mozilla.org/projects/security/components/signed-scripts.html

http://dev.w3.org/2006/waf/access-control/

http://developer.yahoo.com/javascript/howto-proxy.html

https://developer.mozilla.org/En/HTTP_Access_Control

http://ejohn.org/blog/cross-site-xmlhttprequest/

http://ajaxpatterns.org/XMLHttpRequest_Call

http://ajaxpatterns.org/Flash-enabled_XHR

Random Thought: Can you use AJAX to make web applications cleaner?

Microsoft Word Banned

Judge Leonard Davis, a judge for the U.S. District Court for the Eastern District of Texas, has ordered an injunction requiring that Microsoft stop selling Microsoft Word to the U.S. by the 12th of October. Less importantly he also fined them $240 million. What company could bring Microsoft’s office suite to its knees? They’re a “Collaborative XML Content Company” called i4i. They applied for the patent in question in 1994 and it was granted in 1998. The version of Word claimed to be in breach of the patent are Word 2003 and Word 2007.Microsoft Word Logo

Many people have been commenting on how its nice to see Microsoft taking a dose of its own medicine, but I disagree. Sure seeing Microsoft get hit with a massive fine and an explosive ultimatum is pleasing the subject matter isn’t. Patents were originally made to help the small guy, to aid innovation and so that inventors could reveal their inventions without fear of having their idea stolen from them and used to make money. Increasingly though patents are becoming the weapons of large corporations to battle with. Large companies are now applying for millions upon millions of patent in order to sue later when another company will unavoidably infringe on them.

The problem has gotten to the point where even looking through patents before you make something can make you more vulnerable to legal action. What is worse though is that it is showing no signs of slowing down. Even companies that have publicly opposed software patents are being forced to build their own patent portfolios so they can defend themselves.

I am not alone in my view here. Public opinion is beginning to turn. From colleagues, fellow bloggers, The EFF, major news outlets and even large companies most everyone is beginning to see the flaws in the system.

Random Thought: If Firefox was going to set the web alight, what is Google Chrome trying to do?