Last night when working on my Silicon Graphics Origin 300 and suffering with an old version Mozilla circa 2005 as seen below: [caption id="attachment_896" align="aligncenter" width="300" caption="Mozilla 1.7.12 on IRIX"][/caption] I started wondering, these machines can function as a Web Server, MySQL server, firewall etc especially my Quad R14k Origin 300, yet web browsing is seriously lacking on them. Firefox 2 is available over at nekoware, but that is painfully slow. Granted I don't use my Origin for web browsing, but when I was using a R12k 400mhz Octane as my primary machine a few years ago as I am sure others around the world are doing it was painful. This problem I don't think is solely for those on EOL'd Silicon Graphics machines, but any older piece of hardware that does everything but web browsing decently. Thinking back to the Amazon Silk platform, using less powerful hardware, but a brilliant software platform, Amazon is able to deliver more with less. The problem arises for the rest of the market because of the diversity of the PC/Workstation market. The way I see it you've got 2 approaches to a "universal" cloud web renderer. You could either:
  1. Write a custom lightweight browser tied to an external WCF/Soap Web Service
  2. Write a packet filter inspector for each platform to intercept requests and return them from a WCF/Soap service either through Firefox Extensions or a lower level implementation, almost like a mini-proxy
Plan A has major problems because you've got various incarnations of Linux, IRIX, Solaris, VMS, Windows etc, all with various levels of Java and .NET/Mono support (if any), so a Java or .NET/Mono implementation is probably not the right choice. Thus you're left trying to make a portable C/C++ application. To cut down on work, I'd probably use a platform independent library like Gsoap to handle the web service calls. But either way the amount of work would be considerable. Plan B, I've never done anything like before, but I would imagine would be a lot less work than Plan A. I spent 2 hours this morning playing around with a WCF service and a WPF application doing kind of like Plan A. [caption id="attachment_897" align="aligncenter" width="300" caption="jcW3CLOUD in action"][/caption] But instead of writing my own browser, I simply used the WebBrowser control, which is just Internet Explorer. The Web Service itself is simply: [csharp] public JCW3CLOUDPage renderPage(string URL) { using (WebClient wc = new WebClient()) { JCW3CLOUDPage page = new JCW3CLOUDPage(); if (!URL.StartsWith("http://")) { URL = "http://" + URL; } page.HTML = wc.DownloadString(URL); return page; } } [/csharp] It simply makes a web request based on the URL from the client, converts the HTML page to a String object and I pass it into a JCW3CLOUDPage object (which would also contain images, although I did not implement image support). Client side (ignoring the WPF UI code): [csharp] private JCW3CLOUDReference.JCW3CLOUDClient _client = new JCW3CLOUDReference.JCW3CLOUDClient(); var page = _client.renderPage(url); int request = getUniqueID(); StreamWriter sw = new StreamWriter(System.AppDomain.CurrentDomain.BaseDirectory + request + ".html"); sw.Write(page.HTML); sw.Close(); wbMain.Navigate(System.AppDomain.CurrentDomain.BaseDirectory + request + ".html"); [/csharp] It simply makes the WCF request based on the property and then returns the HTML and writes it to a temporary HTML file for the WebBrowser control to read from. Nothing special, you'd probably want to add handling for specific pages, images and caching, but this was far more than I wanted to play with. Hopefully it'll help someone get started on something cool. It does not handle requests from with the WebBrowser control, so you would need to override that as well. Otherwise only the initial request would be returned from the "Cloud", but subsequent requests would be made normally. This project would be way too much for myself to handle, but it did bring up some interesting thoughts:
  1. Handling Cloud based rendering, would keeping images/css/etc stored locally and doing modified date checks on every request be faster than simply pulling down each request fully?
  2. Would the extra costs incurred to the 3G/4G providers make it worthwhile?
  3. Would zipping content and unzipping them outway the processing time on both ends (especially if there was very limited space on the client)
  4. Is there really a need/want for such a product? Who would fund such a project, would it be open source?