Cloud Computing: coming full circle

Google Operating System has a recent article about a presentation made by Google China’s President Dr.Kai-Fu Lee on cloud computing. Cloud computing is what is gradually making Google a powerful force on the planet. The idea is to reduce a user’s dependence on any one computer or device. All data and applications are stored “in the Cloud”, so to speak. Everything is backed on multiple servers and is easily accessible from an Internet connected device. This means that you can use your data and the applications to manipulate that data from almost anywhere in the world (anywhere that has a decently fast internet connection at least). It’s an extremely powerful idea, which has numerous benefits both for the users and for whoever supplies your software.

As the user, the most obvious benefit is that you can access (your often very important) data from a lot of various places. Also important is that you can take advantage of the massive amount of powerful hardware and computing resources that are available in the cloud. You no longer need a 3Ghz processor with 4GB of RAM just to edit that letter to your boss, you just need enough power to run a modern browser and handle a fast internet connection. The results of having so much power is of course obvious right now: any web-based search is much faster than any desktop search due to much more hardware and powerful algorithms to take advantage of that hardware (Google’s MapReduce comes to mind). For the company providing the cloud services the advantages are just as attractive. There aren’t a dozen different versions to support with all the related compatibility issues. Once bugs are tracked down and fixed, the patches can be rolled out without having to bother the user. Google has being using in this in the gradual upgrades to Gmail and Google Documents and has been working quite well. For every piece of software that the user sees there is a lot more that is carefully hidden away. When you’re running a cloud, you have control over the what the user sees as well as what is behind the scenes. With some care you can make sure that both the software systems remain essentially separate but still work well together (unless you don’t control the cloud software, but that’s a different matter entirely).

Cloud computing seems to be the wave of the future, and with ubiquitous broadband, I think this will be a very efficient way of utilizing powerful computing resources. At the same time, cloud computing reminds me of the old days of time sharing computing (which I was born far too late to be a part of). Data and software was in a large mainframe (later minicomputers) either at a company’s computer headquarters or a college campus. To access them users could log from a remote terminal, sometimes dialing in over the phone lines. Of course in those days, users shared processor time and memory on just one computer and data transfer speeds were slow, even for the plain text that went around. But the user’s data and applications were still centrally located, accessible uniformly from any terminal connected to the system.

How different is this from today’s cloud computing? Well, very. Data speeds are higher. Communication is available not just for connected terminals but for any internet connected device, and there is far more computing power available for users. But the core concepts are nevertheless, essentially the same: data located in one place, but uniformly accessible from multiple separate locations. In many ways it’s like the saber-tooth: a design that has appeared many times throughout evolution exactly because it is a good design. Cloud computing too is a good design, which is going to stay with us for a long time to come.

Finally some real work

I’ve been enjoying college life so much for the last two months that blogging was completely out of my mind. But now that the excitement has died down a bit, I think it’s time that I did started blogging regularly. Though I’ve always been interested in computers and I’ve always loved programming, I never really had the chance to any real work. Sure I’ve done work for CS courses and tried my hand at making a personal website, but nothing really serious.

However that’s changed, as a direct result of coming to college. One of the projects for one of my classes is to make a web biography of an alumnus of our college who went on to win a Nobel Prize. As it turns out, I’m the only one in the class (including the professor) who has even the faintest idea of how to go about setting up a website. So I get to be de facto webmaster. I only started really playing around with HTML and CSS earlier this year, so it’s good to have a real project to work on. Though I don’t intend to be a web designer for a living, it is one of those skills that every self-respecting computer scientist should possess. The project is still in the early stages, but I already have a template designed, and I’ve asked for space on the college servers to put it up and do proper testing. I’ll post a link here once it’s online.

Secondly, I’ve started working with a professor who is the Director of Institutional Research. Part of my job involves routinely searching the websites of other colleges for all sorts of information. Of course, Google is my friend, but manually pointing Google to websites and typing in search terms is dull,  repetitive work, the sort of work that we have computers to do. So I need someway to search all those websites and see the results in one go. I had initially thought about using a Python program to query Google and then process the result into something readable. Unfortunately the only way to query the Google engine, is by using JavaScript, preferably embedded in a web page. This means that I can either attempt to create my own search engine using Python, or learn JavaScript to build upon Google. Both of them would have been quite challenging and would have taught be a lot, but I decided to learn JavaScript. I wasn’t quite sure if I was up to the task of writing a proper search algorithm and though I certainly could have learnt what I needed to, that isn’t an investment I feel like making right now. And since we’re now living in the age of Web 2.0 and AJAX goodness, JavaScript would be a good skill to pick.

Two new projects and quite a bit to learn. And my normal classes along with that. Throw blogging regularly into the mix and things might get nasty, but I’m ready for it. I promised myself that I would get the most out of my college years and it makes sense to start early. See you all tomorrow.

5 Public Computer Safety Tips

Now that I’ve started college, I’ve had to learn to live with not having my own computer. Like many people starting college, I’ve had to rely on computers in computer labs scattered throughout campus. While you might be able to get work done on a public computer just as easily as you could on your personal machine, there are some crucial differences, the most important of which is security. The crux of the matter is this: public computers are used by many different people everyday. This means that any data that is on the computer will be seen by a lot of different. And since you are one of the people using the computer, you just might leave data that other people might exploit. This can be anything from a copy of tomorrow’s history paper to important User IDs passwords stored in a browser’s cache. But there are a number of things (of varying complexity) that you can do to make your work on a public computer safer.

1. Always Log Off

On a public computer you’ll have to manually log in lots of different services such as instant messaging software, email accounts and social networks. All this means that you will have to input your username and password. Never opt to store the password if prompted and always log out of everything. That way, the next person accessing the computer won’t have straight access to everything you logged into. If you have to log into your school or corporate network at any time, log out of that as well. If most of your work is online (and involves multiple logins), you might want to clear the browser’s cache and cookies once you’re done. This can get tedious, but will keep you safe. I’ll deal with a way around the tedium later.

2. Never leave anything on the hard disk

You probably won’t be able to avoid storing things to the hard drive at one time or the other, whether it’s stuff you download from the net, or files that you are creating as part of your work. However you probably don’t want to leave your documents available for everyone to see. The easiest way to make sure you delete everything that you’ve created is to create a separate a folder for yourself as soon as you start work and save everything to that folder. Once you’re done, just delete the whole folder. Also please remember to check the Trash or Recycle Bin and permanently delete things from there as well.

Another solution is to not keep anything on the hard drive in the first place. If your school or company gives you network disk space, learn how to access it and try to save directly to that disk space, that way there are no local copies to worry about. If you don’t have such space at your disposal, carry around a USB Pen Drive (they are quite affordable nowadays) and save directly to that.

3. Don’t carry out money transactions

Don’t buy, sell, or in any other way give out any financial information while you’re on a public computer. You have no idea what sort of software may have been installed on the computer you are using. Losing your Facebook password is one thing, giving away your banking PIN is quite another.

4. Carry your own software

If you know that you’re going to be using public computers for a long time, it would be worthwhile to invest some time (and a little money) in getting software that you can carry around in your pocket. Many of us carry around documents on USB drives, but you can also carry around software such as Firefox, Thunderbird,, Gaim and even antivirus and encryption tools. has a excellent software package of popular and useful programs which can easily be installed to and run from a USB drive. These programs have been modified to work for people on the go. For example the portable version of Firefox is designed not to leave cookies or a browsing history on the host computer (but you can still install your favorite themes and extensions).

5. Sit in a corner

Not all the technology and security software in the world is going to stop someone from looking over shoulder and seeing what you’re typing. Try to sit somewhere so that it’s hard for someone to peek over your shoulders. Some cyber-cafes also offer private cubicles for a slightly higher price. Of course, you don’t have much control over this (it depends largely on how the computer are placed) but it never hurts to be a little careful. Some libraries will let you borrow a laptop for a few hours, try to use these if possible (and go look for a cosy spot at the back of the room).

Is Google Reader getting ignored ?

Google Reader has been one of the more erratic Google products. When it was first launched, it didn’t garner much attention. But it’s 2.0 release almost a year made quite a few heads turn and it now has quite a large user base. There have been a few improvements after that (like including video and a widget for the Google Personalized Homepage), but only the major development was probably the reporting of the number of users to services like FeedBurner. Even though Reader is now over two-and-a half years old, there is still one major feature that it lacks: search. Almost all the other services have search including much less used services like Notebook. A few hours ago, Google offered an updated user interface including a bar on top for the various Google services. Once again, Reader is one of the services that do not have this update. One would think that Reader’s substantial user base, Reader development would be one of the top priorities, but from the looks of it, Google Reader is getting ignored. With all the invest that Google is making in new services and software, is it too much to ask for a way to search through our feeds?

What Silverlight Means For You

If you keep tabs on the world of web 2.0, then you’ll have heard something about Microsoft’s newest offering, Silverlight. Silverlight is an outstanding piece of technical wizardry, with even long time Microsoft critics admitting that it is a very good product. But while the technical people and the application developers may be very happy about it, what does Silverlight mean from the tech-savvy web 2.0 user who isn’t a developer, but simply a user? Right now, not much. However, given time and sustained interest in the new platform, it could mean a lot. Let’s take a look at what might come of Silverlight.

More Variety

When first announced, it sounded like Silverlight was being poised as a direct competitor to Adobe’s popular Flash technology. But the latest announcement (the one that has garnered the most interest) has made it clear that Silverlight is not quite so simple. The Silverlight plugin (which weighs in at a mere 4MB) will contain a version of Microsoft’s .NET Common Language Runtime. The CLR allows programs written in a number of popular languages like JavaScript, C#, Python and Ruby to be run directly in the browser itself. While this gives developers a large choice in how to implement their web apps, it means that users can expect to see a new generation of even richer, more feature packed applications delivered right in the browsers. It also frees users from having to understand what plugins or virtual machines are required for their selected web app and developers no longer have to bother with maintaining a plugin in addition to their web app, Silverlight does the worrying for them.


One of the major benefits for end users will be the greater responsiveness that Silverlight will allow for Internet based applications. The recent demonstrations have shown than Silverlight can run JavaScript apps many times faster than native browser implementations. No more waiting around for long periods of time for the applications to load before you can start using it. Heavy duty applications like online web suites, image editors or publication tools similar to Yahoo! Pipes will most benefit from the vast speed increase, but some of the improvement will trickle down to even the smallest pieces of JavaScript. Silverlight applications will also be able to access and alter the basic structure (the DOMs) of the web pages that they will use as interfaces. This means that users can expect far richer, more interactive programs where the program will be able to keep track of any changes made and react accordingly.

Better Multimedia

Flash is currently the most popular technology on the market when it comes to developing streaming media via a browser. But Silverlight promises to do all that Flash can do and much more. Silverlight will allow distribution of video at very high quality (720p or high definition) and will also allow native full screen viewing (as opposed to the current alternative of a maximized browser window). What might eventually make Silverlight a better option than Flash are the new web services that Microsoft is building around Silverlight (and currently distributing full of cost). A service called Silverlight already allows users to store their content and Silverlight based web programs on Microsoft’s servers. If Microsoft handles this properly, we might soon see a large number of new multimedia sites springing up offering richer multimedia and data services and overall better usability for the end user.

While Silverlight currently seems more like a developer tool than an end-user must-have, that might change very soon. Silverlight has a lot to offer for developers, especially those who have been struggling for a long time to consolidate disparate technologies like JavaScript, XML and Flash to make robust web products. Of course, Adobe will stand to lose a lot if Silverlight eclipses Flash, but Adobe already has a firm grounding in the market, which it will be trying to consolidate with the growth of rich web 2.0. In the middle of all this will fall the various web-startups who are currently using AJAX alone, but might easily be outclassed if newer start-ups start using Silverlight vigorously.