5 tips for computer usage during exam time

It’s been over a week since I last posted and in that week I’ve been very busy with schoolwork and exam prep (not to mention catching up after being at NCUR for the better part of a week). It’s been hard for me to get time to do anything besides stay in step with my work, but necessity is the mother of invention and so I’ve managed to learn some lessons that help me to squeeze out a little more useful time out of each day. It helps that I’m on a fairly tech-savvy college campus and hence no more than a few short minutes away from computer with a network connection. This post is devoted to how to make the best use of computer time when there is a ton of other stuff to do.

1. Timebox communication

Timeboxing is a time management strategy where you put together all related tasks into a single (slightly large) time slot and get them done all together. More importantly, when time is up you stop and move on to other things no matter what’s left. This isn’t the best strategy for all things (like any creative work), but it works great for more mundane things. Since most of my communication is electronic, timeboxing works very well. I have 30 minutes in the morning and another 30 minutes to an hour in the evening where I sit down with all email and Facebook messages and get everything cleared out. I don’t check my email in between unless there’s something urgent.

2. Keep work online

College students are always on the move. One great way to squeeze more out of the day is to not have to go back to your room to get your work. Keeping most of your work online means that you can get a little work done whenever you have a network connection (and for most college campuses you’re never too far from one). This doesn’t work if you need a textbook, but can be good for writing papers or doing online research.

3. Don’t be afraid to disconnect

In contrast to my last point, the internet can easily become distracting and disconnecting can be the only way to actually get work done. Don’t be afraid to pull the plug if it’s cutting into worktime.

4. Use calendars and reminders

With a mass of deadlines looming, it can be easy to forget what comes when and have everything just coalesce into an indistinguishable mass. Having a good calendar with some sort of reminder feature is absolutely essential. I use Google Calendar and for the past few weeks it’s been a lifesaver. I use it to schedule out my time as well as keeping track of deadlines. Scheduling is probably the best way to make sure that useful time doesn’t go to waste. Of course, you actually do need to follow the schedules that you make for yourself.

5. Keep notes on a computer

This is most helpful if you do it throughout the semester, but can be especially useful right before exams. Notes for classes are much easier to reorganize and structure if they’re electronic and print-outs are certainly easier to read. This doesn’t always work out for diagrams, but for plain text there’s almost no reason not to do it.

Wolfram Alpha Web Seminar report

I just finished participating in a Web seminar by Stephen Wolfram about his upcoming Wolfram|Alpha search engine. Wolfram Alpha has shaken me to my intellectual core like almost nothing else I have ever seen before. Alpha isn’t really a search engine in the commonly understood sense of the word. The website calls it a “computational knowledge engine” and that is a very accurate, if not very intuitive description of what it is. You really have to see it in action to see what it does. Its defining quality is that instead of just looking across the internet for information relevant to a query, it actually attempts to compute the answers. I’ll give some examples later on to show what I mean, but for now I’ll start by walking through the seminar while it’s still fresh in my head.

The very first thing that Wolfram said (and reiterated a number of times) was that Alpha was based on Mathematica and his New Kind of Science. While this probably isn’t surprising, it’s certainly not something that most software companies or even computer science researchers would actively think about doing. However, both of them are very powerful tools and Alpha is a testament to that. Later on I had the chance to ask if cellular automata (the foundation of NKS) were a core part of Alpha. Wolfram was very emphatic that they were. He went on to say that he had always wondered what the first killer app for CA would be and that Alpha was it, even if it was a ‘prosaic’ application of NKS. He also said that NKS methodology was used heavily in the construction of Alpha. He hoped that Alpha would help people to actively create new types of science and scientific models by exploiting computational analysis.

Wolfram showed a few examples of how Alpha could be used. He used it look up data of Springfield, MA and showed how Alpha was capable of understanding queries and computing and intelligently displaying relevant data. For example searching for a chemical compound showed it’s structure and information about it’s physical and chemical properties as well as how to create it. Given a specific amount of a compound (4 molar sulphuric acid in this case) Alpha gave precise amounts of the other chemicals needed to create the given amount. Another interesting sample was when he typed in a DNA sequence and Alpha showed a possible human gene that matched it as well the relevant amino acids it encoded. That example almost blew me away.

Alpha has 4 major components:

  1. Data curation: Alpha doesn’t feed off the entire web but rather works off a managed database and certain trustworthy sources (Alexa and US Census info being among them). Data which does not change is managed and categorized whereas the sources are polled regularly for relevant, up-to-date information.
  2. Computation: 5-6 million lines of Mathematica spread across lots of parallel processors (10,000 in the production version) make up the heart of Alpha. They collectively encode a large segment of the algorithms and computer models known to man. They can be applied to theoretical problems (ie, integration, series creation, airflow simulation) or to specific data (weather prediction, tide forecasts etc).
  3. Linguistic components: The demonstration makes it clear that their is a very powerful (though far from perfect) natural language processing system at work. This freeform linguistic analysis is essential to Alpha because without it, a manual to make proper use of Alpha would be thousands of pages long (according to Wolfram).
  4. Presentation: Alpha is very pleasing to look at. The information is shown in a way that makes it very easy to get a good grasp of what’s being displayed but isn’t overwhelming at all. Though there is a standard overall format (individual data segments are arranged into ‘pods’ on the page), the actual displayed is very tailored to the specific query. It is actually simple enough for a child to use.

Wolfram has very clear and powerful ideas about what Alpha should achieve once it goes live. His main recurring theme is that it should open up computation and data analysis to everyone. Over human history we have learned to calculate, compute and algorithmically manipulate across a very wide range of topics and data. However, gaining access to these powerful tools requires considerable training and resources. Wolfram wants Alpha to let everyone become a personal scientist (that’s close to the actual words he used) just like search engines allowed everyone to have a personal reference librarian.

Alpha focuses on questions that have definite answers or that have answers that can be computed directly. In cases where there is confusion or dispute, or Alpha cannot compute sufficient answers, there will be the option of sidebar links to additional resources (like Wikipedia). Talking about Wikipedia, Alpha won’t be open for everyone to contribute to, however Wolfram said that there would be a smooth process for experts to contribute to Alpha’s knowledge base.

Talking about Alpha’s actual deployment, Wolfram said the free version would be open to everyone and would allow some amount of customization (like defining specific fields to perform specific operations). Alpha would have a set of APIs allowing data retrieval on multiple levels. Whole pages of results, or just certain sections could be obtained, as well as the underlying data and mathematical abstractions used to obtain those results. There would also be APIs to leverage the language processing infrastructure. Commercial offerings will be available where by the knowledge base could be augmented by a company’s internal information and Alpha would then apply its computational analysis to that knowledge base. I think this is going to be very useful for companies, large and small alike.

Throughout the webinar, Wolfram showed lots of examples of what Alpha could do, some of which were just plain neat and others which were awe-inspiring. Wolfram himself seemed very interested in finding out the limits of the system and would get somewhat distracted by bugs popping up when they weren’t supposed to. I suppose that’s a good thing considering how important and successful Alpha could be. He was always very good about answering questions.

My personal thoughts about Alpha are a bit hard to describe. I think it is a wonderful piece of technology that goes a long way to making computation meaningful in people’s lives. Most people use computers like glorified typewriters and record players, but Alpha might just change that. From a computer scientist’s point of view, it is a certainly a very interesting application of computer technology. The natural language processing that’s available seems considerably more capable than what is seen in most search technologies. I hope that as Alpha launches more details of its implementation come to light. As an engineer, I’d love to know more ab0ut how Alpha’s computational and data-management systems are structured and how the massive parallelism is handled. Most importantly, I hope Alpha causes a fundamental change in how computers are used and what people’s expectations of software are. Make no mistake it, Alpha is important. I won’t say it’s the best thing since sliced bread, but it could be. A lot depends on how people actually use Alpha and how open Wolfram makes it. If there is enough data made available (or if there is any easy way for people to supply their own data), I can see it becoming a powerful tool for real scientific endeavor. Here’s wishing Alpha and Wolfram the best of luck for the future.

Life lessons learned from NCUR 2009

NCUR 2009 is over and I’m back at college. The last few posts are reports of the things I saw and found interesting. I came across quite a few good ideas, some of which I will be exploring in the future. However there are some real-life day-to-day lessons I learned from the experience of going to a conference for the first time. Most of the things I’m about to list are actually pretty common sense things, but they’re not always things that are actively on one’s minds. Without further ado, here’s the list:

  1. Make a checklist and actually use it. It’s amazing how easy it is to miss simple everyday things when you’re packing for a short trip. Keeping an actual list of things to pack and checking off on them helps a great deal. Priority should be given to things like toothbrushes, razors, medicines and other sanitary and essential items.
  2. Always pack a few more clothes than you think you’ll need. This doesn’t mean pack a full wardrobe, but there’s nothing wrong with a spare T-shirt or two, no matter how light you want to travel.
  3. Figure out your look before you leave. If you’re going on stage or meeting a lot of people, then you probably have some image of yourself that you want to present (consciously or not). Please think through this image (if you care about it) and prepare accordingly. It’s fine to dress casual, but wearing a shirt and tie with running shoes is probably not something you want to do. On the same note, make sure everything you want to wear actually fits properly.
  4. Prepare yourself. Whether you’re giving an oral presentation, showing a poster or just mingling with the crowd, make sure  you know about whatever it is you plan to talk about. In particular, rehearse an oral presentation, know the details behind everything you put on a poster and be prepared for questions. Also I’ve learned that it’s better to admit that you don’t know the answer to a question than to try and fudge your way out of it.
  5. Have your equipment ready. This is related to the last point, but not quite the same. For example, you can have a great rehearsed presentation but that doesn’t mean your immune from technical difficulties and the like . In the case of computer science, if you want to show some program you wrote, make sure it’s up and running before you begin your presentation.
  6. Take your time and take a break. If there’s a lot going on at a conference, then it’s very tempting to just keep on moving from one thing to another as fast as possible. Depending on the specific circumstances, that could be a good idea. But sometimes it’s worth slowing down. I made a conscious choice to pick a few sessions and presenters that I thought would be interesting and spent most of time there. I also left enough time to just walk around and relax. I think I had a better experience than if I had just moved around at full speed.
  7. When it’s time to go, make an early start. I’ve never really been late for a departure, but I have been pretty close on a few times. It’s much less stressful to be early even it requires sacrificing a few hours of sleep. Additionally, there’s time left to deal with any last minute things that might pop up.
  8. Have fun and learn what you can. There’s not much in going to a conference if you’re not going to enjoy yourself and come away from it with something interesting. I don’t think there’s any simple rule of thumb on how to do this, but I think that following the above might help to some extent.

If you have any ideas on how to make the most of a conference or just travel in general, please drop me a comment.

NCUR 2009 Day 2

Due to a number of unforeseen circumstances (mostly involving lack of a wireless connection) I haven’t been abel to live blog the conference as I had intended to. Here’s a rather delayed roundup of the second day of the conference.

I didn’t actually make it to the conference until lunch. There was one session of presentations and posters in the morning, but I didn’t see anything of interest.  I went to on poster session in the afternoon and I saw a number of interesting posters. There as one study of using Lego Mindstorms to encourage children to study math and the researchers had found it quite effective. I’m currently looking into building a Mindstorms-style interface for my own research project, but I wasn’t quite sure if it was the right thing to use. However, I think that it’s going to be a good bet.

The most interesting poster (and presenter) was Thomas Levine’s poster about how people position their keyboards. The work he had done was interesting, but it was the conversation I had with him that I found more useful. We talked a lot about different types of keyboards, seating positions, keyboard layouts and such things. Since I’m interested in keyboards too, I think I might stay in contact with him in the future.

There were other cool posters, one about how the perception of death changes people’s attitudes towards loyalty, fairness, duty and other such moral qualities. Though I’m not much interested in psychology, I think it’s worth knowing what things my thought processes respond to and how. Coming back to technology there was interesting work on document processing and structuring the extracted information into machine-processable data trees. Natural language processing isn’t one of my areas of interest but it’s an important field with lots of open questions and I’m glad to see thtat there are smart people working on it.

After the poster session it was time for me to make my own presentation. I talked about how we had applied formal grammars to studying how complex systems evolved over time and how they could be controlled. I had to rush my presentation towards the end since I spent a bit too much time on introductory material. But it went of well and judging by the questions I received, it seems like there was a good amount of interest. It helped that the people in the room were very tech-savvy. The presenters following me almost blew me away by the work they were doing. The next presenter, Abdulmajed Dakkak showed how he had used a variety of tools and languages to create a powefulway to geographically track Bittorrent usage by looking at the IPs of people connected to a swarm. He also had a great looking presentation in Flash instead of Powerpoint which I thought was really attractive. What amazed me even more was that he had done all his work off a netbook. I’ve been looking to get my feet wet in networks and parallel programming and I might consider duplicating some of his work, but using my college’s clusters instead of a netbook.

The last presentation was about a tool for authors called Story Signs developed by John Murray. This is an interesting tool that is designed to help authors better structure their stories by adding tag-like information to different parts of a piece of text like what characters are involved, what the scene is about, what sort of a scene and so on. Being interested in writing myself, I thought this was a really interesting tool. I would be interested in seeing the different ways in which this tag data could be used. Some sort of social networking built on top of it might eb interesting (think of it in terms of a literature version of Flickr). Furthermore the user interface was built in Flash and was really good-looking, very different a run-of-the-mill desktop app. I think it might be a good idea to look into Flash for implementing desktop UIs in the near future.  It’s not something I had considered before, but it might be worth thinking about.

That was the last session for the day. I spent the evening at a social event that had been organized for the attendees which actually turned to be a lot of fun. There was one more session on saturday morning, but since we were leaving that morning, I decided to just sleep in. The trip back was uneventful. I tried to get onto wifi at Minneapolis airport but I couldn’t, if I had this post would have been a day ago. I have a lot of school work to catch up with at this moment, but once I’ve caught up a bit, I’ll post about the lessons I learned at NCUR (and which I’ll really try to follow at my next conference).

NCUR 2009 Day 1 afternoon

The afternoon held an interesting group of sessions. It turns out that the organizers tried to group together similar sessions into groups. However, multiple sessions were running in parallel meaning that there were multiple interesting things happening at the same time. In my case, there was a robotics group and a computer networks group running at the same time. I made the choice to go to the robotics session with a vague idea of leaving after the first two to run half way across the La Crosse campus to get to the networks session. I actually did do that, but the second robotics presentation ended early letting me walk at a brisk pace to the Computer Science building.

The robotics presentations I attended were both by West Point students who were developing robotic equipment for use in the battlefield. The first session was about an unnamed, remote controlled vehicle which would be at the head of a convoy and was also equipped with sensors which would detect obstacles and possible explosive devices in their path. The on-board systems would automatically stop the vehicle waiting for an override from the operator. The system is really interesting and I was impressed by the combination of electrical engineering and computer science that had to come together to make this happen. The fac that they wrote their own device drivers for some of their electronics makes this even more awesome. The second presentation was about smaller autonomous robots which could find their way through mapped rooms. A user would only need to select a point on a map and and the robot would find its own way to the destination, avoiding obstacles by moving around them. The concept was interesting, but the presentation could have been better.

After that a quick walk took me to the Wing Technology building where the computer science presentations were taking place. I had missed the first two, but was in time for the next two. The first was about a simple music-production system called COMPOSE which allows users to select notes and beat patterns by clicking on icons and buttons. The user interface is simple enough for kids and non-musicians to use. The software is simple and would be fun to use. Eventually it is supposed to be part of AI research to understand how humans identify pleasing sounds and note sequences. The next presentation was about using neural networks to predict battery status in solar-powered vehicles. Once again, this is at the combination of computer science and electrical engineering and so very interesting to me. 

I decided to stick around for the next set of conferences. The next 4 presentations were on a variety of subjects. The first was about automating genome research by building a set of extensible, online tools; the second showed a way to classify medical images from different sources (MRI, CAT, X-ray etc.) by analyzing the images themselves. The third one was about using simple image color analysis to look for anomalies. It’s meant to be used for rescue agencies to quickly analyze aerial video footage to look for lost people. The final presentation was about a digitization project to make available all the data collected by the Freedmen’s Bureau. This Bureau was created by Lincoln to help the slaves who were being freed and contains a very large amount of information about the people and their lives. I think digitization projects are important because they help preserve a large amount of history and make it available for other interesting research projects.

That was the end of the presentations I could attend. I went on a river cruise which I spent by mostly talking with one of my professors. I’m now back in my room, watching old House episodes. I have to touch up my presentation a little for tomorrow. I don’t have any plans for tomorrow and I’ll just make up my plans as I go along. I’ll try to keep posting as often as I can.