Semester V of Computer Science

Ah. Another semester gone by and another treasure trove of knowledge discovered and assimilated.
This was semester V and what a semester it was. We finally dove right into core Computer Science and Engineering with the following awesome sujects:

  • Algorithm Analysis and Design
  • Microprocessor Interfacing Techniques
  • Principles of Programming Languages
  • Computer Networks
  • Computer Graphics

Truly great subjects for a Computer Science major to undertake. So let’s dissect them and see what it meant to study these topics.

1. Principles Of Programming Languages
A subject which involves the study of various programming paradigms so that one understands what are the benefits of the multitude of features being introduced in Programming Languages on a daily basis, so that we can understand the motivation behind it as well as better use it in our program design. Sadly, marred by not-so-good teachers who decided they would teach us C++ and Prolog programming rather than cool concepts like the Liskov Substitution Principle.

2. Microprocessor Interfacing Techniques
A fundamental and challenging subject, it made us truly think about the Simplicity vs. Efficiency Model of programming. Learning about the underlying hardware that controls our machines and lets you see this text here, was simply great. Read about the PS3 Cell Broadband Engine after taking this course and found that I was really able to understand all the hardware jargon on the page. All in all, 8085 and 8086 coding was FUN!

3. Computer Networks
A genuine rote subject made interesting by the fervour and enthusiasm of the faculty, this subject stands as a fine example of how great education can be undertaken provided the teacher has the motivation for it. Initially not a fan of Computer Networks due to the mugging tendency of the subject, was thrilled to see how succinctly all the various protocols and their algorithms were presented in a manner that was perfectly logical. Doubts were effortlessly quelled and the various abstractions used just made the learning curve gentler. Coupled with some great laboratory sessions (the teacher actually wanted to do a lot more), this subject became one of my learning high-points this semester.

4. Computer Graphics
A subject with a bad reputation for having outdated curriculum, I took it because the faculty in charge of it was going to revamp the topics and introduce us to OpenGL. Started great, awesome practical submissions and fun classes. Got a little drab in the middle because of redundant topics like Matrices and Vector Geometry as these had already been taught in High-School Maths. In the end, it was a good introductory class to the magical world of Computer Graphics.

5. Algorithm Analysis and Design
Saved the best for the last. My favourite subject of the semester as it coupled pure Computer Science and Mathematics to form a foundation for Computing everywhere. I already had good algorithm skills thanks to all those hours poured in Project Euler and UVa Toolkit, but this course really cleared my head and gave me the tools to really understand just what the heck I was coming up with. Intuitive solutions were given names and ideas were presented as theorems. Various problem-solving approaches helped me understand how to better analyze and tackle problems and applications were given as much importance as the concepts themselves. Someone once famously sayed, “To be a great coder, you can either write codes for ten years or take a semester-worth course on Algorithms”. Guess I saved myself 9 years of pain. 😛

Thus, great semester, lots of nice topics, finally got some Computer Science and Engineering done. Let’s hope the new semester is as much enthralling as this one.


Probabilistic Analysis of Algorithms

Probability Theory has always been one area of study that has made more than its fair share of students cringe. However, its importance in the field of Computer Science, especially Algorithm Design and Artificial Intelligence cannot be emphasized enough.

Here’s a question I recently encountered as part of my selection process as an Intern for Microsoft: Consider Randomized QuickSort for a set of ‘n’ numbers. What is the probability that we always get the worst-case scenario for this algorithm?

Now you may want to try this on your own first as it is deceptively simple.

Here’s the solution: By the analysis and design of QuickSort, we get the worst-case scenario when we select either the least or greatest element in the array of n numbers, as the pivot. Thus the probability for n numbers is 2/n. Now we take the remaining n-1 numbers (Divide and Conquer) and similarly the worst-case is obtained when we select the first or last element. Thus the new probability is 2/(n-1). This continues on until we have just 2 elements left, which then gives the probability of 2/2.

Here’s the smart part: When we analyze Randomized QuickSort, we notice that the selection of the pivot in the recurrence is independent of the pivot selection in the outer equation. Thus, since the probabilities are independent, we can multiply all our earlier probabilities to give us [ 2/n * 2/(n-1) * 2/(n-2) *…* 2/2 ] = [ 2^(n-1)/n! ] (The n-1 as we have n-1 2s in the numerator) .

And thus we have the answer!! Pretty simple, huh? All we needed to know was what is the worst-case criteria for QuickSort and that pivot selection is independent of the pivot selection in the recurrence.

Similarly, probability theory plays a big part in Bayes Networks, which are essentially Probability Distributions in a graph, which is a core topic in AI. Knowing about total probability, Joint Probability and Bayes Rule really helps in the inference of Bayes Nets. Combine that with Variable Independence, Explaining Away Effect and Enumeration and you’ve got yourself a Smart System.

Just because all this seems simple, doesn’t necessarily mean Probability theory is simple. It does tend to get a tad bit confusing at times, but with ample practice, one should find himself/herself being able to predict the future!