Monday, May 9, 2011

Google IO 2011

I will be going to Google IO 2011 and I am pretty excited about it!

Monday, December 22, 2008

Best wireless router which can be flashed with Linux

Two years earlier if I had thought about buying a router that can be easily flashed with a Linux distribution (ddwrt, openwrt, etc) - I would have blindly bought Linksys WRT54GL the router of choice for open source enthusiasts and hobbyists. Two years and nothing much has changed.
There are a few better versions of WRT54GL but they are hard to find and I would prefer them if I can find them :).

There are now more routers and more confusing standards - but none of them seem to have good reviews and the right hardware for flashing it with ddwrt or openwrt. Some of the netgear model claims open source support - but mostly has bad reviews and I stayed away from them. Linksys has so many models to choose from and choosing the right one that can be flashed was a long process. Each model has different versions and only the older versions of the model has more Flash and more RAM which is ideal for ddwrt or anything else - so if you get them (mostly used ones through ebay or elsewhere) - you can build a more powerful wireless router than the WRT 54GL.
I also wanted to try out the 300+ N series (300N, 310N, etc, and the 350N with the fastest processor in low end linksys models) but almost all them had bad reviews in most websites mostly with the default firmware - there were one or two reviews with ddwrt and looked good. OpenWrt description of 300N series was very promising to me with Redboot as the boot loader and seeing your Linux boot from the scratch. I really wanted to check it out. I have worked with Redboot and the Intel XScale processors which has a separate processor for encryption. It is usually useful when you want to use VPN's and leverage the encryption hardware of the XScale. New 300+N routers costed atleast $85 and the advantages were

  1. better range
  2. newer n standard with higher speeds
  3. more ram and flash memory
  4. Better processor


no. 3 and no.4 was the most tempting of all. I am not even sure if no.1 and no.2 are true - some reviews said that the speed was worse than g and range was almost similar or worse. I would wait for some more time for the n standard to take off and the prices to come down. Till then I am happy with 54GL and hoping to see more powerful routers which can run Linux.



Source: Wikipedia - (http://en.wikipedia.org/wiki/Linksys_WRT54G_series)

Tuesday, November 13, 2007

Using Cosmo (Chandlerserver) as a Git Server

I wanted my local Git to be updated to a server and came up across Git on Dreamhost. A nice link which talks on how Dreamhost with WebDAV can be used as a git remote repository. So any WebDAV server should work. I thought I will use Cosmo to host itself - A really cool idea! I have never used Cosmo for anything other than calendars or address book (which I worked on :) ) and had to dig the documentation to find the way to create a collection. I used Cadaver client to create the collection

# sudo port install cadaver

#cadaver http://osaf.us/cosmo/dav/vinu

dav:/cosmo/dav/vinu> ls
dav:/cosmo/dav/vinu> MKCOL git

In Mac OS X Tiger I used finder to connect to the server. I entered the URL as http://osaf.us/cosmo/dav/vinu/git - entered the username, password and it mounted :). But when I tried to put any file on this server it failed :(. After hanging out in IRC for some time I figured out Cosmo (Chandlerserver) does not implement locking yet and Mac OS X Finder expects the server to implement locking :(. I tried searching for a client like finder - but none of them had a way to put a complete folder into the collection.

I tried to copy the git repository using a Windows XP desktop I had - I created a web folder with the url http://osaf.us/cosmo/dav/vinu/git. I then copied my bare git repository from my Mac to Windows and from there to the webdav server (Cosmo). It partially failed with a message saying some files could not be copied :(


Links
  • http://wiki.dreamhost.com/Git
  • https://osaf.us

Sunday, November 11, 2007

Interview Questions

After attending a lot of interviews I found that most interviewers are obsessed with Hashmap and like to test your understanding on the algorithm complexity and how hashmap reduces the complexity to O(n). The list problem shown below is an interesting problem.

The List Problem


Its a search problem where you need to find if a small list is contained in a bigger list. The example below shows the goal of the problem.
Say you have two lists:

smallList = "red", "green";
largeList = "red", "green", "blue";

smallList is contained in largeList for the above data.

If you change the smallList like
smallList = "red", "green", "green";
smallList is not contained in largeList. The order of the strings does not matter. What is the best way to implement a function which will return a boolean value indicating if the smallList is contained in largeList.

Solution 1: worst case implementation
for (int i=0; i < smallList.size;i++) {
found = false;
for (int j=0;j < largeList.size;j++) {
if (smallList[i] == largeList[j]) {
largeList.remove(i);
found = true;
// slight improvement would be
// break; here
}
}
if (found == false) {
return false;
}
}
return true;

Complexity is O(n pow 2)
memory usage is good - since it just uses a boolean variable called found

Solution 2: Similar to solution 1 - if the method largeList.remove is not available
largeListMatches = new boolean[largeList.size]; //assume all the values are initialized to false

for (int i=0; i < smallList.size;i++) {
found = false;
for (int j=0;j < largeList.size;j++) {
if (smallList[i] == largeList[j]) {
if (largeListMatches[j] == true) {
continue;
}
largeListMatches[j] = true;
found = true;
}
}
if (found == false) {
return false;
}
}
return true;

Complexity will be the same - around O(n pow 2). Memory usage will be more since an array of boolean (of size largeArray) is used. So the memory required will be sizeof LargeArray + sizeof smallArray + 1byte * length of LargeArray + 1 byte for the boolean found variable

Solution 3: Use a hash map to hash the large list
for (int j=0;j < largeList.size;j++) {
if (value = largeHash.get(largeList[j])) {
largeHash.set(largeList[j], ++value);
} else {
largeHash.set(largeList[j], 1);
}
}
for (int i=0; i < smallList.size;i++) {
if (value = largeHash.get(smallList[i])) {
if (value == 0) {
return false;
} else {
largeHash.set(largeList[j], --value);
}
} else {
return false;
}

}
return false;

Complexity will be reduced to O(n) + O(n) which is O(n). Memory usage will be really high. For an array list of high value like 10000, etc this method will be infeasible. This is the most efficient algorithm - but assumes the system has a high amount of memory
Memory required is 2*(sizeof LargeList) + sizeof (smallList) + integer array of length largeList

Solution 4: Use binary search to find the elements. the largeList has to be sorted first. Any of the sorts like quick sort or merge sort can be used - complexity of O(nlogn). Binary search has a complexity of O(logn) - but it is for each element of small list. O(m*logn)
Overall complexity is O(nlogn) + O(m*logn) = O(nlogn)
Memory usage is equal to the size of largeList + smallList

quicksort(largeList); //or mergesort(largeList);
for (int i=0; i < smallList.size;i++) {
if (binarySearch(smallList[i], largeList) == true) {
largeList.remove(smallList[i]);
} else {
return false;
}
}
return true;

Tree problem

I don't remember the exact question - but here it is:
You have a class called Data Node.
class DataNode {
int id;
int parentId;

DataNode parentNode;
DataNode[] childNodes;
}

Define an efficient method called
public DataNode updateTree(DataNode[] dataNodes)
which will update all the parentNode and childNodes assuming that the id and parentId of all the nodes are already set. The parentId of the root node is set to zero.

Though the question is designed badly asking you to write a method inside a node - it was interesting otherwise. An example will things make more clear

Suppose you have a tree like

18
5 3

The DataNodes array will have
18, 0,NULL, NULL
5, 18, NULL, NULL
3,18, NULL, NULL


The goal of the question is to update the references parentNode and childNodes. So instead of NULL you will be able to navigate the tree structure.

Solution:
The O(n pow 2) solution is easy to design. The other solution is something to do with the thing I said in the top of the article :)

Saturday, November 3, 2007

Ubuntu on Macbook

Was thinking of installing Linux but seems that Bootcamp beta license is not valid after Oct - so i could not find a link to the beta and all the sites were updated to point to Leapord. The options I had was to either buy Bootcamp or rather upgrade to Leopord. I had no intension to do any of that. I just wanted a OSS operating system and charging to install an OSS is like loosing freedom! The other option was to use rEFit.
I had used reFIT before and I thought I will just use rEFit (refit.org) and parted to resize the partition. reFIT is a nice boot manager which shows the list of operating system during boot, includes booting from DVD/CD media. I installed rEFit and downloaded Ubuntu Gutsy Gibbon (the code name for the latest 7.10 version). I had purchased some Kodak CD -R media from Staples and thought I will use them to burn this image. To my surprise I tried it three times and could not verify the burnt image and so gave up for now. I think the CD -R has some issues or there is some compatibility issue between my Mac and CD -R. I suspect that the former one is true - probably some manufacturing defect or something.

Still waiting and hoping to find an option to install! Fedora 8 is scheduled to be released on 8th of November. I may rather install Fedora 8 if I don't get a resolution for the CD media issue.

Wednesday, October 31, 2007

Git/git-svn on Mac OS X Tiger (Macbook)

Though Git is complex to use I hope it will solve most of my version merge problems. I have used svk (a distributed version tool written in perl built on top of SVN and the SVN perl bindings) before and it is easy to use after creating a basic workflow (which may be slightly hard for a novice svn or CVS user). Unfortunately it dies with complex merge situations. It happens mainly if there is too much reorganization in the main trunk compared to your local branch. For example I worked/work (i try to work whenever I can but always get stupefied by the merge issues.) on an open source project called Cosmo - now renamed as Chandler server - it is actively developed and the trunk undergoes a huge change quite often. The merge interface of svk was really nice and I used it to keep merging changes from the trunk to my local branch. I also had my local branch merged to my own svn sandbox so that if my system crashes I can restore using my svn sandbox :). My entire workflow is documented at
http://chandlerproject.org/bin/view/Journal/SvkUsage

Now I got my almost new Macbook (repaired using Apple support) without any Linux (as of now) - I thought I will have most of the unix OSS goodies using Macports and Fink :). Git was one of my most wanted tool and I thought of installing it soon to get started with a few projects. As some of the websites pointed git install did not work using fink.
I did
# fink install git
and was greeted with some dependency error messages. I also tried
#fink selfupdate
asking me a lot of questions and taking a lot of time :(

#fink install git
again some dependency error message :(.

I always heard of Macports and wanted to give it a try. But installing Macports required installing XCode available through ADC, which requires a login. I first installed Macports without installing XCode hoping that it will work, but later realized that Macports needs a compiler as does any port based system. I have used Gentoo port based system and understood that everything is built using source and thus will require a compiler.

I downloaded the 3.0 version of XCode, but found that XCode 3.0 does not work with OS X 10.4 Tiger and is meant for leopard. I then downloaded the 2.5 version (around 1 GB download size) and installed it successfully. I could then issue port commands :)

# sudo port selfupdate

#sudo port install git-core

printed lot of stuff like downloading, building, etc taking a lot of time for the compilation
and Installation was a success :)

Happy with the success I typed
# git
it showed me the usage
I then typed
#git-svn
It said the command could not be found :(. A google search of git-svn on Mac showed me to use the command:
#sudo port install git +svn

That fetched a few more files, did some compilations, installations taking lots of time and I got an error that git-core is already installed. Obviously! But I wished the system was smarter and just installed git-svn.
I tried
#git-svn
and was greeted with the same command not found error.
I did see that the port system had two entries for git and I had to uninstall all git-core
# sudo port uninstall git-core @1.5.3.4_0+doc_svn
# sudo port uninstall git-core

Then I did a
#sudo port install git +svn

I tried
#git-svn
and yes I did see it was installed but the perl module had some missing dependencies. The module called Error.pm. I searched that package in port - but it was of no use and so I tried CPAN.
# sudo perl -MCPAN -e 'install Error'

It asked me a lot of questions - some missing programs which I installed either using fink or port. Both taking a lot of time and thus patience.
# sudo port install lynx
# sudo fink install wget
The path for port based installations was /opt/local - so I had to specify /opt/local/bin for the path of lynx binary.
For fink the path is /sw/ - so I had to specify /sw/bin for the path of wget binary.

You could use a single package management system to install the packages. I tried fink thinking that it was more like apt-get - but no it was getting all the deb's but I think they are source debs since I could see the output of configure, make.

The other package required by CPAN was ncftpget - both fink and ports did not have those packages :(. I just skipped that step.
Then I needed ncftp - ports had the package - So I did a
# sudo port install ncftp
and the next one
# sudo port install gpg-agent
But that did not give me the gpg binary and there was no gpg package in the default macports :(. I just skipped all the questions by pressing enter continuously. I got the Error.pm module installed :).
I then got an SVN/Core.pm not found error when I tried to run git-svn.

I first tried setting the PERL5LIB to my macports version of svn perl library and I got some weird error regarding some core bundler dependency. After struggling a bit I finally removed all git and svn from both Macports and fink. Downloaded git from kernel.org - compiled and installed it the unix way :)
# wget http://kernel.org/pub/software/scm/git/git-1.5.3.5.tar.bz2
# tar -jxvf git-1.5.3.5.tar.bz2
# cd git-1.5.3.5
# ./configure --prefix=/usr
# make && sudo make install


I followed http://speirs.org/2007/07/22/getting-git-svn-working-on-the-mac/ and downloaded the svn package precompiled with all the language binding. After installing the dmg file, I had to set PERL5LIB to /usr/local/lib/svn-perl.

I got git-svn working finally :).


I still miss the apt-get based installation in Debian systems which just installs the binary and is very fast. Ofcourse port based system has its own benefits like better optimized binary, etc - but it would be really frustrating to wait for 1-2 hours for a program to be installed :(.

Apple support

I recently had some issues with my Macbook and I am impressed with apple support - only disappointment was they had to ship my Macbook to Apple factory and get my Macbook repaired. I had to register for some Geniusbar appointment in Apple website and had to drive to the store. I went to Palo Alto apple store which is the nearest Apple store for me and I was greeted by a good support staff. I found the support guys to be really smart and very crisp compared to any other support I have dealt with. They immediately found out the issue is not resolvable by them and they have to send the Macbook to the factory for repair. I was told that it will take 10 days for the entire process and everything will be covered under waranty :). They replaced my hard drive, RAM and the case which was broken slightly. I handle my Macbook with great care and still the case developed a small hole by itself :(. Thanks to Apple support they replaced my case with a new one and I am happy about that :). They also replaced my Airport network card for some reasons - don't know why - it was working perfectly! Their online tracking system also updates me on the status of the repair and told me when it was shipped overnight.
Overall I am happy that my Macbook is working again now :).