About internet service, mobile companies and IT trends in Bangladesh

12 08 2007

Note: This post is a response to the post মোবাইল দিয়ে ইন্টারনেট কতটুকু নির্ভরযোগ্য…. by Omi Azad.

Dear Omi Azad, Thank you very much for raising this topic. I always wanted to have comparative knowledge about the internet services in Bangladesh. And having a more clear view now after reading the responses. So thanks to everyone else too. Here is my share of the picture and the thoughts.

I have been using GP internet service for a year and half now. I am using it from Chittagong University area which is 14km away from the Chittagong city. So it is fairly remote area. I used to get flat 20kBps of maximum bandwidth (please mind the big B, it is 160kbps with small b) for first 10-12 months, then suddenly the line started to yield 8kBps to 20kbps of maximum bandwidth at different times. So not getting 20kBps every day anymore.

It is clear that bandwidth is dependent on the volume of users in an area. I used GP internet in the bus journey from Chittagong to Dhaka. The connection stayed persistent and fairly fast. Used it form Rampura for a few days, got 12kBps. In Muhammadpur, less than 2kBps, and in Panthopath I got 18kBps or something. So I think it is a pain in some or most places in Dhaka at least. My friend in Muhammadpur decided to loose his line.

[Note: You will get no more than 4-5 kBps if you use GPRS-only sets. I am talking about EDGE-supported mobile phones here. GP does not say anything clearly about this. I went to GP HQ to ask for advice before taking their internet service. All they told me was, the costlier the phone you use, the more the bandwidth you get. And I asked these to people from GP IT division, not customer care. Smart brains they have in GP!!!]

I was thinking of trying Aktel or Citycell’s internet service but now I know that I should not. If Citycell is consistent then they are good choice for someone who is not going to do bulk downloading. I just checked my monthly statistics in DU Meter, I download more or less 7GB per month. So 3GB/month is not for me.

I would neither blame nor kiss GP or the mobile companies for their service. I could not have used internet from such a remote area if there was no GP or other mobile company offering service over mobile network. And also do not forget how unreliable the so called “Broadband” internet connections were. I used to telecommute to USA form Nikunzo in Dhaka, where I had to use a reselled ZIP line which used to yield 4/5kBps with lot of cut downs. If there was a storm, the line would go down for the night. If there was power cut in the area, the line is unavailable, even if you have IPS. So GP internet saved my life at that time. I later moved to CU campus in Chittagong, my home and didn’t have to move till now. It is ironic though that GP did not announce their service outside the metropolitan cities for long. But the best service is available outside.

I think we have bad times ahead, all signs are there. They will go bad and will not do anything. They great service was a surplus that we enjoyed. GP is a imperial minded company. They fed us 7tk/min for long, now lowering cost as slowly as possible, just to keep their income optimal, nothing else. They turned all (almost all) the engineering graduates into customer care executives. Don’t want to talk about that a lot, but I see a generation of stereotypes coming out of “universities” with engineering and CS degrees to become zombies in the mobile industry. These mobile companies indirectly wrecked the vision of creating N thousand programmers per year from our education system. Rest was wrecked by the low (very very very low) paying software companies.

And I see mobile companies always choosing the topics for their ads which are most negative to the society. Off course you can chat with girls if you have an internet connection, but you can do a lot more. Why can’t they tell the nation about MIT Open Courseware or Wikipedia. They are busy devising ways to teach the teenagers to talk on the phone all night. It is true that they changed our lives, but in the least positive way.

And since I started bashing, I have to bash the newspapers too. They are not doing anything for the IT generation. Prothom Alo features the mobile technologies every second week on their weekly IT page, and that’s all (I always say that the lamest feature page of Prothom alo is the IT page. They will review a game, and you will find it hard to extract the name of the game in that one small paragraph of review. Utter carelessness!). We have 10+ (how many?) universities in our country who teaches how to make Microprocessors, OOP with Java, Algorithms… where did those collective knowledge go? Why there is no column by the ace programmers in any newspaper? Why calling-bell circuit diagrams are as important to the newspapers as they were 10 years ago? No newspaper/media did anything to boost the real stream of Computer knowledge/education in Bangladesh. The IT brains always had to swim against the tide, on their own. Can’t people like Omi Azad do anything about that? Anyone from Prothom Alo or Daily star?

Blogged with Flock



Do not run after misleading benchmarks

10 07 2007

I just had to find some time to write about this. This post is aimed to help anyone who feels elevated and impressed by the trendy benchmarks between programming languages and/or frameworks. Benchmarks are tools to help us judge. They have a purpose to serve. But using them to their lowest significance and publishing those results to the community does not help anybody. I’d rather say misleads the rookies. And let’s face it, there are more rookies in the industry right now than anyother time. High level MVC frameworks are starting to get in good stable shapes, enabling the developers do more with less. But the problem is doing more with knowing less is not good and neither is sustainable.

Let’s have a look at some posts that proves the point. Someone benchmarked Code Igniter, Cake PHP and Symfony in this blog. All the fuss is about, these three framework printing “Hello World”. This benchmark uses artillary to kill mosquitos, then honours and ranks the million dollar artillaries for doing that better that each others. And then (the catch) gets great appreciation too. What is the point of printing Hello World with RAD tools? Do you benchmark a sniper rifle, a rail gun and an AK-47 in an indoor fight and rank them which one is better? Aren’t they built for long range?

Some people did mention in the reaction that  a benchmarks like this should involve practical use. Like DB operations, ORM use, to find them perform what they are built to perform and compare them there. My intention is not to blame efforts. My point is, how many rookies gets distracted by these type of benchmarks? I say, many. Just read the comments. There are lot more examples. This one is at least,  amongst PHP frameworks. People are compairing cross language and cross purpose tools like that.

If you are either impressed or repelled by the arguments up to now, I suggest you to listen to this speech of brian d foy about benchmarking, given in Nordic Perl Workshop 2007 a few months ago. This is one of the most interesting speech I ever heard. He starts by saying, why you should never use benchmarking ever in your life. Well not ever, but not while your brain is off, as he later clarifies. And also advices to use profiling, which will help you more than benchmarking.

And about speed; speed is a very relative metric for judgement. It is sensible that RAD tools give away little bit of execution speed for developement speed. Symfony guys posted a good explanation in their blog to clear what is how and why. My final thoughts: there are lot of good frameworks to choose from, choose depending on your particular needs. But above all, choose with realistic expectations.

Blogged with Flock

Tags: , , , , , , , , , ,

Flock 0.9 Beta testing ended

24 06 2007

I have been part of the small group of beta testers who were invited to download and test Flock 0.9 beta for last few weeks. Finally the carnival of 10-mails-per-hour mailing-list is quiet. The bugs are verified, listed and assigned for fix. We were adviced not to air the download link as it was under testing. But now I can invite the enthusiasts to visit the wiki that has the results of the testing phase. I would not like to forget mentioning that my name is listed in that page and will be in Flock Help Menu>”About Flock”>”Credits”>”Friends” of the final version of Flock 0.9. (off-track: I had a hard time explaining what “bragging right” is, to an Indian programmer last week. He should read this post to get an idea.)

Flock is technically a cousin of Firefox, because both runs on the Gecko engine, as does Camino, SeaMonkey, K-Meleon, and Netscape. Flock came out in the scene with many original ideas which firefox and other browser are planning to implement in the future. The main Idea is to integrate the social networking tools in the browser. Like blogging, uploading/downloading images to popular social networking sites right from the browser, intregrated news reader, saving web snippet etc. Amongst these, I like the photo-stream feature most. I can gaze at colorful pictures from  flickr and photobucket, easily can filter the streams, and also upload from my harddrive by just drag n drop. I am also a fan of the news reader it has embeded in it. What I don’t like is yahoo as the default search engine. First thing I customized was setting google in it’s place.

Flock has enhanced all these features along with adding few new ones. The list of  supported social sites are extended. Video streams from Youtube and Truveo are added to the media streams array. The look is prettier than any other browser, and a new home page feature is introduced that is called My World. It aggregates the favorite sites, feeds and media streams in the customizable homepage, reusing the redundant “home” button. There are three buttons (rss, media stream and search engine detector) in the address bar that get active depending on the meta data or content of any web page. They kept their creative flicks alive, to say the least. I am very excited about the release candidates now, with the requested features in place. Here is a sneak peek of 0.9 beta with the default home or my world page:

Flock n Roll…

Blogged with Flock

Tags: , , , , , , ,

A research on Web 2.0 webserver demographics

8 06 2007

Background: I had an interesting chat with one of my friends about what technology Web 2.0 uses and promotes. Does it break any trend or at least started to break any in the field of web servers? So for the welfare of science, technology and humanity, I decided to do a research. I can knock all the Web 2.0 web sites to learn about their web servers info and analyze to find out if there is really any significant visible trend or not. Alexa pays their researchers a lot of money to do this type of research. Anyway I am doing it for free this time.

Plan: So, I had to write a web crawler that makes a list of all the web 2.0 sites some how and knocks all of them at the head (pulls the http header only) to know what web server they use. I chose eConsultants.com to create the list. They have a list of all Web 2.0 sites like quite a few sites do, as you may know. And they are easier to crawl as they are least bloated than the others (and I cannot pull data from websites made with flash).

Research: So crawl i did. And found a list of 1269 sites in total. So you know how many Web 2.0 web sites are there. Hang on a minute! What makes them qualified as Web 2.0 sites? Lets leave that responsibility on eConsultants.com. But if you want to know what I think about Web 2.0, here you go, I found these comments in a Digg story:

Digger X: What exactly is Web 2.0? What kind of features can I expect? I keep hearing about this buzz, but I’m not sure exactly what it is.
Digger Y: web 2.0 is a new buzz word that will allow startups to get funding again if they can tag themselves as web 2.0 If your website has gradient colors and uses ajax you’re already web 2.0 baby!!!

Okey, so our mind is clear again. Here is the list of Web 2.0 websites. And another web crawler retrieved web server info from all these sites and created a Web 2.0 list with webserver info. Then a text analyzer program I wrote, made me a sorted list of all the web servers.

Result: Final list is not that big, so I can post it here:

725 (57.13%) ==> Apache
176 (13.87%) ==> Microsoft-IIS
173 (13.63%) ==> unknown
52 (4.10%) ==> Lighttpd
37 (2.92%) ==> Apache-Coyote
25 (1.97%) ==> Mongrel
10 (0.79%) ==> nginx
7 (0.55%) ==> Zope
7 (0.55%) ==> Jetty
6 (0.47%) ==> GFE/1.3
6 (0.47%) ==> LiteSpeed
6 (0.47%) ==> Resin
4 (0.32%) ==> Oversee Webserver v1.3.18
3 (0.24%) ==> GWS/2.1
2 (0.16%) ==> AOLserver/4.0.10
2 (0.16%) ==> Apache-AdvancedExtranetServer
2 (0.16%) ==> SWS
2 (0.16%) ==> Zeus
1 (0.08%) ==> Web Crossing(r)
1 (0.08%) ==> Juniper Networks NitroCache/v1.0
1 (0.08%) ==> Japache/2.2.4
1 (0.08%) ==> AZTK – dido
1 (0.08%) ==> Web Server
1 (0.08%) ==> TwistedWeb/2.2.0
1 (0.08%) ==> JoyWeb 1.0b1
1 (0.08%) ==> Server
1 (0.08%) ==> LuMriX
1 (0.08%) ==> JWS 1.2
1 (0.08%) ==> Lotus-Domino
1 (0.08%) ==> mfe
1 (0.08%) ==> netvibes.com
1 (0.08%) ==> Concealed by Juniper Networks DX
1 (0.08%) ==> Sparky
1 (0.08%) ==> Sun Java System Application Server Platform Edition
1 (0.08%) ==> Mittwald HTTPD
1 (0.08%) ==> Yaws/1.65 Yet Another Web Server
1 (0.08%) ==> igfe
1 (0.08%) ==> Phillips Data v1
1 (0.08%) ==> bsfe
1 (0.08%) ==> SimplyServer 1.0
1 (0.08%) ==> DMS/1.0.42
1 (0.08%) ==> Sun-ONE-Web-Server/6.1

You can compare it with Netcraft’s research result of all web server. I won’t claim that mine is a very accurate research (does have some rubbish data), but it showes the picture more or less. You can see that Mongrel gets a greater share here than the general list. At least those are Rails sites (does anyone use Ruby without Rails as a web platform?). I am happy to see Lighttpd (lighty) getting a spot amongst top 4. If any web server goes up in that list significantly in the near future, it will be lighty. The “unknowns” listed in 3rd place did not give any web server info in their http header. So let us assume that they have the same demographics as the visible ones.

And Apache is the leader by far, IIS being the second biggest web server in the market.

I would be happier if more info could be retrieved this way. I will definitely try to know more about the other web servers here I did not hear of before. I would also love to take part in any research project of similar sort in the future. Finally, If you have any suggestion/critique to make this research any better, or just would like to let me know your appreciation, please drop me a line.

Update: I just found that the primary list has some duplications. Too bad. I just updated the post with the new data. Changed the bad list files with new ones too.

technorati tags:, , , , , , , , , , , ,

Blogged with Flock

Steve Jobs and Bill Gates interview video

3 06 2007

Steve Jobs and Bill Gate$ interviewed together at the D5 conference on 30th May, 2007. This is the highlight video. You can catch the complete 7 part interview in youtube.

And this is a cartoon about Jobs and Gates…

Perl commandline tool for reading google groups posts

3 06 2007

This is a little commandline tool I wrote with Perl to help myself read the posts in google groups. It fetches the thread urls with the latest posts
and opens them in firefox tabs. I use it from cygwin command prompt in my windows machine. You can easily customize it to help your needs. Long live open source.

Why did I create it: I use a feed reader to read the posts of the groups i subscribed from google groups. To read the posts, I click on the the links on summary/description of each post in my reader. Then I reach the page with only one post, then I click on another link to reach to the thread. I find doing that few times everyday really painful. So this one will keep my sanity intact for now.


# example useage:
# browse.pl
# browse.pl perl.beginners
# browse.pl comp.unix.shell
# (comp.lang.perl.misc is the default group) 

use strict;
use warnings;
use WWW::Mechanize;

my $browser_path = '/cygdrive/c/Program\\ Files/Mozilla\\ Firefox/firefox.exe ';
my $group_name = 'comp.lang.perl.misc'; # default group. it will be used if you don't provide one as parameter
$group_name = $ARGV[0] if @ARGV; # user specified group, from commandline parameter
my $url = 'http://groups.google.com/group/'.$group_name.'/topics?gvc=2'; 
my $limit = 10; # limit number of posts to open

print "Group: [$group_name]\\n";
my $m = WWW::Mechanize->new();
print "Getting $limit links of threads...\\n";
die "oooops! could not load main page\\n" unless $m->success;
my $html = $m->content();
#print $html;

my @links = ($html =~ m{<td><a href="(/group/$group_name/browse_thread/thread/[^/]+/[^/]+)#[^/]+">}igs);
if (@links)	{
	@links = map {'http://groups.google.com' . $_} @links;
	my @links_limited = splice @links, 0, $limit;
	my $url_string = sprintf (qq/"%s"/, join q/" "/, @links_limited);
	#print $url_string."\\n";
	print "Opening browser...\\n";
	system ($browser_path . $url_string . ' &');
	print "oooops! could not load main page\\n";

exit 0;

Note: There are better ways to do this. Use of HTML::TreeBuilder or HTML::Parser would be more standard. But I like to write regex by hand ( 😉 actually that’s the main reason I bothered writing this script).

Click here to download the script.

technorati tags:, , , , , ,

Blogged with Flock

An article about SQL injection

2 06 2007

This is a must read for anyone who ever coded any application that uses a database. The title says it:

SQL Injection Attacks by Example

Read on…

technorati tags:, , , , , , , ,

Blogged with Flock