Have an interesting project you want to talk about?

Its that time of the year again and we’re having Ubuntu Developer Week soon (Feb 28 to Mar 4).  This time around we’re having Lightning Talks as the last session (Thank you Mike for the suggestion!).  If you’ve build an interesting app on Ubuntu, we’d love to have you talk for about 5 minutes about your app and why you made it.  You can add in a little technical info about it, say what help you could use, and provide a link to it!  Add your name at the bottom of this list if you’re interested!

We also have 2 suggestions for talks about Debian

QA uploads, Non-Maintainer Uploads, joining teams and commit things on $VCS_of_choice

How to co-ordinate with Debian, how-to for forwarding patches and bugs and other ubuntu-specific debian things like how to use reportbug

If you’re interested in taking one of those 2 sessions, add your name to an empty slot in the timetable!

Working with Google Maps

There used to be a time when there was this huge maps craze, it has since passed, but Google Maps remains the most recognized map applications seen on the internet.  Recently, I worked on Google Maps API for a client.  This post is a retrospect look at how it went.  I’ve not worked with other map systems, so I cannot compare my experience.

My task at hand was to create a store locator that would take an address as input and plot all the points on a map that was within 100 miles of the given location.  A fairly simple map application, except I decided to innovate.  My first stop was the articles page on the Google Maps API Reference page.  I found a very handy tutorial which was exactly about creating a store, wow that made my work much easier.  What I found very helpful from that tutorial was the formula.  this formula.

SELECT id, ( 3959 * acos( cos( radians(37) ) * cos( radians( lat ) ) * cos( radians( lng ) – radians(-122) ) + sin( radians(37) ) * sin( radians( lat ) ) ) ) AS distance FROM markers HAVING distance < 25 ORDER BY distance LIMIT 0 , 20;

That is the heart of the entire module.  That forumla returns coordinates that are within 25 miles of a point with coordinates (37, -122).  The complexity (if at all) of the application is to pass data from a database using PHP or other server-side language and passing into a javascript function.  The tutorial that I was looking at used xml to pass data to the javascript function.  This of course is nice, but I was a bit lazy and a bit innovative.

In my quest for something better, I discovered JSON.  Now, this seemed simple enough since is 2010 and most languages have JSON support including PHP.  So, I put all the the results into a hidden textbox as JSON and wrote a javascript function that would execute on window load.  Using that information, I could then loop through it and mark points on the map from that.

jQuery being an awesome library provided a means for me to do exactly that.  Icould loop through each of them and plot points on the map quite painlessly.

function markOnMap(x, y) {
    geocoder = new google.maps.Geocoder();
    latlng = new google.maps.LatLng(x, y);  //center the map to the coordinates of the searched address
    var myOptions = {
        center: latlng,
        zoom: 8,
        mapTypeId: google.maps.MapTypeId.ROADMAP,
        mapTypeControl: false
    map = new google.maps.Map(document.getElementById("map_canvas"), myOptions);
    var markers = document.getElementById('marker').value;
    var mapPoints = $.parseJSON(markers);
    var marker = new Array();
    i = 0;
    $.each(mapPoints, function() {
        var latlng = new google.maps.LatLng(this.lat, this.lng);
        marker[i] = new google.maps.Marker({
                    map: map,
                    draggable: true,
                    position: latlng,
                    content: '<b>Name : </b>' + this.name, //very important
    google.maps.event.addListener(marker[i], 'click', function() {
        infowind = new google.maps.InfoWindow({
            content: this.content,
        infowind.open(map, this);

(WordPress seems to gobble up the indenting, so if you want the code, its on pastebin.com)

When using infowindows, its very important that the content is stored inside the marker and then used to pop out the infowindow, that’s the only way that works.  I spend about 5 hours trying to figure that one out.

Time flies

The past few weeks have forced me to think carefully about how I spend my time.  Deadlines at work inching closer cut my volunteering time.  However, this has had a positive effect overall.  I’ve been able to sit back and think on how I spend my day.  I’ve noticed that most of the time, I just start my day without thinking or planning on what I intend to do during that day.  In the last 1 to 2 days though, I’ve been writing down what I wish to accomplish before I go to bed (note: not before night, but before I sleep :D).  I put down every single and small item and cross it off before the end of the day.  I’ve seen that I’m more productive this way.  The most pleasure comes when I get to cross an item off the list.  It might as well be a very small item like

Fill up gas before going to work

But when I do cross it off the list, the feeling is just great!  Being a geek, I’ve tried geeky solutions like Tomboy Notes, but I’ve found nothing works as good as good old paper an pen.  I tried prioritizing, but I ended up spending time classifying stuff as priority and not priority, this further forced me think in terms of “Will I get this done today? If I don’t intend to, I’m not writing it in here.”  This way my list is trimmed down to stuff I really want to do, not a wish list of things I want to do in the next 10 years.

What about you? What methods do you use to mange your time?

Progress Report

Over the last week, we had David Futcher who was on a week’s internship with Canonical.  Jono assigned him to Reviewers Team.  He totally rocked the week.

David did a thorough review of the docs with a good number of changes.  David and Daniel Holbach conducted a classroom session with good participation.  Daniel and Adnane Belmadiaf also got together to make a new pretty looking meter.  It uses javascript and we’d be very glad if you put it on your websites and blogs.  (WordPress doesn’t allow me to put in on mine, sigh)  David also wrote a supybot plugin to print out the number of patches in review queue

This weekly progress report will check if we’re on target to finish reviewing all the current patches by maverick release.  I calculated the number of days that we have from the start date to the end date.  The operation runs from June 4th to October 10, clearly, fewer days than I expected when I set the target of 15 bugs per day.  There are only 128 days for us to operate.  Not that we won’t be reviewing patches after that, but the goal is to get it down to 0 by maverick release.  (No! I’m not dreaming!)

15 bugs per day makes us fall short, badly, it only makes it 1920.  So, I’m announcing an increase of the daily target from 15 bugs per day to 20 bugs per day.  That will perhaps keep us on target better and deal with the new patches that come in every day.

20 bugs per day means we should have reviewed at least 200 bugs by now for us to be on target.  As of me writing this post, we’ve reviewed 388 bugs leaving only 1564 bugs to go!  Yes, we rock!

Even though, we’re well on our way to meet the target, we need your help! Even 1 patch per day helps us.  When you have free time, please help patch review.

PS:  When a patch moves out of review queue, that doesn’t mean the patch is integrated, it means that we have done a review and its either in the upstream tracker, or debian BTS, or the patch wasn’t good enough and needs work.

UDS-M Day 5

Phew, finally I get down to writing day 5 overview, a few days after UDS.  Generally, I write the previous day’s blog post on the next day.  After day 5 though, I had to get work (yeah, on a Saturday).  On Friday, I decided to tackle my power trouble by going outside for the hours that I know in advance I won’t have power.  Overall, good idea, but they decided to cut power at different times.  Sigh.

First thing in the morning was a call with Daniel Holbach to discuss about the Cleansweep Project.  Skype kinda gave us trouble and we ended up using Facebook chat in the end to discuss stuff.

Community Roundtable
A round up in the morning of all the community stuff including what we have to go ahead.  My memory is faint about what we talked, but I vaguely remember everyone summing up the week and the progress that was made.  Also, someone was playing music from Benjamin’s laptop, which included the Titanic song.  Fun times 😉

Ubuntu Women Session
A session I didn’t want to miss.  This session was very goal oriented from all the other sessions.  I liked the mentorship discussion and revival of the whole thing.  I’ll probably sign up to be a mentor.  I’ve already helped a few friends that I know through UW in other teams like Bug Squad.  The idea was not to replace the other mentorship options but to work with the others and to give a list of folks on the UW wiki who can be contacted for particular stuff.

I decided to take a break from the nest session to plan for Operation Cleansweep, a project that I have volunteered to coordinate.  I put up wiki pages and came to the realization that we needed more time to get things together.  I’d rather have a proper start with documentation everything ready rather than having to wait.  I pinged Daniel and we decided to postpone start date to May 24th, 2010.

Lightening Talks
As usual James Tantum rocked us with pictures of slides since most of it were using slides.  I forgot a lot of them, but ones that rocked including one by Jonathan from Launchpad team about ‘How to be an evil overlord’ or something to that extent, Popey’s Momubuntu talk, James Westby’s talk about launchpadlib (and yes, try try try until you succeed), a talk from Google Chrome guys about how speed matters, Chris Johnston talked about Classbot, Alan Bell about etherpad (we overloaded the pad 😉 ), and more that I’ve forgotten.  I’ll wait for the videos.

Travis Hartwell talked about how he wanted a way to pull the source for all the dependencies of a package with one command instead of typing out many different commands.  I was pretty sure sed or awk could do something coupled with apt-cache.  My sed foo is pretty low and I asked my good friend Mackenzie Morgan wrote something up for this.  Travis, this one’s for you buddy

apt-get source $(apt-cache depends gwibber | awk ‘/Depends/{ print $2  }’)

That command would get you all of Gwibber’s dependencies.  You can change that package name to get the source of dependencies for any package.  This source will be downloaded into the current folder when you’re running it from a terminal.  Perhaps someone could make the whole thing more prettier, but hey, this is a start 🙂  Thanks again maco!

Advocate the use of daily builds
One of the projects that Daniel Holbach has been assigned for this cycle.  Its been given a high importance and I realize the reason.  A daily build means every time you write new code, it will be built for you and a whole lot of folks can test it for you and give you bug reports.  Various improvements to LP were discussed including a rollback option among the others.

Ubuntu News Team
Amber is the chief editor of the Ubuntu Weekly Newsletter, so I attended this one hoping it would be interesting and it was!  A lot of discussion about unifying teams, etc.  There was a thought of doing away with Fridge which I stopped right away.  Reminding you folks again, We WANT the Fridge!  Well, it wasn’t a serious consideration but a thought someone had.  All in all, they made some tough calls, which will happen internally.  Also, Fridge is going to be in WordPress soon, so that should help make a lot of things easier.  I don’t remember who, I think Joey, will be working with the Design Team for a new theme, etc for the Fridge.

Closing Session
Finally, the UDS comes to a close.  Everyone had great fun for a week and did lots of work.  Most people were tired and close to burn out (yeah, from all the staying up late in the bar or out partying 😉 ).  Seriously, it was tiring.  Even from remote, I was burned out.  Last 2 days I’ve been so tired.  Hopefully I can recharge this week.  All the track leads summed up their tracks.  Important stuff include Robbie confirming that 10.10.10 could be a release date, pending TB approval.  He was talking about how much time each cycle has had and it seemed okay.  Jaunty cycle only had 25 weeks, so for 10.10.10, we’ll have only 23 weeks and it seems possible.  Scott, talked about btrfs and how it may be the default option for Maverick.  Keyword there being ‘may’.  Scott blogged about what needs to happen for that.  Leann summed up the kernel track decisions.  I didn’t understand much of it, so skipping that.  Design track, Desktop track, and cloud track also had a small summary which I don’t particular recall.  This why I should perhaps write blog posts then and there.  Oh yeah, now I remember one decision from desktop, Chromium will be the default browser for the netbook edition.  Finally Jono summed up the community track.  A huge list of summing up.  Most of which I think I’ve already written in the previous posts.  He announced Project Cleansweep.  Well, he announced it as Project Babu and how it was renamed to Project Cleansweep.  Well, I wonder why I even bothered to oppose if he was going to call it Project Cleansweep a.k.a. Project Babu 😀

The final quote from Jono ‘Lets get seriously drunk people.‘  He did say he was kidding, but the tone he said it in, was awesome.  Marianna arranged for a treasure hunt and she was given a small token of appreciation from the community for all the hard work she did over the week.  Finally UDS is over!

Now, time to get to work.

UDS-M Day 4

I’m probably taking the blogging thing too far with 3 back-to-back posts, but whatever.  One big reason I missed out on going to this UDS was my passport had expired and I hadn’t renewed.  I finally decided, it was time to re-apply and set out to the nearest Bangalore One to get a form.

The form and my old passport

It took me 1 whole hour and around 20 km of roaming around to find the place.  Absolutely no one knew the place when I asked around.  Even a policeman I asked gave me wrong directions.  Eventually, I ended up doing everything else on my things to do and was on the verge of giving up when I located the place.  Turns out, its less than 4 km from my place.  Sigh.  I circled around for an extra 10 km.  Before you talk about Google Maps, yes I tried it out there and it didn’t know what I was talking about.

Overall, Thursday was frustrating in terms of power availability.  I kept on getting power cuts and missing sessions.  The first half of the day was pathetic.  I lost power halfway through the community roundtable and could get back online during the Maverick Governance Changes and Needs session.  I can’t believe I missed the BugSquad Roadmap!  Again, I lost power halfway through Debian Healthcheck.  Sigh.  Today, I’ll just go outside to some internet cafe for the first half.  I have a call scheduled and after that I’ll just go some place for 4 hours.

Community Roundtable
Another general discussion at the roundtable, I lost a bit of that thanks to the power situation.

Maverick Governance Changes And Needs
This session was very interesting.  Though I’m not any councils, we had a lot of folks from different councils and we were exploring the possibility of working the CIVS system into Launchpad.  Jono has a task for that and perhaps we’ll have an awesome voting system by the end of the cycle.  Most of the council elections use the CIVS system.  We even used the same system for the Beginners Team Council voting.

Debian Healthcheck
This session started with Jorge introducing the good parts and the bad parts.  Zack, the DPL, was in the session and he gave some good suggestions on how to go about uploading to Debian.  A lot of packages designed for Ubuntu don’t go into Debian and Zack particularly said that they wanted them.  He explained how we could upload to experimental and sync from there.  We’ve agreed to do this.  Personally, I agreed to work on Gwibber in Debian sometime back, I guess its time to actually start working on it.  In this case, I had spoken to Ken earlier and he specifically said he was happy to help me.  Any delays are my fault and my lack of time.  As I said earlier, I lost power halfway through this session.

“Collaboration with Ubuntu” Plenary
Stefano Zacchiroli, or Zack, the current DPL, talked about Collaboration with Ubuntu from the Debian point of view for the first plenary.  This was the most awesome plenary.  Zack totally changed my vision of Debian Developers.  There are more than 1000 Debian Developers and he mentioned that though the option against Ubuntu exists, its a corner case.  His talk encouraged Ubuntu to collaborate with Debian all the time.  Yes, you heard me right.  Debian wants us.  Uploading to Debian would give all the other distros that fork from Debian a chance to get those packages.  He also mentioned that if we have a bug and patch, he wants Debian to get the patch too because DMs and DDs are the people who know the package best.  A growing trend that is being noticed is Ubuntu developers being Debian maintainers and Debian developers!  Wow, that’s interesting.  All in all, it was a very impressive talk and I’m waiting for the videos to be uploaded to watch them.  Again, special mention to James Tatum for the pictures.  We all love you James!

Photo Credit: James Tatum.

“What does this bit do?” Plenary
James Scott a.k.a. Keybuk talked about the plumbing layer.  The past, the present, the future.  Yes, it was like sitting in ‘A Christmas Carol’ about the plumbing layer.  Some of the stuff flew about my head since I didn’t know much about it.  But it was nice listening to it all the same.  He explained about upstart and how its planned to be awesome in the coming releases.  Can’t write more since I didn’t understand much about it.

In between this session, I was also in the Ubuntu Women Project meeting, so I was only listening in partly.

Crystalizing Project Cleansweep
I’ve agreed to coordinate this project.  Project Cleansweep is about cleaning up all the bugs with patches by Maverick release.  It needs a lot of work and lot of identity to be successful.  Stephan and Daniel Holbach is going to be helping me along with Brian Murray and Jono and everyone else.  We want to get a lot of community attention to this project and use the ‘buckets’ that we use in Reviewers Team into coordinating patch review for all the patches in Ubuntu.  I’ve arranged for a call with Daniel to discuss the specific actions that we need to be taking.  I’ll be posting more updates as time goes on.

I gave up after this.  All the roaming around in the morning and the 4 days was too much for me.  I switched off IRC, got off the audio feed, and started planning for Project Cleansweep and what needs to be done.  I hope to have a productive call with Daniel today.