Category: Technology

Farewell To An IP

Slicehost, the company, stopped existing a while ago after they were acquired by Rackspace. So, the title is a bit inaccurate that way, but having kept the same VPS with them since 2008 (04/04/2008, to be precise), I still consider the server as a Slicehost box than a Rackspace one. It had gone through a couple of rebuilds in the years since 2008, but the IP remained the same (, but it had not hosted anything worthwhile in a long time and I had kept it going since I was still using the DNS management, which remained as an item on the to-do list, that was dealt with in the past couple of days.
So much of the technology stack has changed in the past 4-5 years. There is now literally an ocean of high quality managed services are available now for so little that it is ushering in a world of a new generation of companies who do only integration and value addition by connecting the dots together. Once you dip your toes into those waters, either as a customer or as someone who builds the integrated solutions, the possibilities are endless. Those possibilities are also rapidly changing the world of enterprise, especially at the smaller end.
What used to cost thousands of dollars and at least a small team to build and maintain can now be put together and maintained using a fraction of the cost, if you know how to go about doing it. The playing field almost changes every 8-12 months with new technology and pricing becoming available in the market. This, though, may not always be a good thing as the choices can often be quite confusing and cost alone is not the only determining factor. Organizations that leverage these changes early and effectively will have a huge advantage over the ones that are slow.
And it should be a worrying fact that the cycle of hyper growth, to consistent growth, to plateauing growth, to declining growth is rapidly shrinking these days. Multi-billion dollar companies are created and disappear well within decade’s time, while it used to take close to that to get to hyper growth for companies at one point in time. The high end consulting and service companies won’t have it that easy either for a lot longer.

Filed under: Technology

Data On The Move: Lava W150 + Tata Indicom 32GB Plan

If you move around a lot, being able to access data on your phone alone won’t cut it for long. There is the option of tethering your phone for that, but it eats up your phone battery pretty quickly if you are dependent on the phone for things other than using it as an access point.
After trying out various approaches — dongles, tethering — I have figured that the optimal solution is to carry one of those pocket wifi routers. They cost little and tend to be stabler than tethering your phone and saves you the trouble of installing drivers and horrible dialer software that data dongles usually require you to do.
I had picked up the Lava W150 in November 2012 along with a Tata Indicom (Docomo for the rest of India) dongle. The device is Huawei-made and branded as Lava (as it is the case with most of the cheap Indian phone devices these days) and runs embedded Linux.
The Web admin UI is powered by the GoAhead Web Server and it provides for a advanced options. It is not the most user-friendly experience that you can have, but it does its job quite well, even if it has a bad habit of restarting everything for major configuration changes.
The device is only one part of the data-on-the-move equation, the other (and the more important part) is finding a data plan that won’t ruin you. I have a preference here for pre-paid plans as my usage is erratic and I don’t want to pay a fixed high amount for capacity that I’ll rarely use.
The golden rule with pre-paid data plan pricing  is that you have to hit the road and find out from the vendors what is the best available plan. The ones that companies advertise online is not often the best ones out there and I went looking for 30 GB for Rs 5000 plan and found one that gave me 32 GB instead.
The other issue with picking a provider is knowing your travel pattern well. The overall coverage and quality of coverage differs from state-to-state and provider-to-provider. My strategy is to use Airtel on the phone (2G plan that has a quota of 2GB of transfer every month at Rs 149), Indicom on the pocket router and a backup on the Micromax A73 with a 1.1GB 3G plan on MTNL.
It has been a good experience overall and with controlled usage I have finished only 8 GB of data of the 32 GB that I am allowed. The good thing about the Indicom plan is that it has a validity for a year, so I can probably use it all year at the current burn rate that I have.

Filed under: India, Mobile Data, Technology

Samsung Galaxy S, GT-19000: Two-Year Review

My trusted Android road-warrior — the Galaxy S — completed 2-years sometime late last year. I had picked up the phone as a replacement for my tough-as-nails Nokia E71, long  before the platforms were burning and tablets and mini-tablets had become the rage. To say that the phone has exceeded my expectations would be an accurate statement. As my first touchscreen phone and as my first Andorid phone, I had expected the experience to be ghastly and that the phone would not last for more than a year. I could not have been any more wrong about all that.
The phone is currently running a stock build of Jelly Bean (Android 4.2.1) and other than a deadboot (completely my fault, fixed at a local mobile phone repair shop with a JTAG flash), the device has been flawless. OK, not entirely, I also managed to make the camera unusable after scratching the lens cover pretty badly. In spite of Samsung trying its best to shaft its customers with all the Touchwiz madness and glacially slow firmware updates, this phone will easily continue to go into the pages of history for reasons other than being the phone that started the thermonuclear war with Apple.
Other than being one of the best developer-supported Android handsets, what I love about the phone is how sturdily it is built. It has been dunked in water multiple times, keeps surviving regular falls with unfailing regularity (to the extent that I often ‘demo’ it to friends, eliciting their unparalleled shock) and has withstood my general grubby and clumsy usage. These were the qualities that endeared the Nokias to me a long long time ago and I still retain the E71 (which is fully functional) as one of my backup phones.
I can say with reasonable certainty that come August 2013, I will still be using the phone as long as it keeps going and if it does not get stolen or gets lost. The strange part is that even though the Android ecosystem has changed drastically in the two-years I have had this phone, my desire to switch to a different handset has always been fairly low. It needs to be kept in mind that my smartphone requirements have only regressed over that period. I don’t game at all on the device, there are a few productivity apps and it is used to play music in the car when I drive.
If you take out two important factors — a superb camera and games — Android phones can perform 90% of the functionality (forget NFC for the time being), of the other functions across the board all the way down to Gingerbread. I know this for a fact due to my second phone – Micromax A73. I prefer shooting photos with a proper camera now and don’t use mobile phones for that purpose and I can say that my gaming days are now pretty much behind me. For a while I kept evaluating the Nexus 4, but I just could not convince myself that it was worth the premium I’d have to pay for it.
On the other hand, my regular run-ins with iOS only serves to reaffirm my belief that it is a fine, polished OS and an ecosystem, but it is simply not the right option for me. Having ruled that option out, I am not sure what will be my next smartphone at the higher end of the market.

Filed under: Android, Mobile, Technology

Setting up Fat Free CRM on Webfaction

Fat Free CRM is a lightweight Ruby on Rails application, which is very similar to the Highrise application by 37 Signals. These days I am trying to reduce my dependency on SAAS providers by hosting applications I use on a managed service. Why I am following this particular approach is beyond the scope of this post, but I could not find any instructions online that would get Fat Free CRM running on my hosting provider – Webfaction.
Before you start, I should warn you that I know precious little about Rails other than my usage of Redmine and FatFree. If you were to ask me to help you with any issue you’d face following these instructions, the odds are that I would not be able to help you at all. So proceed at your own risk.
1. Set up a Ruby on Rails application as detailed here.
2. SSH into your host and change to your webapp’s directory
cd ~/webapps/your_webapp_name/
3. Export the environment variables as specified in the Webfaction guide:
export PATH=$PWD/bin:$PATH
export GEM_HOME=$PWD/gems
export RUBYLIB=$PWD/lib
4. Check if the correct Ruby and Gem versions are being called. It should use the app’s versions and not the ones installed by default on the server.
which ruby
which gem
5. Install some gems.
gem install bundler passenger activesupport mysql2 activerecord-mysql2-adapter
6. Get a copy of FatFreeCRM from the Git repo.
git clone fatfree
7. Create a new database for the app.
8. Create the database.yml file and change the connection details in it.
cd fatfree/
cp config/database.mysql.yml config/database.yml
vi config/database.yml
9. Edit the Gemfile to uncomment the mysql2 adapter.
vi Gemfile
10. Check the bundler and rake versions, just to be doubly sure.
which bundle
which rake
11. Run db:migrate and db:set-up. 
RAILS_ENV=production bundle exec rake db:migrate
RAILS_ENV=production bundle exec rake ffcrm:setup
12. Change to config.assets.compile = true in production.rb
vi config/environments/production.rb
13. Edit nginx configuration to point to ‘fatfree’ from ‘hello_world’
vi ../nginx/conf/nginx.conf
14. Restart nginx

Filed under: Business, Start-ups, TechnologyTagged with: ,

On Online Privacy, Snooping

We are gradually entering an age where data generated by an average person about her/himself will just be beyond anything we could have imagined twenty-years ago. From mobile phone towers to cashless transactions, we leave trails and pointers to ourselves all over the world now. There used to be a time when there were only a handful of touch-points for non-cash transactions, everything else was done using cash. CCTVs were a rarity, phones were connected to wires and not mobile and there was no internet.
We have not yet grasped the degree of this change in its entirety, while the impact it has on an individual’s privacy and safety is far beyond what we can imagine today. Just take the case of the internet, everything from your DNS lookups to requests made to another computer online is logged and in the case of unencrypted traffic it can easily be examined too. It is not just spyware and backdoors that are a threat these days, the entire intermediary infrastructure is open to abuse by both the good guys and the bad guys equally.
In a manner of speaking, privacy really does not exist and nothing is really a secret anymore. Any secure system is only as secure as its weakest link and our basic communication infrastructure is riddled with multiple weak points. Does that mean it is wrong to have an expectation or to ask for privacy? I don’t think so. As citizens of a free country we have every right to ask for it. More importantly, it is more important for enforcement and law to not criminalize the citizens of a country, by default, if they have to have an easier time of nabbing the bad guys.
Coming from that line of thinking, I don’t find the overtures by the government to somehow censor the content on the internet more as a case of having no idea of a scary new world than an outright attempt at curtailing dissent or any such thing. As people who publish content, we often don’t have a clear handle on how to deal with things that are written on our sites by the vistors and also what is written about us on other sites. If we don’t have a clear cut idea on this, it is easy to imagine the confusion of people in power, who have to deal with this on a much bigger scale.
I had a first-hand experience of this in 2008 when we were dragged into a case regarding some defamatory comments published on my then-employer’s website. Both parties in the case were well-meaning, but neither of them had a clear understanding of how these things work, nor did the people who were enforcing the law. I do advise a law firm on issues related to technology and I find the same problem there. Both law and enforcement  are dealing with issues over which they have little understanding or clarity. When you work backwards from “this needs to stop,” you are a hammer and the world is full of nails.
The risk in all of this a blanket criminalization of anything anonymous or pseudonymous in the digital world as these are the favourite paths taken by people who abuse the wonders of the digital world. Laws are increasingly being made or drafted that look at abuse as the rule and everything else as an outlier. There are enough individuals and organizations that are willing to help governments and agencies bridge this crucial gap in understanding the inner workings of the digital world. I hope that we can find a better tomorrow where we can all work together to bridge the gaps and progress towards a healthier place.

Filed under: India, Technology

Different ways of manipulating views in Elgg

There are numerous ways in which you can affect the output of any view in Elgg. Some of these are well documented, while others are not. Following are the ways in which it can be done:

  1. Overload existing views
  2. Extend existing views
  3. Post-process views

If you want to overwrite any core view completely, the overloading method is the best way to go about it.

If you only want to append code to existing views, the second method is the way you want to go.

If you want to filter or modify the final output in a selective manner the third approach is the way to go.

Filed under: Technology