Month: January 2013

New Gmail Ad Unit

This morning, I noticed a new ad unit within Gmail. The usual practice for Google, and almost everyone else, is to make the click open a new window and show the target URL in that. In this case, the ad unit opens a first level of content within Gmail, just like an email, and then there is a further click-through that takes the user to the final page.
Gmail now has four ad units in the mail content page and Google has been fairly innovative with their formats. One of the units — the top one — starts as a band on top and when you scroll down it collapses to a smaller unit on the right panel, above the usual set of four link tower unit.
As it is always possible, Google could just be bucket testing this unit.

Filed under: Advertising, Internet

Grokking Growth: Part I

The topic is a vast one in itself, so I’ll address only a smart part of it and that too from a business-to-consumer perspective.
Growth is one of the key drivers for a business and it is not as uni-dimensional as it is often made out to be. For example, you keep hearing about Facebook hitting a billion users and then some more after it and so on and so forth. Does it mean that once a vast majority of the population in the world is on Facebook the company will stop growing? Obviously, not. Market saturation is a business reality for all businesses that marks an inflection point that often leads to diversification or a comprehensive strategy. Smart companies pre-empt this and change course and pursue another kind of growth, while the not-so-smart ones stagnate and expose themselves to significant risk due to disruption.
 
Chasing growth is, though, quite simple for most companies (keeping side M&A options which don’t happen for most). You can either:
a) Keep getting more users (usually known as the hyper growth stage)  or b) Get more from the same (happens after hyper growth, when market saturation has kicked in).
It is harder for companies to segment growth strategies to address both (a) and (b) as you need to deploy 2x of everything (strategy, resources, measurement) to make this hybrid approach happen, while keeping even a single strategy going is tough enough for most companies. Companies that accomplish it, though, tend to be significantly agile. A good example of a company that didn’t manage a dual strategy would be Nokia.
This dual approach can be applied to specific components of a company’s operations:
Revenue:  How do you increase revenue by getting more users? How do you get more revenue from the existing users?
Profitability: How do you increase profits by getting (customer acquisition cost) new users? How do you increase profits from the existing users?
Putting in place a growth strategy also requires a good understanding of which stage is your company at.
Stage-I: Companies that have low to moderate turnover, revenue growth rate that outpaces at least inflation (ideally outpaces other obvious investment segments). The healthy ones tend to be debt/financing-free and privately held. They also have low risk appetite and low profit margins.
Stage-II: Companies that have medium turnover, they aim for explosive revenue growth through hyper growth. These companies tend to have an extremely high risk and limited runways to make the strategy work. They usually involve significant external investment and/or debt load and tend to be privately held. They have a high risk appetite.  Profit margins are nearly non-existent. Companies in this stage either die or make it to Stage III.
State-III: Companies that have massive turnovers. Their revenue growth rate is low but predictable and use debt as a routine path to fund growth. These tend to be public companies and they have a good ability to absorb risk. Their profit margins tend to be steady and companies like this die a slow agonizing death when they do die.
S-I companies typically grow organically. S-II companies typically grow by spending vast amounts on customer acquisition. S-III companies accelerate growth usually through M&A or by diversification.
I’ll examine each theme in detail in a later post.

Filed under: Business, Start-ups

Facebook Graph Search: Nobody's Killing Anyone Yet

The launch of Facebook’s Graph Search has led to the expected feeding frenzy from the media. The product was long-expected from the company, but what was delivered was different from what was expected. The hope was that the company would launch a product that would take Google head-on, but what was delivered was a query parser that uses a restricted dialect closer to natural language and considerably improved results display and filtration options.
The buildup to this event was seen in the stock price of the company in the past week with it rising to its highest in a while and going by the reaction to the product launch it has not been received well as the stock is down, but holding the $30 mark it had risen to. While the stock market is hardly a good indicator of the health of a company (ask Apple), it would seem that all is not well.
The fundamental problem with the Facebook v/s Google narrative is that till Facebook starts crawling the open web, they don’t represent a threat to Google as far as search goes. The same holds true in the opposite direction: till Google explicitly starts building a social network to pull people off Facebook they don’t represent a threat for Facebook. In short, Facebook Graph Search is not search as you know it, Google+ is not the social network that you use in Facebook.
Most of the divergence is the search approach for Facebook and Google boils down to two things:
1. Intent & Context: Most of the content posted on Facebook is posted with an active intent to be consumed within Facebook. This context is vastly different from content on the open web where the context is determined by Google using their secret sauce.
2. Universe: In Facebook, the universe of data is what is created and shared between the network of users.If it is not shared or liked by someone on the network, it will not exist in Facebook. In Google, the universe is every page that can be crawled out there.
For both companies battling the other one is not the most significant challenge they face. Google needs a framework in place that will, over time, reduce their dependence on open crawling (pull) and move in a direction where content publishers will intentionally (push) data into their index. This has the additional benefit of allowing them to fend off lawsuits regarding sourcing (crawling) and preferential display (using Google+ pages for local data).
For Facebook, user retention and overcoming Facebook fatigue is the big challenge. They can build many wonderful things on mobile and elsewhere, but it will all come to naught if a good chunk of its users start to find it is no longer fun to be on the service or be active on it anymore. The company has a long way to go to de-risk the core part of their business.
Coming back to the product specifics, it will be interesting to see user reactions once the feature is rolled out across its entire user base. Natural language querying has been an interesting niche for a while. It was considered as a panacea to all search aliments a while ago, but we discovered then that query parsing is only one half a brilliant search experience, the other — most critical — part is result quality and relevance.
On Facebook, quality is going to be awesome when the results are available, but I’ll wait and see how does availability work out for a large spectrum of queries. The trouble with socially networked data is that the results I see may not be the results you get to see. Having designed and run a private network for a while, I can tell you that this is a pretty significant challenge that few understand clearly.

Filed under: Internet

India Telco Scenario: Re-living 2005 in 2013

Looks like Airtel’s move to hike the price of its 1GB 2G plan by 25% will soon be aped by other operators. And thus continues the decimation of mobile data (and voice to a lesser extent) in India thanks to policies put in place by various governments of various formulations. We are now in the very unique situation where 3G is too expensive to be used in any reasonable quantities by the masses, so we are now marketing 2G in 2012, which should have been the case in 2005. What a mess.
Due to crazy amounts of money spent on 3G licences, the telcos have no choice but to eke out every possible paisa from the subscribers by hook or by crook. Even at the really expensive prices for 3G data, the telcos won’t recover the money they plonked into the licenses, so reducing the prices is no longer a feasible option for them, even though many attempt to do that by putting in place new plans that have minuscule transfer limits, after which the subscriber is billed at per Kb or per 10 Kb rate, which can easily throw up bills that run into the thousands for the subscriber.
The net result of such adventures is that subscribers get sucked into the trap one large group at a time. Such a scalding puts the subscribers off such services, while the telco balance sheets brighten up a bit for as long as they can keep finding more subscribers to die on the 3G data sword. The data billing process and plans are so convoluted now that we are seeing a new level of innovation in both pricing and products (even though some many alternatively call it dishonest business practices).
Some of the innovations:

  1. Capped high speed 3G connection with an ‘unlimited’ slower connection after the FUP which, for some reason, also puts a billing limit in place.
  2. Cheap volume plans with extremely low validity. Basically, you can’t use all of the data you have paid for within the given time. You’re actually paying more for less, even though it looks otherwise.
  3. 3G connections that bill fallback 2G connections on a per Kb or per 10 Kb basis. There’s no way to track this during normal usage as 3G tower footprint is dicey even within metros.
  4. Airtel also sells 2G plans which reduce in transfer speed after the plan’s limit is reached with ambiguous terms on how they define ‘unlimited’ after that.
  5. It is really hard to use even 1 GB of data in a month on a 2G connection. Most customers I know are underutilizing that data allocation every month. They were already paying more for using less, now they’ll pay even more.

What is even more troubling is that 2G cannot support bulk usage due to limited spectrum, you can already experience this in places that have a high concentration of people using mobile data over 2G in the area covered by the same towers. This is one of those spectacular cases that has failure built into it as a fact.
It is only the government who is capable of altering this terrible state of affairs, but being the party that came out smelling the sweetest of all the involved parties (other two being the telcos and the customers), it will be foolhardy to expect them to alter the course on this front. Which is a real pity as affordable 3G data had the potential to transform our internet penetration scenario. That said, we are in good company in the 3G mess. Over in China, the story is no different with 3G eroding margins for operators, thanks to lousy government policy.
The story, for me, as a consumer is different. I spent the last 3-months of 2012 streamlining my connectivity scenario. I was spending an average of Rs. 4000 per month on data and voice till then and a bit of moving things around has almost halved that amount. I switched to a Rs. 149 per month 2G plan on Airtel (2GB transfer & ‘unlimited’ slow transfer after the cap), changed my plan with a higher upfront payment but a lot of free minutes and messages, picked up a Tata Photon data card with a Rs 5000 for 32GB transfer (validity for a year) plan and stuck to a Rs 1200 plan for home broadband.

Filed under: India, Mobile

Go Through This Checklist If You Want To Build A New Twitter

There is a lot of ire about what Twitter is doing to its developer ecosystem and its users these days. An often-mentioned suggestion that comes out of it is to build another Twitter, this time one that will set right all the wrongs. Even before we get around to the question whether any product developed thus will eventually wind up facing the same problems as Twitter, we have to first figure out what exactly is Twitter before we attempt to outdo it.
Most of the current ire with Twitter are the recent changes to the usage of the API and changes to how tweets and timelines can be displayed in third-party clients, but Twitter itself is a lot beyond just an API and a bunch of clients. To compete effectively with it does not mean that it will be good enough to create an awesome API which the developers will love. Twitter is a lot more than that.
Messaging Network: At its heart, Twitter is a massive messaging framework. It does nothing more than pick up a message, check with a set of rules who all gets to see it and dumps it with the bunch of people who are authorized to view that message. Clients can look up a user’s dump and render it. By the latest count, the platform handles 340 million tweets a day, which is replicated into numerous timelines (mentioned as dumps earlier). It is not trivial to engineer even just this part of Twitter.
API: Every client (including the official web frontend) displays data consumed from the API endpoints provided by Twitter. Almost all of the user-facing applications that are built atop Twitter leverage this API as web clients, standalone apps and mobile apps. Thankfully, Twitter seems to have so far stayed clear of having a restrictive license on the API specification itself. So replicating the API to allow existing clients to reuse their code should not be a problem. There is, though, no certainty that Twitter won’t change its mind regarding this, irrespective of the very problematic question whether APIs can be copyrighted or not.
Social Graph: Twitter, like any other large-sized network, is as much about the finesse of the product as it is about the number of people using it. For most users the restrictions of the API and the changes to it are inconsequential. For every developer adversely affected by the changes, there are probably a million users who could not care less about it. Without the users, no Twitter replacement will work, no matter how good/open/flexible it may be. If the reason why you are building a new Twitter is to address the griefs of the developers than the consumers it just will not work. Developers augment the network, but they are not the network by themselves.
Brand Partnerships: One of the reason why Twitter has done well is also the partnerships the company has built over time. The first iteration of this was seen in the ‘verified’ badge. We often underestimate the importance of that little badge in quickly it has been adopted, often by people who have struggled to even maintain a basic website. Historically, there is no parallel to the variety of users/brands who have come forward to claim their digital presence as it has happened on Twitter. Same is the case with non-verified official handles. We have never seen before a situation where television channels have voluntarily carried the branding of a communication platform that does not belong to them (by their usual standards, they would build a Twitter of their own and promote it. Example: NDTV Social which now augments than to aggregate). Twitter is gradually building and solidifying these partnerships to things that extend beyond the ‘verified’ badge with initiatives like cards. Over time, due to the authenticity (of the brand’s presence) and the reach (of Twitter) it will be really tough for a competitor to sell itself.
Mobile: This one is simple. Twitter has partnerships with 175 carriers around the world. This provides them with various degrees of advantages including discounted rates for access over SMS compared to other numbers. In an environment where voice revenue is falling or stagnating around the world non-voice and data revenue is a main focus area for telcos across the world. Anyone looking to take on Twitter will have to take this into account. Another aspect to keep in mind on mobile is the money spent on support SMS messages – it is not cheap.
Funding: As an indirect fallout of the Facebook IPO debacle, the valuation bubble in pre-IPO companies are now taking a bit of a beating. When hypergrowth itself does not perk up a company’s potential valuation it will limit the  ability of companies to raise money and stay true to the core ideals.
Revenue: For ads to work on services like Twitter you need scale. For scale you need to have massive number of users using the service regularly. At $5 per month per user, you won’t get anywhere with user acquisition. How do you eventually scale this business?
Conclusion: If you can address at least some of the above points you could probably wind up successfully building a new Twitter. The odds are that nobody is going to be able to do that.
 

Filed under: Internet, Social, Start-ups

Review Of Prismatic

Wavii and Prismatic are two of the latest warriors in the perilous battlefield of automated social content discovery and recommendations and over the past few weeks I have grown quite fond of using Prismatic. The domain of automated content discovery has seen much money and effort invested into it with companies like Evri/Twine, SocialMedian, Summify either shutting down or being acquired into larger products to be integrated as smaller features or as talent acquisitions. So it is surprising to see even more resources being plonked into a domain that has repeatedly proven to be either plumbing with no real scope as a consumer-facing business.
One reason why this is happening is because nobody has successfully cracked this space, exposing the underlying technology in the form of a useful, simple service than as something which is inherently nerdy in nature. Recommendation engines that work on unstructured text requires a mixture of content crawling, classification and content clustering to make it work right. Each of those three aspects are hard to crack by themselves, requiring a lot more than just your average web development chops. Together, they are an unattainable holy trinity. There is a good reason why so many companies and smart people have failed at it. That is also one of the best reason to have a go at it again.
Then there is the aspect of the RSS readers — another domain that has seen many a brave product, person and purse eventually call it quits, with not a single big product from five-years-ago being alive in 2013. You can, justifiably, argue Google Reader is still alive in somewhat a whittled down form, but even in its heyday the product could not grow into the mainstream. RSS is inherently plumbing for the connected web. It is not meant to be consumed by humans. Yet, the entire workflow around using RSS was built around human intervention. There is a good reason why it never took off.
Why I like Prismatic a lot is because it excels at executing the holy trinity really well. The on-boarding is ridiculously simple – you hook up your Twitter account (in my case, since I am not on Facebook) and it figures out a list of things you like based on your profile. It presents you with a list of links to read right when you log in. There is no “hey, check in 10-hours later when we’d have crawled content for you” in their case. For a non-technical audience this is crucial. Even more crucial is the fact that you need to curate anything at all, it figures out what you like by tracking what you clicked/opened.
Comparatively, Wavii is a more nerdy experience. The on-boarding is nowhere close to being as polished as Prismatic and even after a while on the service it leaves me quite confused. Consumer-facing applications cannot afford to appear convoluted and complicated, especially in the iOS era. Wavii is much more an alpha/beta a product than Prismatic, but that is understandable as Prismatic is better capitalized and has been around for quite a while with some top-notch talent trying to solve the hard problems associated with getting their product right.
In my opinion Prismatic is the next step from Google Reader. Underneath the shiny bits Prismatic polls RSS feeds, crawls updated pages, classifies the information, generates content summaries, titles and images and presents the story to you as a simple news/content item. But, as a regular user, you are not exposed to any of those shenanigans. Instead, all you get is an endless stream of news/information with an exceptionally high degree of relevance with zero active input from you.
If I had money to invest, I would certainly invest a fair chunk with Prismatic. I am still not sold on the idea that niche products like these can stand on their own two legs and scale into the hyper growth phase. But I am pretty certain that should the team choose to sell in the future, the exit would be fairly large for everyone involved and deservedly so too.
Disclosure: I have no connections with either product directly (other than a two bug reports) or through my clients.

Filed under: Internet, Media, Social, Start-ups

Samsung Galaxy S, GT-19000: Two-Year Review

My trusted Android road-warrior — the Galaxy S — completed 2-years sometime late last year. I had picked up the phone as a replacement for my tough-as-nails Nokia E71, long  before the platforms were burning and tablets and mini-tablets had become the rage. To say that the phone has exceeded my expectations would be an accurate statement. As my first touchscreen phone and as my first Andorid phone, I had expected the experience to be ghastly and that the phone would not last for more than a year. I could not have been any more wrong about all that.
The phone is currently running a stock build of Jelly Bean (Android 4.2.1) and other than a deadboot (completely my fault, fixed at a local mobile phone repair shop with a JTAG flash), the device has been flawless. OK, not entirely, I also managed to make the camera unusable after scratching the lens cover pretty badly. In spite of Samsung trying its best to shaft its customers with all the Touchwiz madness and glacially slow firmware updates, this phone will easily continue to go into the pages of history for reasons other than being the phone that started the thermonuclear war with Apple.
Other than being one of the best developer-supported Android handsets, what I love about the phone is how sturdily it is built. It has been dunked in water multiple times, keeps surviving regular falls with unfailing regularity (to the extent that I often ‘demo’ it to friends, eliciting their unparalleled shock) and has withstood my general grubby and clumsy usage. These were the qualities that endeared the Nokias to me a long long time ago and I still retain the E71 (which is fully functional) as one of my backup phones.
I can say with reasonable certainty that come August 2013, I will still be using the phone as long as it keeps going and if it does not get stolen or gets lost. The strange part is that even though the Android ecosystem has changed drastically in the two-years I have had this phone, my desire to switch to a different handset has always been fairly low. It needs to be kept in mind that my smartphone requirements have only regressed over that period. I don’t game at all on the device, there are a few productivity apps and it is used to play music in the car when I drive.
If you take out two important factors — a superb camera and games — Android phones can perform 90% of the functionality (forget NFC for the time being), of the other functions across the board all the way down to Gingerbread. I know this for a fact due to my second phone – Micromax A73. I prefer shooting photos with a proper camera now and don’t use mobile phones for that purpose and I can say that my gaming days are now pretty much behind me. For a while I kept evaluating the Nexus 4, but I just could not convince myself that it was worth the premium I’d have to pay for it.
On the other hand, my regular run-ins with iOS only serves to reaffirm my belief that it is a fine, polished OS and an ecosystem, but it is simply not the right option for me. Having ruled that option out, I am not sure what will be my next smartphone at the higher end of the market.

Filed under: Android, Mobile, Technology

The Price Is Right: YouTube Versus Labels

The Guardian has an interesting story on how record labels are learning how to make money from YouTube.
Five years ago, this is what I had written in ‘Youtube: The future of music distribution?

Now, if Youtube were to give the record companies a fixed amount of money (they had set apart $500 million for copyright litigation-related costs in escrow), for legally playing out music/music videos and if that fixed amount of money is higher by even a cent, compared to what the industry might make legally a few years down the line, who can honestly complain about it?

Since I wrote that many moons ago, I have been quite convinced that the fight with YouTube for the record labels was less about copyright/licenses and more about the right price. Internet has not changed much the manner in which music is created; what it has ripped to bits is how discovery and distribution works for music. The stark truth for labels is that in trying to keep the old model going for as long as possible they have lost on being on top of the game as far as distributing music goes.
Another way to ask the question is: where is the Hulu for music for record labels?
With iTunes and YouTube they have given up a key point of leverage and that horse has long bolted. The situation is largely beyond repair with the limited imagination the industry has. So the only option left is to cut overheads, haggle over the price and ride the wave.
If the price is right they would not care less whether the clip or stream with the largest number of eyeballs is the official one or not, as long as they are getting paid for any stream (official or user-generated) or clip with their artist in it. In fact, it is more of a headache for YouTube as official channels can get them better revenue on the ads, compared to the clip being played on an unofficial channel.
To a great extent, this change is already visible on YouTube through two developments: YouTube’s fingerprinting service has of late improved dramatically. It now develops artist pages and playlists from official and unofficial channels. Secondly, they are aggressively pruning non-official versions of various clips, leaving only what the copyright holder wants to show available to the viewers.
For the users this has one major downside. In a manner of speaking, YouTube was the true successor to Napster in the width of content it used to host. You could find some really obscure clips on it earlier. As the cleanup continues, that feature will slowly fade away. On the other hand, products like Grooveshark, Spotify, Gaana and Saavan are making it increasingly easier to find and consume legal content, thus reducing one of the major cases for piracy – which is convenience.
For the labels, eventually, it is not that bad a deal. It is incredibly complicated to run a digital operation that manages both geo-restriction and monetization effectively. Given enough time, a new product is eventually bound to emerge that will do that better than a YouTube or an iTunes store. Till that happens, they can cut costs, make more money and let others do all the hard work.

Filed under: Advertising, Industry, Internet

Avoiding Entrepreneurial Meltdowns

One of the inevitable side effects of being an entrepreneur is that the first few years of trying to be one finds you never tuned out from a state of being always tuned in. It starts with the extremely long working hours when you toil mostly on your own and it later switches to managing people, business relationships and firefighting.
The lucky few get to take a break from this state of affairs maybe once a year and disappear into a zone of no phones or email, but most don’t get/take that opportunity for years to come. For most of us it is hard to get the mind to shut down even on a break. It is hard to not think about cashflow, growth opportunities, potential leads and product lines all the time.
The problem is worse for those who are responsible for keeping services running 24/7. Being at work actively and passively maybe a glorified way of working, but it has a detrimental effects that accumulate over time. Having done that for a good four-years I found it often hard to focus well for extended periods of time. Any attempt to break from either doing or trying to do or thinking about doing breaks open the dreaded can of worms called guilt – popular culture says that entrepreneurs cannot take it easy.
Flameouts for entrepreneurs are not often absolute or immediate. It is a slow wind down and like the companies they run, especially if they are not doing well, they remain alive only in appearance and inside they are long gone. Being an entrepreneur takes, more than anything else, a sense of absolute conviction in yourself and part of keeping the game face on is to not let anyone know you don’t possess that sense anymore.
In the early years I would always aim for an absolute break – time off from everything and everyone related to work. But by the time the break would come I’d be exhausted beyond words and most of the break would be spent dreading the eventual return to work. Then, by mistake, I discovered taking a couple of days off — sometimes on weekends, sometimes in the middle of a week — and found it to my liking. It took a lot to convince myself that the world won’t end if I take a mid-week break. It was harder to convince my ego than my clients that I was not that important.
Now I have moved to a slightly evolved system. Even in a full working week I don’t attempt to work on all projects on all working days. Scheduling tasks is a major part of this and it has taken time for a workable system to evolve on that front. Of course, there are exceptions to the rule and emergencies that break with it.
Then there are external factors. Sometimes I stay off Twitter on my personal account for days or weeks on end. Information is useless if you can’t efficiently and effectively process it and when I am unfocussed, high levels of incoming information trips me up like nothing else. I’ll give up reading most things, pick up a book and finish it in a couple of days.
There is no one-size-fits-all solution to avoid to take a break and different business segments have different compulsions of its own. So, you have to come up with a solution that works best for you. But it is important to find ways to ease off and take a break for everyone.

Filed under: Start-ups