Author: Shyam Somanadh

Entertainment Industry Wins Pirate Bay Case, Loses $390 MM business opportunity

By now it is all over the place that Pirate Bay (or at least the four defendants) has lost its lawsuit with the authorities regarding their enablement of piracy using the website. Neither will this ensure that piracy related to entertainment will come to an end, nor will this ensure that artists will find another way to monetize their efforts beyond what the entertainment industry can offer.

I have no quibble with the fact that piracy is bad, what I do have a problem with is the fact that on the one hand organizations like RIAA are only happy to sue the pants off file sharers, citing ever declining numbers in their businesses, while on the other, they refuse to see that not everyone who is using P2P has price point zero as the only acceptable price range.

At this very moment, Pirate Bay's trackers are supporting over 22,387,439 seeders and leechers over IPv4, there is a much smaller number on their IPv6 network. That is a huge number of people who are actually acquiring content, concurrently. Well, to be precise, 22 million of them in one go. And we are not even counting the users on the other networks.

If the entertainment industry can see this only as an opportunity to sue 22 million people and get a dollar off each of them they deserve to die the painful death they are undergoing now. If they can see the 22 million as a live and kicking market, they deserve to live and live well.

For those who argue that piracy is free are sadly mistaken. Every download is paid for at some point in the chain. If you do that in your office, it is your company that foots the bill. If you do it at your home, you pay for it with your DSL/Cable bill. In effect, there is no 'free' in the equation here, there is already a value attached to it, even when you are leeching off a torrent. The only problem in the whole picture is that the ones who are getting paid are neither the content creators, nor it is the distributors who get paid.

So, who gets paid?

The telcos who sell you bandwidth and the connection are the ones who profit most by it. Even as sneaky as that can be, it is not their fault either that people use torrents and P2P to get their fix. The fact remains that there is no simple, sane and legitimate way to consume content at a reasonable price point at the moment. The entertainment industry has always refused to embrace innovation on that front. They should de-incentivize piracy, by working hard to make non-pirated content easily available.

Instead, what do they do? They spend their time trying to keep price levels at the highest possible points, trying to maximize their margins and holding on to the days of glory days of CD and cassette tape sales, than trying to open their eyes to the new reality that this will work in their favour only if they play for scale. The average person probably does not download more than 30 songs a month on the internet. Why is it not possible to address that need at a flat rate which would make it much easier for everyone to understand and legitimately participate in the process.

You don't even need to convert the entire 22 million users connected to the Pirate Bay tracker to make this work. At even 25% conversion, that is 6.5 million users in a month. Those users shelling out $5 per month for 30 tracks, without DRM is worth about $32.5 million in monthly revenues and $390 million in annual revenues. I am playing easy with the numbers here, but my point is valid, there is a massive business opportunity here, which is getting ignored.

Even with DRM, Apple and the iTunes franchise has shown that people can and will pay for entertainment given the right experience and a useful price point. But really, why is Apple owning that space, with Amazon competing hard with them on that front? Why is the entertainment industry a no-show here, other than the rare moments when they shake themselves out of their stupor, and trying to up their margins per track sold?

Piracy is really no news. It has been there before the internet was created and it will continue to remain with us as long as we are around. What the entertainment industry needs to recognize is that they need to change their business models and the way they operate now. And every day they spend chasing these people in court with victories that result in a minor blip on the global piracy radar, is another day they are losing to save their own livelihoods.

Filed under: Misc

State Of Mobile GPS in India

Ever since I bought the Nokia E71, I have become a bona-fide GPS junkie. This addiction is helped in no small measure by my increasing ability to get lost anywhere and everywhere. In the past four months, I have tried few of the available solutions and I am going to jot down my observations. Till date I have used Nokia Maps, MapmyIndia and Google Maps and I will be trying out Satguide's software soon to complete the product line up.

Nokia Maps (powered by their Navteq acquisition) is the most refined, classy looking and user-friendly of of all three products. The level of detailing is reasonable and so is the accuracy. One thing to note about all the solutions is that data for India is rather sparse, which means you don't get traffic data in any of the solutions, resulting in the situation where the only determinants in route calculations are distance and time.

Nokia Maps is also one of the pricier solutions in the market. While you can try it out for free in the first three months after your purchase of any GPS-enabled Nokia handset these days, the subscription charges that follows it is quite steep. An all-region (India) 1- year subscription will cost you USD 219 and for a 1-year subscription of select cities will cost you USD 118.99 (USD 42.99 for a 90-day license and USD 15.99 for a 30-day license). I can't imagine why would Nokia use dollar pricing for a product that they are pushing very strongly in the Indian market.

In all likelihood, the product has not been yet customized for India. There is no INR pricing available for the licenses anywhere on their Indian website for the licenses, while USD and Euro prices are available from the other websites and it could wind up hurting them really badly.

Nokia Maps Features:
Voice Prompts: Yes (English)
Map Loader: Yes
OTA license renewal/Purchase: Yes
OTA software upgrade: No
Trial Version: Yes (3 months when it comes pre-installed on Nokia handsets)

The InavConnect is MapmyIndia's mobile GPS solution. The company has tied up with Wayfinder to bring out the product and for all practical purposes the application is just a cobranded version of Wayfinder's software. Of all three, I have used the InavConnect the most, using it within Delhi/NCR and also on my road trips to both Bikaner and Manali. The application is more than a bit clunky and not very refined and the search can be quite odd at times. There is a workaround for it by saving places in the application itself, but in this age a dysfunctional search can really kill you in the market.

The application also seems to have a very low threshold in calculating what can be deemed as going 'off-track'. While I was driving around in the hills and highways, it used to happen quite a lot, even if we were to have slightly gone off on either side of the road. There is also the case of an update lag with the application. If you are driving above 40 KMPH, the software can often take an additional second or two before it updates the location to the current one. This can have interesting consequences when you are approaching forks in the road or roundabouts as we found out. It led to us taking the wrong turn about three times, before we figured out what the problem was.

That said, MapmyIndia maps are fairly detailed and quite accurate, which makes up for the clunky software. The license too is not very expensive, with the 1-year subscription being priced at INR 1990. which includes coverage for all India, making it the best value for money proposition for the paid products, as even Satguide's software costs INR 2183, though it is not transferable to another handset.

InavConnect Features:
Voice Prompts: Yes (English)
Map Loader: Yes
OTA license renewal/Purchase: No
OTA software upgrade: No
Trial Version: Yes (5 days, get download link from

When Google Maps was launched in its latest version, it had a glaring problem. Like the other two products, the application was not really customized for India, leading to features like directions being entirely disfunctional. That changed a couple of months ago when Google upgraded its India Maps to provide directions. The maps are very accurate the search is what swings the deal massively for Google, since it also incorporates the placemarks that users have added over time to Google Earth/Maps.

But the real reason why Google has set this market alight is the price point. There is no price for the application and beyond the rates that you pay for transferring data over GPRS the application. Add Google's location-based social networking product, Latitude, to this mix and it is then not too difficult to imagine why they will turn this segment upside down with the product. Yes, there is not voice assistance for the navigational features, but at pricepointzero, most users will more than happily live with it.

Google Maps Features:
Voice Prompts: No
Map Loader: No
OTA license renewal/Purchase: NA
OTA software upgrade: NA
Trial Version: NA

Conclusion and tips

The market for GPS devices and software in India is very very nascent. Honestly, the companies have more to gain from targeting the business segment (fleet operators) than the individual consumers to make any decent cash for a while to come. The maps themselves have limited use within cities and they largely come on their own on the highways. Besides, India is a country where directions are what we get from pulling over to the roadside and asking a local bloke, it is hard to break that comfort factor.

One feature I would like to focus a bit more, in all the software, is the map loader functionality. This is one major weak point for Google Maps. WIthout a map loader, the application will have to pull the data over GPRS each and every time. If your data plan makes you pay by the kilobyte, this can turn out to be quite expensive over the long run, which is where a map loader can make a significant difference.

Wayfinder's map loader allows you to preload map data for India from the computer. Even though this weighs in at a hefty 50 MB, it makes the maps run much faster and the data transfer too is quite low as a result. Nokia Maps also offer a similar facility. Do note that neither product bundles the map loader with the standard software, you have to download it separately.

in the coming months, I will test Satguide's software and post an update.

Filed under: Misc

Welcome To The Great Indian Intranet

The greatest price that terrorism extracts, and it does that silently, is fear. This is fear is leveraged often by authorities around the world to enforce laws that don't do much more than constrain the freedom of the majority, while presenting only a minor inconvenience for the for tiny minority who cause all the harm. We see yet another example of it in India today with the news that the latest amendments to the IT Act will force email providers to keep their servers in India, so that the terror emails can be tracked and hunted down more easily.

it does not require a doctorate in computer science to figure out how silly this plan is going to be and how easily it can be worked against. All the terrorists need to do is to not use one of these email service providers and the bright plan falls apart in a second. If this is the level of domain knowledge that goes behind the decisions that are supposed to protect us the nation from script kiddies to organized Chinese and Russian hackers meticulously drilling holes into every computer network in the world, I would be more scared of what our authorities are incapable of doing than what the bad guys are capable of doing.

The greater problem here is not even that. There is a massive business cost attached to decisions like these. I run a server of my own in the US because it provides me a cost-effective deal for my needs, which I won't get in India. If, at some point, the logic mentioned earlier is extended to stipulate that any online business operating in India has to have their servers hosted here, it would very negatively impact the country. As one of the shining examples of democracy and free countries in the world, it would be a pity for India to be mentioned as an example of oppression in the same breath as the Great Chinese Firewall.

Moreover, you can extend the logic to cover everything. Why stop at email servers? We have to track every DNS request, every database query and every search query so that we can read into the minds of terrorists and go back in time after an attack has happened to see what the culprit was surfing or using. This can even be extended to cover comments posted on websites, with rules being enforced that sites that accept comments/usage from Indian IPs are illegal businesses unless they are hosted in India.

This can very easily escalate into a situation where we will have a version of the internet that is more or less an Indian intranet, where all traffic originating from India has to terminate in India. Even as incredulous as it sounds, it is easily possible if we follow further down this road. Comments such as this, “we don’t know how they are going to do it, but the government has decided to do it. Every domain company has their branch office here, and they are bound to pass on the information immediately, whenever investigation agency needs it,’’ only makes it clearer that the worst is very much possible.

I wish there was a more open and public debate on such things. There are plenty of experts in the field in India who would only be glad to help draft and design these requirements, which can be done without being intrusive and also be effective at the same time. Honestly, I am more worried about how easy it is still to sneak in anything into most hotels and markets in Delhi than about people sending crazy “we shall overcome” emails to media houses. We are really getting our focus wrong here and barking up the wrong tree with a vengeance.

Filed under: Misc

Indian Broadband's Mbit Eyewash

The state of broadband in India is a rather saddening affair. Any connection that is above 256 Kbps either costs over a thousand rupees per month or it involves download caps which will make the whole point of using it like how broadband is meant to be used out of the question.

The problem exists on two fronts: 1) Pricing – which is unaffordable for a vast majority of Indians, thus standing in the way of broadband penetration 2) Data transfer limits, which eventually wind up as (1) when you exceed the usage limits. Till recently, the telcos have been chastised by TRAI and other bodies for often trying to sell narrowband as broadband. The telcos responded to it by upping the speeds and putting in insane limits to usage which will only enable the user to use the high speed connection as a narrowband line if they are not to spend vast amounts of money paying off their bills.

Now, if you think paying more will make the plan any better, you are gravely mistaken. Airtel recently announced a 16Mbit plan (Speed Combo 2999) for their residential connections. The plan will cost you Rs 2999 per month, with a data transfer limit of 20 GB per month. This roughly translates into about 600 MB worth of data transfered per day. You can easily and legitimately burn up 600MB worth of data in a day if you are fond of watching a lot of multimedia content (on sites like Youtube) and download a lot of podcasts.

Every additional MB on top of that transfer will wind up costing you 50 paise per MB. So let us assume that you go over it by 100 MB every day, which is a trivial thing to happen on a 16Mbit line. That is an additional charge of Rs 50 per day, adding up to a cool Rs 1500 on top of that. If you double the additional transfer (hey, this is high speed line, it is meant to be used for data-heavy situations) to 1200 MB, that is Rs 300 per day, adding up to an additional bill of Rs 9000 per month. You can see where we are going with this.

The point that I am trying to make is that most of these big bang Mbit plans don't have any real use case beyond being nice sound bytes for the telcos. They will sell it to you saying it is meant to be used for multimedia and all that jazz, but if you really wind up using it for that, it will burn up a crater in your pocket. They should really price it more realistically and get the caps too to a realistic level or make it unmetered.

Filed under: Misc

Indian Internet's Three Million Pageview Conundrum

For many years we have been trying very hard to unlock the potential in India's internet market. Beyond all the hype, the fact is that we have made little headway in realizing that dream. Even with the stellar growth numbers quoted year after year by IAMAI and other industry bodies, most individual internet properties seem to have a lot of trouble in growing, organically, beyond the three million page views in a day line in the sand.

The 3MM number is part pulled out of the hat and part anecdotal. It is the upper limit I have seen in standalone properties over the years, after which growth is really sluggish, unless considerable amount of money is spent in SEM to draw in the crowds. It may, of course, get bumped up by a million in the next couple of years, but the point I am making is that it is way too low to be even a hint of what is possible.

And the 3MM is a number that has vexed me for a while now, in trying to understand what will get it to up the ante in that number with the products that are there. And it is only recently it struck me that there is nothing wrong with the products per-se. They are all fine and in cases even too good for the audience to even appreciate them. The problem is with the language. The Indian internet is overwhelmingly English, while India is not.

Yes, the promise of local language internet has been there for a long time, but we have done little towards it. Most of the content there now is at best translations of existing English news content (since that is most readily available in a digitized form) or online versions of the few local language media houses that has aimed to go online with it. This is the classic chicken-and-egg problem in Indian vernacular internet.

Where this hooks up with the 3MM problem is at the point where new users who do not form part of the English internet demographic have nowhere to go and have nothing much to do other than follow on the casual, language-independent routes of internet usage in porn and social networking.

The conundrum, as mentioned in the title, is this: all the money that is being spent in Indian internet is spent in the 3MM band properties that target the Indian English internet. The growth in subscriber numbers are not going to see an equivalent growth in the 3MM properties, because they have little to offer beyond the usual for them. This is why the significant volume of investment that has gone into the English internet (citing the potential of the billion-strong populace) has seen little returns on that investment.

One of the things I am aiming to work towards with FrontierNxt is to fix this mismatch. It is going to be a long and arduous journey, using tools and methods that are far from what I have been used to in a market place that we all know little about.

Filed under: Misc

Remembrance of Things Past: Media's Reaction To Its Own Demise

There is sheer and absolute panic in the media markets these days. We have industry veterans advocating a switch in online content distribution from the free model to the subscription model like it is a panacea for all ailments and everyone is rushing about trying to increase rack rates. To all this hectic activity, there is only thing I can say: stop, take a deep breath and try and think long term than short term.

It is rather ironic to see all this activity at this very stage. For years the media has sat on its haunches while the internet and other factors steadily changed the way information is created, distributed and consumed in the world. Now, after the horse has bolted (and also taken barn along with it), they think effecting the above-mentioned changes will somehow magically set things right.

Let me break it for everyone, it just won’t do much to help things.

I have previously written at length on the matter and I will link to it than repeat myself:

Such seismic shifts in creation, distribution and consumption don’t happen overnight. They happen right under our noses, in plain daylight and the industry leaders would have seen it if they had cared enough to observe and adapt. In looking to milk money out of the consumers of content, most of the leadership wind up asking the wrong question “how do we charge?” than asking the right question of “why would the user pay?”

The content business is broken beyond redemption in two ways, which will mean that the fancy dreams of being able to charge won’t work out. It costs just way too much at this point in time to create non-unique content. To make matters even worse, creating unique content is not only even more expensive, but it is also something that very few have the talent or capability to pull off at this point in time. Traditional media has over time specialized in non-unique content, but when it comes to unique content it is the new publishers (bloggers and others) who have excelled in it.

Thus, for the media companies to try and attempt doing specialized unique content will mean they will get their cost structure, which is already tied up in a big knot, tied up into an even bigger knot. It is nothing short of a conundrum. To improve coverage, they need to cover less to trade quantity for quality. But doing that will make them irrelevant in the current news scenario. And we still have not answered the question regarding why would the user pay for any of this content anyway?

But it is not all gloom and doom alone out there. There are people in leadership positions who are trying to address the problem before it gets entirely out of control and wipes a significant part of the industry out. In his recent memo, detailing Hearst’s 100-day plan, Steven Swartz (president of Hearst Newspapers) points out that the issue is not one of audience, but of a flawed business model and rampant inefficiency. You should read the memo in full, it is a very interesting read.

Another interesting development is the move by NYTCo to go hyperlocal (something I’d suggested earlier) with the local blogs, though I think that would have been a better fit there. It would have also saved the company quite a bit of time they would now spend learning what has already learnt. The ‘we-will-do-it-inhouse-because-we-can’ has led to a lot of strategic faux pas previously in the media, getting the companies invested way too deep into areas they have no business getting into at that level. We will have to wait and watch how this plays out.

In conclusion, this is the deal. That what has taken years to get to this stage won’t get fixed overnight and desperation is really a bad guiding principle to make any kind of changes. Most of the hectic activity we are seeing now don’t have any long-term view or strategy. If these measures were to not succeed (which is very likely, in my opinion), it will end up wiping out even the few strengths these companies are left with. Hopefully, they know something that I don’t.

Filed under: Misc

The awful practice of cross-pollinating status updates

If there is one thing that has gone awfully wrong with the open APIs for pushing and pulling data into various online communities, it is the horrible practice of plugging one thought stream through the API into another in an automated manner.

What exactly is a thought stream? Well, thought streams are more frequently updated status messages. They have been around for years known as custom status messages on instant messengers, which got spun around, made over and turned into a superstar product by the guys at Twitter. These days, status messages are there in every online networking product — be it Linkedin, Facebook, Orkut, Hi5.

Which is all fine. A few more wisecracks a day does not really make the world a better or suckier place. What does make it suckier is that cross-pollination of these messages often lead to broken conversation threads, misplaced context and other byproducts of the law of unintended consequences.

In real life this is how it happens: You can plug in your Twitter stream to update your Facebook status message. But your replies to the Facebook status message remains within Facebook. So you post a message to Twitter, this gets replicated on Facebook. Someone replies to that update on Facebook, but unless you are a Facebook maniac (a dying breed, if you ask me, these days), odds are that you won't see the response till much later.

Killing the conversation: The primary issue this cross pollination creates is that it breaks the conversation. While the tiny updates are the core functionality of Twitter, it is not the case with the larger networking sites like LinkedIn and Facebook. When you plug in the updates from one product into another, it is hard to know, at face value, where that update originated from, leading to instances where the message origin and message response belong to different networks. The end result? Death of the conversation.

Context breakage: Secondly, different networks have different contexts attached to them. My Twitter stream is a whole lot more casual than my LinkedIn activity stream, which is the same, I think, with most people. The fact is that, eventually, the same messaging does not work with different groups and communities. Plugging one into another can have disastrous consequences. What maybe kosher on one network, vis-a-vis, the audience, may not be acceptable on another. With cross pollination, it is hard to know what exactly is seen by whom.

Network latency, throttling issues: Third and last, when you cross-pollinate, you are also introducing network latency and throttling into the equation. Different networks allow for injection and extraction data under the terms that make sense to them, which may not be the same as pushing out and pulling in updates as fast as possible. As a result, your updates maybe throttled at the other network's end, resulting in only a handful of your updates showing up there.

Filed under: Misc

Six reasons why Facebook is losing its way

I would have loved to have summed it up rather simply as that Facebook is story of accidental success. Zuck started the product to have some fun, and incredibly, five years down the line, the fun has not ended. But, it is not all fun and games and even with its astounding growth, the fact is that Facebook is struggling to find a clear direction in which it should head and it also having trouble finding enough revenues to offset its ever-increasing burn rate.

Facebook has zero value when it is not public: This makes for a world of a difference when you are trying to introduce a set of services on the platform that goes beyond its primary use case. Platforms that are primarily meant for public (restricted, yes, but public still) consumption never fare too well in the private world. This is the reason why you don't see social networking in your email inbox (sorry, Xobini is NOT social networking) and the reason why your online banking service does not have a 'social' aspect to it. The simple reason is that it is counterintuitive.

Facebook's main use is for people to be able to connect with each other, in an environment that allows you to pick and choose the audience. You can use Facebook without any connections, but that would also send the value derived out of using the product spiraling downward. In effect, the main use case for Facebook and the extended world it is trying to embrace work at cross purposes. Facebook is a network that functions on the basis of exclusion, while the newer things it is attempting works best on the basis of inclusion.
Mixing up identity with connections: Mixing your identity with the connections you make is a fairly subtle but significant mistake that all of us make. Who I am on Facebook is not my identity (though, poking thrice a day can say a lot about who you are), but it certainly gives me a dashboard on the state of my personal connections. With the new features, Facebook is trying to sew together my identity with my Facebook account. Other than the earlier mentioned counterintuitive problem, it also has a problem of profiling based on a restricted context which will always give you an inaccurate picture.
Context, not connections, drive content discovery: One of the major thrust areas for Facebook is peer-to-peer discovery of almost everything, which conveniently ignores some major points. 1) Friends are not always your peers. Friends are sometimes just friends. 2) The converse also is true in the case that your peers are not always your friends. Content discovery is almost always best done when it led by context (event, topic, theme), than by relationships (remember email forwards?). This is why half of the suggested content you see from your friends on Facebook is junk. You have much better luck with content discovery on Delicious, Digg and Reddit.
Context is often required to be non-shared and private: One of the reasons why you freely consume so much of content on the internet is because the context in which you would have found them are not public. If your Google search history could have been shared on Facebook it would have an immediate effect on what and why you would search for things. Man is often known for the thoughts he expresses, not for how the thought was formed.
The 80:20 art of network maintenance: Every network has a crucial dipping point where the effort that you put into maintaining your network starts to become more than the value you derive out of being a part of the network. After a point, it is hard work to have a clear idea of who gets to see what you do on Facebook and maintaining that alone is a huge chore. On a platform like Google, even if you don't share there is no core value erosion, on Facebook if you don't share or participate the network immediately becomes useless to you and vice versa.

This is the reason why I have held that social networks have an average of three years being at the peak, by that time everyone you know and their dog is on the same network and the last thing you need is to have parallel virtual universe that reflects the real life network you are part of. Basically, this is the point where the whole network effect starts to break down. And this is what eventually marks the death of every social network.
First, just be a good social network: Yes, it is a great thing to have BHAG, but that is only possible when you have done well in your primary context. Facebook needs to nail the primary need of being an easy-to-use and uncomplicated social network first before it gets into anything. Somewhere along the way the company has gotten into nooks and crannies it has no business being in. In fulfilling that need, it may not wind up being valued at $15 billion again, but it certainly will keep the company and users happy in the longer run.

Filed under: Misc