Bella domanda.. un amico mi ha segnalato un paio di articoli al riguardo che credo siano molto interessanti. Sono di Robert X. Cringely e ve li riporto di seguito. Fantascienza o realtà ? Giudicate voi, sicuramente stimolano il pensiero ! Buona lettura e fatemi sapere cosa ne pensate lasciandomi un commento..
A proposito, se vi interessano i link agli articoli originali eccoli di seguito:
When Being a Verb is Not Enough: Google wants to be YOUR Internet
Oh Brother Where Art Thou?: The only way left to compete with Google is P2P with a twist
--------------------------------------------------------------------------
When Being a Verb is Not Enough: Google wants to be YOUR Internet.
I spoke recently with an old friend who is a bandwidth broker. He buys and sells bandwidth on fiber-optic networks around the world. And he told me something that I found not completely surprising, but I certainly hadn't known: Google controls more network fiber than any other organization. This is not to say that Google OWNS all that fiber, just that they control it through agreements with network operators. I find two very interesting aspects to this story: 1) that Google has acquired -- or even needs to acquire -- so much bandwidth, and; 2) that they don't own it, since probably the cheapest way to pick up that volume of fiber would be to simply buy out any number of backbone providers like Level 3 Communications.
Google loves secrecy. That they've been acquiring fiber assets hasn't been a secret, but the sheer volume of these acquisitions HAS been. Why? One thought is that it kept down the price since people didn't really know it was Google snatching up this stuff (they've done it under a number of different corporate names). But if price was the issue, then why hasn't Google just bought the companies that own the fiber? It made no sense until I scratched my head and thought a bit further, at which point it became obvious that Google wants to -- in its own way -- control the Internet. In fact, they probably control it already and we just haven't noticed.
There are two aspects to this control issue, but let's take the legal one first. If Google bought a bunch of Internet backbone providers, such a move would of course get the attention of regulators from the U.S. Department of Justice and the U.S. Federal Trade Commission, the two federal agencies charged with looking at large corporate mergers for signs of anti-competitive activity. But simply acquiring legal control of those same assets through leases and other long-term contracts doesn't trigger such an examination, though perhaps it should. By renting instead of buying, Google was able to acquire its fiber assets primarily in secret. The game was over before most of us even knew there WAS a game.
The second aspect of this is the whole idea that the game is already over for control of the Internet. I touched on this concept back in 1998 when I wrote my first column about PayPal, which at the time had been offering its core service for less than a year and already had eight million members. I wrote then that PayPal had already won the Internet payments race, which time has since showed they had. PayPal's confidence was based on analysis of its own growth. Understanding the potential range of growth, looking at the rate of subscriber acceleration, and using second derivative analysis of these data, PayPal was pretty darned sure, even back in 1998, that its competitors at the time would never be able to catch up.
Topix.net founder Rich Skrenta recently took a similar approach to argue that Google, like PayPal, has already won the game and represents to most users the face of the Internet. Skrenta (in this week's links) argues that Google's dominance of search and advertising is so profound that most competitors -- especially Yahoo -- would probably be better off NOT even attempting to compete and simply let Google handle search and advertising while Yahoo provides content. He's probably correct. Skrenta argues that even if services come along that are superior to Google's, in order to become dominant they'll have to overcome Google's brand recognition with users, which is almost impossible to do. So just being better than Google isn't enough.
All this is prelude for understanding what Google intends to actually DO with all this technology, which I have only lately begun to figure out.
I live in South Carolina, a state that I can argue qualifies as a technology backwater despite being the shrimp and grits capital of the world. Why, then, are the local business pages filled with stories about Google preparing to build massive data centers here? Google is apparently negotiating to build data centers in Goose Creek, a town not far from Charleston, where I live, in Columbia, the state capital, and a third location across the border in Georgia. To read the papers, Google might choose one or another of these locations, but according to people I have spoken with who are fairly close to the action, Google actually seems intent on building in all three locations.
Why?
Why would Google need two data centers in a state with only four million residents? Why would they need to buy 520 acres in a Goose Creek industrial park when that's probably 100 times as much land as any conceivable data center would require?
Google is building a LOT of data centers. The company appears to be as attracted to cheap and reliable electric power as it is to population proximity. In Goose Creek they bought those 520 acres from the local state-owned electric utility, which probably answers the land question posed above. By buying out all the remaining building sites in an industrial park owned by an electric utility, Google guarantees itself a vast and uninterruptible supply of power, much as it has done in Oregon by building a data center next to a hydroelectric dam or back here again in Columbia by building near a nuclear power station.
Of course this doesn't answer the question why Google needs so much capacity in the first place, but I have a theory on that. I think Google is building for a future they see but most of the rest of us don't. I'll go further and guess that Google is planning to build similar data centers in many states and that the two centers they are apparently preparing to build here in South Carolina are probably intended mainly to SERVE South Carolina. That's perhaps 100,000 servers for four million potential users or 40 users per server. What computing service could possibly require such resources?
The answer is pretty simple. Google intends to take over most of the functions of existing fixed networks in our lives, notably telephone and cable television.
The Internet as we know it is a shell game, with ISPs building their profits primarily on how many users they can have practically share the same Internet connection. Based on the idea that most users aren't on the net at the same time and even when they are online they are mainly between keystrokes and doing little or nothing when viewed on a per-millisecond basis, ISPs typically leverage the Internet bandwidth they have purchased by a factor of at least 20X and sometimes as much as 100X, which means that DSL line or cable modem that you think is delivering multi-megabits per second is really only guaranteeing you as much bandwidth as you could get with most dial-up accounts.
This bandwidth leveraging hasn't been a problem to date, but it is about to become a huge problem as we all embrace Internet video. When we are all grabbing one to two hours of high-quality video per day off the net, there is no way the current network infrastructure will support that level of use. At that point we can accept that the Internet can't do what we are asking it to do OR we can find a way to make the Internet do what we are asking it to do. Enter Google and its many, many regional data centers to fill this gap.
Looking at this problem from another angle, right now somewhat more than half of all Internet bandwidth is being used for BitTorrent traffic, which is mainly video. Yet if you surveyed your neighbors you'd find that few of them are BitTorrent users. Less than 5 percent of all Internet users are presently consuming more than 50 percent of all bandwidth. Broadband ISPs hate these super users and would like to find ways to isolate or otherwise reject them. It's BitTorrent -- not Yahoo or Google -- that has been the target of the anti-net neutrality trash talk from telcos and cable companies. But the fact is that rather than being an anomaly, these are simply early adopters and we'll all soon follow in their footsteps. And when that happens, there won't be enough bandwidth to support what we want to do from any centralized perspective. A single data center, no matter how large, won't be enough. Google is just the first large player to recognize this fact as their building program proves.
It is becoming very obvious what will happen over the next two to three years. More and more of us will be downloading movies and television shows over the net and with that our usage patterns will change. Instead of using 1-3 gigabytes per month, as most broadband Internet users have in recent years, we'll go to 1-3 gigabytes per DAY -- a 30X increase that will place a huge backbone burden on ISPs. Those ISPs will be faced with the option of increasing their backbone connections by 30X, which would kill all profits, OR they could accept a peering arrangement with the local Google data center.
Seeing Google as their only alternative to bankruptcy, the ISPs will all sign on, and in doing so will transfer most of their subscriber value to Google, which will act as a huge proxy server for the Internet. We won't know if we're accessing the Internet or Google and for all practical purposes it won't matter. Google will become our phone company, our cable company, our stereo system and our digital video recorder. Soon we won't be able to live without Google, which will have marginalized the ISPs and assumed most of the market capitalization of all the service providers it has undermined -- about $1 trillion in all -- which places today's $500 Google share price about eight times too low.
It's a grand plan, but can Google pull it off? Yes they can.
--------------------------------------------------------------------------Oh Brother Where Art Thou?: The only way left to compete with Google is P2P with a twist.
Last week's column was about my prediction that Google intends to use these data centers it is building to essentially act as proxies for the Internet and come to the aid of all the broadband ISPs as our Internet video fix drives their backbone costs through the roof. Google will do this through peering arrangements with the ISPs that will give the search giant unique high-bandwidth access to broadband users with the result that most big media companies will go through Google, rather than through the public Internet, to reach their customers. Think of Google combined with a Super Akamai. Think of Google literally becoming the Internet.
The only reaction I received from Google, itself, was from a worker in one of those data centers who said, "I can neither confirm nor deny how very, very wrong he is."
Clever.
But when I ran that by another friend who is a top technical guy at one of the top Internet companies in the world, he said, "Last I knew, most folks who actually 'work' in the data centers of any tech company fall into the Operations' org of the company, even if they are 'engineers' with engineering degrees. And I'm sure you can infer what I think of Operations' folks' knowledge of what their company is planning. In all my years, I've never told any of my Ops folks at any of my companies anything about the future plans, technical or otherwise. Hate to sound like a tech elitist, but, there's a reason why they're in Ops versus Eng/R&D."
Only time will tell whether I am right or wrong about Google's strategy, but I am totally convinced, which brings me to this week's column on potential strategies to counter Google.
In one sense, I have no reason to want to counter Google. What do I care if they beat their most obvious competitors -- AOL, MSN, and Yahoo? It doesn't matter much to me, because Google's theme is one of helping users, and if AOL, MSN, and Yahoo go under, who cares? Still, it is always nice to have alternatives just in case Eric Schmidt undergoes a megalomaniacal transformation and turns Google toward the dark side of the Force.
How do you compete with Google if this is indeed their strategy? Well the first thing to decide is whom I mean by "you." It's probably not the broadband ISPs, themselves, because even though they'll be undermined by Google and huge chunks of potential revenue will be taken from them, they may still come out ahead. That's because Google will also allow them to cut back their bandwidth and server costs. No, the potential Google competitors aren't ISPs, they are portals like AOL, MSN, Yahoo, and many others -- content companies not viewed as potential customers by Google.
How will these three companies compete with the Google proxy strategy? As far as I see, they can't compete. They come up short in too many ways, but the biggest way is money, moolah, cash, loot. None of these companies can afford to do what Google is doing, building hundreds of huge data centers around the globe.
It may be surprising to think that Microsoft doesn't have enough money to compete with Google since Redmond is sitting on something just under $29 billion in cash right now, and could borrow tens of billions more if needed. But while Microsoft COULD come up with the money, I seriously doubt that they WILL come up with it, at least not in time to have a real impact. Microsoft has too many businesses going right now with operating systems, applications, games, services, and other hardware. Bent on its own course of world domination through the xBox 360, Microsoft probably sees little reason to rush after Google in a fit of data center building. This is something I am sure Google is counting on.
As for AOL and Yahoo, neither have the financial resources to compete. Starting after Google, they couldn't raise money fast enough to even get in the game. If I am correct, these two companies are doomed.
So of the three logical Google competitors, one won't compete and two others can't compete. That leaves you and me as the only potential competitors to Google in this race.
We started this, remember, first with our BitTorrent and eMule antics and then with our YouTube compulsion. We are creating the bandwidth demand that will ultimately force our ISPs into the arms of Google. So if there is going to be an alternative to Google, that will have to be us, too.
It's pretty simple, really. As more and more video hits the web, ISPs will find themselves crushed by demand that will drive up their backbone costs until all profit is driven from their businesses. Google will come to the rescue with regional data centers that will peer with local ISPs and relieve them of much of that burden, allowing the ISPs to actually cut back their backbone connections and run fewer servers, though at the cost of losing the big movie studio and TV network business deals those ISPs currently think will eventually make them rich. If we look at what Google will be offering, it is bandwidth and server power. So to compete with Google will require bandwidth and server power.
Server power is easy if we embrace peer to peer. Let BitTorrent or VeriSign's Kontiki or Grid Networks or even the new Joost video distribution system from the founders of Skype carve server power out of millions of user PCs. Joost, by the way, is probably the best thing that ever happened to Google, since it will drive a stake into the hearts of broadband ISPs with more panache than old BitTorrent could ever display. So let's allow our problem to provide its own solution. If Google is throwing one million servers at the market, the market can easily respond with 10+ million PC peers, no problem.
The great advantage of P2P for this application is not only that is costs a lot less, but it appears exactly where you need it and with proper promotion the capacity is almost infinite.
But that still leaves us without enough bandwidth. Google will be using its own fiber connections to reach all the broadband ISPs, so any successful response would have to do that, too -- something that I see coming together quickest and cheapest as a kind of confederation of ISPs and optical fiber networks.
The trick here is to see the difference between dark fiber, lighted fiber, and Internet fiber. The most expensive of these is Internet fiber -- fiber connected directly to the Internet and for which ISPs are paying premium prices. What those ISPs need to make P2P work better, however, isn't fiber connected to the Internet but fiber that's connected to other ISPs but NOT to the Internet.
From an ISP's perspective, P2P is annoying in any case but becomes REALLY annoying when peers have to find each other by reaching out over the public Internet. If somehow that copy of American Idol could be found by polling only local nodes, then the cost of P2P would be much lower for ISPs. In fact, it would be almost nothing. The trick, then, is to expand the number of local peers to increase the likelihood that all the bits can be found on the local net.
Broadband ISPs with huge subscriber bases in major markets have an advantage when it comes to local peering, but in the end there aren't generally enough local peers to ever do the job completely. So why not make every user in the whole darned country a local peer?
This could be achieved by creating a parallel fiber network that interconnects all the ISPs but DOESN'T in turn connect to the Internet. This can be done to a certain extent with peering agreements between ISPs that share data centers, but it can't really be done without a lot of dedicated fiber to cover the many gaps in a network that was never deliberately designed.
Fortunately, there is still a LOT of available fiber, much of it owned by regional networks. The trick is to sign up all these regional networks that often follow power lines and gas pipelines. These smaller network players are generally interconnected already to all the local ISPs in their areas. They just don't know there is an opportunity to provide specific P2P service.
What would make this service viable from an ISP's point of view is if P2P bandwidth costs substantially less than Internet bandwidth. Here's where we face a pricing dilemma that is really more of a perception dilemma. If bandwidth on the parallel P2P network costs, say, one tenth as much as regular Internet bandwidth, well this would be a huge attraction for ISPs. They'd all sign up overnight. But can P2P bandwidth providers make money at such low rates? Yes they can, so long as the P2P rates don't drive down the rates on any separate deals they have to provide Internet bandwidth. Keeping these two sides apart may be too difficult, only time will tell.
But why would a bandwidth provider on the P2P network be willing to accept such low rates? There are two reasons: 1) they still have excess capacity, and selling that capacity at anything over their fixed costs puts them ahead, and; 2) the P2P network would give this confederation of networks the same kind of backdoor into ISPs that Google is spending billions right now to leverage. For almost no expense the P2P network can offer to TV networks and movie studios a service that is in many ways superior to Google's plan while also being substantially cheaper.
Short of Microsoft waking up and smelling the coffee, this is the only way I can think of to beat Google or to even compete at all.
1 commento:
Thanks for writing this.
Posta un commento