Tampilkan postingan dengan label from cnet. Tampilkan semua postingan
Tampilkan postingan dengan label from cnet. Tampilkan semua postingan

Sabtu, 11 Desember 2010

How evolution begat the cloud revolution

commentary Asking why cloud computing is happening today is something of a tautology. That's because an inclusive definition of cloud computing essentially equates it with a broad swath of the major advances happening in how IT is operated and delivered today.
Pervasive virtualization, fast application and service provisioning, elastic response to load changes, low-touch management, network-centric access, and the ability to move workloads from one location to another are all hallmarks of cloud computing. In other words, cloud computing is more of a shorthand for the "interesting stuff going on in IT" than it is a specific technology or approach.
Archaeopteryx is widely considered to be the first bird but it actually had more in common with theropod dinosaurs than with modern birds.
(Credit: H. Raab/CC Wikimedia)
But that doesn't make the question meaningless. It would be hard to argue that there isn't a huge amount of excitement (and, yes, hype) around changing the way that we operate data centers, access applications, and deploy new services. So forget the cloud computing moniker if you will. Why is this broad-based rush to do things differently happening right now?
The answer lies in how largely evolutionary trends can, given the right circumstances, come together in a way that results in something that's quite revolutionary.
Take the Internet. The first ARPANET link--the Internet's predecessor--dates to 1969. Something akin to hypertext was first described by Vannevar Bush in a 1945 article and Apple shipped Hypercard in 1984. But it took the convergence of things like inexpensive personal computers with graphical user interfaces, faster and more standardized networking, the rise of scale-out servers, the World Wide Web, the Mosaic browser, open source software like Linux and Apache, and the start-up culture of Silicon Valley to usher in the Internet as we know it today. And that convergence, once it began, happened quite quickly and dramatically.
The same could be said of cloud computing. The following interrelated trends are among those converging to make cloud computing possible.
Comfort level with and maturation of mainstream server virtualization. Virtualization serves as the foundation for several types of cloud computing including public Infrastructure-as-a-Service clouds like Amazon's and most private cloud implementations. So, in this respect, mature server virtualization software is a prerequisite for cloud computing. But the connection goes beyond technology. Increasingly ubiquitous virtualization has required that users get comfortable with the idea that they don't know exactly where there applications are physically running. Cloud computing is even more dependent on accepting a layer of abstraction between software and its hardware infrastructure.
The build out of vendor and software ecosystem alongside and on top of virtualization. From a technology perspective, cloud computing is about the layering of automation tools, including, over time, those for policy-based administration and self-service management. From this perspective, cloud computing is the logical outgrowth of virtualization-based services or--put another way--the layering of resource abstraction on top of the hardware abstraction that virtualization provides. Cloud computing can also involve concepts like pay-per-use pricing, but these too have existed in various forms in earlier generations of computing.
Browser-based application access. The flip side of mobile workloads is mobility of access devices. Many enterprise applications historically depended on the use of specific client software. (In this respect, client-server and then PCs represented something of a step back relative to applications accessed with just a green-screen terminal.) The trend towards being able to access applications from any browser is essentially a prerequisite for the public cloud model and helps make internal IT more flexible as well. I'd argue that ubiquitous browser-based application access is one of the big differences between today's hosted software and Application Service Providers circa 2000.
Mobility and the consumerization of IT are also driving the move to applications that aren't dependent on a specific client configuration or location. For more than a decade, we've seen an inexorable shift from PCs connected to a local area network to laptops running on Wi-Fi to an increasing diversity of devices hooked to all manner of networks. Fewer and fewer of these devices are even supplied by the company and many are used for both personal and business purposes. All this further reinforces the shift away from dedicated, hard-wired corporate computing assets.
The expectations created by consumer-oriented Web services. The likes of Facebook, Flickr, 37signals, Google, and Amazon (from both Amazon Web Services and e-commerce services perspectives) have raised the bar enormously when it comes to user expectations around ease of use, speed of improvement, and richness of interface. Enterprise IT departments rightly retort that they operate under a lot of constraints--whether data security, line-of-business requirements, or uptime--that a free social-media site does not. Nonetheless, the consumer Web sets the standard and IT departments increasingly find users taking their IT into their own hands when the official solution isn't good enough. This forces IT to be faster and more flexible about deploying new services.
And none of these trends really had a single pivotal moment. Arguably, virtualization came closest with the advent of native hypervisors for x86 servers. But, even there, the foundational pieces dated to IBM mainframes in the 1960s and it took a good decade even after x86 virtualization arrived on the scene to move beyond consolidation and lightweight applications and start becoming widespread even for heavyweight business production.
The richness of Web applications and the way they're accessed are even more clearly evolutionary trends which, even now, are still very much morphing down a variety of paths, some of which will end up being more viable than others. Developments like HTML5AndroidChrome OS, smartphones, tablets, and 4G are just a few of the developments affecting how we access applications and what those applications look like.
Collectively, there's a big change afoot and cloud computing is as good a term for it as any. But we got here through largely evolutionary change that has come together into something more.
And that's a good thing. New computing ideas that require lots of ripping and replacing have a generally poor track record. So the fact that cloud computing is in many ways the result of evolution makes it more interesting, not less.


From : http://news.cnet.com/8301-13556_3-20025116-61.html#ixzz17uc0zjxJ

MPAA, RIAA: Lawsuits won't protect content



Trade groups representing the music and film sectors say copyright law offers too many excuses for ISPs to do nothing about protecting copyright.
(Credit: Daniel Terdiman/CNET)
Lawyers representing independent filmmakers, including the studio that produced Oscar-winner "The Hurt Locker," might learn something from a document filed with the U.S. Department of Commerce today by music, television, and film industry trade groups.
The Commerce Department recently sent out a request for information, known as a "Notice of Information," on "copyright policy, creativity, and innovation in the Internet economy." What the Commerce Department intends to do with the information it obtains was unclear this afternoon, but it did receive a response from nine trade groups representing the entertainment sector. In that report were a few notable points.
"The role of lawsuits in solving the online theft problem is clearly limited," wrote the coalition that included the Motion Picture Association of America (MPAA), the Recording Industry Association of America (RIAA), and American Federation of Television and Radio Artists (AFTRA). "For instance, bringing clear-cut claims against major commercial infringers is not by itself a solution in the long run," the coalition wrote. "These cases take years to litigate and are an enormous resource drain."

"Bringing clear-cut claims against major commercial infringers is not by itself a solution in the long run. These cases take years to litigate and are an enormous resource drain."
--Big media trade groups

As an example, the coalition cited the litigation against the company behind the LimeWire file-sharing network, which concluded this year with a federal district court ordering the company to shut down the network. The coalition wrote that though the four largest recording companies prevailed in the case, "the LimeWire defendants were able to drag out the litigation for four years. Such massive civil cases do not provide a scalable solution to the full scope of the problem."
In the case of Lime Wire, the company that operated the LimeWire software, the RIAA's antipiracy approach meant bringing suit against a company. That's different from the strategy adopted by Dunlap, Grubb & Weaver (DGW), the law firm that has filed copyright complaints against thousands of individuals accused of illegally sharing movies made by indie studios. But what the two approaches appear to have in common is that they cost a lot.
DGW has seen considerable opposition from the accused and now many of the cases appear as if they will drag on in the courts for some time. That likely means higher costs for the plaintiffs. The top-four labels pursued a similar legal strategy against individuals for five years but ended the practice in 2008.
The plan now by the labels and big Hollywood studios is to seek more copyright protection from the government. Here is some of what the coalition wrote in its report to the Commerce Department about the state of online piracy:
 •  Peer-to-peer file sharing continues to account for at least 25 percent of all broadband traffic worldwide. A very high proportion of this traffic involves unauthorized copies of movies, TV programming, sound recordings, and other copyrighted works.
•  A recent Princeton University study found that approximately 99 percent of 1,021 BitTorrent files reviewed violated copyright. It is true that P2P's percentage share of total traffic is down from previous years, but in large part this is attributable to increased use of streaming services and cyberlockers as means for making stolen copyrighted materials available.
•  McAfee estimates that the number of "live, active sites delivering illegitimate content" has sextupled since 2007.
The coalition complained that the Digital Millennium Copyright Act, the law that offers Internet service providers a safe harbor from copyright liability, offers companies too many loopholes. They say that the way the law reads now, ISPs have too much of "an excuse to do nothing to combat pervasive and even blatant infringement."
Elsewhere in the report, the coalition used Google as an example of a company that once resisted requests for greater antipiracy efforts but is now moving in the right direction. Last week, Google announced it would stop doing business with members of AdSense, the advertising vehicle that pays sites for posting ads on its pages, if they were found to be trafficking in pirated content. Google also said it would be quicker to remove pirated links from its search results once notified by copyright owners.
The coalition noted, however, not every search engine is cooperating.
"Even though highly effective automated systems for matching online content to copyright reference databases are readily available and are currently in use by some service providers," the coalition wrote, "other providers feel no obligation to implement them."