The Complexities of Content

103 36
Gone are the days of a simple Web page with one or two people updating it, Nowadays, you have dozens, if not hundreds of servers and computers that are all over the world trying to serve up this content as fast as possible, and being updated by any number of people, including the public.
For Web operations and IT managers who are responsible for uptimes and availability, it just makes things a whole lot more complex In those days, back near the turn of the century, when sites created and pushed content out more or less from a single source, performance was in the hands of the site owners.
They were responsible for implementing and configuring the capacity they needed to handle their expected traffic - and when there were performance lapses, they looked in the mirror to find the sources and solutions.
After 9/11, for example, many of the major news sites went down for hours or days.
They simply weren't designed to handle the tremendous surge in visitors as Americans flocked to the Internet for news and updates.
These were perhaps the most significant, spontaneous flash crowds that the adolescent Internet saw, and many sites' inability to handle them was painfully obvious.
But the problems were mainly internal capacity and external bandwidth.
Fast forward to today.
In less than a decade since 9/11, the capacity of the Internet overall and of individual Web sites to handle traffic has increased exponentially - as have user expectations that sites will be available and fast 24/7/365.
Today, Web sites are no longer single-source, singly hosted affairs; content is often fed from multiple external sources to populate a page.
Bandwidth-intensive, processor-hungry video is everywhere, and is the life blood of many media sites.
Flash-crowd events large and small are not uncommon, and by and large, most sites take them reasonably in stride.
Site crashing is a much rarer phenomenon, even in the crush of traffic after a tsunami, an historic election, or a plane landing in the Hudson.
Site performance, however, can still be significantly degraded by a major surge in traffic.
Who's on First? What's on Second? Site owners are more pressured than ever to deliver the fast, flawless experiences users now demand, and can often find at a competitor's site.
Site monitoring and measuring their web performance is no longer the simple task of measuring overall page load time.
There's really nothing a webmaster can do with the information that the site is running slow.
Is it their own content? The CDN that's pushing out their videos? The sister site that's hosting their image library? The Flash banner promoting upcoming programming on their TV network? Or the ad network servers that supply the bulk of the site's revenue? How does the site owner identify the bottlenecks, and gain actionable data to demand better performance from weak providers in the content chain?
Subscribe to our newsletter
Sign up here to get the latest news, updates and special offers delivered directly to your inbox.
You can unsubscribe at any time

Leave A Reply

Your email address will not be published.