Microsoft Goes Green

Microsoft's new Redmond Ridge 1 facility, at 57,000 square feet, will eventually take the place of the server labs currently operating on the company's main campus just down the road. According to Microsoft's Environmental Sustainability Blog, the new facility will use one-third less energy than those server labs, delivering "overall carbon savings of 12,000 metric tons per year." microsft ridge IMAGE 1.jpg

This rollover to a consolidated facility will mark a change in the workflow of those developers on Microsoft's main campus. Previously, product groups managed their own in-building server lab for development and testing, which probably worked out well on those late nights when a frustrated programmer wanted to march downstairs and personally threaten a misbehaving piece of equipment with a screwdriver; now, in the name of boosted corporate efficiency and environmental sustainability, that same programmer will need to climb into his or her car and drive eight miles down the road. microsoftrridge 1 IMAGE 2.jpg

By the time Redmond Ridge 1 operates at full capacity, predicted by Microsoft to be spring 2010, it will host 35,000 servers within self-contained pods. "Each Pod has an individual Uninterrupted Power Source (UPS) and rooftop air handler, direct network connectivity and dynamic generator backup," says the blog. "A state-of-the-art cooling design that relies on evaporative coolers and basic physics to cool the facility—instead of chillers—helps drive much of the reduction in energy for the facility." The facility's servers will leverage Windows Server 2008 R2 and Hyper-V, so that testers and developers can access the machines they need from locations either on Redmond's main campus or around the world. microsoftridge2 IMAGE 3.jpg

Server farms will only increase in size in coming years; if you're a business like Microsoft, your annual cost for the millions of kilowatt hours required to power those facilities will soon approach the GDP of a small to medium-size country. In those cases—and because it also supplies a PR boost by being able to say you've gone more green—building a new, "green" facility makes economic sense. For the enterprise and SMBs (small to medium-sized businesses) without billions of dollars in a vault, there are a few simple ways to save energy (and money) with their own server setup, according to a report by Gartner published late in 2008: 1. Plug holes in any raised floor. 2. Install blanking panels to manage airflow. 3. Coordinate CRAC (computer room air-conditioning) units, pairing older ones with newer to best combine their efforts. 4. Implement "hot" aisles and "cold" aisles. 5. Install sensors to measure temperature data and correct problems. 6. Implement cold-aisle or hot-aisle containment, to better separate cold supply air and hot exhaust air. 7. Raise data center temperature to an efficient standard. 8. Install variable-speed fans and pumps. 9. Exploit "free cooling," such as air-side economization and water-size economization, which can provide up to 8,000 free cooling hours per year, depending on the climate. 10. Design new data centers to exploit "modular cooling" (in-row or on-rack). 11. Improve underfloor air-flow, which will allow for more efficient distribution of cold air. The number of server-intensive facilities—and their carbon footprint—will likely only increase as cloud computing gains more prevalence among both consumers and the enterprise; any steps that can be taken to reduce that footprint not only save money (always a good thing) but might even spare a few polar bears (also a good thing, unless one of them is trying to turn you into an appetizer).

Google Adsense Account Banned? See here for how to get a Google Approved Adsense account with your name and address.. See this website for more details.

If you enjoyed this post, make sure you subscribe to my  RSS Feeds !!

If you enjoyed this post, make sure you subscribe to my E-NewsLetter !!

Also follow me at twitter to know what I am doing.

Posted By: KirubaKaran

Microsoft Certified Technology Specialist

There was an error in this gadget