Mestex and the National Science Foundation Advisors Meet in Dallas

Although it has been far too long since I have posted to this blog due to my travel schedule I have some news to share.

Over the last few days (Oct. 1 & 2) we have been participating in the Industry Advisory Board ("IAB") meeting of the NSF-I/UCRC ES2 ("National Science Foundation-Industry/University Cooperative Research Centers Energy Smart Electronic Systems") research consortium.  That mouthful of letters represents a group of universities and companies whose expressed goal is to reduce the energy consumption of data centers by 20-35%.

The consortium is currently working on fourteen research projects and Mestex serves as an advisor ("mentor") on three of those projects.  Two of the projects that Mestex is mentoring cover research on evaporative and fresh air cooling of data centers and a second project on contaminants in data centers that use fresh air cooling.  As you might guess, the project on evaporative and fresh air cooling offers the greatest opportunity for the consortium to reach the stated goals.  In order to support that research, Mestex has installed a small data pod at it's facility in Dallas and is cooling that data pod with a commercially available Aztec ASC-5 unit.  The ASC-5 has built-in DDC controls that facilitate the use of multiple temperature and humidity sensors for control without any special modifications.  The controls also include a provision for pressure sensing control and that is also implemented in this case.

In addition to the data that is presented by the standard Aztec DDC controls there are additional thermocouples and sensors installed that are streaming data to researchers at the University of Texas at Arlington.

One of the most critical considerations that prevents many data center operators from reducing their energy consumption by huge amounts is the reluctance to introduce outside air to the facility.  The second Mestex project is focused on that research and we were fortunate to have the input of one of the world's experts on contamination control provide test coupons and laboratory analysis of the results.  Dr. Prabjit "PJ" Singh, of IBM, provides guidance and analysis to companies around the world and is a major source of information for the ASHRAE TC 9.9 committee on data center cooling.  Dr. Singh, Dr. Dereje Agonafer from the University of Texas at Arlington, and several members of the NSF IAB toured the Mestex facility at the conclusion of the meetings this week. 

Drs. Singh and Agonafer are shown here learning about the technology behind the patented "Digital High Turndown Burner" that was developed at Mestex.  Jim Jagers, Mestex Sales Manager,  conducted the tour and provided a "deep dive" into how this unique technology works before the group proceeded to the research data pod for additional discussions.



An "Open Access Project" Update


The Mestex "Open Access Project" continues to move forward so I thought I would provide a brief update on the current research activity and the plans for the next few months.

The installation at the Mestex facilities in Dallas has been brought up to the expected final configuration with a total of 120 servers, intelligent PDUs, and switches distributed over 4 cabinets.  We have separated the hot and cold aisles with a combination of a hard wall and flexible "curtains"...this has turned out to be one of the more important features of the installation.  The indirect/direct evaporative cooling system is fully functional although we have also found the need to increase the hot aisle exhaust pressure relief in order to reduce the "back pressure" in the hot aisle. 

In addition to the combination temperature and humidity sensors that are part of the standard Aztec control system, and used by the DDC control system to manage the operation of the Aztec unit, we have also installed 32, 10K thermistors.  These sensors are used to feed information to our data acquisition system that is running in the background collecting more granular detail about the system performance.  These sensors are located on the fronts and backs of the cabinets.

As I mentioned, we have spent some time resolving hot aisle/cold aisle separation issues.  Although the Aztec unit is monitoring cold aisle pressure and operating the supply fan to maintain a target positive pressure in the cold aisle we found that we still had hot aisle air migrating back into the cold aisle.  Over the last few days we have spent time filling small gaps and sealing around the cabinets more carefully and the results were immediately noticeable.  The cold aisle temperature was reduced by 5 to 6 degrees F. 

The other factor contributing to better separation was the reduction of the "back pressure" in the hot aisle.  We had addressed some of this earlier by removing the standard room exhaust grill and replacing it with a screen that had much greater free area.  While that made a measureable difference in server temperature rise we had simply moved the pressure issue from inside the data pod to the return air ductwork on the Aztec unit.  That has now been resolved by doubling the size of the pressure relief openings in the return ductwork.  Supply fan operation is now improved, server temperature rise is now on target, and supply fan motor power consumption has been reduced.  We monitor and report real time PUE for the pod and these changes have lowered the real time PUE to between 1.08 and 1.35, depending upon the system operating mode.

Now that we are beginning to see the kind of stable operation that we were anticipating we have started to plan the next phases of the research.

The Aztec unit is designed to operate in three modes, or some mixture of those modes, depending upon the sensor inputs.  The unit can operate in 100% fresh air cooling mode, in an indirect evaporative cooling mode, or in an indirect/direct evaporative cooling mode.  Each of those modes introduces characteristics that the data center industry wants to research. 

The next round of research will focus on two aspects of fresh air/evaporative cooling:

  • We will be installing coupons in the space to collect data on contaminants and their potential impact on the circuits in the servers.  This project is projected to run for at least 1 month and support is being provided by IBM.
  • Following the collection of this data (and possibly overlapping) we will be installing particle count measuring devices.  These devices will be installed upstream of the filters in the Aztec unit, downstream of the filters, within the cold aisle, and within the hot aisle.  The filter racks in the Aztec unit will allow us to evaluate filters of different MERV ratings and see how well they perform in a typical HVAC unit installation versus the controlled lab environment.

As you can tell, this site offers a unique opportunity for researchers to take their lab research findings and compare them to a real world application with real world equipment.  Mestex is pleased to be a part of this NSF sponsored research into data center cooling technologies.  We will be hosting a tour for the industry advisory board of the NSF-I/UCRC during their upcoming meeting at the University of Texas at Arlington.

Education and Training

Since I have been traveling extensively over the last few weeks I have not been able to give much thought to our blog.  However, the travels have also provided a little fuel for some comments.

First, I continue to be surprised/pleased to hear more and more presentations and discussions about evaporative cooling of data centers.  It seems that "the big guys" get it...cooling data centers costs a fortune using compressors/chillers and the servers can handle much higher temperatures than people realize.  If you run down the roster of large international web service or cloud service providers you will find that most of them have already implemented evaporative cooling or they have it in the construction plans. 

As great as this is there are still market forces that are conspiring against this highly efficient cooling solution.  One is the concern over humidity levels in the data center.  This concern is compounded by the common use of relative humidity as the conversation point when it is actually absolute humidity that should be considered.  This topic will likely be a point of debate for a long time to come since some of the larger companies have concluded that absolute humidity doesn't matter in their facilities...especially with 2 or 3 year server refresh rates...and other members of this progressive group are not sure and choose the "safe path" of limiting absolute humidity or dewpoint in their spaces.

The one area where it seems that all of the large players agree is with regard to temperature.  It is virtually universal that ASHRAE 9.9 recommended guidelines are acceptable and, for many of these users, ASHRAE 9.9 allowable temperatures are OK.

The challenge for the industry is still finding a way to filter this information and confidence down to smaller operators and owners.  I have heard it described as an education issue but is that truly the case?  It is hard to find a computer or data center related design publication these days that does not promote higher temperatures as a feasible solution for cutting operating costs.  Are we just too busy to read these articles or do we not believe the wealth of research and experience that backs up the statements?

At a recent conference on data center design I sat at a lunch table with a group of design engineers and a manager of 13 data centers.  When asked how he learned about managing those centers the response was that he was self taught by attending conferences and talking to "experienced" data center managers.  So his knowledge of the work by ASHRAE and others was not a major factor in deciding on appropriate operating temperatures.  What he was learning was what these other managers had been doing over the last decade...going back to the "old days" where electric costs were low and low data center temperatures were the norm and research had not shown that to be unnecessary.

So, if education is the issue then how do we go about it?  What mechanism will get the message through the daily clutter of information and time demands?  I don't have the answer...if I did I would implement it immediately....but it seems to be a key to moving the industry forward.

NEWS RELEASE


New data center construction expected to boom as demand triples

 

Mestex Open Access Project helps data center operators plan for “build as you grow” expansion

 
DALLAS, April 28, 2014 The digital revolution is sapping the power grid, but a new approach to data center construction may help reverse the trend of ever-increasing energy consumption for powering and cooling these facilities. To help data center operators better understand their options, Mestex, the industry leader in evaporative cooling systems, is providing a free tool to demonstrate how infrastructure can be better deployed to manage competing demands for more capacity and greater energy efficiency.

 “Data centers are the enablers of this digital revolution,” said Mike Kaler, president of Mestex. “The increase in global digital demand and cloud computing is exponential. As demand rises, data centers that house digital information consume more electricity, half of it being used to cool the facility. We wanted to help people see how energy is being consumed and ways for managing infrastructure and costs.”

The company believes intelligent technology combined with a flexible, scalable and energy-saving approach is the best way to “build as you grow.” Adding plug-and-play cooling units – such as Mestex’s own Aztec Evaporative Cooling Units – as capacity increases is the most economical strategy for data centers to manage expansion or new construction while reducing total cost of ownership. Aztec systems are proven to lower power usage by 70% when compared to traditional air conditioning; the system’s digital controls, when integrated with other building automation systems, can extend that savings even further.

To help data center operators get a realistic picture of how their own expansion might play out, the company recently launched the Mestex Open Access Project to provide information technologists, facility managers and financial executives the ability to evaluate energy-saving concepts in a real-world environment. 

“We’ve opened access to our equipment, controls and data, because we want to encourage energy savings and demonstrate to data center decision makers that there are smart, effective ways to increase efficiency and optimize operations,” Kaler said.

The web-based interface offers visibility into the physical plant and air conditioning system of an operating data center being tested as a part of a project spearheaded by the National Science Foundation. The “open access” gives anyone with Internet access an unembellished look at how a data center is operating, in real time, 24/7.

The Open Access Project harnesses the power of Mestex’s direct digital control (DCC) system, which comes standard on all of its HVAC products and can be easily integrated with other HVAC vendors’ products and building automation systems to create an intelligent network that controls cooling for optimal efficiency, performance and longevity, as well as provides web-based system monitoring and management.

 

Note:

Mestex President Mike Kaler will be hosting a presentation on mission-critical cooling systems on Wednesday, April 30, at 10:30 a.m. at AFCOM Data Center World at the Mirage Casino-Hotel, Las Vegas, Nevada. The company is exhibiting (booth #1227) at the conference April 28 – May 2.

Links:


Mestex Open Access Project Live View  (To access, Internet Explorer 10 or above is required. Enter “guest” as user name and password.)

 

# # #

 

Mestex (www.Mestex.com), a division of Mestek, Inc., is a group of HVAC manufacturers with a focus on air handling and a passion for innovation. Mestex is the only HVAC manufacturer offering industry-standard direct digital controls on virtually all of its products, which include Applied Air, Alton, Aztec, Koldwave, Temprite and LJ Wing HVAC systems.

 

Media Contact:

Christina Divigard

Divigard & Associates

413 341 6780 or Christina@Divigard.com

 

 

The Amazon Effect

I recently attended an interesting conference on the impact of E-commerce on the world of logistics and warehousing.  Of course, virtually everyone knows about Amazon and their E-commerce model.  "The Amazon Effect" is used to describe E-commerce in general and the economic impact on communities of having an Amazon facility in their area.

But I think that the part of the story that is not discussed enough is the changing nature of the facilities and the people in the facilities. 


Years ago a warehouse was a warehouse.  These big, uninsulated, boxes were filled with metal racks stretching to the ceiling and covering the floor from wall to wall.  Those racks were at least partially filled with finished goods that would eventually be located manually and pulled from the racks with a human-driven forklift.  From there the finished product would be taken to a truck at the loading dock and sent on its way.  I actually spent a summer during my school years, eons ago, locating products and driving the forklift to the truck.  I had a clipboard (that was like a tablet computer but held actual paper and used something called a "pen" to mark things off on the paper) and the process was tedious.  The building itself was hot in the summer and cold in the winter but we just dressed for it.

An E-commerce facility today is only similar in that the box is still big and there are still metal racks but that is about it.  In the first place, those racks may not be used to store finished goods but components that can be used to create a finished good.  Maybe it is a shirt and tie that are packaged into a set.  Maybe it is a cell phone, battery, charging cable, etc packaged into a retail package. 

This approach is designed to give the end customer flexibility.  Order the color, size, accessories that you want to make the purchase unique to you and the E-commerce company "fulfills" that order to your specification.  So, now, you have "fulfillment" centers instead of warehouses and they are occupied by dozens of workers using computers to configure your package to your needs.  In many cases the components that these workers put together are not retrieved by a human but by a robotic retrieval system.  The robots, and the human workers, all receive their instructions from on-site servers processing thousands of orders a day.

So, the old warehouses of my school days are now air conditioned, filtered, well-lit, high tech "factories" with their own small data center.  The challenge for the HVAC systems is how to handle three different requirements in those buildings.

The area used by workers to fulfill the orders needs comfortable temperature conditions but only in the lowest 7 or 8 feet of the building height.  The rack area where the robots run around might need temperature control for the entire 35 or 40 feet of the building height depending upon the product storage requirements.  And the on-site data center needs filtered fresh air, or evaporative cooling, to keep the servers running at an affordable operating cost. 

And to compound the problem the E-commerce company is probably growing so fast that the system configuration today will be obsolete in 2 or 3 years.

In order to satisfy all of those requirements, and provide future flexibility, requires the kind of analysis that a tool like CFD can provide.  Being able to create the building and experiment with equipment locations and sizes before the space is built, or reconfigured, has tremendous value.  Mistakes can be avoided and performance can be optimized to the requirements of the area being served.  Mestex has invested heavily over the last 15 years in CFD software, computers, and training so that we can perform the kind of analysis required.  We use this tool almost daily to help designers and owners make the best equipment selection for their project.

If you are involved in the E-commerce world and need to know the best type and location of the HVAC equipment for your project please feel free to contact us at www.mestex.com.

NSF/Mestex Research Project Update at SEMI-THERM Conference in San Jose, California

Students from the University of Texas at Arlington will be presenting an update on the progress of a research consortium, partially funded by the National Science Foundation, that is focused on improving the efficiency of data center cooling. This presentation will be made during the SEMI-THERM conference in San Jose, California from March 9-13.

The work presented in this exhibit presents updates on this project since the last industrial advisory board (IAB) meeting in Villanova University in September 2013. The updates include completion of construction of an Aztec ASC-15 cooling unit, attachment of the cooling unit to an IT Pod, construction of internal details of the IT pod, construction of a duct for testing various cooling pads, creation of computational fluid dynamics (CFD) model for the IT pod and the ASC-15 unit.

The cooling unit, ASC-15, which is capable of operating in pure air-side economization, in direct evaporative cooling, in indirect evaporative cooling, and/or in hybrid modes, contains two blowers which can deliver up to 7000 CFM to the IT pod. Various parameters of the cooling unit, such as blower rotational speed, inlet air temperature, supply air temperature, outside air humidity, etc are available through an online portal. ASC-15 is connected to the IT pod at Mestex facility which is providing power and water to the modular research data center. Inside the IT pod, four cabinets, each containing thirty HP SE1102 servers, are placed in a hot/cold aisle configuration.

One of the HP SE1102 servers was tested in UT-Arlington lab to find out its maximum power consumption. The maximum measured energy consumption is used to calculate total dissipated heat per rack in the CFD model of the modular research data center. This CFD model will continue to be updated depending on changes to the IT pod or the cooling unit. For example, updates to the cooling pad model will be applied based on results from the various wet cooling pad tests that will be performed at UT-Arlington.

"Open Access Project" Update

Mestex is continuing to refine the systems and information stream as part of our "Open Access Project".  As a reminder, Mestex has established a research data pod as a member of an NSF project to improve the efficiency of data centers.  This pod is being cooled with an Aztec IDEC system managed by the Mestex DDC control system. 

The goals of the research project require frequent experiments.  Currently, Mestex is continuing to complete the basic infrastructure of the pod itself.  The video in the link below shows a quick walk-through update on the installation of a containment system.  As you can see, the installation is nearing completion.  In the meantime we are running the 51 servers, a 5 kw resistive load, and some ancillary equipment in the 4 cabinets that are in the pod.  https://onedrive.live.com/redir?resid=B173953F824409BC!7016&authkey=!AKcxEKWl7409R-8&ithint=video%2c.MP4

Since the IT equipment is operating while we finish the containment installation we are able to watch the performance of the IDEC cooling system in real time via  http://webctrl.aztec-server-cooling.com/.  This website is accessible to anyone and viewers can log on using the username "guest" and the password "guest".

One of the areas that concerns data center designers and operators is the performance of evaporative cooling systems "on the shoulder" when ambient dry bulb temperatures are relatively low and ambient RH is relatively high.  Last week we had an opportunity to observe just that situation.

The time was 7:55 in the morning.  The ambient conditions were what most people would consider to be the worst scenario for evaporative cooling…relatively low DB and relatively high RH…in this case, 59.6 DB and 89% RH.  Under those conditions most people would expect the cold aisle conditions to be either too warm or too humid.

However the cold aisle was operating at 78.9 DB and 56.7% RH.  The cold aisle setpoint is 80 DB.  The cooling tower integrated into the Aztec IDEC unit was on, the airflow dampers were positioned for 100% return from the hot aisle, and the system was providing enough cooling (even with this high RH) to maintain the cold aisle temp and an 11 to 12 degree rise across the servers.  Note also that these conditions are inside ASHRAE TC 9.9 A1 Allowable limits.
 
The "Open Access Project" will be continuing throughout 2014 and, most likely, into 2015.  This will provide ample opportunity to observe a wide range of operating environments for the IDEC system.  During that time there will also be research on filtration, fresh air cooling, and further refinement of control algorithms for fully integrated IDEC systems such as the Aztec product.
 

What Doesn't Kill You Makes You Stronger

Over the last week or so I have weighed whether or not to post a small rant.  I have decided that I should say something about education...STEM education in particular.  For those who don't keep up with the thousands of acronyms that float around us all the time STEM stands for "Science, Technology, Engineering, and Math".  As you might guess this is an area of great interest to me as president of a business that depends upon technical talent for growth.

The catalyst for this rant was a conversation with a mathematics professor at a local community college.  It seems that this professor was named to a committee whose mission was to find ways to increase the "success rate" of the math department at her campus.  The issue at hand was money.  Funding for the college is linked to the number of students who "pass" the courses.  I am sure that you can guess what the college administration wants to see..."adjust" the testing standards to allow more students to pass.  It seems that under that "recommendation" a student who scores only a 67 on a basic math test will be awarded a "C".  My guess is that if this does not produce the desired "success rate" then the bar will be lowered again.  Is this really the right direction for education...especially at the college level?

To add fuel to my internal fire the state legislature has decided that Algebra 2 is too difficult for high school students so they have voted to have it dropped from the graduation requirements.  What?  Haven't the news media talking heads spread the message that we need STEM students and that our country is falling behind others in those key areas of study? 

One argument that I have heard is that students no longer need as much math because they have computers to solve problems for them.  Well, where did the computers come from?  Who wrote the software that allows the computers to reach the conclusions that these students will depend upon?

I have also heard that students who have no intention of going into engineering or science should not be saddled with learning as much math.  Those students who intend to go into STEM fields can get their math at the college level (where they can pass more easily so that the school can get funding).  But that thinking misses one of the intrinsic values of a math education...the ability to think logically and develop critical thinking skills.

A journalism major, an art major, a liberal arts major will all want a job at some point.  Whether they are writing an article, creating appealing art, or curating a museum collection the ability to clearly communicate is paramount to success.  And clear communication requires logical thinking skills that a solid math education will instill in any person. 

Is math "hard"?  Yes, for some people but that does not mean it should be avoided.  Remember, what doesn't kill you makes you stronger...and taking math hasn't killed anyone that I know.

SHOWTIME!!!

I guess it must be that time of the year because the trade shows are starting to pick up steam.  The Dallas division of Mestek, Mestex, has several upcoming trade show events.

First up will be the annual AHR exhibit in New York from January 21 to 23.  Dallas will be exhibiting a demonstration model of the industry's most advanced indirect evaporative cooling system, the Aztec ASC.  This small scale model illustrates the use of high heat transfer copper tube/aluminum fin cooling coils, redundant direct drive plenum fans, and the industry leading DDC control system. You can see this display in booth 1503.


Next will be the annual Rental Show in Orlando from February 8 to 13.  For this trade show Dallas will be exhibiting our new line of Koldwave air-cooled portable air conditioners.  These units are perfect for rental companies.  You can meet with Koldwave sales manager, Jeff Wilson, in booth 1864.

The "Open Access Project"



In the data center and technology world there is an on-going "discussion" about "openness".  Facebook really kicked this into high gear with their "Open Compute Project" where they share design specifics for their data center product designs in order to encourage energy savings across the data center design community.  This bold move allows virtually anyone to take a peek into how Facebook manages the hardware side of their business and how their equipment operates in the real world environment.

Here at Mestex we have taken our own steps in the same direction.  Anyone, anywhere in the world, who has Internet access can see some of our HVAC product designs operating in the real world environment of heating and cooling our facilities in  Dallas, Texas.  Think about how many other HVAC product manufacturers provide such "open access" to their equipment in real time in the real world? 

By logging in as a guest user (adaptaire.appliedair.com and use "guest" for both username and password) anyone can see how our DOAS unit is operating to serve our engineering department, or how a direct fired unit is operating to heat one of our production buildings, or how an air turnover unit is performing to cool a large uninsulated warehouse building, or how an indirect/direct evaporative cooling unit is functioning to control temperatures in a research server room pod located outside our buildings.


No filters, no marketing BS, no "wordsmithing" of performance...just raw unadulterated data and graphics showing what the equipment is doing in real time and how it is performing against the setpoints and goals set for the equipment.   Does this take guts?  Of course it does, but if you have been in the HVAC business for as long as Mestex has, and you have the proven track record that we have, then why not share how equipment really functions and give people a chance to evaluate concepts in the real world?

Our control software continues to be refined and developed based upon user feedback and our own research projects so the information on the display changes from time to time.  It now includes electrical energy data...kw demand information and kwh consumption information for each piece of equipment on the display.  Again, to give viewers an idea of the true cost of operation for various types of equipment.

So, for real world information about data center equipment you can go to Facebook's "Open Compute Project" and for real world information about HVAC equipment you can go to Mestex's "Open Access Project".