A little didy for you tech and gear heads out there:
Some of you may spend your weekends and nights with family, on the computer
or partaking in various hobbies. I do most of that but I somehow keep finding
myself turning a wrench lately. Here is a little background. I own a beautiful,
plush and sporty, 97 Dodge Dakota pickup truck. Yes, it’s old and yes there is
some paint missing =}. At some point, maybe call it a few years ago I took my
truck into the shop for a small repair. It ended up costing me big time. It was
amazing how something that seemed like simple problem at the time ended up
costing a fortune to fix. Oh and by the way, after the problem was
"fixed", I ended up brining the truck back as it broke down again.
Same part failure a weak after repair, thanks!
After this experience I swore I would do all my own work. The old take
ownership model at its very finest. Well, that mentality has slowly festered
from changing my own oil to doing my own brake jobs all the way up to
rebuilding my engine (ugh). I few weekends ago, I decided to tinker with a
broken bolt to get it out of my cylinder head as I believed it was causing an
exhaust leak that in turn was creating an error code on my dash. Specifically,
a cylinder misfire in combustion chambers 1, 3 and 5.
So I thought up the splendid idea to break out the trusty power drill and
dig that bolt right out. With drill in hand I proceeded to start digging. After
a few moments, I actually thought I had poked through the stud and hit the back
of the bolt hole as I felt the drill pop forward slightly and then hit a solid
object. The feeling of poking through a bolt startled me a bit but I didn’t
think anything of it at the time. With a grin on my face and a triumphant swagger
I walked back over to the driver’s side seat and turned the key to start the
truck up. The motor started instantly, but to my horror, I could hear a loud
water splashing sound even over the absolutely ferocious roar of my epic motor.
From the driver seat I looked over to the right hand corner of my garage where
the sound was coming from and saw something that I don't think many car owners
will ever see. There was a high pressure stream of neon yellow fluid splashing
against the wall (yes this is coolant). Immediately, I shut off my engine and
sprinted over to the side of my truck. Sure as sure, I had poked a hole right
through my cylinder head into a coolant chamber.
So to make this story short, I have spent the last two weeks tearing down my
engine to the block. Not fun. The moral of this story yields the
opportunity to reflect back to our recent launch of the ST EYE. The ST EYE is a
unique application that allows mobile device users, technicians and
professionals in the data center an unprecedented ability utilize Bluetooth
technology to interface with Server Technology PDUs. This technology is the first of its kind and enables a user to remotely view and monitor critical power, environmental and system statistics right from the palm of your hand. The nice part is that there is no need to connect a computer, open the rack up or physically connect to the rack PDU in question over the network. If there was a similar application for my truck in advance to my project, I would be better off today. There is a good read on the ST EYE application here: http://www.processor.com/ if you want to learn more and stop reading about my mess.
I will spare you all the rest of the details on the truck but stay tuned, I
will post how the rebuild goes.
I relate to this recent piece1 in Information Week. It talks about the balance between the value of the cloud and the comfort that comes with visibility to the electronics operating underneath it that makes it go. The thought of turning over all of the technical underpinnings that run my business to a firm I don’t know in depth leaves me queasy. Yet the appeal of letting go of the worry and moving on to more important value added activities holds a great attraction for me.
Powering a datacenter is a complex undertaking. Getting the right amount of power to the right point of use, and making it always available, can be a daunting task. When you move your business over to a cloud-based delivery model, you want to know that there are people and systems in place that value your business and your customers the same way you do. To
deliver on that requires a lot of knowledge, such as the insight gained on power through the use of Server Technology PDUs in the datacenter. You owe it to yourself to ask the hard questions: How does your cloud provider achieve power path redundancy to the cabinet? What is their SLA for you? What brand of power infrastructure are they using? If Server Technology is not on their list, steer clear. We offer the best engineered PDUs and the best customer service in the industry. Period. Why trust your cloud and your business to anything less?
I'm not really much of a techie, though I play one on TV. The season of gift giving does bring out the tech geek in me. I like to start with the Top 10 type lists like the one on CNNMoney. I've already purchased one of those for a gift and leaning toward a couple more. Then, on comes the nearly daily search for hot tech toys, many of which are for myself <sigh>.
Going back to Cisco's Forcast showing tremendous growth in internet traffic coming over the next few years, it seems that tech toys really are the driving factor. What really blows my mind is that we haven't even yet broken the threshold of general adoption of the connected home.
As we go forward, I imagine it won't just be what we now call "mobile devices" connecting on our home networks, nor even simply the expected additional fixed home devices like refrigerators; but, it will be all manner of tools and toys. Though many of us parents lament the loss of the days of the rocking horse as they get buried behind the new days of the tablet, I see kids still engaging with physical toys and likely to engage in the merger of the tablet with new physical toys of the future. It will be the remote control car or helicopter or robot on steroids, so to speak. Program the tablet "game" with actions and watch the toy follow. Ooh, I can't wait...
Calvin Nicholson, Sr. Director of SW and FW Eng.
Server Technology Inc.
The back of the data center equipment cabinet is becoming a more hostile environment to work in for several reasons;
1) Due to security concerns and uptime more cabinets are now being locked and data center cabinet access is harder to come by. In fact most data center operators know that the more they can keep people out of the cabinets, the less likely they are to have problems that result in down time.
2) Due to higher costs and the resulting need for greater efficiency, changes like hot isle isolation and increasing in let temperatures are resulting in greater temperatures within the hot isle. Making the hot isle a harder place to work for extended periods of time or even requiring special gear to work there.
3) If you do access the hot isle and are looking for information such as current load values on a cabinet Power Distribution Unit (PDU) due to poor lighting conditions, numerous cables and other factors the display needs to be very large and bright. Trying to read information from smaller displays is trying at best.
These factors along with new trends like BYOD (bring your own device) and proven technologies like Bluetooth communications and QR Codes provide interesting solutions to the problems mentioned above. Imagine walking down the data center isle and on your Android tablet or phone being able to easily access system, power and environmental information from a device in the back of the cabinet like a cabinet PDU. Not only do you not have to go into the hot isle but once there you don’t have to log into the device using a PC or try and push buttons while on your hands and knees looking at small dim display through a mass of cables.
This technology is now available using a simple App called “ST Eye” (available on Google Play) and a plug and play Bluetooth module connected to a Server Technology CDU. Simply launch the App, scan the QR code or activate the module and you have instant access via your Android tablet for key system, power and environmental information from a STI cabinet power distribution unit.
Peyton Manning has racked up some very impressive statistics in the 2013 NFL football season. 5 games, 5 wins, 20 touchdown passes, 1 interception, a pass completion rate over 80%. An outstanding performance for his 16th season of professional football.
That NSA data center (in Utah) was never going to store a yottabyte of our data…
“Want to store a yottabyte? Expect to pay trillions.” (1) We spent $2B to build it (so far). The only technology that could yield a yottabyte of storage in a volume compatible with the NSA building is DNA based storage, which would have an estimated volume of 1 cubic meter for a yottabyte of data. But the speed and accuracy of retrieval from this technology remains to be seen.
The design for the Utah data center is for 65 megawatts of power into the facility. But this week’s news indicates that the facility may be having some power issues.
“Chronic electrical surges at the massive new data-storage facility central to the National Security Agency's spying operation have destroyed hundreds of thousands of dollars worth of machinery and delayed the center's opening for a year, according to project documents and current and former officials.
There have been 10 meltdowns in the past 13 months that have prevented the NSA from using computers at its new Utah data-storage center, slated to be the spy agency's largest, according to project documents reviewed by The Wall Street Journal” (2)
With those kinds of numbers, I would say the NSA is having an Eli Manning season so far (0-6, averaging 3 interceptions per game), not a Peyton manning season.
Five colors, any unit, no extra charge with standard
shipping time”—it’s not just a slogan. We supply our customers with color PDUs
every day. Server Technology’s Color PDU selection allows data center and
electrical professionals to quickly and easily identify power distribution
pathways to critical data center equipment. Our easy-to-manage selection
of Color PDUs (red, green, blue, black and white) allow for easy visual
identification of power paths at and in the rack. Our solution offers visual recognition
at many different angles and lighting conditions within the data center
equipment cabinet. Ultimately, this easy-to-understand color offering
reduces human errors when plugging in PDUs to power sources or in creating
redundancy for critical equipment.
When you choose Server Technology’s Color PDUs, you receive
unmatched flexibility, cost, delivery times and quality. For example, if your
color scheme changes down the road, you can simply change our color format when
needed without having to order new PDUs. Additionally, since our color
options are in stock, you never have a minimum order quantity, increase in
price or delay in delivery that may come from other coloring methods out in the market.
Bottom line, if you want all the benefits of a Color Coded PDU to match your upstream infrastructure without the associated costs, minimum orders and longer lead times, Server Technology is the right choice for you and your team.
After a fabulous dinner at my favorite Italian restaurant in NY on West 46th street http://becco-nyc.com/ the night before, I was up early and ready to attend my first Open Compute Engineering Summit. This one was hosted at Goldman Sachs and after a cab ride and a short ferry ride, I was on site looking at a great view of The Statue of Liberty and Ellis Island.
Facebook has always been interested in increasing efficiencies and promoting open standards within the data center community and they originally founded Open Compute based on these principals. Server Technology was the original supplier of their 480 / 277 V cabinet PDU designed to take advantage of a couple of keys facts. The first fact being that servers run more efficiently at higher voltages (just take a look at the specifications for the power supplies of the servers you are now purchasing). Secondly that the North American power scheme that takes 480 V into the building and reduces it all the way down to 120 V (two 120 V L-L creates North American 208 V) is very inefficient.
The Open Compute group started out with 3 engineers from Facebook but has evolved with multiple people from multiple organizations working together to help drive industry standards with open IP. There are a number of different designs and specifications that are being actively pursued at this time. Along with written standards there are also systems and processes being setup to test and approve products to these new standards.
Overall this Open Compute Engineering Summit had very informative meetings and was a great way to exchange information with your peers. I sat in on the Open Rack session which offered some very unique and interesting designs. It is all about increasing efficiency and challenging the current designs and thinking within the data center today.
We have a Blog Squad here at Server Tech, and I have been blogging with the company throughout my 5+ year tenure. Sometimes an idea just comes and I know exactly what I want to say and other times I search the web for something to trigger an opinion. This process got me to "Google" the term "blogging infographic" as I was thinking about the proliferation of blogging and its effectiveness.
The first result was for Pinterest in which there were 79 pins. I find this mind boggling. Consider this: I searched for something specific, got what I wanted immediately, and found it to be an overload. What am I supposed to do with 79 infographics? And, that was just the first result of my search! It is very hard to get a sense for authority in information delivery. I could just read a few and cross my fingers that they are accurate, or I could go the next step and cross-reference between the data points to look for the most common answers, or I can try to access raw data from actual analysis. There is such a thing as too much.
So, you may ask what my point is. The point is that you cannot rely on "getting out the word" by simply publishing information to the web by blogs, search terms, pins, or even ads. Personal contact is still the way to provide the best transfer of information. You can't ask an ad or a pin a question, and even blogs are disconnected by time such that relevant thoughts are lost in the lag. This is what is great about how Server Technology does business. We aren't a Marketing juggernaut. We don't shotgun ads with misleading information about our competition. By meeting with our potential customers face-to-face and providing education as well as evidence of our 30 year commitment to quality products for the data center and beyond.
So, today, I was in Dusseldorf. I woke up at 4.45am, kissed the dog, petted the wife and made my way to Heathrow airport.
Server Tech’s central operations is in an office center that was originally army barracks. And it kind of felt like that, too, the way my Country Sales Manager frog marched me into the office. Just as well I got a coffee at the airport—there was no time to waste!
Frank was very excited to show me his new power rig. It's a new way of demonstrating our PDU technology without dragging customers out to our office, or asking for access to their data center.
The STI PDU rig is a set of horizontal PDU units that possess all of the capabilities of our vertical range, but with the convenience where we can now visit a client and show them all of the technology easily in their office. I have to say I was impressed. Frank Hauser, Country Manager for Germany, has been with STI for 6 years now and is one of our most experienced Power Strategy Experts. He holds the record for most power management software sold (take that, US) and closed some of our largest projects over 2013. So I guess I have to listen to him for longer than normal.
Frank claims that his new setup has improved the way his clients have made decisions on what they need. Because the rig has technology of varying intelligence, Frank is able to demonstrate different scenarios and help explore multiple solutions, live with data centre managers, facilities managers, etc. (with relative ease).
As he runs the latest version of our power infrastructure management software (SPM) on his laptop, he also simulates an environment that may require centralised measuring, monitoring, management and reporting of power in the data center.
This was a long discussion today. A common issue we have found is that people don't really know what they want, as they don't know everything that's out there. And I have seen this throughout EMEA and APAC. Customers wanting to improve their knowledge of their power infrastructure and how much power they consume, but needing advice on what else can be done.
For the three of you that are reading this blog, how many of you have a basic PDU infrastructure? If you could do it all again, would you buy intelligent PDUs?
And if you could, what intelligence level would you buy? A slightly bright pdu that could pass high school economics, or a boffin that could rewire Marty's flux capacitor and have a meaningful conversation with Stephen Hawking over tea.
OK, so maybe that's my third glass of wine kicking in, but hopefully you get my point. There are many flavours out there in PDU land, and it can be a minefield. Because unlike servers and other appliances, PDUs aren't easy to upgrade. Once they are in, nobody wants to touch them.
So what do you buy that is right for you now and for tomorrow?
(Shameless sales pitch alert)
Come and talk to us. Our team of power strategy experts have the knowledge to help you make decisions on your power infrastructure that will satisfy your needs today and include capabilities that you may need tomorrow. With the widest range of PDUs on the market, our team has the ability to provide a solution for any type of AC or DC environment.
Or you could do the equivalent of what I did, and buy a 3D TV with all the technological capabilities...and wear the 3D Glasses twice over two years.
Over at Discovery News, Nic Halverson has written about a Skin Tattoo That Takes Body Temperature. This is just the sort of technology that most captures my imagination. Exciting possibilities extend out to any futurist's wildest dreams ... and nightmares.
I previously wrote about the Beginning of the Cybernetic Data Center and the Promise of Truly Unlimited Data, but I think it could go far beyond that. Why stop at communications between some far off data center and your car and communications from biometric sensors to your doctor's computers? The real optimal end point is to be rid of silicon computing and the associated data centers altogether.
The individual human, or animal or plant becomes a compute/network/storage device in its own right. Communications could then take place in the network that is all of the biosphere directly between organisms as willed and then even encrypted to pass around the globe hopping from organism to organism without their knowledge. Information becomes like the most virulent virus imaginable, with no immune system and the greatest of speed.
Hmm! Maybe this goes too far. Wouldn't such an information utopia strip the value from information to the point that the benefit of such great technology then becomes the downfall to that same technology?!
What do you think?...