We spoke with HudsonAlpha CIO Peyton McNully about his experience beta testing HPE Synergy, and how it transformed his team’s computing requirements from the inside out.
Q: As an early stage beta user—and now a Synergy customer—can you describe your overall experience with the platform?
A: Previously, we were a HPE c7000 blades customer, but our computing requirements were changing and we needed a flexible platform for our researchers to convert huge volumes of data into useful information. We’ve been able to build on our previous interactions with HPE OneView, an infrastructure management platform that pairs with Synergy, and certain aspects of composability that we now apply to networking as well as storage within the Synergy frame. Our containerized workflows and deployments of those workloads on Synergy is very similar (the same toolsets) to how we would provision public cloud.
Early benefits of Synergy, now over one year in, include, a downward trend of help desk tickets for server provisioning tasks, an IT OPS team that can better articulate needs to our end users, and faster deployments of software applications for research and clinical applications.
Our new approach allows users to declare resource templates, containerize apps with Docker and deploy those apps to a Synergy frame at either one of the data centers with little to no “Ops intervention”. Anytime we can empower users with better awareness and tighter tool integration to achieve business value with technology the whole organization wins.
Q: What specific business and technology outcomes did you recognize during your time as a Synergy beta tester?
A: Most noticeably, we improved efficiency with a standardized delivery mechanisms for new apps and computation workloads on cloud-based or local Synergy instances. All of this delivers new value in the research work being conducted at HudsonAlpha because more research is done, fewer issues are found out last second, and end users have tools and APIs to engage with versus email and helpdesk tickets or “portals”.
I will also add, as genomics workloads increased in scale (quantities and individual size per unit), there was a need to look at IT different. Our ability to efficiently add or subtract storage, compute, and fabric from some of our workloads became crucial from the outset. We moved to a model where we could better orchestrate resources based on the task at hand and not continually attempt to “scale out” and hold on for dear life.
Simply put, our organization streamlined the process of app creation to make the timing faster and end-result better than before.
Q: Did you set any goals or milestones around the launch of Synergy at HudsonAlpha?
A: Our real goal early on was to be better stewards of our computing resources. We were quite impressed with the ability to take any open source tool and use Synergy as a platform to better customize the environment to the workloads—not the other way around.
An example of this was, using only Synergy, Image Streamer, and the latest release of OpenStack our team deployed an entire cluster in a matter of minutes, added resources, removed resources, and then proceeded to schedule jobs on these same nodes. We were very impressed how far this simple orchestration element could take us.
Regarding our computing stewardship goals, we can better leverage a pool of resources for both hyperscale and standard enterprise needs. At night, we can recover resources during off-peak hours and put it toward the large-scale genomics pipeline. To have a tool that makes it feasible to recover operations faster is new for us.
Ultimately, we were impressed with the ability to use an open source tool on the Synergy platform because our products and apps add to our business value.
Q: Why did you decide to run Docker Enterprise Edition on Synergy as opposed to another solution?
A: There were a few reasons why we selected Docker EE with Synergy—specifically for its robust security scanning and role-based permissions. We needed strong data security measures to ensure we were aligned with best practices for data sovereignty and maintaining HIPAA privacy laws. Genomic medicine workloads hold a significant amount of data and it’s critically important to me that we keep patients secure from cyber security threats.