The National Science Foundation has awarded the collective universities of Utah, Chicago and Michigan a $4 million, four-year grant to produce SLATE, a new online platform for stitching together large amount of data from multiple institutions that reduces friction commonly found in multi-faceted collaborations.
When complete, SLATE, which stands for Services Layer At The Edge, will be a local software and hardware platform that connects and interacts with cloud resources. This online platform will reduce the need for technical expertise, amount of physical resources like servers and time demands on individual IT departments, especially for smaller universities that lack the resources of larger institutions and computing centers.
From the cosmic radiation measurements by the South Pole Telescope to the particle physics of CERN, multi-institutional research collaborations require computing environments that connect instruments, data and storage servers. Because of the complexity of the science, and the scale of the data, these resources are often distributed among university research computing centers, national high-performance computing centers or commercial cloud providers. This resource disparity causes scientists to spend more time on the technical aspects of computation than on discoveries and knowledge creation.
A partner of the project, the University of Utah will be contributing to the following aspects of the project. Reference architecture, advanced networking aspects, core design, implementation and outreach to other principal investors from science disciplines and partner universities. Senior IT architect Joe Breen explained the goal is to have one simple platform for the end user.
“Software will be updated just like an app by experts from the platform operations and research teams. The software will need little to no assistance required from their local IT personnel. The SLATE platform is designed to work in any data center environment and will utilize advanced network capabilities, if available.”
The platform
Once installed, central research teams will be able to connect with far-flung research groups allowing the exchange of data to be automated. Software and computing tasks among institutions will no longer burden local system administrators with installation and operation of highly customized scientific computing services. By stitching together these resources, SLATE will also expand the reach of these domain-specific “science gateways.”
SLATE works by implementing “cyberinfrastructure as code,” increasing bandwidth science networks with a programmable “underlayment” edge platform. This platform hosts advanced services needed for higher-level capabilities such as data and software delivery, workflow services and science gateway components.
“A central goal of SLATE is to lower the threshold for campuses and researchers to create research platforms within the national cyberinfrastructure,” said University of Chicago senior fellow Robert Gardner.
Practical applications
Today’s most ambitious scientific investigations are too large for a single university or laboratory to tackle alone. Dozens of international collaborations comprised of scientific groups and institutions must coordinate the collection and analysis of immense data streams. These data streams include dark matter searches, the detection of new particles at the high-energy frontier and the precise measurement of radiation from the early universe. The data can come from telescopes, particle accelerators and other advanced instruments.
Today, many universities and research laboratories use a “Science DMZ” architecture to balance the need for security with the ability to rapidly move large amounts of data in and out of the local network. As sciences from physics to biology to astronomy become more data-heavy, the complexity and need for these subnetworks grows rapidly, placing additional strain on local IT teams.
Since 2003, a team of computation and Enrico Fermi Institute scientists led by Gardner has partnered with global projects to create the advanced cyberinfrastructure necessary for rapidly sharing data, computer cycles and software between partner institutions.
User benefits
“Science, ultimately, is a collective endeavor. Most scientists don’t work in a vacuum, they work in collaboration with their peers at other institutions,” said Shawn McKee, director of the Center for Network and Storage-Enabled Collaborative Computational Science at the University of Michigan. “They often need to share not only data, but systems that allow execution of workflows across multiple institutions. Today, it is a very labor-intensive, manual process to stitch together data centers into platforms that provide the research computing environment required by forefront scientific discoveries.”
With SLATE, local research groups will be able to fully participate in multi-institutional collaborations and contribute resources to their collective platforms with minimal hands-on effort from their local IT team. When joining a project, the researchers and admins can select a package of software from a cloud-based service — a kind of app store — that allows them to connect and work with the other partners.
By reducing the technical expertise and time demands for participating in multi-institution collaborations, the SLATE platform will be especially helpful to smaller universities that lack the resources and staff of larger institutions and computing centers. The SLATE functionality can also support the development of “science gateways” that make it easier for individual researchers to connect to HPC resources such as the Open Science Grid and XSEDE.
This release was adapted from a version by Robert Mitchum, University of Chicago.