Mining millions of genomes for the next powerful antibiotic November 01, 2024
Interview of Erik Wright, assistant professor of biomedical informatics at the University of Pittsburgh and his use of the OSPool.
Interview of Erik Wright, assistant professor of biomedical informatics at the University of Pittsburgh and his use of the OSPool.
Seeking to unlock the secrets of the “glue” binding visible matter in the universe, the ePIC Collaboration stands at the forefront of innovation. Led by a collective of hundreds of scientists and engineers, the Electron-Proton/Ion Collider (ePIC) Collaboration was formed to design, build, and operate the first experiment at the Electron-Ion Collider (EIC).
NOAA funded marine scientist uses OSPool access to high throughput computing to explode her boundaries of research.
Aashish Tripathee has used multiple file transfer systems and experienced challenges with each before using the Open Science Data Federation (OSDF). With the OSDF, Tripathee has already seen an improvement in data transfers.
The Event Horizon Telescope Collaboration furthers black hole research with a little help from the OSPool open capacity.
A spotlight on two newer contributors to the OSPool and the onboarding process.
The National Radio Astronomy Observatory’s collaboration with the NSF-funded PATh and Pelican projects leads to successfully imaged deep space.
Data Processing for Very Large Array Makes Deepest Radio Image of the Hubble Ultra Deep Field
Studying the impact of two high-fire years in California on over 600 species, ecologists enlist help from CHTC.
Associate State Cartographer Jim Lacy works with CHTC to digitize and preserve historical aerial photography for the public.
The OSG Consortium hosted its annual OSG School in August 2023, assisting participants from a wide range of campuses and areas of research through HTC learning.
Researching land use change in the cattle sector is just one of several large projects where the GLUE Lab is working to apply HTC.
With this technique and the computing power of high throughput computing (HTC) combined, researchers can obtain thousands of simulations to study the pathology of tendons and ligaments.
In the Hanna Lab, researchers use high throughput computing as a critical tool for training robots with reinforcement learning.
The Spalding Lab uses high throughput computing to study plant physiology.
Jimena González and Aashish Tripathee named 2023’s David Swanson awardees
Great Plains Augmented Gateway to the OSG (GP-ARRGO) receives National Science Foundation (NSF) CC* award
USGS uses HTCondor to pre-process 100,000+ images to enable access to Machine Learning and AI analysis of the Mars surface.
Assistant Professor Eric Jonas uses OSG resources to understand the structure of molecules based on their measurements and derived properties.
Over 50 students chose to participate in a distributed computing workshop from the 7th biennial African School of Physics (ASP) 2022 at Nelson Mandela University in Gqeberha, South Africa.
Google’s launch of a Quantum Virtual Machine emulates the experience and results of programming one of Google’s quantum computers, managed by an HTCondor system running in Google Cloud.
Ajay Annamareddy, a research scientist at the University of Wisconsin-Madison, describes how he utilizes high-throughput computing in computational materials science.
Cody Messick, a Postdoc at the Massachusetts Institute of Technology (MIT) working for the LIGO lab, describes LIGO’s use of HTCondor to search for new gravitational wave sources.
Eric Wilcots, UW-Madison dean of the College of Letters & Science and the Mary C. Jacoby Professor of Astronomy, dazzles the HTCondor Week 2022 audience.
Jacqueline M. Fulvio, lab manager and research scientist for the Postle Lab at the University of Wisconsin-Madison, explains how she used the HTCondor Software Suite to investigate neural oscillations in visual working memory.
Matthew Garcia, a Postdoctoral Research Associate in the Department of Forest & Wildlife Ecology at the University of Wisconsin–Madison, discusses how he used the HTCondor Software Suite to combine HTC and HPC capacity to perform simulations that modeled the dispersal of budworm moths.
For the first time, UW Statistics undergraduates could participate in a course teaching high throughput computing (HTC). John Gillett, lecturer of Statistics at the University of Wisconsin-Madison, designed and taught the course with the support of the Center for High Throughput Computing (CHTC).
Justin Hiemstra, a Machine Learning Application Specialist for CHTC’s GPU Lab, discusses the testing suite developed to test CHTC’s support for GPU and ML framework compatibility.
Arrielle C. Opotowsky, a 2021 Ph.D. graduate from the University of Wisconsin-Madison’s Department of Engineering Physics, describes how she utilized high throughput computing to expedite nuclear forensics investigations.
Postdoctoral researcher Parul Johri uses OSG services, the HTCondor Software Suite, and the population genetics simulation program SLiM to investigate historical patterns of genetic variation.
The stunning new image of a supermassive black hole in the center of the Milky Way was created by eight telescopes, 300 international astronomers and more than 5 million computational tasks. This Morgridge Institute article describes how the Wisconsin-based Open Science Pool helped make sense of it all.
A mutually beneficial partnership between Jefferson Lab and the OSG Consortium at both the organizational and individual levels has delivered a prolific impact for the CLAS12 Experiment.
The U.S. National Institute of Allergy and Infectious Diseases (NIAID) and the African Centers for Excellence in Bioinformatics and Data-Intensive Science (ACE) partnered with the OSG Consortium to host a virtual high throughput computing training session for graduate students from Makerere University and the University Of Sciences, Techniques, and Technologies of Bamako (USTTB).
David Swanson Memorial Award winner, Connor Natzke’s journey with the OSG Consortium began in 2019 as a student of the OSG User School. Today, nearly three years later, Natzke has executed 600,000 simulations with the help of OSG staff and prior OSG programming. These simulations, each of them submitted as a job, logged over 135,000 core hours provided by the Open Science Pool (OSPool). Natzke’s history with the OSG Consortium reflects a pattern of learning, adapting, and improving that translates to the acceleration and expansion of scientific discovery.
In this presentation from HTCondor Week 2021, Joao Dorea from the Digital Livestock Lab explains how high-throughput computing is used in the field of animal and dairy sciences.
Collaborating with CHTC research computing facilitation staff, UW-Madison researcher Gaylen Fronk is using HTC to improve cigarette cessation treatments by accounting for the complex differences among patients.
Researchers at the USGS are using HTC to pinpoint potential invasive species for the United States.
BAnQ’s digital collections team recently used HTCSS to tackle their largest computational endeavor yet –– completing text recognition on all newspapers in their digital archives.
An evolutionary biologist at the AMNH used HTC services provided by the OSG to unlock a genomic basis for convergent evolution in bats.
In the face of the pandemic, scientists needed to adapt. This article by the Morgridge Institute for Research provides a thoughtful look into how individuals and organizations, including the CHTC, have pivoted in these challenging times.
Anirvan Shukla, a User School participant in 2016, spoke at this year’s Showcase about how high throughput computing has transformed his research of antimatter in the last five years.
During the OSG School Showcase, Hannah Moshontz, a postdoctoral fellow at UW-Madison’s Department of Psychology, described her experience of using high throughput computing (HTC) for the very first time, when taking on an entirely new project within the field of psychology.
During the OSG Virtual School Showcase, three different researchers shared how high throughput computing has made lasting impacts on their work.
Kicking off the OSG User School Showcase, Spencer Ericksen, a researcher at the University of Wisconsin-Madison’s Carbone Cancer Center, described how high throughput computing (HTC) has made his work in early-stage drug discovery infinitely more scalable.
How undergraduates at the University of Nebraska-Lincoln developed a science gateway that enables researchers to build RNA nanomachines for therapeutic, engineering, and basic science applications.
When Greg Daues at the National Center for Supercomputing Applications (NCSA) needed to transfer 460 Terabytes of NCSA files from the National Institute of Nuclear and Particle Physics (IN2P3) in Lyon, France to Urbana, Illinois, for a project they were working with FNAL, CC-IN2P3 and the Rubin Data Production team, he turned to the HTCondor High Throughput system, not to run computationally intensive jobs, as many do, but to manage the hundreds of thousands of I/O bound transfers.