Synopsis
The Texas Advanced Computing Center (TACC) is part of the University of Texas at Austin. TACC designs and operates some of the world's most powerful computing resources. The center's mission is to enable discoveries that advance science and society through the application of advanced computing technologies.
Episodes
-
Code @ TACC Wearables Summer Camp
25/07/2017 Duration: 05minOur technology is becoming more personal and wearable. Everything from fitness trackers, to sleep trackers, to heart rate headphones aim to keep vital information about us at our fingertips. In June of 2017 TACC hosted a summer camp for high school students to learn how to make and program their own custom wearable technology. It's called Code @ TACC Wearables. The Code @ TACC Wearables Camp guided 27 high school students from the Austin area in how to fashion wearable circuits that responded to things like light and temperature and were connected to the Internet of Things. Podcast host Jorge Salazar interviews Joonyee Chuah, Outreach Coordinator at the Texas Advanced Computing Center.
-
A Retrospective Look at the Stampede Supercomputer - Science Highlights
20/07/2017 Duration: 20minWelcome to a retrospective look at a few of the science highlights of the Stampede supercomputer, one of the most powerful supercomputers in the U.S. for open science research between 2013-2017. Funded by the National Science Foundation and hosted by The University of Texas at Austin, the Stampede system at the Texas Advanced Computing Center achieved nearly 10 quadrillion operations per second. Podcast host Jorge Salazar interviews Peter Couvares, staff scientist at LIGO; University of California Santa Barbara physicist Robert Sugar; and Ming Xue, Professor in the School of Meteorology at the University of Oklahoma and Director of the Center for Analysis and Prediction of Storms. Stampede helped researchers make discoveries across the full spectrum of science, including insight into diseases like cancer and Alzheimer's; the insides of stars and the signals of gravitational waves; natural disaster prediction of hurricanes, earthquakes, and tornados; and more efficient engineering in projects such as designing
-
A Retrospective Look at the Stampede Supercomputer - The Technology
19/07/2017 Duration: 17minIn 2017, the Stampede supercomputer, funded by the National Science Foundation, completed its five-year mission to provide world-class computational resources and support staff to more than 11,000 U.S. users on over 3,000 projects in the open science community. But what made it special? Stampede was like a bridge that moved thousands of researchers off of soon-to-be decommissioned supercomputers, while at the same time building a framework that anticipated the eminent trends that came to dominate advanced computing. Podcast host Jorge Salazar interviews Dan Stanzione, Executive Director of the Texas Advanced Computing Center; Bill Barth, Director of High Performance Computing and a Research Scientist at the Texas Advanced Computing Center; and Tommy Minyard, Director of Advanced Computing Systems at the Texas Advanced Computing Center.
-
Code @TACC Robotics Camp delivers on self-driving cars
03/07/2017 Duration: 08minOn June 11 through 16 of 2017, TACC hosted a week-long summer camp called Code @TACC Robotics, funded by the Summer STEM Funders Organization under the supervision of the KDK Harmon Foundation. Thirty-four students attended. Five staff scientists at TACC and two guest high school teachers from Dallas and Del Valle also gave the students instruction. The students divided themselves into teams each with specific roles of principal investigator, validation engineer, software developer, and roboticist. They assembled a robotic car from a kit and learned how to program its firmware. The robotic cars had sensors that measured the distance to objects in front, and they could be programmed to respond to that information by stopping or turning or even relaying that information to another car near it. Teams were assigned a final project based on a real-world problem, such as what action to take when cars arrive together at a four-way stop. Podcast host Jorge Salazar interviews Joonyee Chuah, outreach coordinator at the
-
Reaching for the Stormy Cloud with Chameleon
04/05/2017 Duration: 12minPodcast host Jorge Salazar interviews Xian-He Sun, Distinguished Professor of Computer Science at the Illinois Institute of Technology. What if scientists could realize their dreams with big data? On the one hand you have parallel file systems for number crunching. On the other, you have Hadoop file systems, made for cloud computing with data analytics. The problem is that one doesn't know what the other is doing. You have to copy files from parallel to Hadoop. Doing that is so slow it can turn a supercomputer into a super slow computer. Computer scientists developed in 2015 a way for parallel and Hadoop to talk to each other. It's a cross-platform Hadoop reader called PortHadoop, short for portable Hadoop. The scientist have since improved it, and it's now called PortHadoop-R. It's good enough to start work with real data in the NASA Cloud library project. The data are used for real-time forecasts of hurricanes and other natural disasters; and also for long-term climate prediction. A supercomputer at TACC he
-
When Data's Deep, Dark Places Need to be Illuminated
07/02/2017 Duration: 23minThe World Wide Web is like an iceberg, with most of its data hidden below the surface. There lies the 'deep web,' estimated at 500 times bigger than the 'surface web' that most people see through search engines like Google. A innovative data-intensive supercomputer at TACC called Wrangler is helping researchers get meaningful answers from the hidden data of the public web. Wrangler uses 600 terabytes of flash storage that speedily reads and write files. This lets it fly past bottlenecks with big data that can slow down even the fastest computers. Podcast host Jorge Salazar interviews graduate student Karanjeet Singh; and Chris Mattmann, Chief Architect in the Instrument and Science Data, Systems Section of NASA's Jet Propulsion Laboratory at the California Institute of Technology. Mattmann is also an adjunct Associate Professor of Computer Science at the University of Southern California and a member of the Board of Directors for the Apache Software Foundation.
-
How to See Living Machines
02/12/2016 Duration: 15minPodcast host Jorge Salazar interviews Eva Nogales, Professor in the Department of Molecular and Cellular Biology at UC Berkeley and Senior Faculty Scientist and Howard Hughes Medical Investigator at Lawrence Berkeley National Laboratory; and Ivaylo Ivanov, Associate Professor of in the Department of Chemistry at Georgia State University. Scientists have taken the closest look yet at molecule-sized machinery called the human preinitiation complex. It basically opens up DNA so that genes can be copied and turned into proteins. The science team formed from Northwestern University, Berkeley National Laboratory, Georgia State University, and UC Berkeley. They used a cutting-edge technique called cryo-electron microscopy and combined it with supercomputer analysis. They published their results May of 2016 in the journal Nature. Over 1.4 million 'freeze frames' of the human preinitiation complex, or PIC, were obtained with cryo-electron microscopy. They were initially processed using supercomputers at the National E
-
Lori Diachin Highlights Supercomputing Technical Program
02/12/2016 Duration: 11minPodcast host Jorge Salazar interviews Lori Diachin of Lawrence Livermore National Laboratory. She's the Director for the Center for Applied Scientific Computing and Research Program Manager and Point of Contact for the Office of Science Advanced Scientific Computing Research organization. She also leads the Frameworks Algorithms and Scalable Technologies for Mathematics (FASTMath) SciDAC center. This year Dr. Diachin was the Chair of the Technical Program at SC16. Right before the conference she spoke by phone to talk about the highlights and some changes happening at SC16. Lori Diachin: I think the most important thing I'd like people to know about SC16 is that it is a great venue for bringing the entire community together, having these conversations about what we're doing now, what the environment looks like now and what it'll look like in five, ten fifteen years. The fact that so many people come to this conference allows you to really see a lot of diversity in the technologies being pursued, in the kind
-
John McCalpin Surveys Memory Bandwidth of Supercomputers
02/12/2016 Duration: 24minPodcast host Jorge Salazar reports on SC16 in Salt Lake City, the 28th annual International Conference for High Performance Computing, Networking, Storage and Analysis. The event showcases the latest in supercomputing to advance scientific discovery, research, education and commerce. The podcast interview features John McCalpin, a Research Scientist in the High Performance Computing Group at the Texas Advanced Computing Center and Co-Director of the Advanced Computing Evaluation Laboratory at TACC. Twenty-five years ago as an oceanographer at the University of Delaware, Dr. McCalpin developed the STREAM benchmark. It continues to be widely used as a simple synthetic benchmark program that measures sustainable memory bandwidth and the corresponding computation rate for simple vector kernels. Dr. McCalpin was invited to speak at SC16. His talk is titled, "Memory Bandwidth and System Balance in HPC Systems." John McCalpin: The most important thing for an application-oriented scientist or developer to understand
-
Sadasivan Shankar Proposes Co-design 3.0 for Supercomputing
02/12/2016 Duration: 21minPodcast host Jorge Salazar interviews Sadasivan Shankar, the Margaret and Will Hearst Visiting Lecturer in Computational Science and Engineering at the John A. Paulson School of Engineering and Applied Sciences at Harvard University. Computers hardware speeds have grown exponentially for the past 50 years. We call this Moore's Law. But we haven't seen a Moore's Law for software. That's according to Sadasivan Shankar of Harvard University. He said that the reason for that is a lack of communication and close collaboration between hardware developers and the users trying to solve problems in fields like social networking, cancer modeling, personalized medicine, or designing the next generation battery for electrical storage. Dr. Shankar proposes a new paradigm in which the software applications should be part of the design of new computer architectures. He calls this paradigm Co-Design 3.0. Shankar was invited to speak about it at the SC16 conference. Sadasivan Shankar: We want to see what will make high perfor
-
Kelly Gaither Starts Advanced Computing for Social Change
02/12/2016 Duration: 20minPodcast host Jorge Salazar interviews Kelly Gaither, Director of Visualization at the Texas Advanced Computing Center (TACC). Gaither is also the Director of Community Engagement and Enrichment for XSEDE, the Extreme Science and Engineering Discovery Environment, funded by the National Science Foundation. XSEDE identified 20 graduate and undergrad students to participate in a week-long event called Advanced Computing for Social Change. The event is hosted by XSEDE, TACC and SC16. The SC16 Social Action Network student cohort will tackle a computing challenge. They will learn how to mine through a variety of data sets such as social media data encompassing a number of years and across large geographic regions. To complete their analysis in a timely fashion they will learn how to organize the large data sets to allow fast queries. The students of the SC16 Social Action Network will also use a computational modeling tool called risk terrain modeling that has been used to predict crime using crime statistics. Thi
-
John West Leads Diversity Efforts in Supercomputing
02/12/2016 Duration: 17minThe SC16 Supercomputing Conference has focused on raising awareness and helping to change attitudes about diversity. That's according to SC16 General Chair John West, Director of Strategic Initiatives at the Texas Advanced Computing Center. West explained that long-term efforts are underway at SC16 to promote diversity in the supercomputing community. These include a new double-blind review of technical papers; a new standing subcommittee focused on diversity and inclusion added to the conference organizing committee; adoption of demographics measurements of the SC16 conference committee and attendees; active recruitment of student volunteers at organizations and universities that serve underrepresented groups; on-site child care; an added official code of conduct; fellowships that promote inclusivity; and continued support of the Women and IT Networking program. John West: For me, (diversity) is a numbers problem. If you look at HPC (high performance computing), more and more communities are adopting advance
-
New Hikari Supercomputer Starts Solar HVDC
16/09/2016 Duration: 11minA new kind of supercomputer system has come online at the Texas Advanced Computing Center. It's called Hikari, which is Japanese for "light." What's new is that Hikari is the first supercomputer in the US to use solar panels and High Voltage Direct Current, or HVDC for its power. Hikari hopes to demonstrate that HVDC works not only for supercomputers, but also for data centers and commercial buildings. The Hikari project is a collaboration headed by NTT Facilities, based out of Japan and with the support of the New Energy and Industrial Technology Development Organization, or NEDO. NTT Facilities partnered with the University of Texas at Austin to begin demonstration tests of the HVDC power feeding system for the Hikari project in late August 2016. What it aims to show is that the high-capacity HVDC power equipment and lithium-ion batteries of Hikari can save 15 percent in energy compared to conventional systems. Podcast host Jorge Salazar discusses the Hikari HVDC project with Toshihiro Hayashi, Assistant Ma
-
Soybean science blooms with supercomputers
16/08/2016 Duration: 11minIt takes a supercomputer to grow a better soybean. A project called the Soybean Knowledge Base, or SoyKB for short, wants to do just that. Scientists at the University of Missouri-Columbia developed SoyKB. They say they've made SoyKB a publicly-available web resource for all soybean data, from molecular data to field data that includes several analytical tools. SoyKB has grown to be used by thousands of soybean researchers in the U.S. and beyond. They did it with the support of XSEDE, the Extreme Science and Engineering Discovery Environment, funded by the National Science Foundation. The SoyKB team needed XSEDE resources to sequence and analyze the genomes of over a thousand soybean lines using about 370,000 core hours on the Stampede supercomputer at the Texas Advanced Computing Center. They're since moved that work from Stampede to Wrangler, TACC's newest data-intensive system. And they're getting more users onboard with an allocation on XSEDE's Jetstream, a fully configurable cloud environment for scien
-
Supercomputers Fire Lasers to Shoot Gamma Ray Beam
11/07/2016 Duration: 13minSupercomputers might have helped unlock a new way to make controlled beams of gamma rays, according to scientists at the University of Texas at Austin. The simulations done on the Stampede and Lonestar systems at TACC will guide a real experiment later this summer in 2016 with the recently upgraded Texas Petawatt Laser, one of the most powerful in the world. The scientists say the quest for producing gamma rays from non-radioactive materials will advance basic understanding of things like the inside of stars. What's more, gamma rays are used by hospitals to eradicate cancer, image the brain, and they're used to scan cargo containers for terrorist materials. Unfortunately no one has yet been able to produce gamma ray beams from non-radioactive sources. These scientists hope to change that. On the podcast are the three researchers who published their work May of 2016 in the journal Physical Review Letters. Alex Arefiev is a research scientist at the Institute for Fusion Studies and at the Center for High Energy
-
UT Chancellor William McRaven on TACC supercomputers - "We need to be the best in the world"
14/06/2016 Duration: 06minUniversity of Texas System Chancellor William McRaven gave a podcast interview at TACC during a visit for its building expansion dedication and the announcement of a $30 million award from the National Science Foundation for the new Stampede 2 supercomputer system. Chancellor McRaven spoke of his path to lead the UT System of 14 Institutions, the importance of supercomputers to Texans and to the nation, the new Dell Medical School, and more. William McRaven: "Behind all of this magnificent technology are the fantastic faculty, researchers, interns, our corporate partners that are part of this, the National Science Foundation, there are people behind all of the success of the TACC. I think that's the point we can never forget."
-
Zika Hackathon Fights Disease with Big Data
24/05/2016 Duration: 07minOn May 15th Austin, Texas held a Zika Hackathon. More than 50 data scientists, engineers, and UT Austin students gathered downtown at the offices of Cloudera, a big data company. They used big data to help fight the spread of Zika. Mosquitos carry and spread the Zika virus, which can cause birth defects and other symptoms like fever. The U.S. Centers for Disease Control is now ramping up collection of data that tracks Zika spread. But big gaps exist in linking different kinds of data, and that makes it tough for experts to predict where it will go next and what to do to prevent it. The Texas Advanced Computing Center provided time on the Wrangler data intensive supercomputer as a virtual workspace for the Zika hackers. Featured on the podcast are Ari Kahn, Texas Advanced Computing Center; and Eddie Garcia, Cloudera. Podcast hosted by Jorge Salazar of TACC.
-
Sudden Collapse: Supercomputing Spotlight on Gels
03/05/2016 Duration: 11minChemical engineering researcher Roseanna Zia has begun to shed light on the secret world of colloidal gels - liquids dispersed in a solid. Yogurt, shampoo, and Jell-o are just a few examples. Sometimes gels act like liquids, and sometimes they act like a solid. Understanding the theory behind these transitions can translate to real-world applications, such as helping understand why mucus - also a colloidal gel - in the airway of people with cystic fibrosis can thicken, resist flow and possibly threaten life. Roseanna Zia is an Assistant Professor of Chemical and Bimolecular Engineering at Cornell. She led development of the biggest dynamic computer simulations of colloidal gels yet, with over 750,000 particles. The Zia Group used the Stampede supercomputer at TACC through an allocation from XSEDE, the eXtreme Science and Engineering Environment, a single virtual system funded by the National Science Foundation (NSF) that allows scientists to interactively share computing resources, data and expertise. Podca
-
Docker for Science
28/04/2016 Duration: 14minScientists might find a friend in the open source software called Docker. It's a platform that bundles up the loose ends of applications - the software and the dependencies that sustain it - into something fairly light that can run on any system. As more scientists share not only their results but their data and code, Docker is helping them reproduce the computational analysis behind the results. What's more, Docker is one of the main tools used in the Agave API platform, a platform-as-a-service solution for hybrid cloud computing developed at TACC and funded in part by the National Science Foundation. Podcast host Jorge Salazar talks with software developer and researcher Joe Stubbs about using Docker for science. Stubbs is a Research Engineering and Scientist Associate in the Web & Cloud Services group at the Texas Advanced Computing Center.
-
Dark Energy of a Million Galaxies
01/04/2016 Duration: 12minUT Austin astronomer Steven Finkelstein eyes Wrangler supercomputer for HETDEX extragalactic survey, in this interview with host Jorge Salazar. A million galaxies far, far away are predicted to be discovered before the year 2020 thanks to a monumental mapping of the night sky in search of a mysterious force. That's according to scientists working on HETDEX, the Hobby-Eberly Telescope Dark Energy Experiment. They're going to transform the big data from galaxy spectra billions of light-years away into meaningful discoveries with the help of the Wrangler data-intensive supercomputer. "You can imagine that would require an immense amount of computing storage and computing power. It was a natural match for us and TACC to be able to make use of these resources," Steven Finkelstein said. Finkelstein is an assistant professor in the Department of Astronomy at The University of Texas at Austin (UT Austin). He's one of the lead scientists working on HETDEX. "HETDEX is one of the largest galaxy surveys that has ever b