Center for High Throughput Computing (CHTC)
About this resource
Established in 2006, the Center for High Throughput Computing (CHTC) is committed to democratizing access to powerful computing resources across all research domains. From their website: > We are the University of Wisconsin-Madison’s core computational service provider for large scale computing. CHTC services are open to UW-Madison staff, students, faculty, and external collaborators. > We offer both a High Throughput Computing system and a High Performance Computing cluster. Access to CPUs/GPUs, high-memory servers, data storage capacity, as well as personalized consultations and classroom support, are provided at no-cost.
Are you a researcher at UW-Madison seeking to extend your computing capabilities beyond local resources, particularly for machine learning tasks? Request an account now to take advantage of the open computing services offered by the CHTC!
Run machine learning jobs on CHTC — GPUs available
This guide provides some of our recommendations for success in running machine learning (specifically deep learning) jobs in CHTC. If you need help leveraging any of the CHTC resources, please reach out to chtc@cs.wisc.edu.
Questions?
If you have any lingering questions about this resource, please feel free to post to the Nexus Q&A on GitHub. We will improve materials on this website as additional questions come in.
See also
- Compute: Intro to AWS SageMaker for Predictive ML/AI. Learn how to launch and scale machine learning workflows in the cloud using AWS SageMaker.
 - Compute: Google Colab - Learn how to use Google Colab for machine learning workflows.
 - Compute: BadgerCompute – UW–Madison’s lightweight, NetID-authenticated Jupyter service for short interactive sessions and classroom use. Includes a 4-hour runtime limit (which may sometimes beat the free version of Colab).