ARCHER logo ARCHER banner

The ARCHER Service is now closed and has been superseded by ARCHER2.

  • ARCHER homepage
  • About ARCHER
    • About ARCHER
    • News & Events
    • Calendar
    • Blog Articles
    • Hardware
    • Software
    • Service Policies
    • Service Reports
    • Partners
    • People
    • Media Gallery
  • Get Access
    • Getting Access
    • TA Form and Notes
    • kAU Calculator
    • Cost of Access
  • User Support
    • User Support
    • Helpdesk
    • Frequently Asked Questions
    • ARCHER App
  • Documentation
    • User Guides & Documentation
    • Essential Skills
    • Quick Start Guide
    • ARCHER User Guide
    • ARCHER Best Practice Guide
    • Scientific Software Packages
    • UK Research Data Facility Guide
    • Knights Landing Guide
    • Data Management Guide
    • SAFE User Guide
    • ARCHER Troubleshooting Guide
    • ARCHER White Papers
    • Screencast Videos
  • Service Status
    • Detailed Service Status
    • Maintenance
  • Training
    • Upcoming Courses
    • Online Training
    • Driving Test
    • Course Registration
    • Course Descriptions
    • Virtual Tutorials and Webinars
    • Locations
    • Training personnel
    • Past Course Materials Repository
    • Feedback
  • Community
    • ARCHER Community
    • ARCHER Benchmarks
    • ARCHER KNL Performance Reports
    • Cray CoE for ARCHER
    • Embedded CSE
    • ARCHER Champions
    • ARCHER Scientific Consortia
    • HPC Scientific Advisory Committee
    • ARCHER for Early Career Researchers
  • Industry
    • Information for Industry
  • Outreach
    • Outreach (on EPCC Website)

You are here:

  • ARCHER
  • Upcoming Courses
  • Online Training
  • Driving Test
  • Course Registration
  • Course Descriptions
  • Virtual Tutorials and Webinars
  • Locations
  • Training personnel
  • Past Course Materials Repository
  • Feedback

Contact Us

support@archer.ac.uk

Twitter Feed

Tweets by @ARCHER_HPC

ISO 9001 Certified

ISO 27001 Certified

Advanced MPI

ARCHER, the UK's national supercomputing service, offers training in software development and high-performance computing to scientists and researchers across the UK. As part of our training service we will be running a 2-day Advanced MPI training session.

Trainer


David Henty

David teaches on a wide range of EPCC's technical training courses, including MPI and OpenMP, and is overall course organiser for EPCC's MSc in High Performance Computing.

 

Details

This course is aimed at programmers seeking to deepen their understanding of MPI and explore some of its more recent and advanced features. We cover topics including communicator management, non-blocking and neighbourhood collectives, single-sided MPI and the new MPI memory model. We also look at performance aspects such as which MPI routines to use for scalability, overlapping communication and calculation and MPI internal implementation issues.

This course is free to all academics.

Intended learning outcomes

  • Understanding of how internal MPI implementation details affect performance
  • Familiarity with neighbourhood collective operations in MPI
  • Knowledge of MPI memory models for RMA operations
  • Familiarity with MPI RMA operations and single-sided communication
  • Understanding of best practice for MPI+OpenMP programming

Pre-requisites

Attendees should be familiar with MPI programming in C, C++ or Fortran, e.g. have attended the ARCHER MPI course.

Pre-course setup

All attendees should bring their own wireless-enabled laptop. Practical exercises will be done using a guest account on ARCHER. You will need an ssh client such as terminal on a Mac or Linux machine, or putty or MobaXterm on Windows. The course tutor will be able to assist with settings to connect on the day. You should also have a web browser, a pdf reader and a simple text editor.

Draft Timetable

(May be subject to change)

All sessions will include hands-on practical exercises in addition to lectures material.

Day 1: Thursday 26th April

  • 09:00 - 09:30 Welcome and Registration
  • 09:30 - 11:00 MPI Internals
  • 11:00 - 11:30 Coffee
  • 11:30 - 13:00 MPI Tools
  • 13:00 - 14:00 Lunch
  • 14:00 - 15:30 MPI Optimisations
  • 15:30 - 16:00 Coffee
  • 16:00 - 17:30 Advanced Collectives
  • 17:30 CLOSE

Day 2: Friday 27th April

  • 09:30 - 11:00 MPI + OpenMP (i)
  • 11:00 - 11:30 Coffee
  • 11:30 - 13:00 MPI + OpenMP (ii)
  • 13:00 - 14:00 Lunch
  • 14:00 - 15:30 New MPI shared-memory model
  • 15:30 - 16:00 Coffee
  • 16:00 - 17:00 Individual consultancy session
  • 17:00 CLOSE

Course Materials

Links to the Slides and exercise material for this course.

Location

The course will be held at University of Exeter

Registration

Please use the registration page to register for ARCHER courses.

Questions?

If you have any questions please contact the ARCHER Helpdesk.

Copyright © Design and Content 2013-2019 EPCC. All rights reserved.

EPSRC NERC EPCC