Personen | Personen nieuwe site | Google | Route | Contact Login 
Course details 2012-2013  
Parallel Computing
Course Code :2001WETCLC
Study domain:Computer Science
Semester:Semester: 2nd semester
Contact hours:45
Study load (hours):168
Contract restrictions: No contract restriction
Language of instruction :English
Exam period:exam in the 2nd semester
Lecturer(s)Jan Broeckhove


1. Prerequisites

At the start of this course the student should have acquired the following competences:
Specific prerequisites for this course:
This course requires a relatively broad background. It contains a systems-oriented part that requires knowledge of operating systems, networks and to some extent, distributed systems. It also contains a part on developing cluster programs based on MPI and this requires a thorough knowledge of C++ programming.

2. Learning outcomes

The ''cluster systems'' part of the course aims to reach following objectives
  • you are knowledgeable about cluster architecture and middleware
  • you are experienced in installing middleware on a beowulf cluster
  • you are capable of using cluster benchmarking tools
The ''MPI based parallel programming'' part of the course aims to reach the following objectives
  • you are knowledgeable about the basic concepts of MPI message passing
  • you are capable of programming and running parallel MPI programs
  • you can analyze the efficiency of MPI programs
This course combines knowledge and insight in grids with experience in working with operational grid systems.

3. Course contents

The course consists of two parts. The first part, ''cluster systems'', is oriented towards cluster middleware with attention focused on beowulf clusters. The second part ''MPI based parallel programming'' deals with  traditional, message based, parallel programming. It covers algorithms and their implementation in MPI programs.

The ''cluster systems'' part contains following topics:
  • Cluster concepts
  • Middleware components
  • Cluster monitoring
  • Job management
  • Benchmarking
It is geared towards providing experience in managing and working with cluster systems . The second part is geared towards program development and contains following topics:
  • Background on parallel computing
  • A firsdt look at MPI
  • MPI: Basics
  • MPI: Advanced Features
  • Timing and Profiling
It provides insight into the possibilities and limitations of the message-passing paradigm and its use through MPI.

4. Teaching method

Class contact teaching:
  • Lectures
  • Practice sessions
  • Tutorials
  • Laboratory sessions

  • Personal work:
  • Assignments:In group

  • 5. Assessment method and criteria

  • Oral without written preparation

  • Portfolio:
  • With oral presentation

  • 6. Study material

    Required reading

    Course notes are provided and after each theory session handouts are made available through the Blackboard system.

    Optional reading

    The following study material can be studied on a voluntary basis:
    [1]  Beowulf Cluster Computing with Linux, Second Edition
           W. Gropp, E. Lusk and T. Sterling, 2002, The MIT Press, Cambridge, Massachusetts
    [2]  Using MPI: Portable Parallel Programming with the Message Passing Interface
           W. Gropp, E. Lusk and A. Skjellum, 1999, The MIT Press, Cambridge, Massachusetts

    7. Contact information
    For questions concerning the theory sessions, contact
    • Jan Broeckhove, email:, building G, G205
    • Frans Arickx, email:, building G, G206
    For questions concernin the lab sessions, contact
    • Sam Verboven, email:, building G, G212

    (+)last update: 02/09/2011 15:16 jan.broeckhove