9%
    
    
    07.06.2019
        
    
    	
        
Jeff Layton regaled readers of the last issue with a survey of Terminal User Interfaces (TUIs) [1], a term describing tools sporting in-terminal graphics powered by the curses library [2] or one
    
 
		    
				        
    8%
    
    
    18.02.2018
        
    
    	
         with threads in a shared main memory) [6] and Message Passing Interface (MPI, a library-based system for parallelization to distributed main memory, typically via a high-speed network connecting the nodes) [7
    
 
		    
				        
    8%
    
    
    07.11.2011
        
    
    	
        -project of the larger Open MPI community [2], is a set of command-line tools and API functions that allows system administrators and C programmers to examine the NUMA topology and to provide details about each processor
    
 
		    
				        
    8%
    
    
    12.03.2015
        
    
    	
         definitely stress the processor(s) and memory, especially the bandwidth. I would recommend running single-core tests and tests that use all of the cores (i.e., MPI or OpenMP).
A number of benchmarks
    
 
		    
				        
    8%
    
    
    31.05.2012
        
    
    	
        .
Applying these lessons to HPC, you might ask, “how do I tinker with HPC?” The answer is far from simple. In terms of hardware, a few PCs, an Ethernet switch, and MPI get you a small cluster; or, a video card
    
 
		    
				    
    8%
    
    
    04.12.2024
        
    
    	 
         maintenance.
Ken Hess Senior ADMIN Editor
Infos
"Dealing with IT Burnout" by Jeff Layton, ADMIN
, issue 50, 2019: https://www.admin-magazine.com/Archive/2019/50/Dealing-with-IT-Burnout
"Mental
    
 
		    
				        
    8%
    
    
    23.04.2013
        
    
    	
         on a separate computer. The results can be combined when the job is finished because the map step has no dependencies. The popular mpiBLAST tool takes the same approach by breaking the human genome file
    
 
		    
				        
    8%
    
    
    29.06.2012
        
    
    	
         standard “MPI is still great” disclaimer. Higher level languages often try to hide the details of low-level parallel communication. With this “feature” comes some loss of efficiency, similar to writing
    
 
		    
				        
    8%
    
    
    22.02.2017
        
    
    	
         to build the HDF5 libraries since they will require an MPI library with MPI-IO support. MPI-IO is a low-level interface for carrying out parallel I/O. It gives you a great deal of flexibility but also
    
 
		    
				        
    8%
    
    
    18.07.2013
        
    
    	
         no dependencies. The popular mpiBLAST tool takes the same approach by breaking the human genome file into chunks and performing "BLAST" mapping on separate cluster nodes.
Suppose you want to calculate the total