14%
27.09.2021
="test.bin", status="replace", &
18 action="readwrite", &
19 iostat = ierr)
20 if (ierr > o) then
21 write(*,*) "error in opening file Stopping"
22 stop
23 else
24 do
14%
20.05.2014
.50 for one hour with 100 instances for Hadoop (100 x US$ 0.015) and up to US$ 6.00 for up to 100 instances that run on-demand (100 x US$ 0.06). The bottom line is that you are billed for US$ 7.50 per hour
14%
17.06.2017
time (no dynamic memory), so if you declare an array x(100,100), you cannot change the dimensions or size after the code has been compiled. One trick was to define one very large vector and then "give
14%
17.05.2017
, ALLOCATABLE, TARGET :: DATA(:,:) ! Data to write
18 INTEGER :: RANK = 2 ! Dataset rank
19
20 CHARACTER(MPI_MAX_PROCESSOR_NAME) HOSTNAME
21 CHARACTER(LEN=100) :: FILENAME ! File name
22 CHARACTER(LEN=3) :: C
14%
06.05.2014
US$ 1.50 for one hour with 100 instances for Hadoop (100 x US$ 0.015) and up to US$ 6.00 for up to 100 instances that run on-demand (100 x US$ 0.06). The bottom line is that you are billed for US$ 7
14%
12.02.2014
: 0.0% steal: 0.0%
guest: 0.0%
CPU6
user: 0.0% nice: 0.0%
system: 0.0% idle: 100.0%
iowait: 0.0% irq: 0.0%
softirq: 0.0% steal: 0.0%
guest: 0
14%
02.02.2021
--reserve-mb='auto'
40 %end
41
42 %anaconda
43 pwpolicy root --minlen=6 --minquality=1 --notstrict --nochanges --notempty
44 pwpolicy user --minlen=6 --minquality=1 --notstrict --nochanges --emptyok
45
13%
22.12.2017
containers can be found online [6].
For a better understanding of the setup, a brief introduction to the OpenShift PaaS framework follows before I then look at the configuration of the storage system. Open
13%
01.06.2024
Laboratory hosts a number of excellent parallel programming tutorials at its Leadership Computing Facility, including one demonstrating the Monte Carlo method in both serial and parallel implementations [6]. I
13%
09.08.2015
.bz2 archive out of the latest stable Cockpit version from the GitHub repository [6], unzip it, and then compile Cockpit using the well-known rule of three:
./configure
make
make install
You then need