100%
31.10.2025
conservatively originally
21 if {$force_conservative} {
22 set send_slow {1 .1}
23 proc send {ignore arg} {
24 sleep .1
25 exp_send -s -- $arg
26 }
27 }
28 ... 12
99%
31.10.2025
"."EUR_VALUE">=10)
26 3 - access<+>6<+>( "S"."SALE_DATE">TRUNC(SYSDATE@!)
27 -INTERVAL'+00-06' YEAR(2) TO MONTH)
1 The explain plan for command only stores the execution plan in the PLAN_TABLE.
2 The DBMS ... 12
97%
31.10.2025
seconds, for example:
# nc -p 16000 -w 30 examplehost.tld 22
If firewalling is in place and you need to originate your connection from a specific IP address to open a port, then you can enter:
# nc -s 1.2.3 ... 12
96%
31.10.2025
starting 12.34, so, for example, 12.34.56.78 will be allowed to connect to ALL services and not just SSH. As well as these flexible options, you can also declare old school subnets directly:
sshd: 1.2.3 ... 12
96%
31.10.2025
Service (Amazon S3)
19 - ami plugin for Amazon Simple Storage Service (Amazon S3)
20 - sftp plugin for SSH File Transfer Protocol
21 - ebs plugin for Elastic Block Storage
22 - local plugin ... 12
78%
01.06.2024
Rubén Llorente ... because Sake sets a number of shell variables that are passed to the task being run, which can include:
S_NAME
S_HOST
S_USER
S_PORT
S_TAGS
Listing 3
Multi-OS Upgrade Task
61%
04.12.2024
Rubén Llorente ... " {
11 endpoint = "https://192.168.3.15:8006/"
12 username = "root@pam"
13 password = "proxmox"
14 insecure = true
15 tmp_dir = "/var/tmp"
16
17 ssh {
18 agent = true
19 }
20
61%
25.09.2023
Rubén Llorente ... is easy enough (superuser privileges are required):
arp -s 192.168.90.55 00:0c:29:c1:91:b1
The last field of the command is the MAC address, which is the unique identifier of the network device to which
60%
30.01.2024
Rubén Llorente ...
8, 9
Debian
11, 12
Ubuntu
20.04, 22.04
FreeBSD
13.x
OpenBSD
7.3
I recommend the downloadable installer, but I will skip
34%
09.10.2017
if page.get('Contents') is not None:
21 for file in page.get('Contents'):
22 s3pump(file.get('Key'), bucket)
Data Highway?
For large S3 buckets with data in the multiterabyte ... Data on AWS S3 is not necessarily stuck there. If you want your data back, you can siphon it out all at once with a little Python pump. ... Data Exchange with AWS S3 ... Getting data from AWS S3 via Python scripts