Code: Select all
#!/bin/python
print('import packages')
from datetime import datetime
import time
now = datetime.now().strftime("%Y-%m-%d_%H-%M-%S")
tarfile = now + '_stardist.tar'
print(tarfile)
rule all:
input: {tarfile}
rule main_rule:
input:
output: {tarfile}
run:
shell('touch ' + output[0])
Code: Select all
import packages
2025-02-04_11-33-38_stardist.tar
Building DAG of jobs...
Using shell: /usr/bin/bash
Provided cores: 1 (use --cores to define parallelism)
Rules claiming more threads will be scaled down.
Job stats:
job count min threads max threads
--------- ------- ------------- -------------
all 1 1 1
main_rule 1 1 1
total 2 1 1
Select jobs to execute...
[Tue Feb 4 11:33:38 2025]
rule main_rule:
output: 2025-02-04_11-33-38_stardist.tar
jobid: 1
reason: Missing output files: 2025-02-04_11-33-38_stardist.tar
resources: tmpdir=/tmp
import packages
2025-02-04_11-33-39_stardist.tar
Building DAG of jobs...
Using shell: /usr/bin/bash
Provided cores: 1 (use --cores to define parallelism)
Rules claiming more threads will be scaled down.
Select jobs to execute...
Waiting at most 5 seconds for missing files.
MissingOutputException in rule main_rule in file /lustre/projects/xxx/test_timestamp_minimal/snakefile, line 14:
Job 1 completed successfully, but some output files are missing. Missing files after 5 seconds. This might be due to filesystem latency. If that is the case, consider to increase the wait time with --latency-wait:
2025-02-04_11-33-38_stardist.tar
Shutting down, this might take some time.
Exiting because a job execution failed. Look above for error message
Complete log: .snakemake/log/2025-02-04T113338.450326.snakemake.log
Kann jemand bitte helfen? 1. Ich habe meinen Code auf einer lokalen Windows10 -Maschine (Miniforge) und auf CentOS7 (dem Kopfknoten einer HPC -Umgebung) getestet. /p>
Dies ist die Ausgabe auf der lokalen Maschine (ich platziere die Ausgabe des HPC-Node nicht erneut, da er mit dem vergleichbar ist, was ich bereits gemeldet habe): < /p>
import packages
2025-02-05_13-13-27_stardist.tar
Assuming unrestricted shared filesystem usage.
host: screening-pc-4
Building DAG of jobs...
Provided cores: 1 (use --cores to define parallelism)
Rules claiming more threads will be scaled down.
Job stats:
job count
--------- -------
all 1
main_rule 1
total 2
Select jobs to execute...
Execute 1 jobs...
[Wed Feb 5 13:13:27 2025]
localrule main_rule:
output: 2025-02-05_13-13-27_stardist.tar
jobid: 1
reason: Missing output files: 2025-02-05_13-13-27_stardist.tar
resources: tmpdir=C:\Users\xxx\AppData\Local\Temp
[Wed Feb 5 13:13:27 2025]
Finished job 1.
1 of 2 steps (50%) done
Select jobs to execute...
Execute 1 jobs...
[Wed Feb 5 13:13:27 2025]
localrule all:
input: 2025-02-05_13-13-27_stardist.tar
jobid: 0
reason: Input files updated by another job: 2025-02-05_13-13-27_stardist.tar
resources: tmpdir=C:\Users\xxx\AppData\Local\Temp
[Wed Feb 5 13:13:27 2025]
Finished job 0.
2 of 2 steps (100%) done
Complete log: .snakemake\log\2025-02-05T131327.214796.snakemake.log
< /code>
Was ich nicht verstehe:
im Fall der Win10 -Umgebung (wo es funktioniert): Snakemake meldet die folgenden Zeilen nur einmal: < /p>
import packages
2025-02-04_14-14-23_stardist.tar
< /code>
Während es schief geht, scheint die Snakemake -Datei zweimal auszuführen (einmal für jeden Job)? zum zweiten Job.