I am trying to run multiple jobs at once. I have written a script which runs this job one by one, but I want to submit all of them at once. It should not wait for the former job to finish. The script is as follows,
#!/bin/bash --login
# Default job location in current directory
#SBATCH -p multicore
#SBATCH -n 4 # Use 4 cores
module load gaussian/g16c01_em64t
export GAUSS_SCRDIR=/scratch/$USER/gau_temp_$JOB_ID
mkdir -p $GAUSS_SCRDIR
export GAUSS_PDEF=$NSLOTS
jobnames=($(seq 1 1 100))
for i in ${jobnames[@]}; do
$g16root/g16/g16 < "job_"${i}".gjf" > "job_"${i}".out"
done
The input files for the job are in the same directory with the names job_1.gjf,job_2.gjf...job_100.gjf. This code runs well with sequential job runs.
parallelmay suit you&? If you runcmd &, thatcmdwill execute in background and you can run any number of jobs this way.