[You can download the code from: MPjobs at my Google Drive site Once you select “MPjobs_jconto_xxxx.zip”, an icon on the top-center screen will perform the download. The demo data are set for PSSe v.33 but an update of the "psseversion" and "pssepath" in the INI file will make it usable for v. 34]
MPjobs
MPjobs, using a multi-processing module available in Python
2.7 (installed with PSSe v.33), activates several instances of
PSSe to run a PSSe script on the same data set with a variable changing between runs.
As an example for loadflow studies, MPjobs can run activity ACCC (contingency analysis) on n-cases using the same set of input data files (*.sub,*.mon, *.con) on a fixed number of CPUs set by the users (of course, less than total number of pc’s cpus ;) The changing variable is the name of the case to be tested.
As an example for dynamic studies, MPjobs can run 16 fault events for a 5 sec. simulation on a 8-cpu pc, in parallel, by running first 8 faults and then the remaining 8 faults (the allocation of tasks to each cpu is done by the Windows OS for which MPjobs does not have control). The changing variable is the name of the file describing the fault to be tested, typically, an idev file name.
As an example for dynamic studies, MPjobs can run 16 fault events for a 5 sec. simulation on a 8-cpu pc, in parallel, by running first 8 faults and then the remaining 8 faults (the allocation of tasks to each cpu is done by the Windows OS for which MPjobs does not have control). The changing variable is the name of the file describing the fault to be tested, typically, an idev file name.
Timing Performance
PSSe v33
7k-bus case,
16 normal clearing faults for 5 sec. simulation,
1/4 cycle step size,
busdim = 80k,
3249 channels
|
|
In power system studies, several processing tasks are part
of a study. Many tasks are and will continue to be done in series. Those
tasks suitable for parallel processing can use MPjobs but they might still go
through a final process to aggregate the output data for reports, plotting,
etc. as needed.
MPjobs uses two python scripts and one data input file:
MPjobs.py – It contains the code for parallelizing the
multiple calls to the target python script (that includes the call to PSSe). Its design is such that it does not
have a dependency on PSSe or the target python script. No changes to this code are expected for
normal parallel runs.
*.ini (i.e.: default name=mpjobs.ini) - text file with data input sufficient to run
the process. It contains the name of the
target python script and declares text files holding the data for the independent variables (one files per each of the independent variables, X, Y, Z). These independent variable values are sent to
the target python script once per run.
Data entry in an INI file:
Xvec = [5,10,15,20,25,30] /<- array input
//Xvec = frange(5.0,35.0,5.0] /<- commented data line
mycnv = CASEs\savnw_flat33_cnv.sav /<-string, no quotes
LOGspath = LOGs\
initLoading= 0.6
A few rules about data input using an INI file:
- - comments start with or from /
- - path ending with '\' required
- - in-line comments are stripped before reading keyword value
- - string shall be empty, not set to be ='' which is read to has two chars of '
x.py (i.e.: SCRIPTs\run_accc1D.py) – the target python script that actually run the PSSe process. Since the PSSe run is done outside its GUI, python code is added at front to identify the correct path to load psspy.pyc:
import psse33 #it updates syspath with location path of the psspy module
import psspy #library of PSSe APIs
If x.py is coded to run a contingency analysis, the python script will include all PSSe activities to create a DFAX files and then run an ACCC activity, it will receive from MPjobs.py all data needed, contained in a dictionary structure (python data type).
All functions and module imports made at MPjobs are available to x.py, together with the study data in dictionary format, the "My" dictionary variable.
Data read by MPjobs from the INI file will retain its data type, and made available to x.py:
In INI file in x.py value type
initLoading My[‘INITLOADING’] 0.6 float
LOGspath My[‘LOGSPATH’] ‘LOGs\’ string
An additional file, 'JCtools.py' contains common procedures like parsing a string, opening a file, reading an ini file, etc.
MPjobs.py has been configured to run data sets with up to three
independent variables (a 3D run or a triple loop run of Z, Y, X vars):
-
1D runs: a single list of variables, in addition
to study data.
(i.e.: In dynamics runs, a list of faults
names = X vars)
-
2D runs: two independent lists of variables
(i.e.: In dynamics runs, a list of faults
names = X vars and
a list of base case names = Y vars)
-
3D runs: three independent lists of variables
(i.e.: In dynamics runs, a list of faults names = X vars,
a list of base case names = Y vars, and
a list of a study region load levels =
Z vars)
If an application can run under DOS, with line argument or data contained in a file, then such application' instances can run in parallel using MPjobs. It is all about how the target python script is coded!
MPjobs can also run in python 2.5 (installed with PSSe v.32)
after installation of the add-on module ‘multiprocessing’ (at the Net:” https://pypi.python.org/pypi/multiprocessing/”)
which might require admin rights on your pc for installation.
Example 1 - Loadflow study demo
(All files are part of the MPjobs release)
Let's run a "Multiple ACCC run" for multiple base cases, where for each base case a ACCC run is executed. In this scenario, the base case file name is the independent variable. A text file ‘sav.lst ’ contains the names of the base cases to be used:
Let's run a "Multiple ACCC run" for multiple base cases, where for each base case a ACCC run is executed. In this scenario, the base case file name is the independent variable. A text file ‘sav.lst ’ contains the names of the base cases to be used:
Sav.lst:
Savnw # = xvar, within the X-loop in MPjobs
Savnw32 # = xvar, within the next loop in the X-loop in MPjobs
The ‘mpACCC.ini’ file
has all study data needed for this run, including the name of the target python
script:
..
Script = SCRIPTs\run_accc1D.py
Xfile = CASEs\sav.lst
..
File names can have full paths or new vars can be defined to
hold those path strings.
The target python script (‘run_accc1D.py’) will contain the
code to run PSSe activities [execution of DFAX and ACCC activities], utilizing
the data made available by MPjobs throught the dictionary My. This target python code will also be able to
parse data and create new variables based on the received independents
variable, like creating unique file names to output data:
xvarpath=
os.path.dirname(My['XFILE']) #=CASEs\
xvarkey,xext =
os.path.splitext(My['XVAR']) #=
savnw & ‘’
..
My['MYSAV'] = '%s\%s'%(xvarpath,My['XVAR']) # = CASEs\savnw
MPjobs within a loop (for the Xvar), open a new instance of python and executes the script run_accc1D.py, effectively running an instance of PSSe to execute an ACCC study.
To run this loadflow demo, open a DOS window (the "PSSe Command Prompt" link can be used or any other means to open a DOS window) and type
command + inifile, without extension as:
C:..>mpjobs mpaccc
Once the parallel run is completed, some output files,
*.dfx, *.acc, will be created in the “ACCC” folder.
Example 2 - Dynamic study demos
1D-dynamic run:
target script = run_faults1D.py
inifile = ‘mpjobs.ini’
xfile = events\sb.lst #list of faults filenames, each one to be assigned to xvar
xfile = events\sb.lst #list of faults filenames, each one to be assigned to xvar
run as C:..>mpjobs
The data in ‘mpjobs.ini’ (default INI file name) will call ‘run_faults1D.py’
as the target python script, to run dynamics runs for 16 faults scenarios.
1D-dynamic plotting:
target script = mpplot.py
inifile = ‘mpplots1D.ini’
xfile = events\sb.lst #list of faults filenames, each one to be assigned to xvar
The data in ‘mpplots1D.ini’ will id ‘mpplot.py’ as the
target python script, to run PSSPLT, the plotting engine in PSSe, to create
plots in postscripts format (*.ps) for each fault scenario created during the 1D-dynamic run.
The created postscripts files can be translated to *.pdf by
Acrobat’s Distiller tools with a second process, not included here.
2D-dynamic run:
target script = run_ faults2D.py
inifile = ‘mp2D.ini’
xfile = events\sb.lst #list of faults filenames, each one to be assigned to xvar
yfile = CASEs\cases.lst #list of base cases filenames, each one to be assigned to yvar
As an example of a 2D process, using two independent
variables [also called a double loop run],
the data in ‘mp2D.ini’ will identify ‘run_faults2D.py’ as the target
python script, to run dynamics runs for two different base cases and 16 fault
scenarios.
1D-parametric run:
target script = grun.py
inifile = ‘mpgrun_droop.ini’
Xvec = [5,10,15,20,25,30] #Xvec is a list of all xvar values (droop) to be tested
run as C:..>mpjobs mpgrun_droop
run as C:..>mpjobs
In the IEESGO governor model, a parameter representing the droop is changed
within a range to get the governor response using PSSe Grun tests.
thank u i want to see file
ReplyDeleteGreat Job, thanks for sharing!
ReplyDelete