Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
24 commits
Select commit Hold shift + click to select a range
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .github/workflows/process.yml
Original file line number Diff line number Diff line change
Expand Up @@ -191,7 +191,7 @@ jobs:
path: tracking/
- run: mv tracking/large_tokamak_nof.SIG_TF.json tracking/large_tokamak_nof_SIG_TF.json
- name: Create an example plot proc
run: hatch run python process/io/plot_proc.py -f tracking/large_tokamak_nof_MFILE.DAT
run: hatch run python process/core/io/plot_proc.py -f tracking/large_tokamak_nof_MFILE.DAT
- name: Move plot proc file to docs images
run: mv tracking/large_tokamak_nof_MFILE.DATSUMMARY.pdf documentation/source/images/plot_proc.pdf
- run: hatch run docs:build
Expand Down
2 changes: 1 addition & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -41,7 +41,7 @@ com.msg
*.gz
ford_site/
lib/
process/io/python_fortran_dicts.json
process/core/io/python_fortran_dicts.json
*.egg-info
fortran.py
.coverage
Expand Down
2 changes: 1 addition & 1 deletion documentation/source/eng-models/machine-build.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@


Simplified scale diagrams of the vertical and horizontal cross-sections of the machine can be
output in the summary created by using the utility `plot_proc.py` (currently stored in `process/process/io`).
output in the summary created by using the utility `plot_proc.py` (currently stored in `process/process/core/io`).

The coordinate system is $(R,Z)$ system, where $R$ is the radial distance from the vertical
centreline (axis) of the torus, and $Z$ is the vertical distance from the equatorial midplane.
Expand Down
4 changes: 2 additions & 2 deletions documentation/source/io/python-lib-guide.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
## Library for `IN.DAT` Files

> `process/io/in_dat.py`
> `process/core/io/in_dat.py`

A set of Python classes to read, modify and write an `IN.DAT` file.

Expand Down Expand Up @@ -64,7 +64,7 @@ i.write_in_dat(output_filename="new_IN.DAT")

## Library for `MFILE.DAT` Files

> `process/io/mfile.py`
> `process/core/io/mfile.py`


A set of Python classes to read and extract data from the `MFILE.DAT`.
Expand Down
38 changes: 19 additions & 19 deletions documentation/source/io/utilities.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,13 +20,13 @@ make sure their directory is in your Python path.

## Compare MFILEs

`process/io/mfile_comparison.py`
`process/core/io/mfile_comparison.py`

Tool for comparing two MFILEs and outputting significant differences in numerical values.

### Usage
```bash
python process/io/mfile_comparison.py [-f path/to/first_MFILE.DAT path/to/second_MFILE.DAT] [-s] [--acc] [--verbose]
python process/core/io/mfile_comparison.py [-f path/to/first_MFILE.DAT path/to/second_MFILE.DAT] [-s] [--acc] [--verbose]
```
### Options
| Argument | Description |
Expand All @@ -43,15 +43,15 @@ Outputs variables and their values which differ significantly between the two MF
## CSV Exporter

```
process/io/mfile_to_csv.py
process/core/io/mfile_to_csv.py
```

This script reads from a PROCESS MFILE and writes values into a CSV file. The variable list is given in a .json file which is defined by the user; a pre-made one can be found in `process/io/mfile_to_csv_vars.json`.
This script reads from a PROCESS MFILE and writes values into a CSV file. The variable list is given in a .json file which is defined by the user; a pre-made one can be found in `process/core/io/mfile_to_csv_vars.json`.

### Usage

```bash
python process/io/mfile_to_csv.py [-h] [-f path/to/MFILE] [-v path/to/variable_list.json]
python process/core/io/mfile_to_csv.py [-h] [-f path/to/MFILE] [-v path/to/variable_list.json]
```
### Options
| Argument | Description |
Expand All @@ -67,14 +67,14 @@ A `.csv` file will be saved to the directory of the input file.

## PROCESS Summary

> `process/io/plot_proc.py`
> `process/core/io/plot_proc.py`

A utility to produce a summary of the output from PROCESS, including the major parameters, poloidal and toroidal cross-sections, temperature and density profiles, TF coil layout and turn structure, density, bootstrap and L-H scaling comparisons, plasma and PF coil current over the pulse duration plot and first wall geometry and pumping plot.

### Usage

```bash
python process/io/plot_proc.py [-h] [-f path/to/MFILE.DAT] [-s] [-n N] [-d] [-c COLOUR] [-o OUTPUT-FORMAT]
python process/core/io/plot_proc.py [-h] [-f path/to/MFILE.DAT] [-s] [-n N] [-d] [-c COLOUR] [-o OUTPUT-FORMAT]
```

If no `-f` argument is provided it assumes a file named `MFILE.DAT` is in the current directory.
Expand Down Expand Up @@ -194,7 +194,7 @@ for the reference year of the inflation used.

## Sankey Diagram

> `process/io/plot_sankey.py`
> `process/core/io/plot_sankey.py`

The power flows of the power plant will be extracted from MFILE.DAT and used to populate a
Sankey diagram. The diagram will start from the initial fusion power and show the inputs
Expand All @@ -204,7 +204,7 @@ heating back into the fusion power.
### Usage

```
python process/io/plot_sankey.py [-h] [-e END] [-m path/to/MFILE.DAT]
python process/core/io/plot_sankey.py [-h] [-e END] [-m path/to/MFILE.DAT]
```
If no `-m` argument is provided it assumes a file named `MFILE.DAT` is in the current directory.

Expand Down Expand Up @@ -233,7 +233,7 @@ N.B. Rounding to whole integer can cause errors of $\pm$1 between adjacent arrow

## TF Stress distribution plots

> `process/io/plot_stress_tf.py`
> `process/core/io/plot_stress_tf.py`

Program to plot stress, strain and displacement radial distributions at the inboard mid-plane section of the TF coil.
This program uses the `SIG_TF.json` file created by running `PROCESS`, that stores stress distributions of the VMCON point and stores the output
Expand All @@ -250,7 +250,7 @@ done in the near future.
### Usage

```bash
python process/io/plot_stress_tf.py [-h] [-f path/to/SIG_TF.json] [-p [PLOT_SELEC]] [-sf [SAVE_FORMAT]] [-as [AXIS_FONT_SIZE]]
python process/core/io/plot_stress_tf.py [-h] [-f path/to/SIG_TF.json] [-p [PLOT_SELEC]] [-sf [SAVE_FORMAT]] [-as [AXIS_FONT_SIZE]]
```

### Option
Expand All @@ -271,7 +271,7 @@ python process/io/plot_stress_tf.py [-h] [-f path/to/SIG_TF.json] [-p [PLOT_SELE

## Turn output into input

`process/io/write_new_in_dat.py`
`process/core/io/write_new_in_dat.py`

This program creates a new `IN.DAT` file with the initial values of all the iteration variables
replaced by their results in `OUT.DAT`, if that output is a feasible solution.
Expand All @@ -285,7 +285,7 @@ the new starting values. There is also an option to select the first feasible so

### Usage
```
python process/io/write_new_in_dat.py [-h] [-f path/to/MFILE.DAT] [-i path/to/IN.DAT] [-o path/to/new_IN.DAT]
python process/core/io/write_new_in_dat.py [-h] [-f path/to/MFILE.DAT] [-i path/to/IN.DAT] [-o path/to/new_IN.DAT]
```

### Options
Expand All @@ -304,7 +304,7 @@ python process/io/write_new_in_dat.py [-h] [-f path/to/MFILE.DAT] [-i path/to/IN

## Plot scan results

`process/io/plot_scans.py`
`process/core/io/plot_scans.py`

This utility plots the output of a PROCESS scan. PROCESS must be run on a scan-enabled input file to create an MFILE on which `plot_scans.py` can be run. More than one input file can be used and the different files will be plotted on the same graph.

Expand All @@ -315,7 +315,7 @@ This utility plots the output of a PROCESS scan. PROCESS must be run on a scan-e
### Usage

```
python process/io/plot_scans.py [-h] [-f path/to/MFILE(s)] [-yv output vars] [-yv2 2nd axis output variable] [-o [path/to/directory]] [-out] [-sf [SAVE_FORMAT]] [-as [AXIS_FONT_SIZE]] [-ln LABEL_NAME] [-2DC] [-stc]
python process/core/io/plot_scans.py [-h] [-f path/to/MFILE(s)] [-yv output vars] [-yv2 2nd axis output variable] [-o [path/to/directory]] [-out] [-sf [SAVE_FORMAT]] [-as [AXIS_FONT_SIZE]] [-ln LABEL_NAME] [-2DC] [-stc]
```

### Options
Expand All @@ -338,7 +338,7 @@ python process/io/plot_scans.py [-h] [-f path/to/MFILE(s)] [-yv output vars] [-y

## Plot a pie chart of the cost breakdown

`process/io/costs_pie.py`
`process/core/io/costs_pie.py`

This utility plots the cost breakdown as a pie chart giving each component as a percentage. This allows for the most expensive areas to be easily identified. For the 1990 cost model, an additional plot showing how direct, indirect and contingency costs contribute to the overall budget is shown.

Expand All @@ -348,7 +348,7 @@ This utility plots the cost breakdown as a pie chart giving each component as a

### Usage
```
python process/io/costs_pie.py [-h] [-f path/to/MFILE] [-s]
python process/core/io/costs_pie.py [-h] [-f path/to/MFILE] [-s]
```
If no `-f` argument is provided it assumes a file named `MFILE.DAT` is in the current directory.

Expand All @@ -363,7 +363,7 @@ If no `-f` argument is provided it assumes a file named `MFILE.DAT` is in the cu

## Plot a bar chart of the cost breakdown

`process/io/costs_bar.py`
`process/core/io/costs_bar.py`

This utility plots the cost breakdown as a bar chart giving the cost of each component. This allows for the most expensive areas to be easily identified. For the 1990 cost model, an additional plot showing how the direct, indirect and contingency costs contribute to the overall budget is shown. Multiple MFILEs can be specified allowing for different PROCESS runs to be compared on the same plot. An inflation factor can be specified using the `-inf` argument, which multiplies all the costs by that value.

Expand All @@ -373,7 +373,7 @@ This utility plots the cost breakdown as a bar chart giving the cost of each com

### Usage
```
python process/io/costs_bar.py [-h] [-f f [f ...]] [-s] [-inf INF]
python process/core/io/costs_bar.py [-h] [-f f [f ...]] [-s] [-inf INF]
```

### Options
Expand Down
8 changes: 4 additions & 4 deletions documentation/source/usage/plotting.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,10 +9,10 @@

### Summary document | `plot_proc.py`

`plot_proc` is used for plotting an overview of the results from an MFILE. It can be run using its own CLI (see [here](https://ukaea.github.io/PROCESS/io/utilities/) for full details):
`plot_proc` is used for plotting an overview of the results from an MFILE. It can be run using its own CLI (see [here](https://ukaea.github.io/process/core/io/utilities/) for full details):

```bash
python process/io/plot_proc.py -f path/to/MFILE.DAT
python process/core/io/plot_proc.py -f path/to/MFILE.DAT
```

or through PROCESS's main CLI:
Expand All @@ -32,7 +32,7 @@ An example of a plot proc output PDF for the large tokamak regression test is sh
Scans can be done in one or two dimensions.

```bash
python process/io/plot_scans.py -f path/to/MFILE.DAT
python process/core/io/plot_scans.py -f path/to/MFILE.DAT
```
<figure markdown>
![2D_contour_plot](../images/2D_contour_plot_example.png){figures-side, width="100%"}
Expand All @@ -51,7 +51,7 @@ python process/io/plot_scans.py -f path/to/MFILE.DAT
`plot_plotly_sankey` is to plot an interactive `.html` Sankey diagram for looking at the plants power balance. It can be run as follows:

```bash
python process/io/plot_plotly_sankey.py -m path/to/MFILE.DAT
python process/core/io/plot_plotly_sankey.py -m path/to/MFILE.DAT
```

<figure markdown>
Expand Down
8 changes: 4 additions & 4 deletions examples/introduction.ex.py
Original file line number Diff line number Diff line change
Expand Up @@ -42,8 +42,8 @@
import tempfile
from pathlib import Path

from process.core.repository import get_process_root
from process.main import SingleRun
from process.repository import get_process_root

# Define project root dir; this is using the current working directory
PROJ_DIR = Path.cwd().parent
Expand All @@ -65,10 +65,10 @@
# ## Plot summary
# Create a summary of the generated `MFILE.DAT` using `plot_proc`.
#
# You can also call `plot_proc` from the cli with `python -m process.io.plot_proc`
# You can also call `plot_proc` from the cli with `python -m process.core.io.plot_proc`

# %%
from process.io import plot_proc
from process.core.io import plot_proc

# Pdf and png output are also available
plot_proc.main(
Expand All @@ -92,7 +92,7 @@
# This demonstrates how you would read from a PROCESS MFILE and write specified values into a csv using the `mfile_to_csv` function

# %%
from process.io import mfile_to_csv
from process.core.io import mfile_to_csv

# mfile_to_csv requires two inputs:
# - path to the MFILE
Expand Down
4 changes: 2 additions & 2 deletions examples/plot_solutions.ex.py
Original file line number Diff line number Diff line change
Expand Up @@ -38,12 +38,12 @@
# %load_ext autoreload
# %autoreload 2

from process.io.plot_solutions import (
from process.core.io.plot_solutions import (
RunMetadata,
plot_mfile_solutions,
plot_mfile_solutions_constraints,
)
from process.repository import get_process_root
from process.core.repository import get_process_root

# %% [markdown]
# ## Plot single solution
Expand Down
4 changes: 2 additions & 2 deletions examples/scan.ex.py
Original file line number Diff line number Diff line change
Expand Up @@ -33,8 +33,8 @@
# - `sweep`: array of values for the scanned variable to take; one for each run. Should be of length `isweep`

# %% slideshow={"slide_type": "subslide"}
from process.core.repository import get_process_root
from process.main import SingleRun
from process.repository import get_process_root

data_dir = get_process_root() / "../examples/data/"
input_name = data_dir / "scan_example_file_IN.DAT"
Expand All @@ -48,7 +48,7 @@

# %% slideshow={"slide_type": "subslide"}
# %matplotlib inline
from process.io import plot_scans
from process.core.io import plot_scans

# Define working directory relative to project dir and input file name
mfile_name = data_dir / "scan_example_file_MFILE.DAT"
Expand Down
4 changes: 2 additions & 2 deletions examples/single_model_evaluation.ex.py
Original file line number Diff line number Diff line change
Expand Up @@ -38,7 +38,7 @@
# In order to initialise all variables in Process with their values at a given point (design parameter vector), run an evaluation input file (one with no optimisation) to initialise values in all models. The "large tokamak" regression test solution is used here.

# %%
from process.repository import get_process_root
from process.core.repository import get_process_root

data_dir = get_process_root() / "../examples/data/"
single_run = SingleRun((data_dir / "large_tokamak_eval_IN.DAT").as_posix())
Expand Down Expand Up @@ -80,7 +80,7 @@ def print_values():
# Now investigate effect of varying W impurity on impurity radiation power, divertor power and constraint 15 (L-H threshold constraint).

# %%
from process.constraints import ConstraintManager
from process.core.solver.constraints import ConstraintManager


def run_impurities(w_imp_fracs):
Expand Down
4 changes: 2 additions & 2 deletions examples/vary_run_example.ex.py
Original file line number Diff line number Diff line change
Expand Up @@ -79,9 +79,9 @@
import tempfile
from pathlib import Path

from process.io.mfile_utils import get_mfile_initial_ixc_values
from process.core.io.mfile_utils import get_mfile_initial_ixc_values
from process.core.repository import get_process_root
from process.main import VaryRun
from process.repository import get_process_root

# Define project root dir; when running a notebook, the cwd is the dir the notebook is in
PROJ_DIR = Path.cwd().parent
Expand Down
File renamed without changes.
15 changes: 8 additions & 7 deletions process/caller.py → process/core/caller.py
Original file line number Diff line number Diff line change
Expand Up @@ -7,13 +7,14 @@
import numpy as np
from tabulate import tabulate

import process.constraints as constraints
from process import constants, data_structure
from process.final import finalise
from process.io.mfile import MFile
from process.iteration_variables import set_scaled_iteration_variable
from process.objectives import objective_function
from process.process_output import OutputFileManager, ovarre
import process.core.solver.constraints as constraints
from process import data_structure
from process.core import constants
from process.core.final import finalise
from process.core.io.mfile import MFile
from process.core.process_output import OutputFileManager, ovarre
from process.core.solver.iteration_variables import set_scaled_iteration_variable
from process.core.solver.objectives import objective_function

if TYPE_CHECKING:
from process.main import Models
Expand Down
File renamed without changes.
File renamed without changes.
File renamed without changes.
10 changes: 5 additions & 5 deletions process/final.py → process/core/final.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,12 +2,12 @@

from tabulate import tabulate

import process.constraints as constraints
from process import constants
from process import output as op
from process import process_output as po
import process.core.solver.constraints as constraints
from process.core import constants
from process.core import output as op
from process.core import process_output as po
from process.core.solver.objectives import objective_function
from process.data_structure import numerics
from process.objectives import objective_function


def finalise(models, ifail: int, non_idempotent_msg: str | None = None):
Expand Down
15 changes: 8 additions & 7 deletions process/init.py → process/core/init.py
Original file line number Diff line number Diff line change
Expand Up @@ -6,10 +6,14 @@
from warnings import warn

import process
import process.iteration_variables as iteration_variables
import process.process_output as process_output
from process import constants, data_structure
from process.constraints import ConstraintManager
import process.core.process_output as process_output
import process.core.solver.iteration_variables as iteration_variables
from process import data_structure
from process.core import constants
from process.core.exceptions import ProcessValidationError
from process.core.input import parse_input_file
from process.core.log import logging_model_handler
from process.core.solver.constraints import ConstraintManager
from process.data_structure.blanket_library import init_blanket_library
from process.data_structure.build_variables import init_build_variables
from process.data_structure.buildings_variables import init_buildings_variables
Expand Down Expand Up @@ -56,9 +60,6 @@
from process.data_structure.times_variables import init_times_variables
from process.data_structure.vacuum_variables import init_vacuum_variables
from process.data_structure.water_usage_variables import init_watuse_variables
from process.exceptions import ProcessValidationError
from process.input import parse_input_file
from process.log import logging_model_handler
from process.models.stellarator.initialization import st_init


Expand Down
Loading