Introduce mocksnakemake which acutally parses Snakefile (#107)

* rewrite mocksnakemake for parsing real Snakefile

* continue add function to scripts

* going through all scripts, setting new mocksnakemake

* fix plotting scripts

* fix build_country_flh

* fix build_country_flh II

* adjust config files

* fix make_summary for tutorial network

* create dir also for output

* incorporate suggestions

* consistent import of mocksnakemake

* consistent import of mocksnakemake II

* Update scripts/_helpers.py

Co-Authored-By: euronion <42553970+euronion@users.noreply.github.com>

* Update scripts/_helpers.py

Co-Authored-By: euronion <42553970+euronion@users.noreply.github.com>

* Update scripts/_helpers.py

Co-Authored-By: euronion <42553970+euronion@users.noreply.github.com>

* Update scripts/_helpers.py

Co-Authored-By: euronion <42553970+euronion@users.noreply.github.com>

* Update scripts/plot_network.py

Co-Authored-By: euronion <42553970+euronion@users.noreply.github.com>

* Update scripts/plot_network.py

Co-Authored-By: euronion <42553970+euronion@users.noreply.github.com>

* Update scripts/retrieve_databundle.py

Co-Authored-By: euronion <42553970+euronion@users.noreply.github.com>

* use pathlib for mocksnakemake

* rename mocksnakemake into mock_snakemake

* revert change in data

* Update scripts/_helpers.py

Co-Authored-By: euronion <42553970+euronion@users.noreply.github.com>

* remove setting logfile in mock_snakemake, use Path in configure_logging

* fix fallback path and base_dir
fix return type of make_io_accessable

* reformulate mock_snakemake

* incorporate suggestion, fix typos

* mock_snakemake: apply absolute paths again, add assertion error
*.py: make hard coded io path accessable for mock_snakemake

* retrieve_natura_raster: use snakemake.output for fn_out

* include suggestion

* Apply suggestions from code review

Co-Authored-By: Jonas Hörsch <jonas.hoersch@posteo.de>

* linting, add return ad end of file

* Update scripts/plot_p_nom_max.py

Co-Authored-By: Jonas Hörsch <jonas.hoersch@posteo.de>

* Update scripts/plot_p_nom_max.py

fixes #112

Co-Authored-By: Jonas Hörsch <jonas.hoersch@posteo.de>

* plot_p_nom_max: small correction

* config.tutorial.yaml fix snapshots end

* use techs instead of technology

* revert try out from previous commit, complete replacing

* change clusters -> clusts in plot_p_nom_max due to wildcard constraints of clusters

* change clusters -> clusts in plot_p_nom_max due to wildcard constraints of clusters II
This commit is contained in:
FabianHofmann 2019-12-09 21:29:15 +01:00 committed by GitHub
parent 5143a258ac
commit eaf30a9b65
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
30 changed files with 241 additions and 316 deletions

View File

@ -124,7 +124,7 @@ rule build_bus_regions:
if config['enable'].get('build_cutout', False):
rule build_cutout:
output: directory("cutouts/{cutout}")
log: "logs/build_cutout.log"
log: "logs/build_cutout/{cutout}.log"
resources: mem=config['atlite'].get('nprocesses', 4) * 1000
threads: config['atlite'].get('nprocesses', 4)
benchmark: "benchmarks/build_cutout_{cutout}"
@ -357,11 +357,11 @@ rule plot_summary:
def input_plot_p_nom_max(wildcards):
return [('networks/{network}_s{simpl}{maybe_cluster}.nc'
.format(maybe_cluster=('' if c == 'full' else ('_' + c)), **wildcards))
for c in wildcards.clusters.split(",")]
for c in wildcards.clusts.split(",")]
rule plot_p_nom_max:
input: input_plot_p_nom_max
output: "results/plots/{network}_s{simpl}_cum_p_nom_max_{clusters}_{technology}_{country}.{ext}"
log: "logs/plot_p_nom_max/{network}_s{simpl}_{clusters}_{technology}_{country}_{ext}.log"
output: "results/plots/{network}_s{simpl}_cum_p_nom_max_{clusts}_{techs}_{country}.{ext}"
log: "logs/plot_p_nom_max/{network}_s{simpl}_{clusts}_{techs}_{country}_{ext}.log"
script: "scripts/plot_p_nom_max.py"
rule build_country_flh:

View File

@ -14,7 +14,7 @@ scenario:
clusters: [37, 100]
opts: [Co2L-3H]
countries: ['AL', 'AT', 'BA', 'BE', 'BG', 'CH', 'CZ', 'DE', 'DK', 'EE', 'ES', 'FI', 'FR', 'GB', 'GR', 'HR', 'HU', 'IE', 'IT', 'LT', 'LU', 'LV', 'ME', 'MK', 'NL', 'NO', 'PL', 'PT', 'RO', 'RS', 'SE', 'SI', 'SK']
countries: ['AL', 'AT', 'BA', 'BE', 'BG', 'CH', 'CZ', 'DE', 'DK', 'EE', 'ES', 'FI', 'FR', 'GB', 'GR', 'HR', 'HU', 'IE', 'IT', 'LT', 'LU', 'LV', 'ME', 'MK', 'NL', 'NO', 'PL', 'PT', 'RO', 'RS', 'SE', 'SI', 'SK']
snapshots:
start: "2013-01-01"
@ -224,7 +224,7 @@ plotting:
storage_techs: ["hydro+PHS", "battery", "H2"]
load_carriers: ["AC load"]
AC_carriers: ["AC line", "AC transformer"]
link_carriers: ["DC line", "Converter AC-DC"]
link_carriers: ["DC line", "Converter AC-DC"]
tech_colors:
"onwind" : "#235ebc"
"onshore wind" : "#235ebc"
@ -278,7 +278,7 @@ plotting:
"helmeth" : "#a31597"
"DAC" : "#d284ff"
"co2 stored" : "#e5e5e5"
"CO2 sequestration" : "#e5e5e5"
"CO2 sequestration" : "#e5e5e5"
"battery" : "#b8ea04"
"battery storage" : "#b8ea04"
"Li ion" : "#b8ea04"
@ -287,7 +287,7 @@ plotting:
"transport fuel cell" : "#e884be"
"retrofitting" : "#e0d6a8"
"building retrofitting" : "#e0d6a8"
"heat pumps" : "#ff9768"
"heat pumps" : "#ff9768"
"heat pump" : "#ff9768"
"air heat pump" : "#ffbea0"
"ground heat pump" : "#ff7a3d"

View File

@ -13,11 +13,11 @@ scenario:
clusters: [5]
opts: [Co2L-24H]
countries: ['DE']
countries: ['DE']
snapshots:
start: "2013-03-01"
end: "2014-04-01"
end: "2013-04-01"
closed: 'left' # end is not inclusive
enable:
@ -202,7 +202,7 @@ plotting:
storage_techs: ["hydro+PHS", "battery", "H2"]
load_carriers: ["AC load"]
AC_carriers: ["AC line", "AC transformer"]
link_carriers: ["DC line", "Converter AC-DC"]
link_carriers: ["DC line", "Converter AC-DC"]
tech_colors:
"onwind" : "#235ebc"
"onshore wind" : "#235ebc"
@ -256,7 +256,7 @@ plotting:
"helmeth" : "#a31597"
"DAC" : "#d284ff"
"co2 stored" : "#e5e5e5"
"CO2 sequestration" : "#e5e5e5"
"CO2 sequestration" : "#e5e5e5"
"battery" : "#b8ea04"
"battery storage" : "#b8ea04"
"Li ion" : "#b8ea04"
@ -265,7 +265,7 @@ plotting:
"transport fuel cell" : "#e884be"
"retrofitting" : "#e0d6a8"
"building retrofitting" : "#e0d6a8"
"heat pumps" : "#ff9768"
"heat pumps" : "#ff9768"
"heat pump" : "#ff9768"
"air heat pump" : "#ffbea0"
"ground heat pump" : "#ff7a3d"
@ -317,4 +317,3 @@ plotting:
H2: "Hydrogen\nStorage"
lines: "Transmission\nlines"
ror: "Run of\nriver"

View File

@ -1,4 +1,6 @@
import pandas as pd
from pathlib import Path
def configure_logging(snakemake, skip_handlers=False):
"""
@ -25,11 +27,14 @@ def configure_logging(snakemake, skip_handlers=False):
kwargs.setdefault("level", "INFO")
if skip_handlers is False:
fallback_path = Path(__file__).parent.joinpath('..', 'logs', f"{snakemake.rule}.log")
logfile = snakemake.log.get('python', snakemake.log[0] if snakemake.log
else fallback_path)
kwargs.update(
{'handlers': [
# Prefer the 'python' log, otherwise take the first log for each
# Snakemake rule
logging.FileHandler(snakemake.log.get('python', snakemake.log[0] if snakemake.log else f"logs/{snakemake.rule}.log")),
logging.FileHandler(logfile),
logging.StreamHandler()
]
})
@ -64,9 +69,6 @@ def load_network(fn, tech_costs, config, combine_hydro_ps=True):
# bus_carrier = n.storage_units.bus.map(n.buses.carrier)
# n.storage_units.loc[bus_carrier == "heat","carrier"] = "water tanks"
for name in opts['heat_links'] + opts['heat_generators']:
n.links.loc[n.links.index.to_series().str.endswith(name), "carrier"] = name
Nyears = n.snapshot_weightings.sum()/8760.
costs = load_costs(Nyears, tech_costs, config['costs'], config['electricity'])
update_transmission_costs(n, costs)
@ -105,7 +107,7 @@ def aggregate_p_curtailed(n):
def aggregate_costs(n, flatten=False, opts=None, existing_only=False):
from six import iterkeys, itervalues
components = dict(Link=("p_nom", "p0"),
Generator=("p_nom", "p"),
StorageUnit=("p_nom", "p"),
@ -150,3 +152,56 @@ def progress_retrieve(url, file):
urllib.request.urlretrieve(url, file, reporthook=dlProgress)
def mock_snakemake(rulename, **wildcards):
"""
This function is expected to be executed from the 'scripts'-directory of '
the snakemake project. It returns a snakemake.script.Snakemake object,
based on the Snakefile.
If a rule has wildcards, you have to specify them in **wildcards.
Parameters
----------
rulename: str
name of the rule for which the snakemake object should be generated
**wildcards:
keyword arguments fixing the wildcards. Only necessary if wildcards are
needed.
"""
import snakemake as sm
import os
from pypsa.descriptors import Dict
from snakemake.script import Snakemake
script_dir = Path(__file__).parent.resolve()
assert Path.cwd().resolve() == script_dir, \
f'mock_snakemake has to be run from the repository scripts directory {script_dir}'
os.chdir(script_dir.parent)
for p in sm.SNAKEFILE_CHOICES:
if os.path.exists(p):
snakefile = p
break
workflow = sm.Workflow(snakefile)
workflow.include(snakefile)
workflow.global_resources = {}
rule = workflow.get_rule(rulename)
dag = sm.dag.DAG(workflow, rules=[rule])
wc = Dict(wildcards)
job = sm.jobs.Job(rule, dag, wc)
def make_accessable(*ios):
for io in ios:
for i in range(len(io)):
io[i] = os.path.abspath(io[i])
make_accessable(job.input, job.output, job.log)
snakemake = Snakemake(job.input, job.output, job.params, job.wildcards,
job.threads, job.resources, job.log,
job.dag.workflow.config, job.rule.name, None,)
# create log and output dir if not existent
for path in list(snakemake.log) + list(snakemake.output):
Path(path).parent.mkdir(parents=True, exist_ok=True)
os.chdir(script_dir)
return snakemake

View File

@ -507,23 +507,9 @@ def add_nice_carrier_names(n, config=None):
if __name__ == "__main__":
# Detect running outside of snakemake and mock snakemake for testing
if 'snakemake' not in globals():
from vresutils.snakemake import MockSnakemake, Dict
snakemake = MockSnakemake(output=['networks/elec.nc'])
snakemake.input = snakemake.expand(
Dict(base_network='networks/base.nc',
tech_costs='data/costs.csv',
regions="resources/regions_onshore.geojson",
powerplants="resources/powerplants.csv",
hydro_capacities='data/bundle/hydro_capacities.csv',
opsd_load='data/bundle/time_series_60min_singleindex_filtered.csv',
nuts3_shapes='resources/nuts3_shapes.geojson',
**{'profile_' + t: "resources/profile_" + t + ".nc"
for t in snakemake.config['renewable']})
)
from _helpers import mock_snakemake
snakemake = mock_snakemake('add_electricity')
configure_logging(snakemake)
n = pypsa.Network(snakemake.input.base_network)

View File

@ -173,15 +173,10 @@ def attach_hydrogen_pipelines(n, costs):
carrier="H2 pipeline")
if __name__ == "__main__":
# Detect running outside of snakemake and mock snakemake for testing
if 'snakemake' not in globals():
from vresutils.snakemake import MockSnakemake, Dict
snakemake = MockSnakemake(output=['networks/elec_s_5_ec.nc'])
snakemake.input = snakemake.expand(
Dict(network='networks/elec_s_5.nc',
tech_costs='data/costs.csv'))
from _helpers import mock_snakemake
snakemake = mock_snakemake('add_extra_components', network='elec',
simpl='', clusters=5)
configure_logging(snakemake)
n = pypsa.Network(snakemake.input.network)

View File

@ -548,28 +548,9 @@ def base_network():
return n
if __name__ == "__main__":
# Detect running outside of snakemake and mock snakemake for testing
if 'snakemake' not in globals():
from vresutils.snakemake import MockSnakemake, Dict
snakemake = MockSnakemake(
path='..',
wildcards={},
input=Dict(
eg_buses='data/entsoegridkit/buses.csv',
eg_lines='data/entsoegridkit/lines.csv',
eg_links='data/entsoegridkit/links.csv',
eg_converters='data/entsoegridkit/converters.csv',
eg_transformers='data/entsoegridkit/transformers.csv',
parameter_corrections='data/parameter_corrections.yaml',
links_p_nom='data/links_p_nom.csv',
links_tyndp='data/links_tyndp.csv',
country_shapes='resources/country_shapes.geojson',
offshore_shapes='resources/offshore_shapes.geojson',
europe_shape='resources/europe_shape.geojson'
),
output = ['networks/base.nc']
)
from _helpers import mock_snakemake
snakemake = mock_snakemake('base_network')
configure_logging(snakemake)
n = base_network()

View File

@ -51,6 +51,9 @@ import geopandas as gpd
import pypsa
if __name__ == "__main__":
if 'snakemake' not in globals():
from _helpers import mock_snakemake
snakemake = mock_snakemake('build_bus_regions')
configure_logging(snakemake)
countries = snakemake.config['countries']

View File

@ -112,8 +112,8 @@ def plot_area_solar(area, p_area, countries):
d.plot.bar(ax=ax, legend=False, align='edge', width=1.)
# ax.set_ylabel(f"Potential {c} / GW")
ax.set_title(c)
ax.legend()
ax.set_xlabel("Full-load hours")
ax.legend()
ax.set_xlabel("Full-load hours")
fig.savefig(snakemake.output.plot, transparent=True, bbox_inches='tight')
@ -147,38 +147,13 @@ def build_aggregate(flh, countries, areamatrix, breaks, p_area, fn):
agg.to_csv(fn)
if __name__ == '__main__':
# Detect running outside of snakemake and mock snakemake for testing
if 'snakemake' not in globals():
from vresutils.snakemake import MockSnakemake, Dict
snakemake = MockSnakemake(
wildcards=Dict(technology='solar'),
input=Dict(
base_network="networks/base.nc",
corine="data/bundle/corine/g250_clc06_V18_5.tif",
natura="resources/natura.tiff",
gebco="data/bundle/GEBCO_2014_2D.nc",
country_shapes='resources/country_shapes.geojson',
offshore_shapes='resources/offshore_shapes.geojson',
pietzker="data/pietzker2014.xlsx"
),
output=Dict(
area="resources/country_flh_area_{technology}.csv",
aggregated="resources/country_flh_aggregated_{technology}.csv",
uncorrected="resources/country_flh_uncorrected_{technology}.csv",
plot="resources/country_flh_{technology}.pdf",
exclusion="resources/country_exclusion_{technology}"
)
)
snakemake.input['regions'] = os.path.join(snakemake.path, "resources",
"country_shapes.geojson"
if snakemake.wildcards.technology in ('onwind', 'solar')
else "offshore_shapes.geojson")
snakemake.input['cutout'] = os.path.join(snakemake.path, "cutouts",
snakemake.config["renewable"][snakemake.wildcards.technology]['cutout'])
from _helpers import mock_snakemake
snakemake = mock_snakemake('build_country_flh', technology='solar')
configure_logging(snakemake)
pgb.streams.wrap_stderr()
configure_logging(snakemake)
config = snakemake.config['renewable'][snakemake.wildcards.technology]
@ -198,7 +173,7 @@ if __name__ == '__main__':
# Use GLAES to compute available potentials and the transition matrix
paths = dict(snakemake.input)
init_globals(bounds, dx, dy, config, paths)
init_globals(bounds.xXyY, dx, dy, config, paths)
regions = gk.vector.extractFeatures(paths["regions"], onlyAttr=True)
countries = pd.Index(regions["name"], name="country")

View File

@ -20,7 +20,7 @@ Relevant Settings
cutouts:
{cutout}:
.. seealso::
.. seealso::
Documentation of the configuration file ``config.yaml`` at
:ref:`atlite_cf`
@ -74,11 +74,11 @@ Outputs
=================== ========== ========== =========================================================
.. image:: ../img/era5.png
:scale: 40 %
:scale: 40 %
A **SARAH-2 cutout** can be used to amend the fields ``temperature``, ``influx_toa``, ``influx_direct``, ``albedo``,
``influx_diffuse`` of ERA5 using satellite-based radiation observations.
.. image:: ../img/sarah.png
:scale: 40 %
@ -95,6 +95,9 @@ import os
import atlite
if __name__ == "__main__":
if 'snakemake' not in globals():
from _helpers import mock_snakemake
snakemake = mock_snakemake('build_cutout', cutout='europe-2013-era5')
configure_logging(snakemake)
cutout_params = snakemake.config['atlite']['cutouts'][snakemake.wildcards.cutout]

View File

@ -64,8 +64,11 @@ import geopandas as gpd
from vresutils import hydro as vhydro
if __name__ == "__main__":
if 'snakemake' not in globals():
from _helpers import mock_snakemake
snakemake = mock_snakemake('build_hydro_profile')
configure_logging(snakemake)
config = snakemake.config['renewable']['hydro']
cutout = atlite.Cutout(config['cutout'],
cutout_dir=os.path.dirname(snakemake.input.cutout))

View File

@ -36,24 +36,29 @@ Description
"""
import logging
logger = logging.getLogger(__name__)
from _helpers import configure_logging
import numpy as np
import atlite
import geokit as gk
from pathlib import Path
logger = logging.getLogger(__name__)
def determine_cutout_xXyY(cutout_name):
cutout = atlite.Cutout(cutout_name, cutout_dir="cutouts")
cutout = atlite.Cutout(cutout_name, cutout_dir=cutout_dir)
x, X, y, Y = cutout.extent
dx = (X - x) / (cutout.shape[1] - 1)
dy = (Y - y) / (cutout.shape[0] - 1)
return [x - dx/2., X + dx/2., y - dy/2., Y + dy/2.]
if __name__ == "__main__":
if 'snakemake' not in globals():
from _helpers import mock_snakemake
snakemake = mock_snakemake('build_natura_raster') #has to be enabled
configure_logging(snakemake)
cutout_names = np.unique([res['cutout'] for res in snakemake.config['renewable'].values()])
cutout_dir = Path(snakemake.input.cutouts[0]).parent.resolve()
cutout_names = {res['cutout'] for res in snakemake.config['renewable'].values()}
xs, Xs, ys, Ys = zip(*(determine_cutout_xXyY(cutout) for cutout in cutout_names))
xXyY = min(xs), max(Xs), min(ys), max(Ys)

View File

@ -91,14 +91,8 @@ def add_custom_powerplants(ppl):
if __name__ == "__main__":
if 'snakemake' not in globals():
from vresutils.snakemake import MockSnakemake, Dict
snakemake = MockSnakemake(
input=Dict(base_network='networks/base.nc',
custom_powerplants='data/custom_powerplants.csv'),
output=['resources/powerplants.csv']
)
from _helpers import mock_snakemake
snakemake = mock_snakemake('build_powerplants')
configure_logging(snakemake)
n = pypsa.Network(snakemake.input.base_network)

View File

@ -35,7 +35,7 @@ Relevant settings
clip_p_max_pu:
resource:
.. seealso::
.. seealso::
Documentation of the configuration file ``config.yaml`` at
:ref:`snapshots_cf`, :ref:`atlite_cf`, :ref:`renewable_cf`
@ -91,24 +91,24 @@ Outputs
.. image:: ../img/profile_ts.png
:scale: 33 %
- **p_nom_max**
.. image:: ../img/p_nom_max_hist.png
:scale: 33 %
- **potential**
.. image:: ../img/potential_heatmap.png
:scale: 33 %
- **average_distance**
.. image:: ../img/distance_hist.png
:scale: 33 %
- **underwater_fraction**
.. image:: ../img/underwater_hist.png
:scale: 33 %
@ -239,10 +239,13 @@ def calculate_potential(gid, save_map=None):
if __name__ == '__main__':
pgb.streams.wrap_stderr()
if 'snakemake' not in globals():
from _helpers import mock_snakemake
snakemake = mock_snakemake('build_renewable_profiles', technology='solar')
configure_logging(snakemake)
pgb.streams.wrap_stderr()
config = snakemake.config['renewable'][snakemake.wildcards.technology]
time = pd.date_range(freq='m', **snakemake.config['snapshots'])

View File

@ -64,7 +64,6 @@ Description
"""
import logging
logger = logging.getLogger(__name__)
from _helpers import configure_logging
import os
@ -77,9 +76,11 @@ import pandas as pd
import geopandas as gpd
from shapely.geometry import MultiPolygon, Polygon
from shapely.ops import cascaded_union
import pycountry as pyc
logger = logging.getLogger(__name__)
def _get_country(target, **keys):
assert len(keys) == 1
try:
@ -202,29 +203,9 @@ def save_to_geojson(df, fn):
df.to_file(fn, driver='GeoJSON', schema=schema)
if __name__ == "__main__":
# Detect running outside of snakemake and mock snakemake for testing
if 'snakemake' not in globals():
from vresutils.snakemake import MockSnakemake, Dict
snakemake = MockSnakemake(
path='..',
wildcards={},
input=Dict(
naturalearth='data/bundle/naturalearth/ne_10m_admin_0_countries.shp',
eez='data/bundle/eez/World_EEZ_v8_2014.shp',
nuts3='data/bundle/NUTS_2013_60M_SH/data/NUTS_RG_60M_2013.shp',
nuts3pop='data/bundle/nama_10r_3popgdp.tsv.gz',
nuts3gdp='data/bundle/nama_10r_3gdp.tsv.gz',
ch_cantons='data/bundle/ch_cantons.csv',
ch_popgdp='data/bundle/je-e-21.03.02.xls'
),
output=Dict(
country_shapes='resources/country_shapes.geojson',
offshore_shapes='resource/offshore_shapes.geojson',
europe_shape='resources/europe_shape.geojson',
nuts3_shapes='resources/nuts3_shapes.geojson'
)
)
from _helpers import mock_snakemake
snakemake = mock_snakemake('build_shapes')
configure_logging(snakemake)
country_shapes = countries()

View File

@ -20,7 +20,7 @@ Relevant Settings
lines:
length_factor:
.. seealso::
.. seealso::
Documentation of the configuration file ``config.yaml`` at
:ref:`toplevel_cf`, :ref:`renewable_cf`, :ref:`solving_cf`, :ref:`lines_cf`
@ -46,7 +46,7 @@ Outputs
:scale: 33 %
- ``resources/clustermaps_{network}_s{simpl}_{clusters}.h5``: Mapping of buses and lines from ``networks/elec_s{simpl}.nc`` to ``networks/elec_s{simpl}_{clusters}.nc``; has keys ['/busmap', '/busmap_s', '/linemap', '/linemap_negative', '/linemap_positive']
- ``networks/{network}_s{simpl}_{clusters}.nc``:
- ``networks/{network}_s{simpl}_{clusters}.nc``:
.. image:: ../img/elec_s_X.png
:scale: 40 %
@ -57,7 +57,7 @@ Description
.. note::
**Why is clustering used both in** ``simplify_network`` **and** ``cluster_network`` **?**
Consider for example a network ``networks/elec_s100_50.nc`` in which
``simplify_network`` clusters the network to 100 buses and in a second
step ``cluster_network``` reduces it down to 50 buses.
@ -86,7 +86,7 @@ Description
.. tip::
The rule :mod:`cluster_all_networks` runs
for all ``scenario`` s in the configuration file
for all ``scenario`` s in the configuration file
the rule :mod:`cluster_network`.
"""
@ -168,7 +168,7 @@ def distribute_clusters(n, n_clusters, focus_weights=None, solver_name=None):
total_focus = sum(list(focus_weights.values()))
assert total_focus <= 1.0, "The sum of focus weights must be less than or equal to 1."
for country, weight in focus_weights.items():
L[country] = weight / len(L[country])
@ -292,27 +292,9 @@ def cluster_regions(busmaps, input=None, output=None):
save_to_geojson(regions_c, getattr(output, which))
if __name__ == "__main__":
# Detect running outside of snakemake and mock snakemake for testing
if 'snakemake' not in globals():
from vresutils.snakemake import MockSnakemake, Dict
snakemake = MockSnakemake(
wildcards=Dict(network='elec', simpl='', clusters='45'),
input=Dict(
network='networks/{network}_s{simpl}.nc',
regions_onshore='resources/regions_onshore_{network}_s{simpl}.geojson',
regions_offshore='resources/regions_offshore_{network}_s{simpl}.geojson',
clustermaps='resources/clustermaps_{network}_s{simpl}.h5',
tech_costs='data/costs.csv',
),
output=Dict(
network='networks/{network}_s{simpl}_{clusters}.nc',
regions_onshore='resources/regions_onshore_{network}_s{simpl}_{clusters}.geojson',
regions_offshore='resources/regions_offshore_{network}_s{simpl}_{clusters}.geojson',
clustermaps='resources/clustermaps_{network}_s{simpl}_{clusters}.h5'
)
)
from _helpers import mock_snakemake
snakemake = mock_snakemake('cluster_network', network='elec', simpl='', clusters='5')
configure_logging(snakemake)
n = pypsa.Network(snakemake.input.network)

View File

@ -191,7 +191,7 @@ def calculate_supply(n,label,supply):
items = c.df.index[c.df.bus.map(bus_map)]
if len(items) == 0:
if len(items) == 0 or c.pnl.p.empty:
continue
s = c.pnl.p[items].max().multiply(c.df.loc[items,'sign']).groupby(c.df.loc[items,'carrier']).sum()
@ -209,7 +209,7 @@ def calculate_supply(n,label,supply):
items = c.df.index[c.df["bus" + end].map(bus_map)]
if len(items) == 0:
if len(items) == 0 or c.pnl["p"+end].empty:
continue
#lots of sign compensation for direction and to do maximums
@ -237,7 +237,7 @@ def calculate_supply_energy(n,label,supply_energy):
items = c.df.index[c.df.bus.map(bus_map)]
if len(items) == 0:
if len(items) == 0 or c.pnl.p.empty:
continue
s = c.pnl.p[items].sum().multiply(c.df.loc[items,'sign']).groupby(c.df.loc[items,'carrier']).sum()
@ -255,7 +255,7 @@ def calculate_supply_energy(n,label,supply_energy):
items = c.df.index[c.df["bus" + end].map(bus_map)]
if len(items) == 0:
if len(items) == 0 or c.pnl['p' + end].empty:
continue
s = (-1)*c.pnl["p"+end][items].sum().groupby(c.df.loc[items,'carrier']).sum()
@ -471,7 +471,16 @@ def to_csv(dfs):
if __name__ == "__main__":
# Detect running outside of snakemake and mock snakemake for testing
if 'snakemake' not in globals():
from _helpers import mock_snakemake
snakemake = mock_snakemake('make_summary', network='elec', simpl='',
clusters='5', ll='copt', opts='Co2L-24H', country='all')
network_dir = os.path.join('..', 'results', 'networks')
else:
network_dir = os.path.join('results', 'networks')
configure_logging(snakemake)
def expand_from_wildcard(key):
w = getattr(snakemake.wildcards, key)
return snakemake.config["scenario"][key] if w == "all" else [w]
@ -483,14 +492,9 @@ if __name__ == "__main__":
else:
ll = [snakemake.wildcards.ll]
configure_logging(snakemake)
networks_dict = {(simpl,clusters,l,opts) : ('results/networks/{network}_s{simpl}_{clusters}_ec_l{ll}_{opts}.nc'
.format(network=snakemake.wildcards.network,
simpl=simpl,
clusters=clusters,
opts=opts,
ll=l))
networks_dict = {(simpl,clusters,l,opts) :
os.path.join(network_dir, f'{snakemake.wildcards.network}_s{simpl}_'
f'{clusters}_ec_l{l}_{opts}.nc')
for simpl in expand_from_wildcard("simpl")
for clusters in expand_from_wildcard("clusters")
for l in ll

View File

@ -17,7 +17,8 @@ Description
import logging
logger = logging.getLogger(__name__)
from _helpers import load_network, aggregate_p, aggregate_costs, configure_logging
from _helpers import (load_network, aggregate_p, aggregate_costs,
configure_logging)
import pandas as pd
import numpy as np
@ -73,7 +74,7 @@ def plot_map(n, ax=None, attribute='p_nom', opts={}):
## DATA
line_colors = {'cur': "purple",
'exp': to_rgba("red", 0.7)}
'exp': mpl.colors.rgb2hex(to_rgba("red", 0.7), True)}
tech_colors = opts['tech_colors']
if attribute == 'p_nom':
@ -188,7 +189,7 @@ def plot_total_energy_pie(n, ax=None):
labels = e_primary.rename(opts['nice_names_n']).index,
autopct='%.0f%%',
shadow=False,
colors = [tech_colors[tech] for tech in e_primary.index])
colors = [opts['tech_colors'][tech] for tech in e_primary.index])
for t1, t2, i in zip(texts, autotexts, e_primary.index):
if e_primary.at[i] < 0.04 * e_primary.sum():
t1.remove()
@ -200,6 +201,7 @@ def plot_total_cost_bar(n, ax=None):
ax = plt.gca()
total_load = (n.snapshot_weightings * n.loads_t.p.sum(axis=1)).sum()
tech_colors = opts['tech_colors']
def split_costs(n):
costs = aggregate_costs(n).reset_index(level=0, drop=True)
@ -210,13 +212,15 @@ def plot_total_cost_bar(n, ax=None):
costs, costs_cap_ex, costs_cap_new, costs_marg = split_costs(n)
costs_graph = pd.DataFrame(dict(a=costs.drop('load', errors='ignore')),
index=['AC-AC', 'AC line', 'onwind', 'offwind-ac', 'offwind-dc', 'solar', 'OCGT','CCGT', 'battery', 'H2']).dropna()
index=['AC-AC', 'AC line', 'onwind', 'offwind-ac',
'offwind-dc', 'solar', 'OCGT','CCGT', 'battery', 'H2']).dropna()
bottom = np.array([0., 0.])
texts = []
for i,ind in enumerate(costs_graph.index):
data = np.asarray(costs_graph.loc[ind])/total_load
ax.bar([0.5], data, bottom=bottom, color=tech_colors[ind], width=0.7, zorder=-1)
ax.bar([0.5], data, bottom=bottom, color=tech_colors[ind],
width=0.7, zorder=-1)
bottom_sub = bottom
bottom = bottom+data
@ -236,8 +240,8 @@ def plot_total_cost_bar(n, ax=None):
texts.append(text)
ax.set_ylabel("Average system cost [Eur/MWh]")
ax.set_ylim([0,80]) # opts['costs_max']])
ax.set_xlim([0,1])
ax.set_ylim([0, 80]) # opts['costs_max']])
ax.set_xlim([0, 1])
#ax.set_xticks([0.5])
ax.set_xticklabels([]) #["w/o\nEp", "w/\nEp"])
ax.grid(True, axis="y", color='k', linestyle='dotted')
@ -245,21 +249,12 @@ def plot_total_cost_bar(n, ax=None):
if __name__ == "__main__":
if 'snakemake' not in globals():
from vresutils.snakemake import MockSnakemake, Dict
from snakemake.rules import expand
snakemake = Dict()
snakemake = MockSnakemake(
path='..',
wildcards=dict(network='elec', simpl='', clusters='90', lv='1.25', opts='Co2L-3H', attr='p_nom', ext="pdf"),
input=dict(network="results/networks/{network}_s{simpl}_{clusters}_lv{lv}_{opts}.nc",
tech_costs="data/costs.csv"),
output=dict(only_map="results/plots/{network}_s{simpl}_{clusters}_lv{lv}_{opts}_{attr}.{ext}",
ext="results/plots/{network}_s{simpl}_{clusters}_lv{lv}_{opts}_{attr}_ext.{ext}")
)
from _helpers import mock_snakemake
snakemake = mock_snakemake('plot_network', network='elec', simpl='',
clusters='5', ll='copt', opts='Co2L-24H',
attr='p_nom', ext="pdf")
configure_logging(snakemake)
set_plot_style()
opts = snakemake.config['plotting']
@ -273,8 +268,7 @@ if __name__ == "__main__":
fig, ax = plt.subplots(figsize=map_figsize, subplot_kw={"projection": ccrs.PlateCarree()})
plot_map(n, ax, snakemake.wildcards.attr, opts)
fig.savefig(snakemake.output.only_map, dpi=150,
bbox_inches='tight', bbox_extra_artists=[l1,l2,l3])
fig.savefig(snakemake.output.only_map, dpi=150, bbox_inches='tight')
ax1 = fig.add_axes([-0.115, 0.625, 0.2, 0.2])
plot_total_energy_pie(n, ax1)
@ -292,5 +286,4 @@ if __name__ == "__main__":
fig.suptitle('Expansion to {amount} {label} at {clusters} clusters'
.format(amount=amnt, label=lbl, clusters=snakemake.wildcards.clusters))
fig.savefig(snakemake.output.ext, transparent=True,
bbox_inches='tight', bbox_extra_artists=[l1, l2, l3, ax1, ax2])
fig.savefig(snakemake.output.ext, transparent=True, bbox_inches='tight')

View File

@ -22,6 +22,7 @@ import pypsa
import pandas as pd
import matplotlib.pyplot as plt
import logging
def cum_p_nom_max(net, tech, country=None):
carrier_b = net.generators.carrier == tech
@ -42,31 +43,17 @@ def cum_p_nom_max(net, tech, country=None):
if __name__ == "__main__":
# Detect running outside of snakemake and mock snakemake for testing
if 'snakemake' not in globals():
from vresutils.snakemake import MockSnakemake, Dict
snakemake = MockSnakemake(
path='..',
wildcards={'clusters': '45,90,181,full',
'country': 'all'},
params=dict(techs=['onwind', 'offwind-ac', 'offwind-dc', 'solar']),
input=Dict(
**{
'full': 'networks/elec_s.nc',
'45': 'networks/elec_s_45.nc',
'90': 'networks/elec_s_90.nc',
'181': 'networks/elec_s_181.nc',
}
),
output=['results/plots/cum_p_nom_max_{clusters}_{country}.pdf']
)
from _helpers import mock_snakemake
snakemake = mock_snakemake('plot_p_nom_max', network='elec', simpl='',
techs='solar,onwind,offwind-dc', ext='png',
clusts= '5,full', country= 'all')
configure_logging(snakemake)
plot_kwds = dict(drawstyle="steps-post")
clusters = snakemake.wildcards.clusters.split(',')
techs = snakemake.params.techs
clusters = snakemake.wildcards.clusts.split(',')
techs = snakemake.wildcards.techs.split(',')
country = snakemake.wildcards.country
if country == 'all':
country = None
@ -75,14 +62,15 @@ if __name__ == "__main__":
fig, axes = plt.subplots(1, len(techs))
for cluster in clusters:
net = pypsa.Network(getattr(snakemake.input, cluster))
for j, cluster in enumerate(clusters):
net = pypsa.Network(snakemake.input[j])
for i, tech in enumerate(techs):
cum_p_nom_max(net, tech, country).plot(x="p_max_pu", y="c_p_nom_max", label=cluster, ax=axes[0][i], **plot_kwds)
cum_p_nom_max(net, tech, country).plot(x="p_max_pu", y="cum_p_nom_max",
label=cluster, ax=axes[i], **plot_kwds)
for i, tech in enumerate(techs):
ax = axes[0][i]
ax = axes[i]
ax.set_xlabel(f"Capacity factor of {tech}")
ax.set_ylabel("Cumulative installable capacity / TW")

View File

@ -18,6 +18,7 @@ Description
import os
import logging
logger = logging.getLogger(__name__)
from _helpers import configure_logging
import pandas as pd
import matplotlib.pyplot as plt
@ -181,7 +182,11 @@ def plot_energy(infn, fn=None):
if __name__ == "__main__":
if 'snakemake' not in globals():
from _helpers import mock_snakemake
snakemake = mock_snakemake('plot_summary', summary='energy', network='elec',
simpl='', clusters=5, ll='copt', opts='Co2L-24H',
attr='', ext='png', country='all')
configure_logging(snakemake)
summary = snakemake.wildcards.summary

View File

@ -38,7 +38,9 @@ from _helpers import configure_logging
import pandas as pd
if __name__ == "__main__":
if 'snakemake' not in globals():
from _helpers import mock_snakemake #rule must be enabled in config
snakemake = mock_snakemake('prepare_links_p_nom', simpl='', network='elec')
configure_logging(snakemake)
links_p_nom = pd.read_html('https://en.wikipedia.org/wiki/List_of_HVDC_projects', header=0, match="SwePol")[0]

View File

@ -178,15 +178,10 @@ def average_every_nhours(n, offset):
if __name__ == "__main__":
# Detect running outside of snakemake and mock snakemake for testing
if 'snakemake' not in globals():
from vresutils.snakemake import MockSnakemake
snakemake = MockSnakemake(
wildcards=dict(network='elec', simpl='', clusters='37', ll='v2', opts='Co2L-3H'),
input=['networks/{network}_s{simpl}_{clusters}.nc'],
output=['networks/{network}_s{simpl}_{clusters}_ec_l{ll}_{opts}.nc']
)
from _helpers import mock_snakemake
snakemake = mock_snakemake('prepare_network', network='elec', simpl='',
clusters='5', ll='copt', opts='Co2L-24H')
configure_logging(snakemake)
opts = snakemake.wildcards.opts.split('-')

View File

@ -45,6 +45,12 @@ import tarfile
from _helpers import progress_retrieve, configure_logging
if __name__ == "__main__":
if 'snakemake' not in globals():
from _helpers import mock_snakemake
snakemake = mock_snakemake('retrieve_cutout')
rootpath = '..'
else:
rootpath = '.'
configure_logging(snakemake) # TODO Make logging compatible with progressbar (see PR #102)
@ -54,13 +60,13 @@ if __name__ == "__main__":
url = "https://zenodo.org/record/3517949/files/pypsa-eur-cutouts.tar.xz"
# Save location
tarball_fn = Path("./cutouts.tar.xz")
tarball_fn = Path(f"{rootpath}/cutouts.tar.xz")
logger.info(f"Downloading cutouts from '{url}'.")
progress_retrieve(url, tarball_fn)
logger.info(f"Extracting cutouts.")
tarfile.open(tarball_fn).extractall()
tarfile.open(tarball_fn).extractall(path=rootpath)
tarball_fn.unlink()

View File

@ -36,7 +36,13 @@ from pathlib import Path
import tarfile
if __name__ == "__main__":
# Detect running outside of snakemake and mock snakemake for testing
if 'snakemake' not in globals():
from _helpers import mock_snakemake
snakemake = mock_snakemake('retrieve_databundle')
rootpath = '..'
else:
rootpath = '.'
configure_logging(snakemake) # TODO Make logging compatible with progressbar (see PR #102)
if snakemake.config['tutorial']:
@ -45,15 +51,15 @@ if __name__ == "__main__":
url = "https://zenodo.org/record/3517935/files/pypsa-eur-data-bundle.tar.xz"
# Save locations
tarball_fn = Path("./bundle.tar.xz")
to_fn = Path("./data")
tarball_fn = Path(f"{rootpath}/bundle.tar.xz")
to_fn = Path(f"{rootpath}/data")
logger.info(f"Downloading databundle from '{url}'.")
progress_retrieve(url, tarball_fn)
logger.info(f"Extracting databundle.")
tarfile.open(tarball_fn).extractall(to_fn)
tarball_fn.unlink()
logger.info(f"Databundle available in '{to_fn}'.")
logger.info(f"Databundle available in '{to_fn}'.")

View File

@ -22,7 +22,7 @@ This rule, as a substitute for :mod:`build_natura_raster`, downloads an already
- ``resources/natura.tiff``: Rasterized version of `Natura 2000 <https://en.wikipedia.org/wiki/Natura_2000>`_ natural protection areas to reduce computation times.
.. seealso::
For details see :mod:`build_natura_raster`.
For details see :mod:`build_natura_raster`.
"""
@ -33,16 +33,14 @@ from pathlib import Path
from _helpers import progress_retrieve, configure_logging
if __name__ == "__main__":
if 'snakemake' not in globals():
from _helpers import mock_snakemake
snakemake = mock_snakemake('retrieve_natura_raster')
configure_logging(snakemake) # TODO Make logging compatible with progressbar (see PR #102)
# Save location, ensure folder existence
to_fn = Path("resources/natura.tiff")
to_fn.parent.mkdir(parents=True, exist_ok=True)
url = "https://zenodo.org/record/3518215/files/natura.tiff"
logger.info(f"Downloading natura raster from '{url}'.")
progress_retrieve(url, to_fn)
logger.info(f"Natura raster available as '{to_fn}'.")
progress_retrieve(url, snakemake.output[0])
logger.info(f"Natura raster available as '{snakemake.output[0]}'.")

View File

@ -331,26 +331,9 @@ def cluster(n, n_clusters):
return clustering.network, clustering.busmap
if __name__ == "__main__":
# Detect running outside of snakemake and mock snakemake for testing
if 'snakemake' not in globals():
from vresutils.snakemake import MockSnakemake, Dict
snakemake = MockSnakemake(
path='..',
wildcards=Dict(simpl='1024', network='elec'),
input=Dict(
network='networks/{network}.nc',
tech_costs="data/costs.csv",
regions_onshore="resources/regions_onshore.geojson",
regions_offshore="resources/regions_offshore.geojson"
),
output=Dict(
network='networks/{network}_s{simpl}.nc',
regions_onshore="resources/regions_onshore_{network}_s{simpl}.geojson",
regions_offshore="resources/regions_offshore_{network}_s{simpl}.geojson",
clustermaps='resources/clustermaps_{network}_s{simpl}.h5'
)
)
from _helpers import mock_snakemake
snakemake = mock_snakemake('simplify_network', simpl='', network='elec')
configure_logging(snakemake)
n = pypsa.Network(snakemake.input.network)

View File

@ -370,17 +370,10 @@ def solve_network(n, config=None, solver_log=None, opts=None, callback=None,
return n
if __name__ == "__main__":
# Detect running outside of snakemake and mock snakemake for testing
if 'snakemake' not in globals():
from vresutils.snakemake import MockSnakemake, Dict
snakemake = MockSnakemake(
wildcards=dict(network='elec', simpl='', clusters='45', lv='1.0', opts='Co2L-3H'),
input=["networks/{network}_s{simpl}_{clusters}_lv{lv}_{opts}.nc"],
output=["results/networks/s{simpl}_{clusters}_lv{lv}_{opts}.nc"],
log=dict(solver="logs/{network}_s{simpl}_{clusters}_lv{lv}_{opts}_solver.log",
python="logs/{network}_s{simpl}_{clusters}_lv{lv}_{opts}_python.log")
)
from _helpers import mock_snakemake
snakemake = mock_snakemake('solve_network', network='elec', simpl='',
clusters='5', ll='copt', opts='Co2L-24H')
configure_logging(snakemake)
tmpdir = snakemake.config['solving'].get('tmpdir')

View File

@ -21,7 +21,7 @@ Relevant Settings
name:
(solveroptions):
.. seealso::
.. seealso::
Documentation of the configuration file ``config.yaml`` at
:ref:`solving_cf`
@ -79,23 +79,16 @@ def set_parameters_from_optimized(n, n_optim):
return n
if __name__ == "__main__":
# Detect running outside of snakemake and mock snakemake for testing
if 'snakemake' not in globals():
from vresutils.snakemake import MockSnakemake
snakemake = MockSnakemake(
wildcards=dict(network='elec', simpl='', clusters='45', lv='1.5', opts='Co2L-3H'),
input=dict(unprepared="networks/{network}_s{simpl}_{clusters}.nc",
optimized="results/networks/{network}_s{simpl}_{clusters}_lv{lv}_{opts}.nc"),
output=["results/networks/{network}_s{simpl}_{clusters}_lv{lv}_{opts}_op.nc"],
log=dict(solver="logs/s{simpl}_{clusters}_lv{lv}_{opts}_op_solver.log",
python="logs/s{simpl}_{clusters}_lv{lv}_{opts}_op_python.log")
)
from _helpers import mock_snakemake
snakemake = mock_snakemake('solve_operations_network', network='elec',
simpl='', clusters='5', ll='copt', opts='Co2L-24H')
configure_logging(snakemake)
tmpdir = snakemake.config['solving'].get('tmpdir')
if tmpdir is not None:
patch_pyomo_tmpdir(tmpdir)
configure_logging(snakemake)
n = pypsa.Network(snakemake.input.unprepared)

View File

@ -49,21 +49,16 @@ from solve_network import patch_pyomo_tmpdir, prepare_network, solve_network
import pypsa
if __name__ == "__main__":
# Detect running outside of snakemake and mock snakemake for testing
if 'snakemake' not in globals():
from vresutils.snakemake import MockSnakemake
snakemake = MockSnakemake(
wildcards=dict(network='elec', simpl='', clusters='45', lv='1.25', opts='Co2L-3H'),
input=["networks/{network}_s{simpl}_{clusters}_lv{lv}_{opts}.nc"],
output=["results/networks/s{simpl}_{clusters}_lv{lv}_{opts}_trace.nc"],
log=dict(python="logs/{network}_s{simpl}_{clusters}_lv{lv}_{opts}_python_trace.log")
)
from _helpers import mock_snakemake
snakemake = mock_snakemake('trace_solve_network', network='elec', simpl='',
clusters='5', ll='copt', opts='Co2L-24H')
configure_logging(snakemake)
tmpdir = snakemake.config['solving'].get('tmpdir')
if tmpdir is not None:
patch_pyomo_tmpdir(tmpdir)
configure_logging(snakemake)
n = pypsa.Network(snakemake.input[0])

View File

@ -13,7 +13,7 @@ scenario:
clusters: [5]
opts: [Co2L-24H]
countries: ['DE']
countries: ['DE']
snapshots:
start: "2013-03-01"
@ -25,7 +25,7 @@ enable:
retrieve_databundle: true
build_cutout: false
build_natura_raster: false
electricity:
voltages: [220., 300., 380.]
co2limit: 100.e+6
@ -202,7 +202,7 @@ plotting:
storage_techs: ["hydro+PHS", "battery", "H2"]
load_carriers: ["AC load"]
AC_carriers: ["AC line", "AC transformer"]
link_carriers: ["DC line", "Converter AC-DC"]
link_carriers: ["DC line", "Converter AC-DC"]
tech_colors:
"onwind" : "#235ebc"
"onshore wind" : "#235ebc"
@ -256,7 +256,7 @@ plotting:
"helmeth" : "#a31597"
"DAC" : "#d284ff"
"co2 stored" : "#e5e5e5"
"CO2 sequestration" : "#e5e5e5"
"CO2 sequestration" : "#e5e5e5"
"battery" : "#b8ea04"
"battery storage" : "#b8ea04"
"Li ion" : "#b8ea04"
@ -265,7 +265,7 @@ plotting:
"transport fuel cell" : "#e884be"
"retrofitting" : "#e0d6a8"
"building retrofitting" : "#e0d6a8"
"heat pumps" : "#ff9768"
"heat pumps" : "#ff9768"
"heat pump" : "#ff9768"
"air heat pump" : "#ffbea0"
"ground heat pump" : "#ff7a3d"
@ -317,4 +317,3 @@ plotting:
H2: "Hydrogen\nStorage"
lines: "Transmission\nlines"
ror: "Run of\nriver"