New representation of time
Opened this issue · 8 comments
Currently, all addto!
functions which formulate the equations per device, call
T = axis(m, :snapshots)
effectively requesting the snapshots coordinate from the NetCDF dataset as an AxisArrays.Axis
.
In this scheme, we can only support a single time-dimension and rolling horizon is effectively not possible without intercepting this axis
call in the Data
layer.
Wishlist
- Special handling of time dimensions as functions on a separate -- probably mutable -- object, which is either passed as an argument to
addto!
calls, or kept and updated on theEnergyModel
object (i'm favoring the latter). - We want to be able to model several investment periods, snapshots, different durations for snapshots (was called weights in PyPSA), maybe several groups of continuous snapshots with different weights per group.
Plan (WIP)
Snapshots
Typically hourly.
Works as in PyPSA. Would be good for time clustering methods if the duration of a single snapshots is variable. There was an argument that there should be weights
separately from durations
(?) in the context of storageunits. Can you still remember, @fneum ?
Investment periods
For instance each year every five years.
Represented by a separate time coordinate :periods
on which capacity values and capital costs can depend.
If that coordinate :periods
is managed orthogonally, it's easy to represent changing costs, at the same time with weather for just a single year, ie per snapshot. On the other hand it's not possible to change the snapshot representation for later investment periods (for instance, PLEXOS allows to reduce the amount of representative days for later periods, on the other hand representative days is a difficult concept with seasonal storage units).
@lisazeyen @FabianHofmann Can you elaborate how investment periods are represented in the PSA.jl pathway_optimization branch? What worked, what was the difficulty?
Groups of continuous snapshots (maybe later?)
Modeling seasonal storage by Leander suggests it's possible to use a typical day approach even with seasonal storage to represent an energy system with a small error margin if the selected days are adequately linked. I do not understand how everything fits together but it effectively boils down to being able to group the operational time-coordinate :snapshots
into multiple continuous groups and link them together.
Can someone read that paper or similar descriptions and try to distill how to model this? @fneum , you were playing with Leander's tsam
library before! Care to comment?
--> Investment Periods
The Investment Periods (IP) in PSA.jl are not explicitly an extra dimension. Rather it is investment points (subset of snapshots) where capacities of components can be extended or reduced. So basically, the range of snapshots becomes very long (e.g. 10 years) but only at the IPs infrastructure investments are possible.
The variables for capacity infrastructure are then not the capacity of a generators itself but the expansion/reduction of capacity. Therefore the capacity of a generator at a specific ip is:
A major argument for this approach was to price infrastructure cutbacks (even if very low) and not like pypsa to gain money from reducing infrastructure.
The approach works well for generators, lines and links expansion. It solves large systems (last was 50 nodes, 11 years, 3 hourly, 10 ips). But as soon as transmission and storage capacity expansion is allowed, the solver often returns numerical troubles, which we couldn't track down so far. Atm it seems that the formulation was a good first try but might not be appropriate, since not numerically stable...
Did you check and/or compare to the formulation of OSeMOSYS?
Judging by their MathProg
formulation:
- They minimize the
TotalDiscountedCost
, - which splits into operational costs (marginal + emissions), investment costs minus salvage values (refer to line 371)
- investment costs are calculated as discounted
Capital_costs * NewCapacity
(line 352), while - salvage values seem to track the value loss over the life-time (refer to line 357), probably only to avoid the optimization planning for system collapse immediately after the modelled expansion horizon.
- as variables they use a
NewCapacity[region,technology,year] >= 0
variable (only additions) and keep anAccumulatedNewCapacity
variable which is the cumulative sum respecting life times (line 261). So they do not have capacity reduction!
This seems to be in line with what you did.
But mind: An orthogonal coordinate for investment periods is no contradiction to using capacity expansions/reductions as separate variables. It's only a matter of data management. FWIW: OSeMOSYS uses such a separate coordinate: YEARS.
@FabianHofmann Where did you store the different capital_cost
values? Was that a time-series?
Yes, the cost division in OSeMOSYS is very similar. the cost types in PSA are
- investment costs for capacity addition
- infrastructure removal costs
- operational costs for power production
- maintenance costs for power installation
All costs were fix, so no timeseries only static values for the components
And one more datapoint: SWITCH 2.0 also uses positive-only capacity additions (see BuildGen) and separately keeps variables for the aggregated available capacity (see genCapacity). Also of interest is the model formulation as documented in their Supporting information.
So, I suppose formulation-wise that's the most treaded path, even though you had stability issues.
Works as in PyPSA. Would be good for time clustering methods if the duration of a single snapshots is variable. There was an argument that there should be weights separately from durations (?) in the context of storageunits. Can you still remember, @fneum ?
I am not sure about the context of storage units, but - if we are talking about naming - the term weight
is just more general than duration
and in PyPSA the weight
can also be used for different contexts than time, e.g. to represent a probability (https://pypsa.org/doc/optimal_power_flow.html?highlight=weighting).
I find the term duration
quite intuitive, but if it should be kept general weights
may be the better option.
@fneum I am not saying that we should abolish weight
s, but there was a suggestion earlier that snapshot_weighting
as it is in PyPSA mixes two different concepts, which would be more flexible if they were separated cleanly. But that's as far as my recollection goes. @nworbmot Can you jump in again, since you already had to correct the last few times?
And one more datapoint: SWITCH 2.0 also uses positive-only capacity additions (see BuildGen) and separately keeps variables for the aggregated available capacity (see genCapacity). Also of interest is the model formulation as documented in their Supporting information.
So, I suppose formulation-wise that's the most treaded path, even though you had stability issues.
@coroa One further note: A problem we see in pathway optimisation in PSA.jl is that for reduction on infrastructure, generator capacities are always reduced in approriate amounts for mapping them onto specific generator blocks. This is not very useful as soon as one tries to tackle the question about decommissioning blocks of power plants, e.g. the case of the German coal phase-out. I am not sure how the MIP formulation would then impede solving times, but it might be a good feature to introduce discrete reduction of capacities then.