Units and behavior of diffusive particles
Opened this issue · 19 comments
In the documentation, mmol/L
is mentioned as the unit for ion X diffusive concentration
, but it is not stated what it refers to (e.g., to point mechanisms, as it is stated for other quantities on the same documentation page). More importantly, the unit of the amount of diffusive particles is neither stated in the documentation nor does it immediately become clear.
Here is an example to specify the issue. Consider a compartment with the surface area A = 2⋅π⋅radius⋅length
. To get from a point measurement of the diffusive particle concentration Xd
to the amount of particles on the whole surface of the compartment, we find that one needs to compute Xd⋅A⋅10^9
. Then, with Xd
in units of mmol/L
, A
in units of m^2
, and 10^9
in units of µm
, the unit of the amount of diffusive particles would be µmol
, which seems fine. Nevertheless, the exact definition of the amount of particles (and thus its biophysical meaning) remains unclear because the origin and interpretation of the huge factor 10^9 µm
is unclear. I guess, the problem may originate from the fact that the concentration is defined 'per volume' while all the dynamics currently considered by Arbor happen on a surface.
Furthermore, it is not clear from the documentation how the particle diffusion behaves in general (but some information has already been provided in this PR).
To summarize, it would be nice to have a thorough description of the behavior of the particle diffusion in the documentation. Moreover, fixing the unit of the amount of particles (perhaps µmol
) is likely necessary, and its definition should be mentioned in the documentation.
(edited)
So, on the behaviour the section you quoted says this:
To enable diffusion of ion species along the morphology (axial diffusion) one sets the per-species diffusivity to a positive value. It can be changed per region and defaults to zero. This is strictly passive transport according to the diffusion equation X' = ß ∆X where X is the species’ diffusive concentration and ß the diffusivity constant.
and the NMODL docs teach how to manipulate the quantity X from a mechanism. The developer section talks about
the mathematical details, namely that X basically follows the cable equation.
It's a simple process so I couldn't think of anything else of interest. Other than that I'll add the units, good point.
Thanks @thorstenhater. Regarding the mathematical details, do you mean this document?
And thanks for including the unit into the documentation. However, I think we should still have a discussion about its foundation (I'll write you on Gitter).
Please note that I've edited the initial post once more.
Here's a formal derivation of the units which may be of interest (note that this is for the current implementation, and there seems to be a flaw). If we want to compute the amount of particles inµmol
, and x
is the factor that we need to multiply with to arrive at the amount of particles, [x]
is the unit of that factor:
µmol = mmol/L ⋅ m^2 ⋅ [x]
10^(-3)⋅mmol = mmol/(10^(-3)⋅m^3) ⋅ m^2 ⋅ [x]
10^(-6) = m^2 / m^3 ⋅ [x]
10^(-6) ⋅ m = [x]
[x] = µm
Choosing µmol
as I did above, and thus getting µm
as a unit for the 10^9
factor, is arbitrary. One could also have pmol
and pm
, for example. But in any case, the relationship between the particle concentration and the amount of particles is biased such that the amount is unrealistically low.
I am a bit confused by now. You mention
mmol/L is mentioned as the unit for ion X diffusive concentration
This is correct. However,
Xd
to the amount of particles on the whole surface of the compartment
in the model we used here Xd
is the volume concentration and the surface
density is not determined. Are you looking for a shell model?
Nevertheless, the exact definition of the amount of particles (and thus its
biophysical meaning) remains unclear because the origin and interpretation
of the huge factor 10^9 µm is unclear
As noted elsewhere, the diffusion coefficient is in SI units
internal model deals with
of breaks API.
But in any case, the relationship between the particle concentration and the amount of particles is biased such that the amount is unrealistically low.
That sentence is unclear to me.
Now to my confusion. so far I was assuming the kind-of-mysterious factor
arose when handling deltas in Xd
in POINT_PROCESS
es
Xd += fX
where fX
is supposed to be some kind of flux. It is totally possible that I made a mistake
somewhere that drops proper normalisation.
But now we are talking about surface densities?
Back to the normalisation question. By definition
and describes the particle concentration in the CV.
We also want that an injection by point process is CV-size-independent, thus we should
be normalising changes to X_d
in a POINT_PROCESS
(and only there) by the CV area.
I remember testing this during development, but now long have the same hardware.
Is this broken? If not, please explain what else is wrong.
We also want that an injection by point process is CV-size-independent, thus we should
be normalising changes to X_d in a POINT_PROCESS (and only there) by the CV area.
I remember testing this during development, but now long have the same hardware.
I'm not sure I understand this. The effect of a point process should be the injection of a total amount of ions irrespective of the CV volume or area. The same way as a synapse injects a total amount of charge. Therefore we would need to normalise to the CV volume.
Xd
is the volume concentration and the surface density is not determined.
@thorstenhater okay, that's good to know.
Are you looking for a shell model?
I personally wouldn't mind whether it'd be a shell or a volume model, but I think at the moment it is neither. For a volume model, which it's supposed to be at the moment if I understood the quote above correctly,
- the amount of particles injected in a point process should be normalized with the volume, as @schmitts has pointed out,
- the third distance dimension (in addition to the two dimensions of the surface area) has to be well-defined, for example, depending on the radius.
For now, however, it seems that the third dimension is given simply by the factor of 10^9
. To compute the amount of particles in units of µmol
we would thus assume a distance of 10^9 µm
, which is very big. That's what I've wanted to point out before by saying
in any case, the relationship between the particle concentration and the amount of particles is biased such that the amount is unrealistically low.
That's what I've wanted to point out before by saying
Ok, now I understand, our scaling is such that the actual concentration is lower than you would expect.
For now, however, it seems that the third dimension is given simply by the factor of 10^9.
No, this is still the conversion from SI -> biophysical unit that's attached to setting the coefficient
the amount of particles injected in a point process should be normalized with the volume, as @schmitts has pointed out,
I would disagree, the injection is a flux through a surface and thus we should normalize by CV area not volume.
That said I would still like to see what your experiment, observed outcome, and expected outcome are. Overall,
I understand your general dissatisfaction, the cable model can be unintuitive. Here, we have Xi
, which is considered
to be the volume concentration in a thin shell inside the membrane (the width being arbitrarily small). The cable model
assume a basically infinite buffer further inside the leads to the resetting of Xi
on infinitely small time scales (we use
Xi
))). Xd
had to be added
for this reason, so we can still use Xi
as before and have diffusion in another channel (concentration model are a no-go since we never can extract state values from mechanisms). That makes Xd
somewhat of a chimera, not quite of the
cable model and not quite outside either. It's definition is therefore subtly different and somewhat vague, it simply (!)
drops the buffering. Then Xd
is the average concentration across the neurite's radius. That muddies the water a bit
wrt to injection and localization of Xd
.
I hope that at least clarifies the motivations.
Thanks for the elaboration @thorstenhater , I think we're getting closer to a solution.
our scaling is such that the actual concentration is lower than you would expect.
Almost. The total amount of particles is lower than we would expect, or alternatively, the concentration is too high.
this is still the conversion from SI -> biophysical unit that's attached to setting the coefficient
$\beta$ .
Okay, it's 10^9
along with the unit conversion. If [diffusivity] = m^2/s
and given that Arbor usually computes with µm
, we'd have 10^9 µm
for the third dimension, and the resulting amount of particles should be in µmol
. I guess it's just the scaling that needs to be different. What could be possible ways to achieve this?
Here is a toy example of one compartment, which should demonstrate how the concentration and amount of two types of diffusing particles depend on the normalization factor and the radius.
In this example, we inject an amount of particles sd
at a synapse (see the point mechanism synapse_with_diffusion
). The particles then diffuse across the neuron, and the concentration s
is measured from the density process neuron_with_diffusion
, while the total amount of particles is given by the normalized variable s⋅A⋅10^9
.
The amount of sd
particles then drives the dynamics of the particles pd
, as described in the density mechanism neuron_with_diffusion
. These particles also diffuse across the neuron. Their concentration p
is measured from the point process synapse_with_diffusion
, while their total amount of is reflected by the normalized variable p⋅A⋅10^9
.
The gray-shaded dotted lines in the plots below show that the mechanisms and the normalization are working.
From the plots, you can also see that the concentration is inversely proportional to the radius while the total amount remains invariant with respect to the radius. This is as it should be for surface area normalization, but for volume normalization the concentration should depend on the radius in a quadratic manner.
Output for r=1:
length = 2.0 (µm)
radius = 1.0 (µm)
A = 1.2566370614359172e-11 (m^2)
p_factor = 0.012566370614359171 (unit??)
diffusivity = 1 (m^2/s??)
max(s) = 143.2394544745301
max(p) = 795.738887608604
Output for r=2:
length = 2.0 (µm)
radius = 2.0 (µm)
A = 2.5132741228718345e-11 (m^2)
p_factor = 0.025132741228718343 (unit??)
diffusivity = 1 (m^2/s??)
max(s) = 71.61972723726505
max(p) = 397.869443804302
Output for r=4:
length = 2.0 (µm)
radius = 4.0 (µm)
A = 5.026548245743669e-11 (m^2)
p_factor = 0.050265482457436686 (unit??)
diffusivity = 1 (m^2/s??)
max(s) = 35.80986361863253
max(p) = 198.934721902151
We normalise the flux to the area. Reason being -- lifted from a discussion with @schmitts last fall -- that we
want our experiments to the independent of discretisation, ie halving the CV length should not result in double the
total concentration added to the CV via the same injection. This is different from the current injection which is
CV-size dependent.
Thus
Xd += Z
is understood as a change to Xd
of
Z/A_cv
We do not normalise Xd
itself to the CV volume nor its area. It might appear that way if your Xd
is generated
solely by POINT_PROCESSES
.
Okay, that sounds sensible. So this normalization is not done in density processes, correct?
What remains is the unit issue. As a suggestion, I think either could the concentration be re-defined as mmol/cm^2
or similar (per area), or the factor 10^9
should be replaced by 1
or similar.
Why would we redefine [Xd]
? It's mM = mmol/l
and it needs to be, since we assume equivalency Xd ~ Xi
.
The issue is the flux -- called Z
in the example above -- which has a hidden unit conversion from mM
to
mM/um^2
(z ~ Z
) which is then summed across the CV's POINT_PROCESSES
and added to the concentration
as
So, again, this was a request from @schmitts wrt discretisation independence, if opinions have change about this,
we can talk changing this towards the model for current injection which is depending on CV sizes and thus
discretisation.
So, again, this was a request from @schmitts wrt discretisation independence, if opinions have change about this,
we can talk changing this towards the model for current injection which is depending on CV sizes and thus
discretisation.
I will go into deep medidation about this. In case I resurface, I let you know.
🧘
I'd want to say 'I warned you', but I thought it was sensible, too ;)
Here is a newer summary of this issue (we've discussed this here).
Definitions
All variables are defined per CV.
- volume:
$V$ $[\text{SI: m}^3,\text{Arbor: µm}^3]$ - area:
$A$ $[\text{SI: m}^2,\text{Arbor: µm}^2]$ - amount of ions (or particles)
$x$ :
$N_x$ $[\text{SI: }\text{mol},\text{Arbor: }\text{1e-18 mol}]$ - concentration of ions (or particles)
$x$ :
$C_x = \frac{N_x}{V}$ $\left[\text{SI: }\frac{\text{mol}}{\text{m}^3},\text{Arbor: }\frac{\text{mmol}}{\text{L}}\right]$
Assumptions and Requirements
-
a biological synapse injects particles (following some time course) and does not immediately act on the concentration (but it may take the concentration as input to its computations)
-
the equilibrium concentration over the full neuron is independent of the spatial discretization (the user must never be required to know the discretization when setting parameters, neither when setting up the model nor inside NMODL mechanisms)
-
in the absence of inject or decay processes, the number of particles in the whole neuron must stay constant
-
concentrations are always greater than or equal to zero
Expected Behavior
Given delta-function-like particle input
- a fixed amount of particles
$\Delta w$ is injected,$N_x = N_x + {\Delta w}$ - the resulting concentration, or the concentration increment, must scale inversely with the CV volume,
$C_x = C_x + \frac{\Delta w}{V}$
Current Behavior
The expected behavior of C_xd
in a NMODL mechanism:
- via
C_xd = C_xd + delta_w*A/V/1000
for a point mechanism - via
C_xd = C_xd + delta_w/V
for a density mechanism
Proposal
The solution to issue #2105 will help to solve the issue here, since it enables the normalization of concentration values – according to the current behavior described above – directly within the mechanisms.
Next, the units of the concentration and of the amount of particles can be sorted out more easily.
I've just edited the previous post to include more recent 'findings' and to make it more accessible.
And here is a new example with volume normalization (similar to the example with surface area normalization above).
The new example (also see the plots below) shall make clear how the diffusion in Arbor currently behaves (master version 235b291). First, we inject an amount of particles s
at a synapse (see the point mechanism synapse_with_diffusion
) - this amount is scaled by area / volume / 1000
to capture the subsequent normalization that is automatically applied in point mechanisms (see the previous post). The particles s
then diffuse across the neuron, and we can measure their concentration and total amount (see top panel and second panel, respectively). Moreover, the amount of particles s
triggers the injection of particles p
(if sV
is greater than theta
) which is described in the density mechanism neuron_with_diffusion
. The particles p
also diffuse across the neuron, and we can measure their concentration and total amount as well (see third and fourth panel, respectively). The dotted lines in the third panel demonstrate that the mechanisms and the assumed normalization are working.
With the mathematical formulation given in the previous post, the diffusion now almost behaves as expected, in particular, the concentration depends on the CV radius in a quadratic manner. It now appears that the amount of injected particles is given in units of 1e-18 mol
, for a density mechanism (if the concentration is given in units of mmol/L
). This may seem small, but in terms of particles (one mole equaling 1000
that occurs when injecting particles in a point mechanism (see the previous post). Thus, it appears that for a point mechanism the amount of injected particles has to be provided in units of 1e-15 mol
(unless a division by 1000
is performed in the mechanism, as in the example code).
One might discuss whether to leave things like this or to adapt Arbor so as to get rid of this distinction between density and point mechanisms. Beyond this, I think we've reached at a proper description of Arbor's particle diffusion mechanisms which should in condensed form be transferred to the documentation. This should also include the unit of the diffusivity parameter - I remember that @thorstenhater once mentioned it'd be m^2/s
, but I couldn't find this anywhere in the documentation.
Output for r=1:
length = 2 (µm)
radius = 1 (µm)
area = 12.566370614359172 (µm^2)
volume = 6.283185307179586 (µm^3)
diffusivity = 1 (m^2/s??)
max(s) = 286.47889756541207 (mmol/L)
max(s⋅V) = 1800.000000000003 (1e-18 mol)
max(p) = 1.591477775217208 (mmol/L)
max(p⋅V) = 9.999549773947617 (1e-18 mol)
Output for r=2:
length = 2 (µm)
radius = 2 (µm)
area = 25.132741228718345 (µm^2)
volume = 25.132741228718345 (µm^3)
diffusivity = 1 (m^2/s??)
max(s) = 71.61972439135288 (mmol/L)
max(s⋅V) = 1799.9999999999993 (1e-18 mol)
max(p) = 0.397869443804302 (mmol/L)
max(p⋅V) = 9.999549773947617 (1e-18 mol)
Output for r=4:
length = 2 (µm)
radius = 4 (µm)
area = 50.26548245743669 (µm^2)
volume = 100.53096491487338 (µm^3)
diffusivity = 1 (m^2/s??)
max(s) = 17.90493109783822 (mmol/L)
max(s⋅V) = 1799.9999999999993 (1e-18 mol)
max(p) = 0.0994673609510755 (mmol/L)
max(p⋅V) = 9.999549773947617 (1e-18 mol)
While for a single-compartment neuron everything seems to work (if the factor 1000
mentioned above is considered), I'm still encountering issues with multi-compartment neurons. According to this, I've updated the requirements in the summary post above, and I've opened a new issue because some of the requirements are not fulfilled.
I'd suggest that we keep the issue here open until the description of the diffusion processes has been integrated into the documentation (I'm working on that).