Generic Delay implementation
TeXitoi opened this issue · 4 comments
Almost all the hal implementations:
- have the same time.rs module defining time object like
Hz - implement (more or less right) a Delay object on each timer.
I think that this can be solved in embedded-hal by:
- Defining something like time.rs
- Having a trait
CalibratedCountDownthat provide the needed function to implement delay - A Wrapper type over a
CalibratedCountDownthat provide theDelayimplementation.
Then, the hal only have to provide a CalibratedCountDown object that can be created from a timer and a Clock object.
An idea of the implementation:
pub trait CalibratedCountDown {
/// The duration of a tick.
fn tick(&self) -> Duration;
/// Maximum ticks the counter can count.
fn max_ticks(&self) -> usize;
/// Start a new countdown of `count` ticks.
fn start<T>(&mut self, count: usize);
/// Non-blockingly "waits" until the count down finishes.
fn wait(&mut self) -> Result<(), Void>
}Then you can generically implement Delay as atsamd-rs/atsamd#14
I fully agree there's plenty of room for improvement. We do need to be a bit careful about how we design something generic so it can actually be used by any implementation and will lead to efficient code everywhere (especially division operations can be problematic).
I'm not quite sure how ergonomic your suggestion would be in real life, it would be great to have an implementation available to play with it on different systems to get a feel how it might work.
the timer can implement an fn into_delay(self, Clock) -> Delay<Self> that returns the Generic delay. That would be ergonomic, no?
That would indeed be ergonomic. But will it actually blend? 😅