Tracking issue for string patterns
alexcrichton opened this issue Β· 69 comments
(Link to original RFC: rust-lang/rfcs#528)
This is a tracking issue for the unstable pattern
feature in the standard library. We have many APIs which support the ability to search with any number of patterns generically within a string (e.g. substrings, characters, closures, etc), but implementing your own pattern (e.g. a regex) is not stable. It would be nice if these implementations could indeed be stable!
Some open questions are:
- Have these APIs been audited for naming and consistency?
- Are we sure these APIs are as conservative as they need to be?
- Are we sure that these APIs are as performant as they can be?
- Are we sure that these APIs can be used to implement all the necessary forms of searching?
cc @Kimundi
String Patterns RFC tracking issue: #22477
Stabilization not only impacts implementing the pattern traits, but also of course detailed use of the Searcher trait, .next_match()
, .next_reject()
and so on.
I'm having trouble seeing the purpose of the next()
method; in all the examples I look at, it's faster and cleaner to just implement next_match()
and next_reject()
individually. For example, these two functions can be implemented for CharEqSearcher as one-liners using Iterator::find
. Moreover, if you want an optimized implementation of Searcher
for an ASCII char
then in implementing next()
you need to choose between an implementation that returns rejections quickly and an implementation that skips quickly over the input (e.g. using SIMD) in order to quickly find a match.
@jneem the StrSearcher uses the same code for next()
and next_match()
, but specialized for each case, so that the next_match()
case is much faster. It works well.
@bluss I think that helps make my point: AFAICT, all the users of Searcher only call next_match()
and next_reject()
. Therefore, the only purpose of next()
AFAICT is to make implementing Searcher easier -- you only need to implement one function instead of two. But that benefit isn't born out in practice. StrSearcher implements two methods anyway, and it would be simpler to implement next_reject()
instead of next()
. CharEqSearcher implements next()
only, but it would be simpler and cleaner to optimize if it implemented next_match()
and next_reject()
instead.
Oops, I missed one: Pattern uses Searcher::next()
for the default implementation of is_prefix_of
.
This is useful, I'd like to use it on stable.
cc @BurntSushi, Regex is the major pattern user and it would be great for everyone if it was stable because of that.
@BurntSushi Are the Pattern traits sufficient for Regex? Any design issues?
I wrote the impl for Regex
a while back, which does indeed only implement next
: https://github.com/rust-lang-nursery/regex/blob/master/src/re.rs#L1104 I seem to recall that it was tricky to get the corner cases right, but that isn't too surprising. I otherwise found it pretty straight forward.
One thing that I don't think is representable with the Pattern
API is fast retrieval of all matches from a suffix table. In particular, matches are reported as part of lexicographically sorted suffixes rather than order of occurrence in the string, so it can't satisfy the contract of Searcher
without an additional cost. (I've already talked to @Kimundi about this, and I don't think it's a major issue. Suffix tables are a pretty niche data structure.)
Nominating for discussion. Not sure whether this is ready for stabilization, but I'd like for the team to dig into it a bit.
Unfortunately the libs team didn't get a chance to talk about this in terms of stabilization for 1.8, but I'm going to leave the nominated tag as I think we should chat about this regardless.
Today I investigated a bit what would need to be done to generalize the Pattern API to arbitrary slice types. As part of that I also took a more pragmatic route to the involved types and interfaces based on these assumptions:
- Generally, the API should suit the std library, with more complex scenarios being handled by external libraries.
- Thus, providing the same feature set for libstd slice types like str, [T] or OsStr is more important than having a API that can accommodate the most general kind of "slice type".
- Having an more simple to maintain API suitable for the existing consumers like find, split and match_indices is a higher priority than having a more complicated API that could accommodate exotic consumers.
- Being able to reuse the API for mutable slices is worth a moderate increase in API verbosity and unsafety-based interface surfaces.
A very rough sketch of the result can be seen below. Note that this is only the Pattern traits itself, without actual integration in the std lib or comprehensive re implementation of the existing types.
https://github.com/Kimundi/pattern_api_sketch/blob/master/src/v5.rs
The core changes are:
- The
Pattern
trait now has an input parameter for the slice type. This turnsPattern<'a>
intoPattern<&'a str>
and allows forPattern<&'a mut [T]>
. next()
got removed, leaving onlynext_{match/reject}()
. This has been done because none of the existing types could make use of the return value ofnext()
, and none of the existing implementations benefited from needing to implement it.- As part of that change, the
Pattern::is_{prefix/suffix}_of()
methods are no longer default-implemented. But seeing how the manual implementation for a given pattern-slice combination is usually simple and straight forward, this does not appear to be a problem.
- As part of that change, the
- In order to allow for mutable slices and shared code of general iterators like
match_indices
between slice types,
the return values ofnext_*()
are now associated types of the slice type.
They have an API that allows them to be used as abstract raw pointer types, which means generic code can use them relatively freely to create slices at or between matches, at the cost of requiring unsafe code and needing to follow aliasing rules.
Changes in regard to the open questions:
- In addition to the existing interface, the new
SearchPtrs
trait has gained new ad-hoc named elements which would require additional name audition. Also,Pattern
continues to be a confusing name in a language with pattern matching... - The sketched API change is somewhat more conservative by getting rid of the
next()
methods,
concentrating instead on the pure search loops. - The prior point should also make the performance aspect more validateable.
However, should the existing slice type-specific iterators types likesplit()
be replaced by shared, generic ones there might be some regressions due to optimizations possibly not carrying over. This is optional though. - We should probably specify the Pattern API more closely as only being intended for iterator-like linear search operations.
Is the focus of Pattern for slices going to be specific for &[u8]
only? I think that's ok, I'm unsure how to really extend "substring search" further to generic &[T]
. Technically, the two-way algorithm that we use for str::find(&str)
can be made to work on any ordered alphabet, i.e. T: Ord
is enough, but I don't know how performant or realistic this is.
@bluss: I only used &[u8]
as a quick proof of concept that the traits work with different slice types. I'm assuming that there is some sensible way to make it work with generic slices.
The &[u8]
case seems like a good candidate for a specialization impl though.
I'm currently trying out some runtime scanners for scan-rules
. Basically: I want users to be able to parse text based on Pattern
s (e.g. accept anything that matches this pattern, consume input until this pattern is matched, etc.). The current example is being able to do something like (where until
accepts a Pattern
):
let_scan!("before:after", (let word_0 <| until(":"), ":", let word_1: Everything));
assert_eq!(word_0, "before");
assert_eq!(word_1, "after");
The problem I'm having is that it's really painful to actually do this. The interface as it exists seems to assume that ownership of the pattern can be passed to the method which will use it, and as a result, can only be used exactly once. This doesn't make much sense to me. Searching for a pattern should not (in general) consume the pattern.
What I want is P: Pattern
β for<P: Pattern> &P: Pattern
. Currently, it's only true for &str
. If I had this, I could take ownership of the pattern from the caller, store it, and loan it to the underlying find
calls. If the caller wants to use the pattern in multiple places, and it's expensive to construct, they can instead pass my function a borrow, which will also work.
The more I think about this, the more it comes down to the FnMut
implementation. The only closure kind that would allow for the kind of interface I want is Fn
... but that excludes closures that test based on accumulated state, though I wonder how often that's even desireable.
I can work around this by requiring Copy
patterns, but again, callables are the most useful kind of pattern (until(char::is_whitespace)
, etc.), so it seems a deep shame to exclude them.
Oh, another thing I just realised: there doesn't appear to be any way to find out how long a match is given a pattern and, say, str::starts_with
.
You can use .next_reject for that.
Hey, I don't know a lot about this API, but I noticed that StrSearcher
is not a double ended pattern, but CharSliceSearcher
is. Is this an error? The explanation of why StrSearcher
is not double ended seems to apply equally well to CharSliceSearcher
:
(&str)::Searcher is not a DoubleEndedSearcher because the pattern "aa" in the haystack "aaa" matches as either "[aa]a" or "a[aa]", depending from which side it is searched.
@withoutboats A slice of chars represents a set of possibilities, so it's not like a string; either of the chars can be matched by themselves.
@bluss That makes sense. The semantics of CharSliceSearcher
are not documented; I assumed the char slice was treated as an ordered sequence rather than a set.
Can we say that, after returning Done
once, a Searcher
must continue to return Done
forever more? That is, can I make this assumption when implementing FusedIterator
?
I'm a big fan of extending this to be more generic than just str, needed/wanted it for [u8] quite frequently. Unfortunately, there's an API inconsistency already - str::split takes a Pattern, whereas slice::split takes a predicate function that only looks at a single T in isolation.
@shahn That shouldn't be a big problem - if Pattern
is implemented for such function, then slice::split
could support Pattern
without breaking backwards compatibility.
I strongly agree with widening the API to slice and maybe Vec.
What's the status here? The original RFC is quite old -- from 2014, before 1.0 -- do we need to completely revisit the design? What are the current blockers?
With SIMD stabilizing in 1.27, Jetscii will be available on stable, but its Pattern
support won't β I never expected that SIMD would arrive before this! π
Would be nice to see these for OsStr, CStr, [u8] and similar, as mentioned a year ago. Non-UTF in Rust feels very much a second class citizen and this would be a start.
I think, that Searcher
need skip
method(s) that can be used to advance search positions without performing search. For example, proof-of-concept of inplace unescape requires this method to work correctly.
I'm looking at this and there doesn't seem to be any way to efficiently perform a search for a single needle in many haystacks? The only way to make a Searcher
currently is by starting with a str
for both the needle and the haystack. This leads to an expensive call to ToWaySearcher::new
which performs computations that only seem to depend on the needle
.
(see how expensive these calls are here: https://users.rust-lang.org/t/why-my-rust-code-is-2-times-slower-than-my-js-code/31189/8)
But it doesn't look like this is impossible with the current API. It looks to me like a new public type StrPattern
could be added which implements Pattern
and contains these precomputed results?
Aho-Corasick (or similar) is probably the right answer there, but the bstr crate does provide a way to do what you want (including not needing to do utf8 validation, if that helps): https://docs.rs/bstr/0.2.7/bstr/struct.Finder.html
I do think it is a somewhat niche concern though. Consider that libc APIs like memmem don't permit this either.
Why is Pattern
not implemented for String
?
It is implemented for &String
, is there a reason to implement it for String
as well?
Sorry if there is already a way to do this, but as part of this issue, can Pattern
be implemented for &[&str]
to avoid allocation? (IIRC slices avoid allocation)
e.g.
let s = "opt=val";
let x = &["opt", "="];
assert!(s.starts_with(x));
That could conceivably work in either of two ways: what I suspect you want (one followed by the other) or one or the other. That by itself is probably reason enough to avoid that impl.
You'd still be able to avoid allocations by using .as_bytes()
, though.
I think it's reasonable to prefer "one followed by the other" over "one or the other". "one or the other" can be done "relatively" cleanly by iterating over the str
s,
e.g.
let x = &["apt", "apc"];
assert_eq!(
x.map(|e| s.starts_with(e)).fold(|acc, x| acc || x), // wrote this quickly
s.starts_with(x) // "one or the other"
);
Meanwhile "one followed by the other" needs to use as_bytes
(as you said) to avoid allocations, which is much more opaque (IMO).
As a further point, &[char]
looks to see if any char in the slice is in the string rather than if the characters in the slice are in the string in order, so I don't know if conceivably working in two different ways is a good justification for not implementing, since &[char]
chooses one anyways (though admittedly it chooses "one or the other").
One or the other seems rather more intuitive. Maybe there should be wrapper newtypes or an enum to separate the cases (it would make for more explicit code, albeit less discoverable). .contains(AllInOrder(["a", "b"]))
or .contains(Any(["a", "b", "c"]))
and so on.
With following in sequence there is also the question of - is it about following each other exactly (no gap between the matches) or not - for .contains()
both alternatives could be useful - it's like contains all (any order), contains all (in order), contains any etc.
Wrapper newtypes sound good. They are probably more useful than enums in this case since the accepting function can distinguish newtypes at the type level and newtypes are more easily extendable, as the different cases of patterns don't seem to be super well enumerated. Maybe implementing some reasonably common ones like Or
(one or the other), ExactOrder
(one after the other, no gaps), and InOrder
(one after the other, gaps), so users don't have to get their hands into unsafe
Searcher
would be good. Then adding some examples to Pattern
to make them more discoverable.
This would let us have default implementations for types like &[char]
while also getting rid of concerns about working in several different ways, since we can just disambiguate with the newtypes. It also opens up nesting, which could be useful. (Though I'll mention that we seem to be slowly approaching regex.)
FWIW, I would probably be opposed to implementing the "or" variant in std on strings. The simple implementation of this is a performance footgun as soon as it gets beyond a few patterns. We would invariably find ourselves recommending people "not use it" unless the inputs are small. The alternative, to make it fast, is a lot of work that probably shouldn't live inside std: https://github.com/BurntSushi/aho-corasick/blob/master/DESIGN.md
This might be obvious for some of you, but it has not been mentioned yet:
Before stabilizing the API (seems like this might take some time) one should consider waiting for #44265:
use core::str::pattern::{ReverseSearcher, Searcher};
pub trait Pattern {
type Searcher<'a>: Searcher<'a>;
// lifetime can be elided:
fn into_searcher(self, haystack: &str) -> Self::Searcher<'_>;
//
fn is_contained_in(self, haystack: &str) -> bool;
fn is_prefix_of(self, haystack: &str) -> bool;
fn is_suffix_of<'a>(self, haystack: &'a str) -> bool
where
Self::Searcher<'a>: ReverseSearcher<'a>;
fn strip_prefix_of(self, haystack: &str) -> Option<&'_ str>;
fn strip_suffix_of<'a>(self, haystack: &'a str) -> Option<&'a str>
where
Self::Searcher<'a>: ReverseSearcher<'a>;
}
I think using an associated lifetime for the Searcher
makes more sense, because it depends on the lifetime of the haystack, which is passed as a parameter to the function.
I can not really think of any practical benefits of this approach, except for lifetime elision and patterns without lifetimes:
pub struct SomePattern;
impl Pattern for SomePattern {
type Searcher<'a> = SomeSearcher<'a>;
fn into_searcher(self, haystack: &str) -> Self::Searcher<'_> {
SomeSearcher(haystack)
}
}
pub struct SomeSearcher<'a>(&'a str);
unsafe impl<'a> Searcher<'a> for SomeSearcher<'a> {
fn haystack(&self) -> &'a str {
self.0
}
fn next(&mut self) -> SearchStep {
unimplemented!()
}
}
Pattern
is implemented for FnMut(char) -> bool
. Is it allowed that this function returns different values for the same input? Specifically, say that we want to strip at most three a
's from a string, then currently we may prepare a counter so that it returns true
for input a
while the counter is less than 3 or otherwise false
. Is it an intended use? Particularly, is it guaranteed that FnMut(char) -> bool
function is called for each character in order?
@TonalidadeHidrica I wouldn't rely on that myself. To me that's not an idiomatic use case; I'd expect the function to be pure.
@TonalidadeHidrica I wouldn't rely on that myself. To me that's not an idiomatic use case; I'd expect the function to be pure.
Why has this trait been implemented for FnMut
? If the function should be pure, then wouldn't it be better to only implement it for Fn
?
From the docs of FnMut
:
Use FnMut as a bound when you want to accept a parameter of function-like type and need to call it repeatedly, while allowing it to mutate state. If you donβt want the parameter to mutate state, use Fn as a bound; if you donβt need to call it repeatedly, use FnOnce.
This seems like something that should be documented in the docs of the Pattern
trait.
I agree that this should be documented. I guess why this function is Fn
instead of FnMut
is that we may sometimes use something like "cache" of a large database of chars, the update of which requires mutations. The issue of use of FnMut
is somewhat similar to "Iterator::map
accepts FnMut
, then can I update an external counter during the iteration of map
? No, it's discouraged."
Would it be better to ban the mutation, and if it is needed, use the internal mutation (Cell
, etc.)?
Not possible due to pack-compatibility. Pattern
, while unstable, is exposed in some stable APIs.
what is the holdup for making this stable?
I'm a little confused by the API; the API docs for std::str::pattern::Searcher
state that:
This trait provides methods for searching for non-overlapping matches of a pattern starting from the front (left) of a string.
I see the following possible interpretations of this, and I want to be sure which is in use to prevent any ambiguity in implementations.
Greedy approach
If you're looking for the string aa
in the haystack aaaaa
, then you'll always get a sequence like the following if you require a greedy search:
Match(0,2)
Match(2,4)
Reject(4,5)
Done
Starts at the start of the string, but skips some letters because why not?
Greedy is overrated. Let's skip the first letter and match on the rest!
Reject(0,1)
Match(1,3)
Match(3,5)
Done
Getting the most matches is so overrated, how about we skip some?
The API definition doesn't require that the maximal number of matches be returned, so we could just ignore some matching sub-strings.
Reject(0,1)
Reject(1,2)
Match(2,4)
Reject(4,5)
Done
Suggestions for documentation improvements.
I'd like to suggest that matching is always greedy and always maximal. Roughly the following pseudo-code (don't use this in production, it will overflow your stack, and finite state machines are faster anyways):
// `results` is empty when this function is first called.
// `usize` is 0 when first called.
fn string_matcher(pattern: &str, haystack: &str, results: &mut Vec<SearchStep>, index: usize) {
if haystack.len() < pattern.len() {
if haystack.len() > 0 {
results.push(SearchStep::Reject(index, index + haystack.len()));
}
results.push(SearchStep::Done);
} else if pattern == &haystack[0..pattern.len()] {
results.push(SearchStep::Match(index, index + pattern.len()));
string_matcher(
pattern,
&haystack[pattern.len()..],
results,
index + pattern.len(),
);
} else {
results.push(SearchStep::Reject(index, index + 1));
string_matcher(pattern, &haystack[1..], results, index + 1);
}
}
Also, what about when you want overlapping matches? I can see cases where I would want all overlapping matches in addition to what the Searcher API currently provides.
The docs could certainly be improved. I'm not sure if "greedy" or "maximal" are the right words.
Overlapping matches is a bit of a niche case and I don't think there is a compelling reason for the standard library to support them. Overlapping searches are available in the aho-corasick
crate.
@BurntSushi the main reason for the overlapping case is because then you can say that the searcher needs to return all matches, even the overlapping ones. The user is then responsible for deciding which overlapping case is the interesting one(s). If the searcher implements the Iterator trait, then you can use filtering to get the parts you want.
I don't think that's worth doing and likely has deep performance implications.
I don't think that's worth doing
I disagree, though I do think that it should be a completely separated from the current API (different function, different trait, whatever is deemed best)
and likely has deep performance implications.
Hah! I agree 110% with you on this! And it's the reason why having it as a separate API is likely the best way to do it.
Overlapping searches are way way way too niche to put into std. If you want to convince folks otherwise, I would recommend giving more compelling reasons for why it should be in std.
The best example I can give you off the top of my head is very niche, and likely not applicable to str
.
I sometimes have to decode streams of bytes coming in from a receiver that can make errors1 because of clock skew, mismatched oscillator frequencies, and noise in general. These can show up as bit flips, missing bits, or extra bits. Despite this, I want to know when a legitimate byte stream is starting. The normal (and fast) way is to define some kind of known pattern that signals that a frame is starting, which I'm going to call the start of frame pattern2. To make your own life simple, this pattern is going to be chosen to be highly unlikely to occur by accident in your environment, but it's also really, really easy to look for. One example might be just to have a stream of bits like 0101010101
as your start pattern.
Now here is where things get interesting; while you could use some form of forward error correction (FEC) code to encode the start of frame pattern, continuously decoding all incoming bits to look for the pattern is energy intensive, which means battery life goes down. What you want to do is find the probable start of a frame, and then start the computationally (and therefore power) expensive process of decoding bits only when you are pretty sure you've found the start of a frame. So, you don't bother with proper FEC of the frame pattern. Instead, you make your pattern simple, and your pattern matcher will be just as simple. If it sees a pattern that looks like it could be a start of frame, you turn on your full FEC decoder and start decoding bits until you either decide that you made a mistake, or you have a frame (checksums, etc. come later).
The issue is that the noise I mentioned earlier can show up anywhere, including at the head of the start of frame pattern. So instead of looking for the full 0101010101
start of frame pattern, you might just look for 0101
in overlapping substrings, starting a new FEC decode task as soon as you match the pattern3. Which is where you need the overlapping pattern search.
All of that makes good sense in a byte stream, and that is where the windows
method can be helpful. Does any of this make sense for a UTF-8 encoded string that is not subject to errors in encoding? Probably not. But, this is the best I could come up with on the spur of the moment for a practical use case.
Footnotes
-
'Receiver' in this case might be hardware, like a radio receiver. If the receiver is in a noisy environment, then it's constantly receiving bits, including bits that are just noise. β©
-
I'm skipping so many details and algorithms that can be used under various conditions, it isn't even funny. If you know that background, just fill in the details in your head, if you don't know them, just ignore them. I'm just trying to give a very simplified example here. β©
-
The assumption is that since the noise could be anywhere including in the start of frame section, you might have to start correcting for errors that occurred right at the start of your actual byte string. β©
Yeah I totally grant that there exist use cases for overlapping search. That's not really what I'm looking for, although I appreciate you outlining your use case. What I'm trying to get at here is that they are not frequent enough to be in std. Frequency isn't our only criterion, but I can't see any other reason why std should care at all about overlapping search. If you want it to be in std, you really need to answer the question, "why can't you use a crate for it?" with a specific reason for this particular problem. (i.e., Not general complaints like "I don't want to add dependencies to my project.")
You're right, on all counts. I don't have a good enough reason for why it should be in std and not some crate, so I'm fine with it being dropped.
That said, I would like to see the documentation clarified on which non-overlapping patterns need to be returned. I'm fine with the docs stating that you can return an arbitrary set of non-overlapping matches, I just want it to be 100% clear as to what is expected of implementors.
Could Pattern
have a flag that could be set on whether to use eq_ignore_ascii_case
for its comparisons?
(note: i do not know how Pattern
works so maybe it's not possible idk. but it would be very handy!)
@Fishrock123 It is doable, but substring search algorithms are usually not amenable to being adapted straight-forwardly to support case insensitivity. (This isn't true for all of them, but I think is likely true for Two-Way at least, which is the algorithm currently used for substring search.) So in order to support such a flag, you'd probably need to dispatch to an entirely different algorithm.
Another option is the regex
crate. It's a bit of a beefy dependency for just a simple ASCII case insensitive search, but it will do it for you. The aho-corasick
crate also supports ASCII case insensitivity. While Aho-Corasick is typically used for multi-substring search, it can of course be used with just one substring. aho-corasick
is a less beefy dependency than regex
.
This issue is open since over 7 years now, me and likely a lot other people would like to use this in stable.
Would it be possible to move the unstable attributes from the root into the API methods instead. Then at least one could export it in a stable way as '&str' already does (in stable). Example (illustration only, i leave the working bits out):
use std::str::pattern::Pattern;
struct MyStr {
/*...*/
}
impl MyStr {
fn from(&str) -> Self {...}
fn as_str(&self) -> &str {...}
pub fn split_once<'a, P: Pattern<'a>>(&'a self, delimiter: P) -> Option<(MyStr, MyStr)> {
match self.as_str().split_once(delimiter) {
Some(a,b) => Some(Self::from(a), Self::from(b)),
None => None,
}
}
}
I agree, it is sad to see such an issue being abandoned.
Before stabilizing the API (seems like this might take some time) one should consider waiting for #44265:
use core::str::pattern::{ReverseSearcher, Searcher}; pub trait Pattern { type Searcher<'a>: Searcher<'a>;
Is GAT in its current state suitable for this? I know it has some limitations.
If it is, then I imagine it must be worth considering this API while we are still unstable?
Related to this, I'm writing a function that ultimately searches a &str, the obvious (to me) signature was:
fn until(pattern: impl Pattern);
But I guess it would need some lifetime generic with the current API.
I think the summary here is that this API needs rework and somebody to champion it. There was some good discussion at #71780, including a rough proposal from @withoutboats in #71780 (comment). This kinda sorta echos @Luro02's outline in #27721 (comment) (it seems like GATs provide us with a more ergonomic solution in any case)
Another thing to keep in mind is that slice patterns were removed (#76901 (comment)) but we may want some way to work with &[u8]
byte strings. It is a bit of a pain point that many std APIs require a UTF-8 &str
when it isn't always needed, meaning that there is a runtime cost for str::from_utf8
to do things without unsafe
when you have mostly-but-maybe-not-completely UTF-8 sequences (e.g., OsStr
/ Read
interaction)
So the next steps forward, probably:
- Somebody puts forth a design proposal. I don't think this needs to be a RFC since the concept was already accepted, but it has been so long that I think we just need a from-scratch design with justification and documented limitations. An ACP is probably a good place for this (acps are just an issue template at https://github.com/rust-lang/libs-team, link it here if you post one)
- Implement that proposal
- Revisit stabilization after it has been around for a while
It is unfortunate that we more or less have to go back to square one with stabilization, but there have been a lot of lessons learned and better ways to do things since the 2014 RFC (a decade!). Really this is probably just in need of somebody to take charge of the redesign and push everything forward.
All I'd really asked for above is to stabilize the existence of the Pattern API, that would already address a lot of problems removing unstable bits from the stable Rust stdlib API.
When the API/implementation behind needs more work, that's Ok. But honestly after that much years and many people relying on patterns, overly big changes would be quite surprising.
All I'd really asked for above is to stabilize the existence of the Pattern API, that would already address a lot of problems removing unstable bits from the stable Rust stdlib API.
That would of course be nice, but we donβt want to do that until knowing for sure that we wonβt need to change generics from what there currently is (a single lifetime). Probably unlikely, but thereβs no way of knowing without a concrete proposal.
When the API/implementation behind needs more work, that's Ok. But honestly after that much years and many people relying on patterns, overly big changes would be quite surprising.
I think itβs the opposite: all the discussion here, the very long time with no stabilization, and the proposed replacements I linked in #27721 (comment) seem to indicate that nobody is happy enough with this API as-is. This feature needs a champion who is willing to experiment and push things along.
Do we have some news about this feature?
Do we have some news about this feature?
Please stop poluting the issue tracker. If there is an update, someone will link to this issue.
I realize there are a lot of issues with stabilizing the pattern
feature itself, but if I may suggest a possible middle ground, would it be possible to work on detecting particular pattern search patterns and try to just optimize (a subset of) those (possibly via some specific hint, perhaps only using iter instead of in a for loop) to internally use some version of TwoWaySearcher
(or anything, really)?
There are no guarantees for compiler optimizations, there's no specific api we would have to stabilize, and we don't need to handle all the cases (not all at once nor even eventually), but it could give a decent performance boost to certain code and be both an asset in the short-term (until a stable pattern matching api becomes available) as well as in the long term (code not using the pattern matching api could still benefit).
Not sure how doable this is in the technical sense, but at least on paper it might be worth considering?