UKPLab/sentence-transformers

SentenceTransformer._first_module() and issues created with bespoke architectures

slobstone opened this issue · 0 comments

At the moment, this does exactly what it says on the tin -- returns the first module in the modules list. However, in all the calls to this method (e.g. here, here, here), there is an implicit assumption that the first module is a transformer, when there is actually no strict need for it to be.

I've come across (especially with the recent rise in custom modules) instances where SentenceTransformer can be [initialised/trained/used for inference] with module stacks that are more bespoke, though still containing a transformer at the heart of all the functionality. I suppose my question is whether there is any possibility of supporting such cases by, e.g.

  1. defining a new method that is more explicit about this, e.g.
def _transformer_module(self) -> torch.nn.Module:
        if hasattr(self._first_module(), "auto_model"):  # better criteria possibly available
                return self._first_module()
        else:
                # WARN
                try:
                        return [x for x in self.modules() if hasattr(x, "auto_model")][0]
                except:
                        # SUITABLE HANDLING
  1. changing the functions that implicitly depend on a transformer (examples given previously) to reference this new method