Attention Accelerator the repo will obtain some hardware achievement of transformer attention-based accelerator from some papers or projects using the SpinalHDL Paper [1] Hardware Accelerator for Multi-Head Attention and Position-Wise Feed-Forward in the Transformer