Found 166,516 repositories(showing 30)
Dao-AILab
Fast and memory-efficient exact attention
xmu-xiaoma666
๐ Pytorch implementation of various Attention Mechanisms, MLP, Re-parameter, Convolution, which is helpful to further understand papers.โญโญโญ
jadore801120
A PyTorch implementation of the Transformer model in "Attention is All You Need".
cmhungsteve
An ultimately comprehensive paper list of Vision Transformer/Attention, including papers, codes, and related websites
fla-org
๐ Efficient implementations for emerging model architectures
Attendize
Attendize is an open-source ticket selling and event management platform built on Laravel.
thu-ml
[ICLR2025, ICML2025, NeurIPS2025 Spotlight] Quantized Attention achieves speedup of 2-5x compared to FlashAttention, without losing end-to-end metrics across language, image, and video models.
MoonshotAI
No description available
MenghaoGuo
Summary of related papers on visual attention. Related code will be released based on Jittor gradually.
philipperemy
Keras Attention Layer (Luong and Bahdanau scores).
heykeetae
Pytorch implementation of Self-Attention Generative Adversarial Networks (SAGAN)
Jongchan
Official PyTorch code for "BAM: Bottleneck Attention Module (BMVC2018)" and "CBAM: Convolutional Block Attention Module (ECCV2018)"
ozan-oktay
Use of Attention Gates in a Convolutional Neural Network / Medical Image Classification and Segmentation
fossasia
Open Event Attendee Android General App https://github.com/fossasia/open-event-android/blob/apk/open-event-dev-app-playStore-debug.apk
openai
Examples of using sparse attention, as in "Generating Long Sequences with Sparse Transformers"
fossasia
iOS app for open event
peteanderson80
Bottom-up attention model for image captioning and VQA, based on Faster R-CNN and Visual Genome
szagoruyko
Improving Convolutional Networks via Attention Transfer (ICLR 2017)
bojone
some attention implements
wouterkool
Attention based model for learning to solve different routing problems
bloc97
Unofficial implementation of "Prompt-to-Prompt Image Editing with Cross Attention Control" with Stable Diffusion
Awesome List of Attention Modules and Plug&Play Modules in Computer Vision
mjun0812
Provide with pre-build flash-attention package wheels on Linux and Windows platforms using GitHub Actions
The-AI-Summer
Implementation of various self-attention mechanisms focused on computer vision. Ongoing repository.
Neighborhood Attention Transformer, arxiv 2022 / CVPR 2023. Dilated Neighborhood Attention Transformer, arxiv 2022
meta-pytorch
Helpful tools and examples for working with flex-attention
xiangwang1223
KGAT: Knowledge Graph Attention Network for Recommendation, KDD2019
houqb
Code for our CVPR2021 paper coordinate attention
epfml
Source code for "On the Relationship between Self-Attention and Convolutional Layers"
da03
Visual Attention based OCR