Found 21 repositories(showing 21)
TsinghuaC3I
[ICML 2025] Fourier Position Embedding: Enhancing Attention’s Periodic Extension for Length Generalization
radrumond
Learning complex time series forecasting models usually requires a large amount of data, as each model is trained from scratch for each task/data set. Leveraging learning experience with similar datasets is a well-established technique for classification problems called few-shot classification. However, existing approaches cannot be applied to time-series forecasting because i) multivariate time-series datasets have different channels and ii) forecasting is principally different from classification. In this paper we formalize the problem of few-shot forecasting of time-series with heterogeneous channels for the first time. Extending recent work on heterogeneous attributes in vector data, we develop a model composed of permutation-invariant deep set-blocks which incorporate a temporal embedding. We assemble the first meta-dataset of 40 multivariate time-series datasets and show through experiments that our model provides a good generalization, outperforming baselines carried over from simpler scenarios that either fail to learn across tasks or miss temporal information.
oshapio
Official code for the paper "Compositional Generalization Requires Linear, Orthogonal Representations in Vision Embedding Models"
lugiavn
Generalization in Metric Learning: Should the Embedding Layer be the Embedding Layer?
Rlag1998
Exploring the Generalization Performance of Quantum Metric Learning Classifiers
jasl1
a self-supervised sentence embedding framework that enhances both generalization and robustness benchmarks
The code for magnification generalization for the histopathology image embedding
kjw11
Robust MAML for domain generalization in speaker embedding.
ntunlplab
With the aid of recently proposed word embedding algorithms, the study of semantic similarity has progressed and advanced rapidly. However, many natural language processing tasks need sense level representation. To address this issue, some researches propose sense embedding learning algorithms. In this paper, we present a generalized model from existing sense retrofitting model. The generalization takes three major components: semantic relations between the senses, the relation strengths and the semantic strengths. In the experiment, we show that the generalized model can outperform previous approaches in three types of experiment: semantic relatedness, contextual word similarity and semantic difference.
Niranjan6752
Empirical study on how embedding dimension affects NLP model accuracy, generalization, and training efficiency.
bqzhendehenqiang
This is the official code implementation of our paper titled "Nonpolarized Embedding Learning in Multimodal Domain Generalization".
Imlementation and Results of the Paper 'CNN-Based Geometric Feature Embedding Using Coordinates for Cartographic Generalization Tasks on Building Footprints'
RyanLin0727
• Trained the Quantum Embedding Kernel for predictions • Increased the accuracy on classifying data instances from 0.58 to 0.83 • Made the program robust enough to avoid bad generalizations
Designed and trained a self-supervised image encoder on a subset of ImageNet data using contrastive learning; evaluated generalization on in-distribution and OOD sets, and analyzed learned representations through embedding visualizations.
kareem-gameel
GemNet-based delta-learning for molecular energy prediction, covering random split, OOD generalization, cross-domain transfer, and data-efficiency experiments on QM9/tmQM benchmarks. Includes reproducible scripts, embedding analysis, and figure notebooks for the paper.
ardavano
Benchmarking MLIPs for cleavage energy prediction using a DFT-calculated dataset. Fine-tuning and transfer learning adapt pre-trained models, embedding physics principles like symmetry invariance and energy extensivity. Demonstrates how scientific principles enhance ML accuracy, generalization, and interpretability.
mamoon-17
A research project comparing Classical ML models, Hybrid LLM-embedding methods, and fine-tuned Transformers for emotion classification. The study evaluates performance on two datasets, song lyrics and GoEmotions, to analyze how model choice and dataset quality impact accuracy and generalization.
amiTanmayNath
Processed data using tokenization, stop word removal, stemming, lemmatization, and word embedding techniques • Deployed RNN, LSTM, GRU models using binary cross-entropy and SGD algorithm,achieving 91% accuracy with LSTM • Curbed overfitting in sentiment models via dropout, clipping and L1/L2 regularization ensuring robust generalization
riyaaggarrwal
Processed data using tokenization, stop word removal, stemming, lemmatization, and word embedding techniques • Deployed RNN, LSTM, GRU models using binary cross-entropy and SGD algorithm,achieving 91% accuracy with LSTM • Curbed overfitting in sentiment models via dropout, clipping and L1/L2 regularization ensuring robust generalization
2021sshah
This repository explores strategies to improve semantic structure in CLIP’s shared embedding space. The work investigates supervised contrastive and sliced Wasserstein loss functions to refine representation quality without compromising zero-shot generalization. Quantitative and qualitative evaluations are conducted on the Hateful Memes dataset.
dharmendrakumar21
Built an LSTM-based deep learning model to classify news articles as real or fake using textual analysis. Utilized embedding layers, stacked LSTM units, and dense layers for accurate binary classi?cation. Achieved 97.9% accuracy and demonstrated strong generalization across training and validation data.
All 21 repositories loaded