TEST: Text Prototype Aligned Embedding to Activate LLM's Ability for Time Series

This work aims to activate LLM's ability for TS data by designing a TS embedding method suitable for LLM, named TEST, and shows that the pre-trained LLM with TEST strategy can achieve better or comparable performance than today's SOTA TS models and offer benefits for few-shot and generalization.

Wed Aug 16 2023
Citations
22
by Chenxi Sun, Yaliang Li and others
CHAT WITH RESEARCH


QUESTIONS & ANSWERS

Log in to generate
TL;DR
AI KEY POINTS
ABSTRACT
PAPER
This work aims to activate LLM's ability for TS data by designing a TS embedding method suitable for LLM, named TEST, and shows that the pre-trained LLM with TEST strategy can achieve better or comparable performance than today's SOTA TS models and offer benefits for few-shot and generalization.


Research is provided by Semantic Scholar and AI-generated text may at times produce inaccurate results.
Information provided on this site does not constitute legal, financial, medical, or any other professional advice.

DATA LICENSING
Search and article data is provided under CC BY-NC or ODC-BY and via The Semantic Scholar Open Data Platform. Read more at Kinney, Rodney Michael et al. “The Semantic Scholar Open Data Platform.” ArXiv abs/2301.10140 (2023): n. pag.