×
Well done. You've clicked the tower. This would actually achieve something if you had logged in first. Use the key for that. The name takes you home. This is where all the applicables sit. And you can't apply any changes to my site unless you are logged in.

Our policy is best summarized as "we don't care about _you_, we care about _them_", no emails, so no forgetting your password. You have no rights. It's like you don't even exist. If you publish material, I reserve the right to remove it, or use it myself.

Don't impersonate. Don't name someone involuntarily. You can lose everything if you cross the line, and no, I won't cancel your automatic payments first, so you'll have to do it the hard way. See how serious this sounds? That's how serious you're meant to take these.

×
Register


Required. 150 characters or fewer. Letters, digits and @/./+/-/_ only.
  • Your password can’t be too similar to your other personal information.
  • Your password must contain at least 8 characters.
  • Your password can’t be a commonly used password.
  • Your password can’t be entirely numeric.

Enter the same password as before, for verification.
Login

Grow A Dic
Define A Word
Make Space
Set Task
Mark Post
Apply Votestyle
Create Votes
(From: saved spaces)
Exclude Votes
Apply Dic
Exclude Dic

Click here to flash read.

Time series forecasting holds significant importance in many real-world
dynamic systems and has been extensively studied. Unlike natural language
process (NLP) and computer vision (CV), where a single large model can tackle
multiple tasks, models for time series forecasting are often specialized,
necessitating distinct designs for different tasks and applications. While
pre-trained foundation models have made impressive strides in NLP and CV, their
development in time series domains has been constrained by data sparsity.
Recent studies have revealed that large language models (LLMs) possess robust
pattern recognition and reasoning abilities over complex sequences of tokens.
However, the challenge remains in effectively aligning the modalities of time
series data and natural language to leverage these capabilities. In this work,
we present Time-LLM, a reprogramming framework to repurpose LLMs for general
time series forecasting with the backbone language models kept intact. We begin
by reprogramming the input time series with text prototypes before feeding it
into the frozen LLM to align the two modalities. To augment the LLM's ability
to reason with time series data, we propose Prompt-as-Prefix (PaP), which
enriches the input context and directs the transformation of reprogrammed input
patches. The transformed time series patches from the LLM are finally projected
to obtain the forecasts. Our comprehensive evaluations demonstrate that
Time-LLM is a powerful time series learner that outperforms state-of-the-art,
specialized forecasting models. Moreover, Time-LLM excels in both few-shot and
zero-shot learning scenarios.

Click here to read this post out
ID: 446574; Unique Viewers: 0
Unique Voters: 0
Total Votes: 0
Votes:
Latest Change: Oct. 4, 2023, 7:31 a.m. Changes:
Dictionaries:
Words:
Spaces:
Views: 9
CC:
No creative common's license
Comments: