×
Well done. You've clicked the tower. This would actually achieve something if you had logged in first. Use the key for that. The name takes you home. This is where all the applicables sit. And you can't apply any changes to my site unless you are logged in.

Our policy is best summarized as "we don't care about _you_, we care about _them_", no emails, so no forgetting your password. You have no rights. It's like you don't even exist. If you publish material, I reserve the right to remove it, or use it myself.

Don't impersonate. Don't name someone involuntarily. You can lose everything if you cross the line, and no, I won't cancel your automatic payments first, so you'll have to do it the hard way. See how serious this sounds? That's how serious you're meant to take these.

×
Register


Required. 150 characters or fewer. Letters, digits and @/./+/-/_ only.
  • Your password can’t be too similar to your other personal information.
  • Your password must contain at least 8 characters.
  • Your password can’t be a commonly used password.
  • Your password can’t be entirely numeric.

Enter the same password as before, for verification.
Login

Grow A Dic
Define A Word
Make Space
Set Task
Mark Post
Apply Votestyle
Create Votes
(From: saved spaces)
Exclude Votes
Apply Dic
Exclude Dic

Click here to flash read.

For a complicated algorithm, its implementation by a human programmer usually
starts with outlining a rough control flow followed by iterative enrichments,
eventually yielding carefully generated syntactic structures and variables in a
hierarchy. However, state-of-the-art large language models generate codes in a
single pass, without intermediate warm-ups to reflect the structured thought
process of "outline-then-detail". Inspired by the recent success of
chain-of-thought prompting, we propose ChainCoder, a program synthesis language
model that generates Python code progressively, i.e. from coarse to fine in
multiple passes. We first decompose source code into layout frame components
and accessory components via abstract syntax tree parsing to construct a
hierarchical representation. We then reform our prediction target into a
multi-pass objective, each pass generates a subsequence, which is concatenated
in the hierarchy. Finally, a tailored transformer architecture is leveraged to
jointly encode the natural language descriptions and syntactically aligned I/O
data samples. Extensive evaluations show that ChainCoder outperforms
state-of-the-arts, demonstrating that our progressive generation eases the
reasoning procedure and guides the language model to generate higher-quality
solutions. Our codes are available at:
https://github.com/VITA-Group/ChainCoder.

Click here to read this post out
ID: 118122; Unique Viewers: 0
Unique Voters: 0
Total Votes: 0
Votes:
Latest Change: May 10, 2023, 7:31 a.m. Changes:
Dictionaries:
Words:
Spaces:
Views: 9
CC:
No creative common's license
Comments: