×
Well done. You've clicked the tower. This would actually achieve something if you had logged in first. Use the key for that. The name takes you home. This is where all the applicables sit. And you can't apply any changes to my site unless you are logged in.

Our policy is best summarized as "we don't care about _you_, we care about _them_", no emails, so no forgetting your password. You have no rights. It's like you don't even exist. If you publish material, I reserve the right to remove it, or use it myself.

Don't impersonate. Don't name someone involuntarily. You can lose everything if you cross the line, and no, I won't cancel your automatic payments first, so you'll have to do it the hard way. See how serious this sounds? That's how serious you're meant to take these.

×
Register


Required. 150 characters or fewer. Letters, digits and @/./+/-/_ only.
  • Your password can’t be too similar to your other personal information.
  • Your password must contain at least 8 characters.
  • Your password can’t be a commonly used password.
  • Your password can’t be entirely numeric.

Enter the same password as before, for verification.
Login

Grow A Dic
Define A Word
Make Space
Set Task
Mark Post
Apply Votestyle
Create Votes
(From: saved spaces)
Exclude Votes
Apply Dic
Exclude Dic

Click here to flash read.

There has been a recent surge of interest in automating software engineering
tasks using deep learning. This paper addresses the problem of code generation
where the goal is to generate target code given source code in a different
language or a natural language description. Most of the state-of-the-art deep
learning models for code generation use training strategies primarily designed
for natural language. However, understanding and generating code requires a
more rigorous comprehension of the code syntax and semantics. With this
motivation, we develop an encoder-decoder Transformer model where both the
encoder and decoder are explicitly trained to recognize the syntax and data
flow in the source and target codes, respectively. We not only make the encoder
structure-aware by leveraging the source code's syntax tree and data flow
graph, but we also support the decoder in preserving the syntax and data flow
of the target code by introducing two novel auxiliary tasks: AST (Abstract
Syntax Tree) paths prediction and data flow prediction. To the best of our
knowledge, this is the first work to introduce a structure-aware Transformer
decoder that models both syntax and data flow to enhance the quality of
generated code. The proposed StructCoder model achieves state-of-the-art
performance on code translation and text-to-code generation tasks in the
CodeXGLUE benchmark, and improves over baselines of similar size on the APPS
code generation benchmark. Our code is publicly available at
https://github.com/reddy-lab-code-research/StructCoder/.

Click here to read this post out
ID: 169130; Unique Viewers: 0
Unique Voters: 0
Total Votes: 0
Votes:
Latest Change: June 2, 2023, 7:31 a.m. Changes:
Dictionaries:
Words:
Spaces:
Views: 15
CC:
No creative common's license
Comments: