Skip to content

LiuFang816/xl_mt_code_completion

Repository files navigation

A Self-Attentional Neural Architecture for Code Completion with MTL

Prerequisite

  • Python 2.7
  • Tensorflow 1.12.0

Data

Python and JavaScript datasets: http://plml.ethz.ch

Java dataset: https://drive.google.com/open?id=1xxnYAu8L5i6TpNpMNDWxSNsOs3XYxS6T

Each program is represented in its AST format, and the AST is serialized in in-order depth-first traversal to produce the AST node sequence.

Data process code is in the "preprocess_code" diretory.

Train models

python train_gpu_mt.py --alpha ${weight for type prediction loss} --mem_len ${memory length} --model_dir ${path_to_save} 

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages