Figure 4.
Total Framework. In the encoder, besides the relation-aware self-attention layer, we add a pruned self-attention layer to abandon the irrelevant elements in the sequence. The relation selection mechanism is used in both type layers. For the decoder, we have the same implementation with RAT-SQL [5].

Total Framework. In the encoder, besides the relation-aware self-attention layer, we add a pruned self-attention layer to abandon the irrelevant elements in the sequence. The relation selection mechanism is used in both type layers. For the decoder, we have the same implementation with RAT-SQL [5].

Close Modal

or Create an Account

Close Modal
Close Modal