Skip to content

[Question] Does dropout layer after word embedding improve neural machine translation? #12

@SkyAndCloud

Description

@SkyAndCloud

Hi, I saw you add dropout layer after word embedding, which was not mentioned in rnnsearch paper "Neural Machine Translation by Jointly Learning to Align and Translate". Does this trick improve some performance? Is this implemented in vanilla theano version groundhog?
Thanks!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions