Skip to Main Content Area
  • CPS-VO
    • Contact Support
  • Browse
    • Calendar
    • Announcements
    • Repositories
    • Groups
  • Search
    • Search for Content
    • Search for a Group
    • Search for People
    • Search for a Project
    • Tagcloud
      
 
Not a member?
Click here to register!
Forgot username or password?
 
Home
National Science Foundation

Cyber-Physical Systems Virtual Organization

Read-only archive of site from September 29, 2023.

CPS-VO

Hierarchy-to-sequence

biblio

Visible to the public A Hierarchy-to-Sequence Attentional Neural Machine Translation Model

Submitted by grigby1 on Mon, 10/05/2020 - 2:01pm
  • semantic compositionality modeling
  • neural nets
  • optimal model parameters
  • parameter learning
  • pubcrawl
  • recurrent neural nets
  • Recurrent neural networks
  • segmented clause sequence
  • segmented clauses
  • neural machine translation
  • Semantics
  • sequence-to-sequence attentional neural machine translation
  • short clauses
  • Speech
  • speech processing
  • text analysis
  • Training
  • translation prediction
  • hierarchical neural network structure
  • Chinese-English translation
  • clause level
  • Compositionality
  • Context modeling
  • conventional NMT model
  • Decoding
  • English-German translation
  • grammars
  • attention models
  • Hierarchy-to-sequence
  • hierarchy-to-sequence attentional neural machine translation model
  • hierarchy-to-sequence attentional NMT model
  • language translation
  • learning (artificial intelligence)
  • long parallel sentences
  • natural language processing

Terms of Use  |  ©2023. CPS-VO