|
--- |
|
license: mit |
|
task_categories: |
|
- text-classification |
|
tags: |
|
- AI |
|
- ICLR |
|
- ICLR2023 |
|
--- |
|
## ICLR 2023 International Conference on Learning Representations 2023 Accepted Paper Meta Info Dataset |
|
|
|
This dataset is collect from the ICLR 2023 OpenReview website (https://openreview.net/group?id=ICLR.cc/2023/Conference#tab-accept-oral) as well as the arxiv website DeepNLP paper arxiv (http://www.deepnlp.org/content/paper/iclr2023). For researchers who are interested in doing analysis of ICLR 2023 accepted papers and potential trends, you can use the already cleaned up json files. |
|
Each row contains the meta information of a paper in the ICLR 2023 conference. To explore more AI & Robotic papers (NIPS/ICML/ICLR/IROS/ICRA/etc) and AI equations, feel free to navigate the Equation Search Engine (http://www.deepnlp.org/search/equation) as well as the AI Agent Search Engine to find the deployed AI Apps and Agents (http://www.deepnlp.org/search/agent) in your domain. |
|
|
|
|
|
### Meta Information of Json File |
|
|
|
``` |
|
{ |
|
"title": "Encoding Recurrence into Transformers", |
|
"url": "https://openreview.net/forum?id=7YfHla7IxBJ", |
|
"detail_url": "https://openreview.net/forum?id=7YfHla7IxBJ", |
|
"authors": "Feiqing Huang,Kexin Lu,Yuxi CAI,Zhen Qin,Yanwen Fang,Guangjian Tian,Guodong Li", |
|
"tags": "ICLR 2023,Top 5%", |
|
"abstract": "This paper novelly breaks down with ignorable loss an RNN layer into a sequence of simple RNNs, each of which can be further rewritten into a lightweight positional encoding matrix of a self-attention, named the Recurrence Encoding Matrix (REM). Thus, recurrent dynamics introduced by the RNN layer can be encapsulated into the positional encodings of a multihead self-attention, and this makes it possible to seamlessly incorporate these recurrent dynamics into a Transformer, leading to a new module, Self-Attention with Recurrence (RSA). The proposed module can leverage the recurrent inductive bias of REMs to achieve a better sample efficiency than its corresponding baseline Transformer, while the self-attention is used to model the remaining non-recurrent signals. The relative proportions of these two components are controlled by a data-driven gated mechanism, and the effectiveness of RSA modules are demonstrated by four sequential learning tasks.", |
|
"pdf": "https://openreview.net/pdf/70636775789b51f219cb29634cc7c794cc86577b.pdf" |
|
} |
|
|
|
``` |
|
|
|
|
|
|
|
## Related |
|
|
|
## AI Equation |
|
[List of AI Equations and Latex](http://www.deepnlp.org/equation/category/ai) <br> |
|
[List of Math Equations and Latex](http://www.deepnlp.org/equation/category/math) <br> |
|
[List of Physics Equations and Latex](http://www.deepnlp.org/equation/category/physics) <br> |
|
[List of Statistics Equations and Latex](http://www.deepnlp.org/equation/category/statistics) <br> |
|
[List of Machine Learning Equations and Latex](http://www.deepnlp.org/equation/category/machine%20learning) <br> |
|
|
|
## AI Agent Marketplace and Search |
|
[AI Agent Marketplace and Search](http://www.deepnlp.org/search/agent) <br> |
|
[Robot Search](http://www.deepnlp.org/search/robot) <br> |
|
[Equation and Academic search](http://www.deepnlp.org/search/equation) <br> |
|
[AI & Robot Comprehensive Search](http://www.deepnlp.org/search) <br> |
|
[AI & Robot Question](http://www.deepnlp.org/question) <br> |
|
[AI & Robot Community](http://www.deepnlp.org/community) <br> |
|
[AI Agent Marketplace Blog](http://www.deepnlp.org/blog/ai-agent-marketplace-and-search-portal-reviews-2025) <br> |
|
|
|
## AI Agent Reviews |
|
[AI Agent Marketplace Directory](http://www.deepnlp.org/store/ai-agent) <br> |
|
[Microsoft AI Agents Reviews](http://www.deepnlp.org/store/pub/pub-microsoft-ai-agent) <br> |
|
[Claude AI Agents Reviews](http://www.deepnlp.org/store/pub/pub-claude-ai-agent) <br> |
|
[OpenAI AI Agents Reviews](http://www.deepnlp.org/store/pub/pub-openai-ai-agent) <br> |
|
[Saleforce AI Agents Reviews](http://www.deepnlp.org/store/pub/pub-salesforce-ai-agent) <br> |
|
[AI Agent Builder Reviews](http://www.deepnlp.org/store/ai-agent/ai-agent-builder) <br> |