From 3d83895c3224188602deb23354eb5c18c53a84bd Mon Sep 17 00:00:00 2001 From: lucidrains Date: Thu, 11 Jan 2024 13:30:12 -0800 Subject: [PATCH] update --- README.md | 2 ++ 1 file changed, 2 insertions(+) diff --git a/README.md b/README.md index 4ed5668..6eb9845 100644 --- a/README.md +++ b/README.md @@ -32,6 +32,8 @@ Update 13: Still clearly worse Update 14: See some synergy when mixing gateloop and attention on a small scale, when holding parameters constant. Will be adding a tiny bit of simplified gateloop layers to transformers to address a main weakness in attention for future projects. +Update 15: There may be a way to combine associative scan based works with the findings from the recently proposed taylor series linear attention. will carry out some independent research before end of January 2024 and share the results here. + ### Appreciation - StabilityAI, A16Z Open Source AI Grant Program, and 🤗 Huggingface for the generous sponsorships, as well as my other sponsors, for affording me the independence to open source current artificial intelligence research