No items found.

MAMBA2: State-Space Semiseparable Matrices & Structured Linear Causal Attention

In the second delivery of this multi-part series, Aaron and Juan discuss the evolution of the Mamba architecture to Mamba2. The focus is to provide a mathematical overview of the equivalence of State-Space Model components and block operations to Linear Causal Attention.

The goal of this second webinar is to provide our viewers with the necessary context to understand the connections of Mamba2 to more popular architectures like Transformers.

Who should watch this?

Any technical folks looking to stay up to date on language modeling techniques and expand their development skillset.

Just like its predecessor, Mamba2 is useful for RAG systems with long context length requirements, and, unlike Mamba, it can be implemented and deployed without any specialized hardware requirements. A more leveled field for experimentation and development for this emerging architecture!

Sign up for the webinar
presented by
Aimpoint Digital Logo

Meet the speakers

Aaron McClendon

Aaron McClendon

Head of AI, Aimpoint Digital

Juan Morinelli

Juan Morinelli

Lead Data Scientist, Aimpoint Digital

Let’s talk data.
We’ll bring the solutions.

Whether you need advanced AI solutions, strategic data expertise, or tailored insights, our team is here to help.

Meet an Expert