LIMoE: Learning Multiple Modalities with One Sparse Mixture-of-Experts Model
Prodramp
LIMoE is large-scale multimodal architecture using a sparse mixture of experts, which simultaneously processes both images and text, but uses sparsely activated experts that naturally specialize.
In this video we are taking a deep dive to learn the more about LIMoE: Learning Multiple Modalities with One Sparse Mixture-of-Experts Model, how it works and internal architecture, text and images data processing..
GitHub Resources: https://github.com/prodramp/DeepWorks/tree/main/LIMoE
Research Paper and Code:
- https://arxiv.org/abs/2206.02770
- https://ai.googleblog.com/2022/06/limoe-learning-multiple-modalities-with.html
ā¬ā¬ā¬ā¬ā¬ā¬ ā° TUTORIAL TIME STAMPS ā° ā¬ā¬ā¬ā¬ā¬ā¬
- (00:00) Research Paper intro
- (01:27) Topics Covered
- (02:35) LIMoE Internals
- (05:54) Training System
- (08:18) Multimodal Contrastive Learning
- (10:18) LIMoE Behavior Understanding
- (12:36) LIMoE Performance
- (14:42) Conclusion
Connect
- Prodramp LLC (@prodramp)
- Website - https://prodramp.com
- LinkedIn - https://www.linkedin.com/company/prodramp
- GitHub- https://github.com/prodramp/
- AngelList - https://angel.co/company/prodramp
- Facebook - https://www.facebook.com/Prodramp
Content Creator: Avkash Chauhan (@avkashchauhan)
Tags: #limoe #ai #google #cnn #ml #lime #aicloud #h2oai #driverlessai #machinelearning #cloud #mlops #model #collaboration #deeplearning #modelserving #modeldeployment #pytorch #datarobot #datahub #streamlit #modeltesting #codeartifact #dataartifact #modelartifact #onnx #aws #kaggle #mapbox #lightgbm #xgboost #classification #dataengineering #pandas #keras #tensorflow #tensorboard #cnn #prodramp #avkashchauhan #LIME #mli #xai ... https://www.youtube.com/watch?v=i-V33KEwX00
96078464 Bytes