WHAT YOU WILL LEARN
- Understand the core architecture of various foundational models, including single and multimodalities.
- Step-by-step approach to developing transformer-based Machine Learning models.
- Utilize various open-source models to solve your business problems.
- Train and fine-tune various open-source models using PyTorch 2.0 and the Hugging Face ecosystem.
- Deploy and serve transformer models.
- Best practices and guidelines for building transformer-based models.
WHO THIS BOOK IS FOR
This book caters to data scientists, Machine Learning engineers, developers, and software architects interested in the world of generative AI.
TABLE OF CONTENTS
1. Transformer Architecture 2. Hugging Face Ecosystem 3. Transformer Model in PyTorch 4. Transfer Learning with PyTorch and Hugging Face 5. Large Language Models: BERT, GPT-3, and BART 6. NLP Tasks with Transformers 7. CV Model Anatomy: ViT, DETR, and DeiT 8. Computer Vision Tasks with Transformers 9. Speech Processing Model Anatomy: Whisper, SpeechT5, and Wav2Vec 10. Speech Tasks with Transformers 11. Transformer Architecture for Tabular Data Processing 12. Transformers for Tabular Data Regression and Classification 13. Multimodal Transformers, Architectures and Applications 14. Explore Reinforcement Learning for Transformer 15. Model Export, Serving, and Deployment 16. Transformer Model Interpretability, and Experimental Visualization 17. PyTorch Models: Best Practices and Debugging
Dieser Download kann aus rechtlichen Gründen nur mit Rechnungsadresse in A, B, BG, CY, CZ, D, DK, EW, E, FIN, F, GR, HR, H, IRL, I, LT, L, LR, M, NL, PL, P, R, S, SLO, SK ausgeliefert werden.








