Open-Source Generative AI for the Enterprise is a comprehensive 4-day course that teaches practical applications for AI in the business environment. This course offers a combination of lectures and hands-on labs, providing participants with a solid understanding of AI concepts and the skills to design and implement AI solutions. Throughout the course, you will learn about AI transformer-based architectures, the fundamentals of Python programming for AI deployments, and the deployment of open-source Transformer models. You will also explore the importance of hardware requirements in AI performance, comparing different GPU architectures and understanding how to match AI requirements with suitable hardware. The course delves into training techniques, including back propagation, gradient descent, and various AI tasks such as classification, regression, and clustering. You will gain practical experience through hands-on exercises with open source LLM (Language Learning Model) frameworks, allowing you to work with fine-tuned models and run workloads on different models to understand their strengths and weaknesses. Additionally, the course covers the conversion of model formats and provides in-depth exploration of AI programming environments like PyTorch + transformers and transformers' low-level interactive inspection. Towards the end of the course, you will delve into advanced topics such as context extension through fine-tuning and quantization for specific application target environments. By the completion of the course, you will have the opportunity to earn an AI certification, further enhancing your credentials in the field of Artificial Intelligence. This course is ideal for Python Developers, DevSecOps Engineers, and Managers or Directors seeking a practical overview of AI and its practical application in the enterprise.
Previous exposure to any programming language, preferably Python.
4 Days/Lecture & Lab
This course is designed for Project Managers, Architects, CKA Developers, and Data Acquisition Specialist.
- The Mechanics of Deep Learning - Gain an intuitive understanding of the most current generative AI architecture.
- The Transformer Model - Develop an intuitive understanding of the transformer model, without math.
- Hardware Requirements
- Pre-trained LLM Model Essentials
- Pre-trained LLM hands-on - Run inference on multiple models on diverse prompts
- Transformer Model training essentials
- Conversion of Model Formats
- Model Fine Tuning
- Introduce Application Interfacing to Models
- Applications Augmentation with Langchain and Guidance
- Advanced Topics
- Use llama to perform Natural Language ::Processing tasks
- Deploy a Natural Language Model Capstone