Info:
- Intermediate to Advance
- ₹ 2400
- 20 hours
- 1 week (40 hours total) with 8 hours/day
- 2 weeks (20 hours total) with 4 hours/day
- 11/17/2024
- 12/29/2024
- 03/23/2025
- 03/23/2025
- Institute of Engineering Sciences
- A.I.
Prerequisites:
● Basic understanding of machine learning and deep learning fundamentals.
● Proficiency in Python programming.
● Familiarity with neural networks and basic concepts of AI/ML.
Who is this for:
● Experienced AI/ML practitioners aiming to deepen their knowledge in advanced topics like transformers, GANs, and prompt engineering.
● Data scientists and engineers looking to explore cutting-edge techniques in generative AI.
● Professionals in BFSI, marketing, or tech sector’s seeking to innovate using advanced AI tools.
Who should attend:
● Senior data scientists and AI researchers.
● Software engineers and developers working on AI-driven products.
● Professionals interested in mastering advanced AI techniques and exploring their applications in various industries.
What do we offer:
● In-depth exploration of advanced AI concepts including neural networks, transformers, and GANs.
● Hands-on labs focusing on real-world applications such as fraud detection, sentiment analysis, and text generation.
● Practical experience with state-of-the-art tools like Hugging Face, OpenAI, and closed-source foundation models.
● Guidance on prompt engineering and fine-tuning techniques for generative AI tasks.
● Exposure to cutting-edge topics like Retrieval-Augmented Generation (RAG) and their applications.
● Opportunities to implement and test advanced AI models, preparing you to lead AI innovation in your organization.
Course Objectives:
- Equip participants with foundational knowledge of Deep Learning and Natural Language Processing (NLP) techniques used in Generative AI (Gen AI).
- Enable participants to leverage industry-specific applications of Gen AI for tasks like anomaly detection, text classification, and sentiment analysis.
- Empower participants to utilize cutting-edge NLP models and prompt engineering techniques for real-world problem-solving.
Course Outcomes:
● Gain a comprehensive understanding of Deep Learning architectures like Convolutional Neural Networks (CNNs) and Recurrent Neural Networks (RNNs).
● Apply Variational Autoencoders (VAEs) to address data challenges in the BFSI (Banking, Financial Services, and Insurance) sector.
● Effectively navigate the Transformer architecture, including self-attention mechanisms and encoder-decoder models.
● Utilize pre-trained NLP models like ChatGPT and Hugging Face pipelines for efficient text analysis and generation.
● Master prompt engineering techniques for zero-shot, few-shot, and chain-of-thought learning in Gen AI applications.
● Gain exposure to advanced topics like Retrieval-Augmented Generation (RAG), fine-tuning, and Generative Adversarial Networks (GANs).
Course Content:
- Building the Deep Learning Foundation
- Introduction to Neural Networks
- Demystifying Neural Network Architecture
- Understanding Forward and Backward Propagation
- Exploring Activation Functions and Dropout Regularization
- Hands-on Lab: Implementing Basic Neural Networks
- Convolutional Neural Networks (CNNs)
- Unveiling the Power of CNNs
- Practical Applications of CNNs in Industry
- Hands-on Lab: Building a CNN Model (Example: Fraud Detection in Transactions)
- Mastering the Language of Business with NLP
- Introduction to Recurrent Neural Networks (RNNs)
- Understanding RNN Architecture and its Applications
- Deep Dive into Text Classification and Sentiment Analysis Techniques
- Hands-on Lab: Text Classification Using RNNs
- Powering NLP with Pre-Trained Models and Tools
- Introduction to Hugging Face Pipelines and APIs
- Leveraging Pre-Trained NLP Models for Base Model Inference and Evaluation
- Hands-on Lab: Sentiment Analysis of Customer Reviews Using Hugging Face Models
- Unveiling the Transformer Revolution
- Beyond RNNs: Introduction to Transformers
- Evolution from RNNs to Transformers
- Understanding Embeddings, Tokenization, and Self-Attention Mechanisms
- Hands-on Lab: Tokenization and Embedding Techniques
- Exploring Transformers and Variational Auto encoders (VAEs)
- Demystifying the Transformer Architecture
- Applications of Transformers in NLP Tasks (Example: Text Summarization for Financial Reports)
- Hands-on Lab: Training Simple Language Models (SLMs) from Scratch
- Exploring Variational Auto encoders (VAEs): Concepts, Architecture, and Applications in BFSI
- Hands-on Lab: Anomaly Detection in Financial Data Using VAEs
- Redefining Creativity with Prompt Engineering
- The Art of Prompt Engineering
- Mastering Zero-Shot, Few-Shot, and Chain-of-Thought (CoT) Learning
- Practical Applications of Prompt Engineering in Gen AI Tasks (Example: Generating Creative Marketing Copy)
- Hands-on Lab: Designing Effective Prompts for Various Applications
- Exploring Closed-Source Foundation Models and OpenAI Applications
- Introduction to Closed-Source Foundation Models (e.g., ChatGPT)
- Unveiling the Architecture of ChatGPT
- Exploring OpenAI Applications: Text Generation, Question Answering, Language Translation, Summarization, and Conversational Agents
- Hands-on Lab: Building and Testing ChatGPT-Based Applications
- Unveiling the Cutting Edge
- Introduction to Retrieval-Augmented Generation (RAG)
- Understanding the Concept and Applications of RAG
- Hands-on Lab: Implementing RAG for Enhanced Text Generation
- Exploring Advanced Topics (if time permits)
- Introduction to Fine-Tuning and PEFT Fine-Tuning
- Introduction to Generative Adversarial Networks (GANs)
- Hands-on Lab: Fine-Tuning Pre-Trained Models
- Hands-on Lab: Basic GAN Implementation and Applications
- Trainer Profile
Dr. Darshan Ingle (Ph.D. Computer Engineering)
The trainer is an experienced professional with over 10 years of expertise in both the corporate and education sectors. He is highly skilled in data analytics, machine learning, and programming languages such as R, Python, and Julia. He also has deep knowledge of statistics, data structures, and advanced software tools like Excel, Tableau, and Java. Throughout his career, he has successfully trained over 30,000 students and 40,000 professionals. His corporate clients include major organizations like HDFC, Tata Steel, AstraZeneca, and PWC. The trainer is actively contributing to the fields of deep learning and natural language processing, particularly in the realm of generative AI.