MENU

Fun & Interesting

Realtime Streaming with Data Lakehouse - End to End Data Engineering Project

CodeWithYu 8,962 12 months ago
Video Not Working? Fix It Now

In this video you will learn to design, implement and maintain secure, scalable and cost effective lakehouse architectures leveraging Apache Spark, Apache Kafka, Apache Flink, Delta Lake, AWS, and open-source tools. Unlock data's full potential through advanced analytics and machine learning. Part 1: https://youtu.be/p36YixNqGLg FULL COURSE AVAILABLE: https://sh.datamasterylab.com/costsaver Like this video? Support us: https://www.youtube.com/@CodeWithYu/join Timestamps: 0:00 Setting up Kafka Broker in KRaft Mode 21:30 Setting up Minio 35:30 Producing data into Kafka 39:10 Acquiring Secret and Access Key for S3 59:00 Creating S3 Bucket Event Listener for Lakehouse 1:05:53 Data Preview and Results 1:07:42 Outro Resources: Youtube Source Code: https://buymeacoffee.com/yusuf.ganiyu/youtube-source-code-building-cost-effective-data-lakehouse 🌟 Please LIKE ❤️ and SUBSCRIBE for more AMAZING content! 🌟 👦🏻 My Linkedin: https://www.linkedin.com/in/yusuf-ganiyu-b90140107/ 🚀 X(Twitter): https://x.com/YusufOGaniyu 📝 Medium: https://medium.com/@yusuf.ganiyu Hashtags: #dataengineering #bigdata #dataanalytics #realtimeanalytics #streaming, #datalakehouse, #datalake, #datawarehouse, #dataintegration, #datatransformation, #datagovernance, #datasecurity, #apachespark, #apachekafka, #apacheflink, #deltalake, #aws, #opensource, #dataingestion, #structureddata, #unstructureddata, #semi-structureddata, #dataanalysis, #advancedanalytics, #dataarchitecture, #costoptimization, #cloudcomputing, #awscloud

Comment