πŸ“˜ Foundation Phase Completed – Starting Phase 2 of My Journey

I started my foundation phase on August 11, with the goal of building comfort in scripting, querying, and managing environments. Over the past 7 weeks, I’ve explored Linux, Cloud, SQL, and built my first ETL pipeline. Each week had its own focus, and I’ve documented everything both on Hashnode and GitHub.

πŸ”Ή Week 1: Linux Basics and Cloud Introduction

This week was about getting familiar with Linux commands and understanding cloud architecture. I practiced workflows like navigation, file management, permissions, and package handling. On the cloud side, I explored service models and AWS basics.

πŸ“„ Hashnode Article: Week 1 & Week 2 – Linux and Cloud Fundamentals

πŸ“‚ GitHub Documentation: week 1 – Reflections

πŸ”Ή Week 2: Continued Linux Practice and Cloud Concepts

I refined my Linux skills with process control, scheduling tasks, and monitoring system resources. I also explored IAM roles and EC2 setup in AWS.

πŸ“„ Hashnode Article:Week 1 & Week 2 – Linux and Cloud Fundamentals

πŸ“‚ GitHub Documentation:week 2 – Reflections

πŸ”Ή Week 3: PostgreSQL Practice and Query Mastery

This week was focused on SQL. I designed a mini sales database and solved 50+ queries across filtering, joins, aggregations, and window functions. It helped me understand relational logic and query design.

πŸ“„ Hashnode Article:Week 3 – PostgreSQL Practice & Query Mastery

πŸ“‚ GitHub Documentation: week 3 – Reflections

⏸️ Sep 1 – Sep 6: Break Week

I didn’t work on foundation topics this week. I was involved in other tasks not related to this phase, so I didn’t count this as part of the learning timeline.

πŸ”Ή Week 4: ETL Pipeline Project

This week was all about building. I created a beginner-friendly ETL pipeline using Linux shell scripting, Python (pandas), and PostgreSQL. I extracted, transformed, and loaded drug label data into a structured database.

πŸ“„ Hashnode Article:My First ETL Pipeline

πŸ“‚ GitHub Repository:Linux_ETL_Pipeline

πŸ”Ή Week 5: Thinking Like a Builder

This week was a mindset shift. I stopped just running commands and started designing systems.

  • Practiced shell scripting with error handling
  • Explored Docker basics
  • Worked with EC2, Lambda, and S3 lifecycle rules
  • Documented daily reflections with clarity

πŸ“„ Hashnode Article:Week 5 – The Week I Started Thinking Like a Builder

πŸ“‚ GitHub Documentation:Week 5 – Reflections

🧠 Week 6: Internship Work – Smart Fridge Annotation

After Week 5, I planned to start Phase 2. But as part of my internship, I was assigned to work on data collection and annotation for smart fridge images.
This week, I focused on:

  • Collecting diverse fridge images
  • Annotating items shelf by shelf

πŸš€ Starting Phase 2: Core Workflows

Now that the foundation phase is complete, I’m officially starting Phase 2 of my journey.
This phase will focus on how data moves, transforms, and gets scheduled.

πŸ”§ Phase 2 Plan (Sep 29 – Early Dec)

  • Data Extraction: APIs, web scraping, Selenium
  • Data Ingestion: Kafka, Spark, Flink
  • Orchestration: Airflow, dbt

I’ll continue documenting each week with clarity sharing both technical progress and mindset shifts.

β€œFoundation gave me clarity. Now I’m building momentum.”

Similar Posts