Experience: 4–6 Years
Job Overview:
We are seeking a skilled GCP Data Engineer to design, build, and maintain scalable data pipelines on Google Cloud Platform. The ideal candidate should have strong experience in handling large datasets and optimizing data workflows.
Key Responsibilities:
- Design and develop ETL/ELT pipelines on GCP
- Work with BigQuery, Dataflow, Pub/Sub, Cloud Storage
- Build and optimize data architectures for performance and scalability
- Integrate data from multiple sources (structured & unstructured)
- Ensure data quality, governance, and security
- Collaborate with data analysts and business teams
- Monitor, troubleshoot, and improve data pipelines
Required Skills:
- Hands-on experience with GCP services (BigQuery, Dataflow, Pub/Sub)
- Strong knowledge of SQL and Python
- Experience in data modeling and ETL processes
- Understanding of data warehousing concepts
- Familiarity with Airflow or other orchestration tools
Preferred Skills:
- Experience with Spark, Kafka, or Hadoop ecosystem
- Knowledge of CI/CD pipelines and DevOps practices
- Exposure to data security and compliance
Soft Skills: