Industry: IT Consulting Company
Location: Bangkok
Responsibilities:
Design and manage ETL (Extract, Transform, Load) processes within the Google Cloud environment to ensure efficient data integration.
Develop and implement database schemas, and write queries for data retrieval and manipulation.
Build data pipelines using Airflow, DBT, and various Google Cloud Platform tools (BigQuery, Storage, Dataflow, Pub/Sub, Cloud Functions/Run, Vertex AI, Cloud Build).
Automate data workflows and processes using scripting languages such as Python and Bash.
Utilize containerization (e.g., Docker) and orchestration tools (e.g., Kubernetes) for effective deployment and management of data applications.
Work closely with data scientists, analysts, and other stakeholders to understand and meet data requirements.
Requirements:
Minimum of 5 years of experience in data engineering, big data, and data warehousing.
Extensive experience with cloud computing platforms, particularly Google Cloud Platform (GCP).
Proficiency with data pipeline and workflow management tools (Airflow, DBT).
Strong knowledge of data storage and processing.
Expertise in Python and SQL.
Excellent communication skills, with fluency in English.