📄 Job Description
Key Responsibilities:Design, build, and maintain scalable ETL/ELT pipelines to collect and process large datasets.Develop and manage data architecture across data lakes, warehouses, and cloud platforms.Ensure data quality, consistency, and governance throughout the pipeline.Work closely with data analysts, scientists, and product teams to understand data needs and deliver clean, reliable data.Monitor and optimize the performance of data systems.Develop and maintain automated data workflows using tools like Airflow or dbt.Document data processes, definitions, and pipeline workflows.
✅ Requirements
Required Skills & Experience:Proficiency in SQL and at least one programming language (Python, Java, or Scala).Experience with cloud platforms like AWS (e.g., S3, Redshift, Glue), GCP (e.g., BigQuery, Dataflow), or Azure.Hands-on experience with data pipeline tools (e.g., Apache Airflow, dbt, Kafka, Spark).Strong understanding of data modeling, data warehousing, and performance tuning.Familiarity with both structured and unstructured data systems.Version control with Git and experience working in Agile environments.
⭐ What we offer
Good Work environment
💰 Salary
The salary range for this position is between $ 50.000 - $75.000
Contact person
ⓘ About the Company Annex It Solutions
Annex IT Solutions is the best institute in online training, corporate training and classroom training. They offer training programs online in India and abroad.
Cityjobs.info connects job seekers and employers in various fields, making the job market accessible and inclusive for everyone.