
4+ Years

Bangalore

Yes

Full Time
Required Skills s Qualifications
•Bachelor’s or Master’s degree in Computer Science, Information Technology, or related field.
•4–7 years of experience in Data Engineering, ETL Development, or Data Warehousing.
•Proficiency in SQL and Python for data manipulation and scripting.
•Experience with big data technologies such as Spark, Hadoop, Hive, or Kafka.
•Hands-on experience with cloud data platforms (e.g., AWS Redshift).
•Strong knowledge of ETL tools (e.g., Apache Airflow, Talend, Informatica, or AWS Glue).
•Familiarity with data modeling, schema design, and performance optimization.
•Experience with version control (Git) and CI/CD pipelines is a plus.
•Understanding of data governance, security, and privacy best practices.

Key Responsibilities
•Design, develop, and maintain robust ETL/ELT pipelines to collect, process, and store data from multiple sources.
•Build and optimize data architectures and data models for analytics and reporting.
•Work with stakeholders to identify data requirements and implement reliable data solutions.
•Ensure data quality, consistency, and reliability across databases and data warehouses.
•Manage and optimize data storage and processing in cloud/on-prem environments.
•Collaborate with data scientists, analysts, and business teams to make data accessible and actionable.
•Implement and monitor data security, governance, and compliance processes.
•Troubleshoot and improve performance of existing data workflows and systems.

Immediate to 1 Month
Apply for this Job
By submitting this form, I confirm that I have read and agree to the
Privacy Policy.