Key Responsibilities• Design and develop data architecture and ETL processes to ensure efficient data flow and management.• Collaborate with data analysts and data scientists to identify data requirements, develop data models, and implement data pipelines.• Optimize and tune data pipelines and data storage systems to ensure reliability and scalability.• Implement and maintain data security and data quality standards.• Work closely with cross-functional teams to ensure the timely delivery of data solutions.• Continuously evaluate emerging technologies and tools to ensure the optimal performance of our data infrastructure.• Establish best practices for developer operations (For more experienced engineers) provide technical leadership across multiple product teams• Share your expertise and mentor other engineers• Help with recruiting Requirements• Bachelor's or Master's degree in Computer Science, Information Technology, or related field.• At least 3 years of hands-on experience with extensive experience in SQL, Tableau, Airflow and Apache Spark (Databricks).• Strong understanding of data modeling, data warehousing, and ETL processes.• Experience with big data technologies and cloud-based solutions, such as AWS (preferably) , GCP or Azure.• Excellent analytical and problem-solving skills.• Strong communication and interpersonal skills.• Ability to work independently as well as in a team environment.• Able to converse in English• Proven track record of delivering high-quality data solutions in a timely manner.อ่านเพิ่มเติม