
Description
Our Purpose
Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we're helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential.
Title and Summary
Moon#168 - Senior Data Engineer Who is Mastercard?Mastercard is a global technology company in the payments industry. Our mission is to connect and power an inclusive, digital economy that benefits everyone, everywhere by making transactions safe, simple, smart, and accessible. Using secure data and networks, partnerships and passion, our innovations and solutions help individuals, financial institutions, governments, and businesses realize their greatest potential.
Our decency quotient, or DQ, drives our culture and everything we do inside and outside of our company. With connections across more than 210 countries and territories, we are building a sustainable world that unlocks priceless possibilities for all.
Overview:
Ethoca, a Mastercard Company is seeking a Senior Data Engineer to join our team in Pune, India to drive data enablement and explore big data solutions within our technology landscape. The role is visible and critical as part of a high performing team – it will appeal to you if you have an effective combination of domain knowledge, relevant experience and the ability to execute on the details.
You will bring cutting edge software and full stack development skills with advanced knowledge of cloud and data lake experience while working with massive data volumes. You will own this – our teams are small, agile and focused on the needs of the high growth fintech marketplace.
You will be working across functional teams within Ethoca and Mastercard to deliver on cloud strategy.
We are committed in making our systems resilient and responsive yet easily maintainable on cloud.
Key Responsibilities:
• Design, develop, and optimize batch and real-time data pipelines using Snowflake, Snowpark, Python, and PySpark.
• Build data transformation workflows using dbt, with a strong focus on Test-Driven Development (TDD) and modular design.
• Implement and manage CI/CD pipelines using GitLab and Jenkins, enabling automated testing, deployment, and monitoring of data workflows.
• Deploy and manage Snowflake objects using Schema Change, ensuring controlled, auditable, and repeatable releases across environments.
• Administer and optimize the Snowflake platform, handling performance tuning, access management, cost control, and platform scalability.
• Drive DataOps practices by integrating testing, monitoring, versioning, and collaboration into every phase of the data pipeline lifecycle.
• Build scalable and reusable data models that support business analytics and dashboarding in Power BI.
• Develop and support real-time data streaming pipelines (e.g., using Kafka, Spark Structured Streaming) for near-instant data availability.
• Establish and implement data observability practices, including monitoring data quality, freshness, lineage, and anomaly detection across the platform.
• Plan and own deployments, migrations, and upgrades across data platforms and pipelines to minimize service impacts, including developing and executing mitigation plans.
• Collaborate with stakeholders to understand data requirements and deliver reliable, high-impact data solutions.
• Document pipeline architecture, processes, and standards, promoting consistency and transparency across the team.
• Apply exceptional problem-solving and analytical skills to troubleshoot complex data and system issues.
• Demonstrate excellent written and verbal communication skills when collaborating across technical and non-technical teams.
Required Qualifications:
• Tenured in the fields of Computer Science/Engineering or Software Engineering.
• Bachelor's degree in computer science, or a related technical field including programming.
• Deep hands-on experience with Snowflake (including administration), Snowpark, and Python.
• Strong background in PySpark and distributed data processing.
• Proven track record using dbt for building robust, testable data transformation workflows following TDD.
• Familiarity with Schema Change for Snowflake object deployment and version control.
• Proficient in CI/CD tooling, especially GitLab and Jenkins, with a focus on automation and DataOps.
• Experience with real-time data processing and streaming pipelines.
• Strong grasp of cloud-based database infrastructure (AWS, Azure, or GCP).
• Skilled in developing insightful dashboards and scalable data models using Power BI.
• Expert in SQL development and performance optimization.
• Demonstrated success in building and maintaining data observability tools and frameworks.
• Proven ability to plan and execute deployments, upgrades, and migrations with minimal disruption to operations.
• Strong communication, collaboration, and analytical thinking across technical and non-technical stakeholders.
Ideally you have experience in banking, e-commerce, credit cards or payment processing and exposure to both SaaS and premises-based architectures. In addition, you have a post-secondary degree in computer science, mathematics, or quantitative science.
Corporate Security Responsibility
All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must:
Abide by Mastercard's security policies and practices;
Ensure the confidentiality and integrity of the information being accessed;
Report any suspected information security violation or breach, and
Complete all periodic mandatory security trainings in accordance with Mastercard's guidelines.
Apply on company website