Data Engineer, M-DAQ


About M-DAQ Pte. Ltd.

M-DAQ builds over-the-top (OTT) applications to facilitate cross-border business for various industries. These include securities markets, e-Commerce platforms and payment solutions providers. M-DAQ achieves this through our proprietary technology, together with the remittance license awarded by the Monetary Authority of Singapore. M-DAQ was awarded “Best Tech Company To Work For 2019” by Singapore Computer Society (SCS).

M-DAQ is a high-growth FinTech start-up that focuses on proprietary best-in-class corporate FX solutions across Asia. We have to date processed over $10 billion in FX transactions and generated hundreds of millions of dollars in revenue for our partners and savings for their end-customers. Having recently concluded our Series D financing round, we have exciting plans to leverage our FX expertise in even more verticals.


Why Us?

Have a positive impact to the world’s economy by creating a World without Currency BordersTM

Team Innovation Mindset, People-Oriented
Challenging environment, offering great opportunities to learn and grow
Creative and Innovative Workplace


Roles & Responsibility:

Design, develop and implement Big Data platforms in cloud
Build a centralized datalake and data warehouse for data feeds from all M-DAQ entities
Set up tools for data ingesting (batch and real-time), data cleaning, building data cubes in the data warehouse
Build tools to monitor data quality and manage data changes
Build end-user facing data catalogue and data lineage to facilitate data discovery


5+ years of experience with excellent coding skills (Python, Java or Scala)
Bachelor or Master’s degree in Computer Science, Engineering, or relevant industry experience is required
Experience with Big Data technology architecture (Hadoop, Spark, Kafka, Flink, Hive etc), tuning, troubleshooting and scaling the systems. Deep understanding with the internal principles of these systems.
Hands on experience designing and implementing Data Warehouse (Snowflake, Athena or Redshift)
Solid understanding of ETL and ELT approaches and large scale data ingestion/integration framework.
Experience in designing data quality morning and fault-tolerant pipelines
Hands on experience in developing/designing/integrating using Kubernetes
Exposure in building real-time processing pipelines using Flink or Spark-streaming
Exposure to data and machine learning services from Amazon Web Services (AWS) and/or Google Cloud (GCP) is a plus

What We Offer:

M-DAQ offers competitive remuneration including employee stock options and employee benefits.




Please send your CV in confidence to

Scrum MasterEngineering - Scrum/AgileSingapore
Senior Software Engineer (Java)Engineering - Software EngineerSingapore
Product ManagerStrategic Product ManagementSingapore
Software Engineer (Java)Engineering - Software EngineerSingapore
QA EngineerEngineering - QA EngineerSingapore
Mid-Level Software Engineer (Java)Engineering - Software EngineerSingapore
Analyst Client Onboarding and Relation Extension / Site Reliability Engineer (CORE/SRE)CORE / SRESingapore
Business Development Executive (Entry-Level)Business DevelopmentSingapore
Software Engineer InternEngineering - Software EngineerSingapore
Data science/Analytics InternCOE - Data ScienceSingapore
Human Resources InternHuman ResourcesSingapore
Site Reliability Engineer InternCORE / SRESingapore
Marketing InternCOE - MarketingSingapore
Associate Site Reliability Engineer CORE / SRESingapore
DevOps EngineerICS - DevOpsSingapore
Software EngineerEngineering - Software EngineerSeoul, Korea
Seoul, Korea

Data EngineerEngineering - Data EngineerSingapore
UAT TesterEngineering - UAT TesterSingapore
Graphic Design InternGraphic DesignSingapore
Business Development Manager, KoreaBusiness DevelopmentSeoul, Korea
Mergers & Acquisitions off-cycle Internship, 6 monthsMergers & Acquisitions Singapore