Job Description
As a Data Architect at Acumatica, you will have the opportunity to envision, design, implement and operate a state-of-the-art data platform, Lakehouse infrastructure, data pipelines, and associated processes.
The ideal candidate will have a strong background in the creation of scalable, unified architectures to support both real-time and batch analytics & machine learning, ensuring seamless data integration, governance, and accessibility.
Responsibilities
- Architect a robust data platform and ecosystem of technology, combining structured and unstructured data storage with analytical processing.
- Define strategies and develop data pipelines for ingestion, transformation, and optimization within the lakehouse.
- Establish policies for data governance quality, privacy, and security in compliance with global regulations like GDPR, PCI, HIPAA and CCPA.
- Assert data governance principles and evangelize company wide.
- Ensure secure data access and robust role-based permissions.
- Leverage tools like Delta Lake & Apache Iceberg for Lakehouse architecture.
- Make recommendations for improvements considering cost and time to value.
- Monitor & enhance query performance, storage costs, and system scalability.
- Enable self-service analytics and facilitate efficient data exploration.
- Partner with IT, Analytics, Business & Engineering teams to align data strategies.
- Lead cross-functional projects, mentoring engineers on modern data architectures.
- Regularly present proposals to cross functional audiences, like the Architectural Review Board, to drive cross-functional alignment of ideas and support.
- Establish enterprise data service performance standards and SLAs.
- 8+ years in data architecture & engineering with expertise in data lakehouse or modern data architecture.
- Bachelor’s or Master’s degree in Computer Science, Data Engineering, or a related field.
- Expertise in designing & managing big data platforms, data warehouses/lakehouses or similar hybrid architectures.
- Ability to be a technical lead for a team of 3-5 data engineers.
- Strong skills in SQL, Python, Java, and cloud native tools for data storage & analytics.
- Strong data modeling experience.
- Strong knowledge of ETL/ELT pipelines, streaming data, and orchestration frameworks.
- Understanding of data governance frameworks & compliance standards.
- Experience integrating machine learning workflows into lakehouse environments.
- Prefer experience with database performance testing and SQL optimization.
- Excellent communication & collaboration abilities with remote team members.
- Excellent documentation skills for writing policies, defining business metrics/data glossary terms, writing proposals, and detailing architectures.
- Ability to articulate vision & strategy to senior leadership.
Work Perks:
- Industry competitive salary
- Comprehensive Health & Life insurances
- Hybrid working
- Office lunch
- Exposure to working with distributed global teams
- Paid paternity leave
Generating Apply Link...