iLabs is a global software product engineering company headquartered in Sri Lanka, with deep roots in the US Silicon Valley. We deliver world-class solutions in Web, eCommerce, Mobile, AI/ML, and Cloud technologies, serving industries such as fintech, edtech, medtech, martech, and hospitality. Guided by our vision “to become a global powerhouse in information technology to push humanity forward” we focus on delivering innovative, impactful solutions that empower businesses and create meaningful change in the world.
With a global talent network, we build agile remote teams for leading tech companies worldwide, including Silicon Valley pioneers. Our in-house ventures include Cloud of Goods, a fast-growing eCommerce rental marketplace, and Xenia, a customizable web platform for modern businesses.
At iLabs, we’re on a mission to advance lifestyles through technology and empower our partners to scale smarter and faster. Our culture is driven by creativity, innovation, ownership, teamwork, and global impact; giving you the opportunity and freedom to challenge the norm, spark change, and make a real difference.
If you’re ready to break boundaries and create your defining moment, we’re here to make it happen. Be part of something bigger. Join iLabs
Job Requirements
- Design and implement scalable, reliable distributed data processing frameworks and analytical infrastructure.
- Be part of a team to define, design, and implement data integration, management, storage, consumption, backup, and recovery solutions that ensure the high performance of the organization's enterprise data.
- Develop Structured Query Language (SQL), Data Definition Language (DDL), and Python or equivalent programming scripts to support data pipeline development, problem-solving, data validation, and performance tuning.
- Work with software engineers, devops, ML engineers, and data scientists to achieve the organization’s goals.
Location: Battaramulla (On-Site)
Job Requirements
- Minimum 0.6 - 1 years experience in the similar capacity with BS/MS degree in Computer Science, Engineering or a related subject.
- Strong knowledge of python programming languages is required.
- Experience with FastAPI, Django, Django REST Framework, and Celery for background task processing.
- Strong understanding of SOLID principles and ability to write clean, reusable, and maintainable code.
- Knowledge of MongoDB or AWS DocumentDB, along with caching technologies such as Redis.
- Experience with data engineering tools and platforms, including Delta Lake and Data Lakehouse architectures
- Proficiency in ELT/ETL data pipeline orchestration tools, such as Apache Airflow.
- Understanding about data warehousing solutions, relational database theories and no-sql databases.
- Good knowledge of new and emerging tools for extracting, ingesting and processing of large datasets (Kafka, Spark, Hadoop, DataBricks or equivalent).
- Hands-on experience with Amazon Web Services is a big plus and knowledge about other cloud systems (Azure/GCP).
- Understanding about dockerization and kubernetes, data modeling and design patterns is a big plus.
- Knowledge of web scraping technologies is a big plus (selenium, beautiful soup etc.).
- Familiarity with Linux.
- Excellent interpersonal, communication and organizational skills are required
Generating Apply Link...



