Intern Data Engineering

0



What’s the opportunity?

We are looking for a driven and curious Intern Data Engineer with excellent SQL knowledge to join our growing data team. In this role, you will contribute to building and optimizing data pipelines, warehouses, and dashboards across cloud and on-prem environments. You will be exposed to cloud-native tools such as GCP BigQuery, Cloud Spanner, and Dataflow and gain real-world experience across data engineering and analytics use cases.


This role is ideal for someone early in their career who wants to grow technically while supporting business-critical data initiatives.

Octopus BI, a part of Qoria, is committed to delivering cutting-edge data analytics and integration solutions that drive informed decision-making.


Key Responsibilities

  • Develop SQL queries and scripts for transforming, joining, and aggregating data.
  • Assist in building and maintaining scalable ETL/ELT pipelines across cloud and on-prem sources.
  • Support development and optimization of cloud data warehouses and databases (e.g., BigQuery, Cloud Spanner, Bigtable).
  • Automate data processing pipelines using tools like Google Cloud Dataflow and Composer/workflows.
  • Collaborate with team members to ensure data integrity, accuracy, and security during processing.
  • Perform exploratory data analysis (EDA) to identify trends, patterns, and anomalies.
  • Assist in building reports and dashboards using tools like Power BI or Looker.
  • Work with IT and DevOps to solve integration and access issues.
  • Participate in sprint planning, standups, code reviews, and QA processes.
  • Apply statistical and analytical techniques under supervision to support reporting use cases.
  • Keep up to date with modern data engineering tools and best practices.


Must-Have

  • A Bachelor's degree in Computer Science, Data Science, or a related technical field is required.
  • Strong SQL skills with some hands-on experience in writing optimized queries.
  • Understanding of data pipelines, ETL/ELT workflows, and data modeling concepts.
  • Basic programming knowledge in Python, Go and Java.


Nice-to-Have

  • Familiarity with GCP services like BigQuery, Cloud Spanner, and Dataflow.
  • Exposure to data validation, data quality frameworks, or monitoring systems.
  • Awareness of statistical methods and their use in reporting and dashboards.
  • Familiarity (academic or internship) with at least one BI tool (e.g., Power BI, Looker, Tableau).
  • Should be handling structured and semi-structured data (CSV, JSON, Parquet, etc.).
You have to wait 20 seconds

Generating Apply Link...

Post a Comment

0 Comments
* Please Don't Spam Here. All the Comments are Reviewed by Admin.
Post a Comment (0)
Our website uses cookies to enhance your experience. Learn More
Accept !
X

Join Our WhatsApp Channel to get latest Updates Join Now