Skip to content

Avyka Hiring Data Engineer & AI/ML Intern

If you’re aiming to build a career in Data Engineering, Cloud Computing, or Artificial Intelligence, this ML internship opportunity at Avyka is worth serious consideration. With the growing demand for professionals skilled in Google Cloud Platform (GCP), data pipelines, and machine learning, this role gives you a strong foundation in all three areas.

This internship is designed for freshers and recent graduates who want hands-on exposure to real-world data workflows and AI/ML projects while learning from experienced professionals.

Avyka Hiring Data Engineer & AI/ML Intern

About the Company

Avyka is a technology-focused organization working on modern data solutions and AI-driven innovations. The company emphasizes cloud-first development, enabling businesses to build scalable, efficient, and intelligent systems using platforms like Google Cloud.

With a focus on learning, mentorship, and innovation, Avyka provides an environment where freshers can gain practical exposure to industry-level tools and technologies.

Role Overview

As a Data Engineer & AI/ML Intern, you will be working at the intersection of data engineering and machine learning. Your primary focus will be on building and supporting data pipelines, cloud workflows, and AI/ML proof-of-concepts using Google Cloud tools.

This role is not just theoretical—you will actively contribute to projects involving data ingestion, transformation, and model development, making it a valuable stepping stone toward roles like Data Engineer, ML Engineer, or Cloud Engineer.

Responsibilities

  • Assist in building scalable data pipelines using GCP tools like BigQuery, Bigtable, Spanner, and AlloyDB
  • Support data ingestion and transformation using Dataflow, Dataproc, and Pub/Sub
  • Help in developing basic machine learning models and contributing to proof-of-concepts (PoCs)
  • Monitor workflows and troubleshoot issues under guidance
  • Participate in team discussions, brainstorming sessions, and documentation
  • Stay updated with new GCP features and share insights with the team

Help a friend land their next role. Share now!

Who Can Apply

CriteriaDetails
EducationB.Tech / B.Sc / M.Sc (CS, IT, Math, Stats or related)
ExperienceFreshers / Recent Graduates
LocationPune
Work ModeOn-site / Hybrid (as per company policy)
SkillsPython / Java, SQL, Basic ML Knowledge

Preferred Skills

  • Basic understanding of Google Cloud Platform (GCP)
  • Knowledge of SQL and database concepts
  • Familiarity with Python or Java programming
  • Interest in Data Engineering, AI/ML, or Cloud Computing
  • Strong analytical thinking and problem-solving ability
  • Good communication and teamwork skills

What You’ll Get

  • Mentorship from experienced GCP professionals
  • Hands-on training in data engineering and AI/ML workflows
  • Exposure to real-world cloud-based projects
  • Opportunity to work on live data pipelines and ML use cases
  • GCP certification (for selected interns)
  • Internship certificate and potential full-time offer (PPO)
  • Flexible and learning-focused work environment

Stipend (Market Estimate) 💰

The company has not disclosed the stipend. However, based on similar Data Engineering & AI/ML internships in Pune, you can expect:

👉 ₹10,000 – ₹25,000/month (estimated)

Note: The actual stipend may vary depending on your technical skills, academic background, and interview performance.

Why This Internship is Valuable

This role stands out because it combines three high-demand domains:

  • Cloud Computing (GCP) ☁️
  • Data Engineering (Pipelines & Big Data) 📊
  • Artificial Intelligence & Machine Learning 🤖

Most internships focus on just one area, but this one gives you exposure to all three, which significantly boosts your career prospects.

Additionally, GCP is becoming increasingly popular among companies, and having hands-on experience with tools like BigQuery, Dataflow, and Pub/Sub can give you a strong edge in interviews.

How to Apply 🚀

To maximize your chances of getting selected:

👉 Build a strong resume that highlights:

  • Projects related to Python, SQL, or data analysis
  • Any machine learning models you’ve worked on
  • Basic exposure to cloud platforms (even small projects count)

👉 Practice:

  • SQL queries (joins, aggregations, filtering)
  • Python basics (data handling, scripting)
  • Basic ML concepts (regression, classification)

👉 Bonus Tip:
If you have a GitHub portfolio with data projects, it will significantly improve your chances.

👉 Apply by clicking the application link/button provided in the job post and ensure your resume reflects your practical skills clearly.

Disclaimer: This job information is collected from official or public sources. We do not charge any fees for job updates and do not guarantee recruitment. Please verify details from the official source before applying. We are not responsible for any loss arising from the use of this information.

Find your dream job tap the heart!

Share the opportunity

Leave a Reply

Your email address will not be published. Required fields are marked *