Job brief.
Role: Senior AI Engineer
Location: Hyderabad
Culture: Hybrid
Experience Required: 8 – 12 years
Job Description.
This high-impact role focuses on building scalable machine learning and generative AI solutions that drive measurable improvements across business units.
Key Responsibilities.
- Collaborate with cross-functional teams to gather business requirements and understand the scope of analytical problems.
- Partner closely with data scientists to implement, test, and scale ML/GenAI models on the Azure platform.
- Build and manage scalable data pipelines (real-time and batch) using tools like Databricks, DBT, and Airflow.
- Develop and maintain secure, high-quality, production-level code that adheres to best practices.
- Integrate with structured and unstructured data sources via APIs and automated pipelines.
- Develop and consume scalable APIs (multi-threaded/batched), using frameworks like FastAPI and Uvicorn.
- Design and maintain CI/CD pipelines using tools like Jenkins and container technologies (Docker, Kubernetes).
- Communicate complex technical concepts clearly and effectively to business stakeholders.
- Stay up-to-date with the latest advancements in AI/ML, particularly in Generative AI, Prompt Engineering, and RAGs with LLMs.
- Foster a collaborative team culture focused on knowledge sharing and continuous improvement.
Qualifications.
- Bachelor’s or Master’s degree in Computer Science, Data Science, Engineering, Mathematics, or a related technical field.
- 8–12 years of total professional experience, with 5+ years focused on Machine Learning (ML).
- Strong proficiency in Python for developing and deploying scalable ML solutions.
- Hands-on experience with ML libraries and frameworks such as TensorFlow, PyTorch, scikit-learn, etc.
- Proven ability to design, build, and deploy end-to-end ML/NLP pipelines in production environments.
- Deep understanding of NLP techniques: text cleaning and pre-processing, entity extraction, encoder-decoder architectures, semantic similarity, etc.
- Experience working in distributed, high-throughput, low-latency systems.
- Proficient in containerization technologies such as Docker and Kubernetes, with experience in Linux command-line scripting.
- Working knowledge of NoSQL databases like MongoDB, Elasticsearch, or CosmosDB.
- Exposure to orchestration and workflow tools like Apache Airflow, Luigi, and DBT.
- Familiarity with scripting/programming languages such as R, Scala, Go, or Java is a plus.
- Strong communication, collaboration, and stakeholder engagement skills.
- Passion for continuous learning, with a keen interest in Generative AI, LLMs, Prompt Engineering, and RAGs.
- This role is open exclusively to female candidates in support of our commitment to building a diverse and inclusive workforce.