Job Description
We are seeking an expert-level ETL Engineer to design, develop, and optimize scalable, high-performance ETL/ELT pipelines for our US Healthcare Revenue Cycle Management (RCM) platform. This role is critical for enabling AI/ML, GenAI, LLM Ops, Agentic AI, and RPA workflows, supporting analytics, predictive modeling, and automation across RCM modules like Claims, Prior Authorization, Coding, Collections, Scheduling, and EDI.
The role requires deep expertise in data engineering, cloud-native ETL, Big Data pipelines, and compliance governance, ensuring reliable, secure, and production-ready data solutions.
Location - Pune
Candidate Requirements
Bachelor s or Master s degree in Computer Science, Data Engineering, Information Systems, or related fields.
5 10 years of experience in ETL/ELT development, data engineering, or Big Data pipelines.
Hands-on experience with cloud ETL platforms, Big Data tools, and modern AI/ML data requirements.
Proven experience integrating pipelines for AI/ML, GenAI, LLM Ops, RPA, and autonomous agent workflows.
Strong knowledge of HIPAA, SOC 2, GDPR, and healthcare compliance standards.
Technical Expertise
Programming Python, SQL, R, Scala, Java
ETL / Data Pipeline Tools Apache Airflow, Apache NiFi, Talend, Informatica, AWS Glue, Azure Data Factory, Google Dataflow, dbt
Big Data Platforms Hadoop, Spark, Snowflake, Redshift, BigQuery, EMR
Cloud Platforms AWS, Azure, GCP (serverless functions, storage, pipelines)
Data Integration APIs, microservices, event-driven streaming (Kafka, Kinesis, Pub/Sub)
RPA AI Integration UiPath, Automation Anywhere, Blue Prism (feeding AI models)
CI/CD DevOps Git, Docker, Kubernetes, Jenkins, MLOps/LLMOps pipelines
Analytics Visualization Tableau, Power BI, Looker (preferred)