
About Superclaims
Superclaims modernizes health insurance claims adjudication with intelligent automation. We help insurers and TPAs replace manual, document-heavy workflows with faster, more accurate decisions at scale.
Role: Python Backend Developer
We are looking for a Python Backend Developer who is excited to build AI-powered automation products in a fast-paced startup environment.
What you'll do
- Build and maintain scalable backend systems and APIs
- Develop intelligent data extraction pipelines using AI/ML
- Design and implement agentic workflows with LangGraph
- Design efficient database schemas and optimize queries in PostgreSQL
- Integrate and work with LLMs (OpenAI, Gemini, or similar)
- Collaborate with product, frontend, and data teams to deliver end-to-end features
- Write clean, tested, and well-documented code
Must-have skills
- Strong proficiency in Python and a modern web framework (FastAPI or similar)
- Experience with PostgreSQL and an ORM (SQLAlchemy preferred)
- Solid understanding of RESTful API design and best practices
- Hands-on experience or strong familiarity with LangGraph
- Experience working with LLMs (OpenAI, Gemini, or similar providers)
- Comfort with Git/version control and collaborative development workflows
Nice-to-have skills
- Experience with Docker and containerized deployments
- Knowledge of Redis for caching or background tasks
- Exposure to cloud platforms (GCP, AWS, or Azure)
- Experience with vector databases and retrieval-augmented generation
- Basic prompt engineering skills
- Experience with object storage (S3/MinIO)
What we're looking for
- 1+ years of Python backend development experience (open to exceptional freshers)
- Fast learner with genuine curiosity about AI/ML and automation
- Prior startup experience preferred
- Ownership mindset, bias for action, and comfort with ambiguity
- Ready to relocate to Hyderabad (work location)
How to apply
Please share:
- Your resume
- GitHub/Portfolio link
- A brief note on why you're interested in AI-powered automation and Superclaims

About Superclaims
About
Company social profiles
Similar jobs
Strong Senior Backend Engineer profiles
Mandatory (Experience 1) – Must have 5+ years of hands-on Backend Engineering experience building scalable, production-grade systems
Mandatory (Experience 2) – Must have strong backend development experience using one or more frameworks (FastAPI / Django (Python), Spring (Java), Express (Node.js).
Mandatory (Experience 3) – Must have deep understanding of relevant libraries, tools, and best practices within the chosen backend framework
Mandatory (Experience 4) – Must have strong experience with databases, including SQL and NoSQL, along with efficient data modeling and performance optimization
Mandatory (Experience 5) – Must have experience designing, building, and maintaining APIs, services, and backend systems, including system design and clean code practices
Mandatory (Domain) – Experience with financial systems, billing platforms, or fintech applications is highly preferred (fintech background is a strong plus)
Mandatory (Company) – Must have worked in product companies / startups, preferably Series A to Series D
Mandatory (Education) – Candidates from Tier - 1 engineering institutes (IITs, BITS, are highly preferred)
Role: Sr. Azure Data Engineer
Experience: 8–10 Years
Work Timings: 1:30 PM – 10:30 PM IST
Location: Bellandur Bengaluru (Work from Office)
Company: Chevron
Employment Type: 6- 12 months Contract
Role Overview
We are seeking an experienced Senior Data Engineer to design and deliver scalable cloud data solutions on Azure. The ideal candidate will have strong expertise in Databricks, PySpark, and modern data architectures, with exposure to energy domain standards like OSDU.
Key Responsibilities
- Architect and design robust Azure-based data solutions using Databricks, ADLS, and PaaS services
- Define and implement scalable data Lakehouse architectures aligned with OSDU standards
- Build and manage end-to-end data pipelines for batch and real-time processing using PySpark
- Establish data governance frameworks including metadata, lineage, security, and access control
- Implement DevOps best practices (CI/CD, Azure Pipelines, GitHub, automated deployments)
- Collaborate with stakeholders to translate business needs into technical solutions
- Develop and maintain architecture documentation, solution patterns, and standards
- Provide technical leadership and mentorship to engineering teams
- Optimize solutions for performance, cost, reliability, and security
- Ensure alignment with enterprise architecture and compliance standards
- Drive adoption of modular and reusable cloud data components
Required Skills & Qualifications
Core Technical Skills
- Azure Databricks, Apache Spark (PySpark), Delta Lake, Unity Catalog
- Azure Data Lake Storage (ADLS), Azure Data Factory, Synapse Analytics
- Strong experience in Python-based data engineering
- Data pipeline development (batch + real-time)
Architecture & Advanced Skills
- Data Lakehouse architecture and distributed systems
- Microservices, APIs, and integration frameworks
- OSDU (Open Subsurface Data Universe) or similar energy data models
DevOps & Tools
- CI/CD tools: Azure Pipelines, GitHub Actions
- Infrastructure as Code: Terraform or similar
Other Skills
- Data governance, security, compliance, and cost optimization
- Strong analytical and problem-solving skills
- Excellent communication and stakeholder management
Senior Python Developer
Experience: 4–8 Years
About the Role
We are looking for a Senior Python Developer Engineer to join our team. This role focuses on building and maintaining data-intensive backend systems, handling large-scale datasets, and exposing insights through robust, scalable APIs.
You will work closely with operational and transactional data, design efficient data pipelines, and build backend services that power analytics, reports, and ERP workflows. The ideal candidate is strong in Python, excellent with data and databases, and capable of owning features end-to-end.
Key Responsibilities
- Analyze large datasets to identify trends, inconsistencies, and operational insights.
- Design, build, and maintain backend services and REST APIs using Python and FastAPI.
- Perform advanced data manipulation and aggregation using Pandas, NumPy, and SQL.
- Design and optimize data pipelines for analytics, reporting, and downstream systems.
- Implement automated data quality checks, validations, and monitoring scripts.
- Work closely with product, application, and business teams to translate raw data into clear, actionable outputs.
- Optimize query performance across relational and analytical databases.
- Expose processed data and insights via APIs or dashboards for consumption by web or ERP applications.
- Ensure high standards of code quality, performance, scalability, and maintainability.
- Write clear documentation for APIs, data flows, and processing logic.
Required Skills & Qualifications
- 4–8 years of strong, hands-on experience with Python in production systems.
- Excellent experience with data handling, processing, and large datasets.
- Strong experience building APIs using FastAPI (or similar frameworks).
- Deep expertise in Pandas, NumPy, and SQL.
- Solid experience with MySQL and PostgreSQL.
- Experience working with analytical or reporting workloads.
- Strong understanding of data modeling, joins, aggregations, and performance tuning.
- Proficiency with Git and collaborative development workflows.
- Strong analytical and problem-solving skills with the ability to work independently.
Good to Have
- Experience with ClickHouse, Databricks, or Elasticsearch.
- Exposure to data engineering concepts such as ETL/ELT, batch processing, and data pipelines.
- Experience with workflow orchestration tools (Airflow, Prefect, Dagster).
- Familiarity with data visualization libraries (Plotly, Matplotlib, Seaborn).
- Experience with AWS services (S3, EC2, RDS, Lambda).
- Prior experience integrating data services into ERP or business applications.
Soft Skills
- Strong analytical mindset and attention to detail.
- High ownership and accountability.
- Ability to work independently with minimal supervision.
- Clear communication and documentation skills.
- Proactive, solution-oriented approach.
We are seeking an experienced Senior Software Engineer to join our Vet Healthcare Technology team. In this role, you will design, develop, and maintain cloud-native applications on Azure that power our Practice Management platform. You’ll collaborate closely with cross-functional teams—clinical SME’s, architects, QA, and DevOps—to deliver robust, scalable, and secure solutions utilizing .NET 8, React, and modern Azure services.
Key Responsibilities
- Architecture & Design
- Lead design discussions and apply proven design patterns (e.g., CQRS, Repository, Factory) to ensure clean, maintainable code.
- Define microservices boundaries and integration strategies (APIs, messaging) for HL7 and FHIR data flows.
- Development & Integration
- Build backend services in .NET 8, leveraging Azure Functions, Logic Apps, Service Bus, API Gateway, and Storage Services.
- Develop responsive front-end interfaces using React, TypeScript, and state-management libraries (e.g., Redux or Context API).
- Implement data persistence layers for SQL Server and PostgreSQL, including schema design, stored procedures, and performance tuning.
- Integrate with healthcare standards (HL7 v2/v3, FHIR R4) and third-party systems via secure, high-throughput interfaces.
- Quality & Compliance
- Write unit and integration tests to ensure code quality; participate in code reviews and pair-programming sessions.
- Follow best practices for security, privacy, and compliance in healthcare (HIPAA, GDPR, etc.).
- Mentorship & Collaboration
- Mentor mid-level engineers, drive knowledge-sharing sessions, and contribute to technical roadmaps.
- Work in an Agile/Scrum environment: estimate user stories, attend sprint ceremonies, and deliver incremental value.
Role Overview:
We are seeking skilled Backend Developers to design, build, and maintain efficient, scalable, and secure server-side logic and services. The ideal candidate will have strong expertise in Python, Flask, and Google Cloud Platform (GCP), with experience building APIs, handling databases, and integrating cloud services in production environments.
Required Experience: 4+ Years
Location: Chennai, Open for remote for strong candidates
Key Responsibilities:
- Collaborate with project teams to understand business requirements and develop efficient, high-quality code.
- Design and implement low-latency, high-availability, and performant applications using frameworks such as Flask, or FastAPI.
- Integrate multiple data sources and databases into a unified system while ensuring seamless data storage and third-party library/package integration.
- Create scalable and optimized database schemas to support complex business logic and manage large volumes of data.
- Conduct thorough testing using pytest and unittest, debugging applications to ensure they run smoothly.
Required Skills & Qualifications:
- 3+ years of experience as a Python developer with strong communication skills.
- Bachelor's degree in Computer Science, Software Engineering or a related field.
- In-depth knowledge of Python frameworks such as Flask, or FastAPI.
- Strong expertise in cloud technologies, GCP preferred.
- Deep understanding of microservices architecture, multi-tenant architecture, and best practices in Python development.
- Familiarity with serverless architecture and frameworks like GCP Cloud Functions.
- Experience with deployment using Docker, Nginx, Gunicorn.
- Hands-on experience with SQL and NoSQL databases such as MySQL and Firebase.
- Proficiency with Object Relational Mappers (ORMs) like SQLAlchemy.
- Demonstrated ability to handle multiple API integrations and write modular, reusable code.
- Strong knowledge of user authentication and authorization mechanisms across multiple systems and environments.
- Familiarity with scalable application design principles and event-driven programming in Python.
- Solid experience in unit testing, debugging, and code optimization.
- Hands-on experience with modern software development methodologies, including Agile and Scrum.
- Experience with CI/CD pipelines and automation tools like Jenkins, GitLab CI, or CircleCI.
- Experience with version control system.
Driving Results:
- A good single contributor and a good team player.
- Flexible attitude towards work, as per the needs.
- Proactively identify & communicate issues and risks.
Other Personal Characteristics:
- Dynamic, engaging, self-reliant developer
- Ability to deal with ambiguity
- Manage a collaborative and analytical approach
- Self-confident and humble
- Open to continuous learning
- Intelligent, rigorous thinker who can operate successfully amongst bright people
Experience:
The candidate should have about 2+ years of experience with design and development in Java/Scala. Experience in algorithm, data-structure, database and distributed System is mandatory.
Required Skills:
Mandatory: -
- Core Java or Scala
- Experience in Big Data, Spark
- Extensive experience in developing spark job. Should possess good Oops knowledge and be aware of enterprise application design patterns.
- Should have the ability to analyze, design, develop and test complexity of spark job.
- Working knowledge of Unix/Linux.
- Hands on experience in Spark, creating RDD, applying operation - transformation-action
Good To have: -
- Python
- Spark streaming
- Py Spark
- Azure/AWS Cloud Knowledge of Data Storage and Compute side
Role
- A Golang developer role is to build optimized, scalable, and modular software using required technologies. You are tasked with developing and coding back-end components and connecting applications to other web services.
Authority
- Research and test new technologies
- Collaborating with others to build and develop a quality
- Monitoring and Oversee company’s data
- Managing users and user roles
- Detecting, announcing, and correcting errors
- Proposing alternative solutions
Responsibility
- To cooperate with other stakeholder to design, develop, test, release and improve services
- Maintain development standards, practices & principles
- Build scalable and maintainable software
- Analytical approach on what and how
Requirements
- At least 4 years of experience with Golang.
- Have expertise in implementing micro services (using tools and technologies for messaging, RPC, containerization, etc.)
- Experience working with SQL/NoSQL databases, ability to write complex queries and optimize them
- Understanding of containerization technologies (Docker, RKT, Kubernetes, etc.)
- Basic experience with CI/CD systems (Jenkins, TeamCity, GoCD, Concourse, etc.)
- Basic experience working with AWS/Google Cloud
Requirements:
- Experience in Enterprise Java building restful microservices
- Strong Exposure to any of the Java Enterprise frameworks such as Spring, Vert. x, Quarkus or others
- Good exposure to databases such as PostgreSQL, MongoDB etc
- Good analytical and problem-solving capabilities along with excellent communication skills.
- Any exposure to UI programming using REACT JS or AngularJS is a plus
- Preferred candidate who can join in 15 days or max 30 days.







