11+ Foundry Jobs in Pune | Foundry Job openings in Pune
Apply to 11+ Foundry Jobs in Pune on CutShort.io. Explore the latest Foundry Job opportunities across top companies like Google, Amazon & Adobe.
- Sr. Data Engineer:
Core Skills – Data Engineering, Big Data, Pyspark, Spark SQL and Python
Candidate with prior Palantir Cloud Foundry OR Clinical Trial Data Model background is preferred
Major accountabilities:
- Responsible for Data Engineering, Foundry Data Pipeline Creation, Foundry Analysis & Reporting, Slate Application development, re-usable code development & management and Integrating Internal or External System with Foundry for data ingestion with high quality.
- Have good understanding on Foundry Platform landscape and it’s capabilities
- Performs data analysis required to troubleshoot data related issues and assist in the resolution of data issues.
- Defines company data assets (data models), Pyspark, spark SQL, jobs to populate data models.
- Designs data integrations and data quality framework.
- Design & Implement integration with Internal, External Systems, F1 AWS platform using Foundry Data Connector or Magritte Agent
- Collaboration with data scientists, data analyst and technology teams to document and leverage their understanding of the Foundry integration with different data sources - Actively participate in agile work practices
- Coordinating with Quality Engineer to ensure the all quality controls, naming convention & best practices have been followed
Desired Candidate Profile :
- Strong data engineering background
- Experience with Clinical Data Model is preferred
- Experience in
- SQL Server ,Postgres, Cassandra, Hadoop, and Spark for distributed data storage and parallel computing
- Java and Groovy for our back-end applications and data integration tools
- Python for data processing and analysis
- Cloud infrastructure based on AWS EC2 and S3
- 7+ years IT experience, 2+ years’ experience in Palantir Foundry Platform, 4+ years’ experience in Big Data platform
- 5+ years of Python and Pyspark development experience
- Strong troubleshooting and problem solving skills
- BTech or master's degree in computer science or a related technical field
- Experience designing, building, and maintaining big data pipelines systems
- Hands-on experience on Palantir Foundry Platform and Foundry custom Apps development
- Able to design and implement data integration between Palantir Foundry and external Apps based on Foundry data connector framework
- Hands-on in programming languages primarily Python, R, Java, Unix shell scripts
- Hand-on experience in AWS / Azure cloud platform and stack
- Strong in API based architecture and concept, able to do quick PoC using API integration and development
- Knowledge of machine learning and AI
- Skill and comfort working in a rapidly changing environment with dynamic objectives and iteration with users.
Demonstrated ability to continuously learn, work independently, and make decisions with minimal supervision
JOB DETAILS:
* Job Title: Lead I - Azure, Terraform, GitLab CI
* Industry: Global Digital Transformation Solutions Provider
* Salary: Best in Industry
* Experience: 3-5 years
* Location: Trivandrum/Pune
Job Description
Job Title: DevOps Engineer
Experience: 4–8 Years
Location: Trivandrum & Pune
Job Type: Full-Time
Mandatory skills: Azure, Terraform, GitLab CI, Splunk
Job Description
We are looking for an experienced and driven DevOps Engineer with 4 to 8 years of experience to join our team in Trivandrum or Pune. The ideal candidate will take ownership of automating cloud infrastructure, maintaining CI/CD pipelines, and implementing monitoring solutions to support scalable and reliable software delivery in a cloud-first environment.
Key Responsibilities
- Design, manage, and automate Azure cloud infrastructure using Terraform.
- Develop scalable, reusable, and version-controlled Infrastructure as Code (IaC) modules.
- Implement monitoring and logging solutions using Splunk, Azure Monitor, and Dynatrace.
- Build and maintain secure and efficient CI/CD pipelines using GitLab CI or Harness.
- Collaborate with cross-functional teams to enable smooth deployment workflows and infrastructure updates.
- Analyze system logs, performance metrics, and s to troubleshoot and optimize performance.
- Ensure infrastructure security, compliance, and scalability best practices are followed.
Mandatory Skills
Candidates must have hands-on experience with the following technologies:
- Azure – Cloud infrastructure management and deployment
- Terraform – Infrastructure as Code for scalable provisioning
- GitLab CI – Pipeline development, automation, and integration
- Splunk – Monitoring, logging, and troubleshooting production systems
Preferred Skills
- Experience with Harness (for CI/CD)
- Familiarity with Azure Monitor and Dynatrace
- Scripting proficiency in Python, Bash, or PowerShell
- Understanding of DevOps best practices, containerization, and microservices architecture
- Exposure to Agile and collaborative development environments
Skills Summary
Azure, Terraform, GitLab CI, Splunk (Mandatory) Additional: Harness, Azure Monitor, Dynatrace, Python, Bash, PowerShell
Skills: Azure, Splunk, Terraform, Gitlab Ci
******
Notice period - 0 to 15days only
Job stability is mandatory
Location: Trivandrum/Pune
Strong UX Researcher Profile
Mandatory (Experience 1) - Must have 3+ YOE hands-on UX Research / User Research for B2C Products
Mandatory (Experience 2) - Should have hands-on experience in User Research, Persona's, User Stories, Work-Flows, and Usability Testing
Mandatory (Experience 3) - Must have worked on both qualitative and quantitative research methodologies
Mandatory (Portfolio) - Portfolio showcasing UX Research works for Good B2C Apps / Website / Product Companies
Mandatory (CTC) - The CTC breakup offered will be 80% fixed, and 20% variable, as per Company policy.
Preferred
Preferred (Company) - Product Company
AccioJob is conducting a Walk-In Hiring Drive with Turtle Software for the position of SDE Intern.
To apply, register and select your slot here: https://go.acciojob.com/6Hmbnb
Required Skills: HTML, CSS, JavaScript, React, Node, Python, Django
Eligibility:
Degree: All
Branch: All
Graduation Year: 2025, 2026
Work Details:
Work Location: Pune
CTC: 4 LPA to 5 LPA
Evaluation Process:
Round 1: Offline Assessment at AccioJob Pune Skill Centre
Further Rounds (for shortlisted candidates only):
Resume Evaluation, Technical Interview 1, Technical Interview 2, Technical Interview 3, HR Discussion
Important Note: Bring your laptop & earphones for the test.
Register here:
https://go.acciojob.com/qGCDy3
👇 FAST SLOT BOOKING 👇
[ 📲 DOWNLOAD ACCIOJOB APP ]
1. Develop and maintain user interfaces for web applications using Angular,
They are building and implementing top-notch user interfaces using JavaScript and the Angular framework.
Writing efficient JavaScript code while also using HTML and CSS
Ensuring high performance on mobile and desktop
Able to write unit test cases
2. Work on the latest version of Angular (Angular 14 and above) to implement robust and scalable solutions.
3. Designing, coding, testing, and deploying the application.
4. Debugging issues in the application code to ensure it is working correctly
5. Collaborate with cross-functional teams to integrate RESTful APIs and ensure seamless application functionality.
6. Work according to Agile methodologies, participating in sprints, retrospectives, and other Agile ceremonies.
7. Demonstrate effective communication skills in English, enabling smooth collaboration in an international environment.
8. Adaptability and flexibility to work closely with deployment teams across various time zones.
Job Responsibilities
· Responsibilities for this position include but are not limited to, the following.
· Development experience 3-6 years
· Experience working with Azure cloud-hosted web applications and technologies.
· Design and develop back-end microservices and REST APIs for connected devices, web applications, and mobile applications.
· Stay up to date on relevant technologies, plug into user groups, and understand trends and opportunities that ensure we are using the best techniques and tools.
- Meeting with the software development team to define the scope and scale of software projects.
- Designing software system architecture.
- Completing data structures and design patterns.
- Designing and implementing scalable web services, applications, and APIs.
- Developing and maintaining internal software tools.
- Writing low-level and high-level code.
- Troubleshooting and bug fixing.
- Identifying bottlenecks and improving software efficiency.
- Collaborating with the design team on developing micro-services.
- Writing technical documents.
- Be an active professional in continuous learning resulting in enhancement in organizational objectives.
- Provide technical support to all internal teams and customers as it relates to the product.
Requirements:
- Bachelor’s degree in computer engineering or computer science.
- Previous experience as a full stack engineer and IoT Products.
- Advanced knowledge of front-end languages including HTML5, CSS, JavaScript, Angular, React.
- Proficient in back-end languages including Nodejs and basic knowledge of Java, C#.
- Experience with cloud computing APIs and Cloud Providers such as Azure or AWS.
· Working knowledge of database systems (Cassandra, CosmosDB, Redis, PostgreSQL)
· Messaging systems (RabbitMQ, MQTT, Kafka)
· Cloud-based distributed application scaling & data processing in the cloud
· Agile / Scrum methodology
- Advanced troubleshooting skills.
- Familiarity with JavaScript frameworks.
- Good communication skills.
High-level project management skills.
- Well versed with Programming Languages C, C++
- At least 2 years of experience working in projects in CAD/CAM or 3D graphics APIs like OpenGL, OpenGL-ES, or WebGL, etc.
- Good hold on Geometry
- Able to work independently as needed .
- Able to participate and drive in customer scoping sessions, able to do designs and design reviews, code reviews etc.
- Good communication and client facing skills
- Quick learner
- Self-driven person
- We are looking for someone who can join soon
Years of Experience – 2-3 years
Location – Flexible (Pune/Jaipur Preferred), India
Position Summary
At Clarista.io, we are driven to create a connected data world for enterprises, empowering their employees with the information they need to compete in the digital economy. Information is power, but only if it can be harnessed by people.
Clarista turns current enterprise data silos into a ‘Live Data Network’, easy to use, always available, with flexibility to create any analytics with controls to ensure quality and security of the information
Clarista is designed with business teams in mind, hence ensuring performance with large datasets and a superior user experience are critical to the success of the product
What You'll Do
You will be part of our data platform & data engineering team. As part of this agile team, you will work in our cloud native environment and perform following activities to support core product development and client specific projects:
• You will develop the core engineering frameworks for an advanced self-service data analytics product.
• You will work with multiple types of data storage technologies such as relational, blobs, key-value stores, document databases and streaming data sources.
• You will work with latest technologies for data federation with MPP (Massive Parallel Processing) capabilities
• Your work will entail backend architecture to enable product capabilities, data modeling, data queries for UI functionality, data processing for client specific needs and API development for both back-end and front-end data interfaces.
• You will build real-time monitoring dashboards and alerting systems.
• You will integrate our product with other data products through APIs
• You will partner with other team members in understanding the functional / nonfunctional\ business requirements, and translate them into software development tasks
• You will follow the software development best practices in ensuring that the code architecture and quality of code written by you is of high standard, as expected from an enterprise software
• You will be a proactive contributor to team and project discussions
Who you are
• Strong education track record - Bachelors or an advanced degree in Computer Science or a related engineering discipline from Indian Institute of Technology or equivalent premium institute.
• 2-3 years of experience in Big Data and Data Engineering.
• Strong knowledge of advanced SQL, data federation and distributed architectures
• Excellent Python programming skills. Familiarity with Scala and Java are highly preferred
• Strong knowledge and experience in modern and distributed data stack
components such as the Spark, Hive, airflow, Kubernetes, docker etc.
• Experience with cloud environments (AWS, Azure) and native cloud technologies for data storage and data processing
• Experience with relational SQL and NoSQL databases, including Postgres, Blobs, MongoDB etc.
• Experience with data pipeline and workflow management tools: Airflow, Dataflow, Dataproc etc.
• Experience with Big Data processing and performance optimization
• Should know how to write modular and optimized code.
• Should have good knowledge around error handling.
• Fair understanding of responsive design and cross-browser compatibility issues.
• Experience versioning control systems such as GIT
• Strong problem solving and communication skills.
• Self-starter, continuous learner.
Good to have some exposure to
• Start-up experience is highly preferred
• Exposure to any Business Intelligence (BI) tools like Tableau, Dundas, Power BI etc.
• Agile software development methodologies.
• Working in multi-functional, multi-location teams
What You'll Love About Us – Do ask us about these!
• Be an integral part of the founding team. You will work directly with the founder
• Work Life Balance. You can't do a good job if your job is all you do!
• Prepare for the Future. Academy – we are all learners; we are all teachers!
• Diversity & Inclusion. HeForShe!
• Internal Mobility. Grow with us!
• Business knowledge of multiple sectors
Advanced degree in computer science, math, statistics or a related discipline ( Must have master degree )
Extensive data modeling and data architecture skills
Programming experience in Python, R
Background in machine learning frameworks such as TensorFlow or Keras
Knowledge of Hadoop or another distributed computing systems
Experience working in an Agile environment
Advanced math skills (Linear algebra
Discrete math
Differential equations (ODEs and numerical)
Theory of statistics 1
Numerical analysis 1 (numerical linear algebra) and 2 (quadrature)
Abstract algebra
Number theory
Real analysis
Complex analysis
Intermediate analysis (point set topology)) ( important )
Strong written and verbal communications
Hands on experience on NLP and NLG
Experience in advanced statistical techniques and concepts. ( GLM/regression, Random forest, boosting, trees, text mining ) and experience with application.




