11+ Data services Jobs in Pune | Data services Job openings in Pune
Apply to 11+ Data services Jobs in Pune on CutShort.io. Explore the latest Data services Job opportunities across top companies like Google, Amazon & Adobe.
TVARIT GmbH develops and delivers solutions in the field of artificial intelligence (AI) for the Manufacturing, automotive, and process industries. With its software products, TVARIT makes it possible for its customers to make intelligent and well-founded decisions, e.g., in forward-looking Maintenance, increasing the OEE and predictive quality. We have renowned reference customers, competent technology, a good research team from renowned Universities, and the award of a renowned AI prize (e.g., EU Horizon 2020) which makes TVARIT one of the most innovative AI companies in Germany and Europe.
We are looking for a self-motivated person with a positive "can-do" attitude and excellent oral and written communication skills in English.
We are seeking a skilled and motivated senior Data Engineer from the manufacturing Industry with over four years of experience to join our team. The Senior Data Engineer will oversee the department’s data infrastructure, including developing a data model, integrating large amounts of data from different systems, building & enhancing a data lake-house & subsequent analytics environment, and writing scripts to facilitate data analysis. The ideal candidate will have a strong foundation in ETL pipelines and Python, with additional experience in Azure and Terraform being a plus. This role requires a proactive individual who can contribute to our data infrastructure and support our analytics and data science initiatives.
Skills Required:
- Experience in the manufacturing industry (metal industry is a plus)
- 4+ years of experience as a Data Engineer
- Experience in data cleaning & structuring and data manipulation
- Architect and optimize complex data pipelines, leading the design and implementation of scalable data infrastructure, and ensuring data quality and reliability at scale
- ETL Pipelines: Proven experience in designing, building, and maintaining ETL pipelines.
- Python: Strong proficiency in Python programming for data manipulation, transformation, and automation.
- Experience in SQL and data structures
- Knowledge in big data technologies such as Spark, Flink, Hadoop, Apache, and NoSQL databases.
- Knowledge of cloud technologies (at least one) such as AWS, Azure, and Google Cloud Platform.
- Proficient in data management and data governance
- Strong analytical experience & skills that can extract actionable insights from raw data to help improve the business.
- Strong analytical and problem-solving skills.
- Excellent communication and teamwork abilities.
Nice To Have:
- Azure: Experience with Azure data services (e.g., Azure Data Factory, Azure Databricks, Azure SQL Database).
- Terraform: Knowledge of Terraform for infrastructure as code (IaC) to manage cloud.
- Bachelor’s degree in computer science, Information Technology, Engineering, or a related field from top-tier Indian Institutes of Information Technology (IIITs).
- Benefits And Perks
- A culture that fosters innovation, creativity, continuous learning, and resilience
- Progressive leave policy promoting work-life balance
- Mentorship opportunities with highly qualified internal resources and industry-driven programs
- Multicultural peer groups and supportive workplace policies
- Annual workcation program allowing you to work from various scenic locations
- Experience the unique environment of a dynamic start-up
Why should you join TVARIT ?
Working at TVARIT, a deep-tech German IT startup, offers a unique blend of innovation, collaboration, and growth opportunities. We seek individuals eager to adapt and thrive in a rapidly evolving environment.
If this opportunity excites you and aligns with your career aspirations, we encourage you to apply today!
Looking an experienced and organized Project Manager to oversee end-to-end project execution, ensure timely delivery, and coordinate between teams, clients, and stakeholders. The role requires strong leadership, planning, and communication skills to successfully manage projects and achieve business objectives.
Key Responsibilities:
- Plan, execute, and deliver projects within scope, budget, and timelines.
- Define project objectives, deliverables, and resource requirements.
- Coordinate with cross-functional teams (sales, operations, documentation, counseling, etc.).
- Track project progress, prepare reports, and communicate updates to management and stakeholders.
- Identify project risks, issues, and bottlenecks; implement corrective actions.
- Ensure client requirements are clearly understood and met with quality standards.
- Manage project documentation, compliance, and reporting.
- Lead, mentor, and motivate team members to achieve performance goals.
- Optimize workflows and improve efficiency in project execution.
Requirements:
- Proven work experience as a Project Manager.
- Strong knowledge of project management tools, methodologies, and reporting.
- Excellent leadership, problem-solving, and organizational skills.
- Strong communication and stakeholder management abilities.
- Ability to handle multiple projects simultaneously and meet deadlines.
- Proficiency in MS Office, project management tools (e.g., Trello, Asana, MS Project, Jira).
Hiring: Azure Data Engineer
⭐ Experience: 2+ Years
📍 Location: Pune, Bhopal, Jaipur, Gurgaon, Bangalore
⭐ Work Mode:- Hybrid
⏱️ Notice Period: Immediate Joiners
Passport: Mandatory & Valid
(Only immediate joiners & candidates serving notice period)
Mandatory Skills:
Azure Synapse, Azure Databricks, Azure Data Factory (ADF), SQL, Delta Lake, ADLS, ETL/ELT,Pyspark .
Responsibilities:
- Build and maintain data pipelines using ADF, Databricks, and Synapse.
- Develop ETL/ELT workflows and optimize SQL queries.
- Implement Delta Lake for scalable lakehouse architecture.
- Create Synapse data models and Spark/Databricks notebooks.
- Ensure data quality, performance, and security.
- Collaborate with cross-functional teams on data requirements.
Nice to Have:
Azure DevOps, Python, Streaming (Event Hub/Kafka), Power BI, Azure certifications (DP-203).
About the Company:
Verinite is a global technology consulting and services company laser-focused on the banking & financial services sector, especially in cards, payments, lending, trade, and treasury
They partner with banks, fintechs, payment processors, and other financial institutions to modernize their systems, improve operational resilience, and accelerate digital transformation. Their services include consulting, digital strategy, data, application modernization, quality engineering (testing), cloud & infrastructure, and application maintenance.
Skill – Authorization, Clearing and Settlement
1. Individual should have worked on scheme (Visa, Amex, Discover, Rupay & Mastercard both on authorization or clearing section.
2. Should be able to read scheme specifications and create business requirement/mapping for authorization and Clearing
3. Should have Hands on experience in implementing scheme related changes
4. Should be able to validate the and certify the change post development based on the mapping created
5. Should be able to work with Dev team on explaining and guiding on time-to-time basis.
6. Able to communicate with various teams & senior stakeholders
7. Go getter and great googler
8. Schemes – VISA/MC/AMEX/JCB/CUP/Mercury – Discover and Diners, CBUAE, Jaywan ( Local Scheme from UAE)
9.Experience with Issuing side is plus (good to have).
🚀 Hiring: SAP Vistex Consultant
📍 Locations: Pune | Mumbai | Bangalore | Gandhinagar
🧠 Experience: 6+ years in SAP Vistex
✅ Requirements:
- Minimum 6 years of hands-on experience in SAP Vistex
- Expertise in Incentive Management, Rebates, Chargebacks, and Claims Processing
- Strong integration experience with SAP SD/MM
- Should have done at least 1 full-cycle implementation
- Good understanding of Vistex modules like Deal Management, Pricing, and Contract Setup
- Support & Enhancement project experience preferred
- Excellent communication skills and ability to interact with clients
💼 Nice to Have:
- Exposure to SAP S/4HANA
- Knowledge of ABAP debugging
⏱️ Notice Period:
Immediate to 30 days preferred
environment. He/she must demonstrate a high level of ownership, integrity, and leadership
skills and be flexible and adaptive with a strong desire to learn & excel.
Required Skills:
- Strong experience working with tools and platforms like Helm charts, Circle CI, Jenkins,
- and/or Codefresh
- Excellent knowledge of AWS offerings around Cloud and DevOps
- Strong expertise in containerization platforms like Docker and container orchestration platforms like Kubernetes & Rancher
- Should be familiar with leading Infrastructure as Code tools such as Terraform, CloudFormation, etc.
- Strong experience in Python, Shell Scripting, Ansible, and Terraform
- Good command over monitoring tools like Datadog, Zabbix, Elk, Grafana, CloudWatch, Stackdriver, Prometheus, JFrog, Nagios, etc.
- Experience with Linux/Unix systems administration.
Requirements :
a) CSS / Bootstrap : Flex Layout/ability to demonstrate the cascading nature of CSS clearly with example/Positioning.
b) Pure JavaScript: Prototype chain and inner workings of inheritance in JS. Understood DOM, Events, Event Bubbling, and Capturing/ ability to demonstrate it with example/ Promises and their use cases.
c) TypeScript: Basic understanding of using TS. Union Types, Index Signatures,
d) REST: Backend experience. REST principles, URL structures for APIs. integration using native JS (fetch and Promises) and in Angular. use cases of getting vs POST vs PUT vs PATCH.
e) Angular : Abstractions of Angular Component, Service, Pipe, Directive, Module, Lifecycles. Knew advanced patterns as well Dynamic Components, Content Projection, Reactive Forms, ViewChild, ContentChild. Understood Observables and various operators demonstration of Design Skills ability to create the component and module hierarchy for a moderately complex application, along with their interactions
- Working on interesting technical challenges in a product centric and open-source driven environment.
- Providing architectural direction on large-scale enterprise project implementations.
- Structuring teams to ensure there is capacity to work on larger architectural redesign to meet scalability, performance, security or compliance needs.
- Working closely with clients as they build features, functionality, and applications to make the ideal applications for the end customers.
- Developing visioning skills and ability to see the big picture
- Implementing and contributing to engineering practices and processes
What you need to have:
- B.Tech /B.E.; M.Tech
- NodeJs, ExpressJs and Java based applications
- loopback.ioas a framework.
- Elastic search and MongoDB is used as Database
- Angular and Angular 2 is the dominant front-end framework used for development
- HTML5, CSS3, Angular, Angular 2/4/6 stacks
- What we look for in a front-end engineer is someone who has deep hands-on experience with Angular/ AngularJS and proficiency with visual design for a mobile-first product.
- Experience with MEAN stack is a plus.
- It's a joint venture between Front-end and Back-end.
- Docker
- Kubernetes
- Github
- Third-party API integrations
Location: Pune
Employment Type: Full Time/Part-Time
Experience: 2-5 years
Eligibility: Qualified Company Secretary (Women who are willing to restart their career also can apply)
JOB DESCRIPTION:
The candidate must have good knowledge of
• The Companies Act, 2013
• Companies Rules made thereunder
• Foreign Exchange Management Act
• Securities and Exchange Board of India Act and the
Regulations.







