11+ Training & Development Jobs in Pune | Training & Development Job openings in Pune
Apply to 11+ Training & Development Jobs in Pune on CutShort.io. Explore the latest Training & Development Job opportunities across top companies like Google, Amazon & Adobe.
Profile:Azure Data Engineer
Experience: 5 Years
Locations: Nagpur | Nashik | Pune | Indore | Jaipur | Gurgaon | Noida | Bhopal | Gwalior | Mumbai
Mode:Hybrid
Mandatory Skills
- Azure Databricks - Autoloader, DLT (Delta Live Tables)
- Azure Data Factory (ADF) - ETL/ELT pipeline development
- Azure Synapse Analytics - Data warehousing solutions
- Delta Lake - ACID transactions and schema evolution
- Apache Kafka & Azure Event Hub - Real-time streaming
- SQL - Advanced querying and optimization
- Data Quality - Validation and monitoring frameworks
Key Responsibilities
- Design and develop data pipelines using ADF and Databricks
- Implement real-time and batch processing with Kafka/Event Hub
- Build Delta Live Tables for declarative ETL workflows
- Manage Delta Lake architecture with schema evolution
- Optimize data ingestion using Databricks Autoloader
- Implement data quality and monitoring solutions
- Performance tuning of Synapse Analytics queries
Required Qualifications
- 5+ years of hands-on Azure data engineering experience
- Strong programming skills in Python/Scala/Spark
- Experience with data warehousing and dimensional modeling
- Knowledge of CI/CD practices for data pipelines
Preferred
- Azure Data Engineer Associate or Databricks certification
- Experience with Power BI and version control (Git)
Job Title: PySpark/Scala Developer
Functional Skills: Experience in Credit Risk/Regulatory risk domain
Technical Skills: Spark ,PySpark, Python, Hive, Scala, MapReduce, Unix shell scripting
Good to Have Skills: Exposure to Machine Learning Techniques
Job Description:
5+ Years of experience with Developing/Fine tuning and implementing programs/applications
Using Python/PySpark/Scala on Big Data/Hadoop Platform.
Roles and Responsibilities:
a) Work with a Leading Bank’s Risk Management team on specific projects/requirements pertaining to risk Models in
consumer and wholesale banking
b) Enhance Machine Learning Models using PySpark or Scala
c) Work with Data Scientists to Build ML Models based on Business Requirements and Follow ML Cycle to Deploy them all
the way to Production Environment
d) Participate Feature Engineering, Training Models, Scoring and retraining
e) Architect Data Pipeline and Automate Data Ingestion and Model Jobs
Skills and competencies:
Required:
· Strong analytical skills in conducting sophisticated statistical analysis using bureau/vendor data, customer performance
Data and macro-economic data to solve business problems.
· Working experience in languages PySpark & Scala to develop code to validate and implement models and codes in
Credit Risk/Banking
· Experience with distributed systems such as Hadoop/MapReduce, Spark, streaming data processing, cloud architecture.
- Familiarity with machine learning frameworks and libraries (like scikit-learn, SparkML, tensorflow, pytorch etc.
- Experience in systems integration, web services, batch processing
- Experience in migrating codes to PySpark/Scala is big Plus
- The ability to act as liaison conveying information needs of the business to IT and data constraints to the business
applies equal conveyance regarding business strategy and IT strategy, business processes and work flow
· Flexibility in approach and thought process
· Attitude to learn and comprehend the periodical changes in the regulatory requirement as per FED

Tech Lead (India) — Help Build WebLager’s Next Engineering Hub
Location: India
Team: Product & Development
Reporting to: Head of Product & Development (Denmark)
Why this role exists
WebLager is scaling fast, and 2026 will be a breakout year. We’re building an Indian IT office that’s not an outsourced extension of Denmark.
This is a real “build it right from day one” leadership role.
You’ll be our right hand in India — shaping the team, culture, and delivery. If you want to build something meaningful that’s expected to grow a lot next year, keep reading.
What you’ll do
You’re not here to babysit Jira. You’re here to ship, lead, and raise the bar.
● Build and lead our India engineering team from early stage into real scale in 2026.
● Set standards for quality and delivery — clean code, stable systems, smart execution.
● Coach and grow people across levels: students, juniors, mid-levels, seniors.
● Create a local WebLager community that feels like one company, not two offices.
● Work tightly with Denmark on product, architecture, and delivery — as a partner, not a follower.
● Stay hands-on: design, code, review, refactor, deploy.
● Scale enterprise systems: performance, reliability, maintainability, observability.
● Improve how we work: CI/CD, engineering rituals, docs that matter, fewer surprises.
● Be the technical anchor when things are complex, messy, or moving fast.
What you bring
We don’t care about buzzwords. We care about proof you can build and lead.
Must-haves:
● 5+ years as a developer, with real production systems behind you.
● Strong backend skills — ideally Python or another scripting language, plus Java/C# or similar, and also extensive database knowledge of both relational and
non-relational databases.
● Frontend experience with a reactive framework like Angular, React, Vue, etc.
● Experience scaling enterprise-grade systems and making architecture tradeoffs that hold up.
● You’ve led people before (formally or naturally) and enjoy helping others grow.
● Excellent problem-solving skills — you don’t freeze when things are unclear; you untangle them.
● Near-perfect English (spoken and written). This is non-negotiable — you’ll work daily across countries and levels.
● You take ownership by default and don’t need a map for every step.
Nice-to-haves:
● You’ve helped build or grow a team from scratch.
● Cloud + DevOps experience.
● Product-minded engineering: you care about outcomes, not just tasks.
The kind of person who’ll thrive here
Let’s be direct:
● You’re driven to create real results, not just “do your part.”
● You want to build something from the ground up and shape the future of a company.
● You lead with calm, clarity, and high standards.
● You’re motivated beyond the norm — you don’t settle for “good enough.”
● You know a Tech Lead is someone who steps up, helps others win, and keeps shipping.
● You’re hungry to learn, and confident enough to challenge weak solutions.
The kind of person who won’t
Also direct:
● If you expect everything to be built around you, look for another job.
● If you want Denmark to hand you tasks, this isn’t it.
● If you avoid responsibility or hard conversations, this will hurt.
● If “average and comfortable” is your goal, don’t apply.
We’re building an exceptional team. Mediocre doesn’t survive here.
What you get
● A rare chance to build an office, a culture, and a high-performing team in India from scratch.
● Direct partnership with Danish leadership and product org.
● Real influence over architecture, standards, and execution.
● A company that values ownership and speed over politics.
● Massive growth opportunity as the India office scales in 2026.
● Competitive salary + benefits.
How to apply
Only reach out if you genuinely believe you’re the right fit and you’re motivated to build something one-of-a-kind.
Send: (This is mandatory)
● A short page about you and what you’ve built.
● CV/LinkedIn/GitHub/portfolio.
● 2–3 projects you’re proud of, and why.
Job Description
- Min 5 years of Test Data Management tools experience with experience in Data De-identification and Masking Capabilities.
- Min 3 years of experience working on the Delphix tool with Data De-identification TDM experience
- Min 2 years of Experience in Synthetic Data Generation Capabilities
- Nice to have Experience in Python and .NET
- Understand and align with the roadmaps of key consumers with faster turnaround of test data generation.
- Optional: Prior experience working in cloud-hosted platforms, CICD pipeline, and integration of test data into the pipeline.
.
🔧 Key Responsibilities:
- Design and develop customized Salesforce solutions using Apex, Visualforce, and Lightning Web Components (LWC)
- Manage and implement Salesforce APIs and third-party integrations
- Work with Salesforce Data Cloud (CDP) to unify and manage enterprise data
- Collaborate with cross-functional teams to deliver scalable, secure, and high-performing Salesforce solutions
- Adhere to best practices in data security, performance, and system architecture
- Participate actively in Agile/Scrum processes and contribute to sprint planning and delivery
- Conduct thorough code reviews, unit testing, and maintain clear technical documentation
✅ Required Skills & Qualifications:
- 4+ years of hands-on Salesforce development experience
- Proficiency in Apex, LWC, Visualforce, and Salesforce APIs
- Experience with third-party integrations and middleware platforms
- Familiarity with Salesforce Data Cloud or similar data management platforms
- Strong grasp of Salesforce configuration, data modeling, flows, and triggers
- Knowledge of Git, DevOps, and CI/CD pipelines
- Excellent problem-solving and communication skills
🌟 Preferred:
- Salesforce Platform Developer I/II certification(s)
- Experience working in large-scale enterprise environments
- Familiarity with DevOps tools and CI/CD best practices
Job Location: Baner, Pune (On-site)
Key Duties and Responsibilities
● Design and develop websites from initial concept, site architecture, and user interface to finished deliverable
● Code and implement custom themes, functions, and plugins using best practices and relevant coding standards.
● Design and implement new features, enhancements, and content of existing websites
● Create and update re-usable code libraries to streamline the WordPress development cycle
Requirements
● 3+ years’ experience working in web development and interface design.
● In-depth WordPress experience with high-quality outcomes
● Gutenberg block development experience.
● Working knowledge of WordPress REST API or GraphQL.
● Strong knowledge of current web development languages (including HTML5, CSS3, PHP/MySQL)
● Comfort with Git-based development and deployment workflow.
● Familiarity with web standards and usability
● Flexibility and eagerness to identify, learn, and use new and changing technologies
● Ability to manage multiple projects at a time
● Attention to detail
- Over all 2-4 years’ experience in Inventory Management System and customization with third party systems,
- Atleast 2 years of hands on experience in Cramer 8 or Above version and customisations using Cramer integrations.
- Java/PL SQL Development hands on experience.
- Relevant domain experience in Sales & Distribution SAP SD / SAP LE (Sales Order processing, Pricing and Conditions, Availability check and Requirements, Credit Management, Output determination, Foreign Trade/Customs, Billing, Shipping and Transportation)
- Experience of supporting medium/large scale Support / implementation / SAP ECC on HANA .
- Worked with SAP SD and HCM.Having experience in S4 HANA implementation.
Roles and responsibilities
- Develop well-designed performant and scalable applications and microservices
- Writing reusable, testable, and efficient code aligning to software development best practices
- Integrate data storage solutions including databases, key-value stores, blob stores, etc.
- Build integrations with 3rd party applications through apis’ to ingest and process data
- Develop state-of-the-art analytics tools to support diverse tasks ranging from ad hoc analysis to production-grade pipelines and workflows for customer applications
- Ensure security and data protection aspects within the applications
- Partner with Data Scientists and Analytics Engineers to improve the performance and reliability of advanced algorithms
- Ensure high performance and availability of distributed systems and applications
- Interact directly with client project team members and operational staff to support live customer deployments and production issues
- 4+ years of experience in developing applications using python and related technologies.
- Familiarity in data ingestion and processing libraries in python.
- Thorough understanding of REST and GRPC technologies.
- Experience in using ORM (Object Relational Mapper) libraries for data access.
- Experiencing in developing and hosting APIs and integration with external applications.
- Experience in building data models and repositories using relational and NoSql databases.
- Knowledge of JIRA, Bitbucket and agile methodologies.
- Good to have knowledge of AWS services like Lambda, dynamodb, kinesis and others.
- Understanding of fundamental design principles behind a scalable application
- Familiarity with event-driven programming
- Strong unit test and debugging skills
- Affinity for learning and applying new technologies and solving new problems
- Effective organizational skills with strong attention to detail
- Experience in working with docker is a plus
- Comfortable in working with Unix/Linux environment
- Strong communication skills — both written and verbal




