11+ Dataflow architecture Jobs in Pune | Dataflow architecture Job openings in Pune
Apply to 11+ Dataflow architecture Jobs in Pune on CutShort.io. Explore the latest Dataflow architecture Job opportunities across top companies like Google, Amazon & Adobe.
CANDIDATE WILL BE DEPLOYED IN A FINANCIAL CAPTIVE ORGANIZATION @ PUNE (KHARADI)
Below are the job Details :-
Experience 10 to 18 years
Mandatory skills –
- data migration,
- data flow
The ideal candidate for this role will have the below experience and qualifications:
- Experience of building a range of Services in a Cloud Service provider (ideally GCP)
- Hands-on design and development of Google Cloud Platform (GCP), across a wide range of GCP services including hands on experience of GCP storage & database technologies.
- Hands-on experience in architecting, designing or implementing solutions on GCP, K8s, and other Google technologies. Security and Compliance, e.g. IAM and cloud compliance/auditing/monitoring tools
- Desired Skills within the GCP stack - Cloud Run, GKE, Serverless, Cloud Functions, Vision API, DLP, Data Flow, Data Fusion
- Prior experience of migrating on-prem applications to cloud environments. Knowledge and hands on experience on Stackdriver, pub-sub, VPC, Subnets, route tables, Load balancers, firewalls both for on premise and the GCP.
- Integrate, configure, deploy and manage centrally provided common cloud services (e.g. IAM, networking, logging, Operating systems, Containers.)
- Manage SDN in GCP Knowledge and experience of DevOps technologies around Continuous Integration & Delivery in GCP using Jenkins.
- Hands on experience of Terraform, Kubernetes, Docker, Stackdriver, Terraform
- Programming experience in one or more of the following languages: Python, Ruby, Java, JavaScript, Go, Groovy, Scala
- Knowledge or experience in DevOps tooling such as Jenkins, Git, Ansible, Splunk, Jira or Confluence, AppD, Docker, Kubernetes
- Act as a consultant and subject matter expert for internal teams to resolve technical deployment obstacles, improve product's vision. Ensure compliance with centrally defined Security
- Financial experience is preferred
- Ability to learn new technologies and rapidly prototype newer concepts
- Top-down thinker, excellent communicator, and great problem solver
Exp:- 10 to 18 years
Location:- Pune
Candidate must have experience in below.
- GCP Data Platform
- Data Processing:- Data Flow, Data Prep, Data Fusion
- Data Storage:- Big Query, Cloud Sql,
- Pub Sub, GCS Bucket
Job Description: Field Sales Executive
Company: DetoXyFi Technologies PVT Ltd
Product: Jal Kavach (Portable Water Filters)
Role Objective
Drive the distribution and sales of Jal Kavach by building a robust network of dealers and distributors. You will be the face of the brand on the field, ensuring our life-saving technology reaches every household.
Key Responsibilities
- Channel Expansion: Identify and appoint new dealers, distributors, and retail partners.
- Sales Growth: Achieve primary and secondary sales targets within your assigned territory.
- Demonstrations: Conduct product demos to showcase the efficiency of our low-cost filters.
- Relationship Management: Maintain strong ties with partners and ensure consistent product stock.
- Market Reporting: Track competitor trends and provide daily field activity reports.
Required Skills & Experience
- Experience: 1–5 years in Field/Channel Sales (Water Purifier or FMCG background preferred).
- Hustle: Proven track record of territory mapping and network building.
- Travel: Must be comfortable with extensive daily field travel.
- Communication: Strong negotiation skills in Hindi and local languages.
A Desktop Support Engineer is responsible for providing technical support and assistance to end-users in an organization. This role involves troubleshooting hardware and software issues, ensuring that desktop systems are functioning efficiently, and maintaining a high level of customer satisfaction.
Key Responsibilities:
- Respond to user inquiries and provide technical support via phone, email, or in-person.
- Diagnose and resolve hardware and software problems, including operating systems, applications, and network connectivity issues.
- Install, configure, and upgrade desktop hardware and software, ensuring compliance with company standards.
- Maintain inventory of desktop equipment and software licenses, ensuring proper documentation and tracking.
- Collaborate with IT teams to implement new technologies and improve existing systems.
- Provide training and support to users on new software applications and tools.
- Assist in the setup and deployment of new workstations and peripherals.
- Monitor and maintain system performance, applying updates and patches as necessary.
- Document technical procedures and solutions for future reference.
Qualifications:
- Bachelor’s degree in computer science, information technology, or a related field, or equivalent experience.
- Proven experience in a desktop support role or similar technical support position.
- Strong knowledge of Windows and macOS operating systems, as well as common software applications.
- Familiarity with networking concepts and troubleshooting techniques.
- Excellent problem-solving skills and the ability to work under pressure.
- Strong communication skills, both verbal and written, with a customer-focused attitude.
- Relevant certifications (e.g., CompTIA A+, Microsoft Certified Desktop Support Technician) are a plus.
This role is essential for maintaining the productivity of employees by ensuring that their desktop environments are operational and efficient. A successful Desktop Support Engineer will be proactive, detail-oriented, and able to work independently as well as part of a team.
Data Integration Developer Role
Responsibilities:
§ As a Data Integration Developer/Sr Developer, be hands-on ETL/ELT data pipelines, Snowflake DWH, CI/CD deployment Pipelines and data-readiness(data quality) design, development, implementation and address code or data issues.
§ Experience in designing and implementing modern data pipelines for a variety of data sets which includes internal/external data sources, complex relationships, various data formats and high-volume.
§ Experience and understanding of ETL Job performance techniques, Exception handling,
Query performance tuning/optimizations and data loads meeting the runtime/schedule time SLAs both batch and real-time data uses cases.
§ Demonstrated ability to rationalize problems and use judgment and innovation to define clear and concise solutions.
§ Demonstrate strong collaborative experience across regions (APAC,EMEA and NA) to come up with design standards, High level design solutions document, cross training and resource onboarding activities.
§ Good understanding of SDLC process, Governance clearance, Peer Code reviews, Unit Test Results, Code deployments, Code Security Scanning, Confluence Jira/Kanban stories.
§ Strong attention to detail during root cause analysis, SQL query debugging and defect issue resolution by working with multiple business/IT stakeholders.
Educational Qualifications:
BTech in Computer Science or other technical course of study required.
Experience:
§ A minimum of 4-10 years of experience into data integration/orchestration services, service architecture and providing data driven solutions for client requirements
§ Experience on Microsoft Azure cloud and Snowflake SQL, database query/performance tuning.
§ Experience with Qlik Replicate and Compose tools(Change Data Capture) tools is considered a plus
§ Strong Data warehousing Concepts, ETL tools such as Talend Cloud Data Integration tool is must
§ Exposure to the financial domain knowledge is considered a plus.
§ Cloud Managed Services such as source control code Github, MS Azure/Devops is considered a plus.
§ Prior experience with State Street and Charles River Development ( CRD) considered a plus.
§ Experience in tools such as Visio, PowerPoint, Excel.
§ Exposure to Third party data providers such as Bloomberg, Reuters, MSCI and other Rating agencies is a plus.
§ Strong SQL knowledge and debugging skills is a must.
Job Description:
·
· Extensive experience in Appian BPM application development
· Knowledge of Appian architecture and its objects best practices
· Participate in analysis, design, and new development of Appian based applications
· Mandatory Team leadership and provide technical leadership to Scrum teams Certification Mandatory- L1, L2 or L3
· Must be able to multi-task, work in a fast-paced environment and ability to resolve problems faced by team
· Build applications: interfaces, process flows, expressions, data types, sites, integrations, etc.
· Proficient with SQL queries and with accessing data present in DB tables and views
· Experience in Analysis, Designing process models, Records, Reports, SAIL, forms, gateways, smart services, integration services and web services
· Experience working with different Appian Object types, query rules, constant rules and expression rules
Qualifications
· At least 6 years of experience in Implementing BPM solutions using Appian 19.x or higher
· Over 8 years in Implementing IT solutions using BPM or integration technologies
· Experience in Scrum/Agile methodologies with Enterprise level application development projects
· Good understanding of database concepts and strong working knowledge any one of the major databases e g Oracle SQL Server MySQL
Additional information Skills Required
· Appian BPM application development on version 19.x or higher
· Experience of integrations using web services e g XML REST WSDL SOAP API JDBC JMS
· Good leadership skills and the ability to lead a team of software engineers technically
· Experience working in Agile Scrum teams
· Good Communication skills
JD for Cloud engineer
Job Summary:
We are looking for an experienced GCP Cloud Engineer to design, implement, and manage cloud-based solutions on Google Cloud Platform (GCP). The ideal candidate should have expertise in GKE (Google Kubernetes Engine), Cloud Run, Cloud Loadbalancer, Cloud function, Azure DevOps, and Terraform, with a strong focus on automation, security, and scalability.
You will work closely with development, operations, and security teams to ensure robust cloud infrastructure and CI/CD pipelines while optimizing performance and cost.
Key Responsibilities:
1. Cloud Infrastructure Design & Management
- Architect, deploy, and maintain GCP cloud resources via terraform/other automation.
- Implement Google Cloud Storage, Cloud SQL, filestore, for data storage and processing needs.
- Manage and configure Cloud Load Balancers (HTTP(S), TCP/UDP, and SSL Proxy) for high availability and scalability.
- Optimize resource allocation, monitoring, and cost efficiency across GCP environments.
2. Kubernetes & Container Orchestration
- Deploy, manage, and optimize workloads on Google Kubernetes Engine (GKE).
- Work with Helm charts for microservices deployments.
- Automate scaling, rolling updates, and zero-downtime deployments.
3. Serverless & Compute Services
- Deploy and manage applications on Cloud Run and Cloud Functions for scalable, serverless workloads.
- Optimize containerized applications running on Cloud Run for cost efficiency and performance.
4. CI/CD & DevOps Automation
- Design, implement, and manage CI/CD pipelines using Azure DevOps.
- Automate infrastructure deployment using Terraform, Bash and Powershell scripting
- Integrate security and compliance checks into the DevOps workflow (DevSecOps).
Required Skills & Qualifications:
✔ Experience: 8+ years in Cloud Engineering, with a focus on GCP.
✔ Cloud Expertise: Strong knowledge of GCP services (GKE, Compute Engine, IAM, VPC, Cloud Storage, Cloud SQL, Cloud Functions).
✔ Kubernetes & Containers: Experience with GKE, Docker, GKE Networking, Helm.
✔ DevOps Tools: Hands-on experience with Azure DevOps for CI/CD pipeline automation.
✔ Infrastructure-as-Code (IaC): Expertise in Terraform for provisioning cloud resources.
✔ Scripting & Automation: Proficiency in Python, Bash, or PowerShell for automation.
✔ Security & Compliance: Knowledge of cloud security principles, IAM, and compliance standards.
Job Summary
We are looking for a skilled Java + Cloud Developer to design, develop, and maintain high-performance applications. The ideal candidate will have strong expertise in Core Java, Spring Framework, multithreading, and database management, along with exposure to cloud platforms and containerization technologies.
Job Title: Java + Cloud Developer
Location: Pune / Mumbai / Bangalore
Experience Level: 4-8
Employment Type: Full-time
Key Responsibilities
- Design, develop, and maintain scalable Java applications using Core Java, Spring Framework, JDBC, and multithreading concepts.
- Implement and integrate database solutions using relational and NoSQL databases.
- Utilize JDBC for database connectivity and manipulation.
- Work with cloud platforms such as Azure or GCP; experience with DevOps practices is an added advantage.
- Develop, deploy, and manage applications using containerization technologies (Docker, Kubernetes).
- Debug and troubleshoot applications through log analysis and monitoring tools.
- Collaborate with cross-functional teams to ensure seamless integration between multi-service components.
- Handle large-scale data processing tasks effectively; hands-on experience with Apache Spark is a plus.
- Apply Agile methodologies (Scrum/Kanban) in daily development activities.
- Continuously research and adopt new technologies to improve development processes and methodologies.
Required Skills & Qualifications
- Strong proficiency in Core Java (Java 8 or higher) with a deep understanding of threading and concurrent programming.
- Solid experience with the Spring Framework and its various modules (Spring Boot, Spring MVC, Spring Data, etc.).
- Experience with RDBMS (e.g., MySQL, PostgreSQL, Oracle) and NoSQL databases (e.g., MongoDB, Cassandra).
- Basic understanding of cloud platforms (Azure, GCP, or AWS).
- Knowledge of DevOps practices (CI/CD, version control, monitoring tools) is a plus.
- Familiarity with Docker and Kubernetes for application deployment and scaling.
- Strong analytical and problem-solving skills.
- Good communication skills and ability to work in a collaborative environment.
Preferred Qualifications
- Hands-on experience with Apache Spark for big data processing.
- Exposure to microservices architecture and API integrations.
- Familiarity with log monitoring tools (ELK, Splunk, etc.).
Note : Serving Notice OR 30 Days NP
XressBees – a logistics company started in 2015 – is amongst the fastest growing companies of its sector. Our
vision to evolve into a strong full-service logistics organization reflects itself in the various lines of business like B2C
logistics 3PL, B2B Xpress, Hyperlocal and Cross border Logistics.
Our strong domain expertise and constant focus on innovation has helped us rapidly evolve as the most trusted
logistics partner of India. XB has progressively carved our way towards best-in-class technology platforms, an
extensive logistics network reach, and a seamless last mile management system.
While on this aggressive growth path, we seek to become the one-stop-shop for end-to-end logistics solutions. Our
big focus areas for the very near future include strengthening our presence as service providers of choice and
leveraging the power of technology to drive supply chain efficiencies.
Job Overview
XpressBees would enrich and scale its end-to-end logistics solutions at a high pace. This is a great opportunity to join
the team working on forming and delivering the operational strategy behind Artificial Intelligence / Machine Learning
and Data Engineering, leading projects and teams of AI Engineers collaborating with Data Scientists. In your role, you
will build high performance AI/ML solutions using groundbreaking AI/ML and BigData technologies. You will need to
understand business requirements and convert them to a solvable data science problem statement. You will be
involved in end to end AI/ML projects, starting from smaller scale POCs all the way to full scale ML pipelines in
production.
Seasoned AI/ML Engineers would own the implementation and productionzation of cutting-edge AI driven algorithmic
components for search, recommendation and insights to improve the efficiencies of the logistics supply chain and
serve the customer better.
You will apply innovative ML tools and concepts to deliver value to our teams and customers and make an impact to
the organization while solving challenging problems in the areas of AI, ML , Data Analytics and Computer Science.
Opportunities for application:
- Route Optimization
- Address / Geo-Coding Engine
- Anomaly detection, Computer Vision (e.g. loading / unloading)
- Fraud Detection (fake delivery attempts)
- Promise Recommendation Engine etc.
- Customer & Tech support solutions, e.g. chat bots.
- Breach detection / prediction
An Artificial Intelligence Engineer would apply himself/herself in the areas of -
- Deep Learning, NLP, Reinforcement Learning
- Machine Learning - Logistic Regression, Decision Trees, Random Forests, XGBoost, etc..
- Driving Optimization via LPs, MILPs, Stochastic Programs, and MDPs
- Operations Research, Supply Chain Optimization, and Data Analytics/Visualization
- Computer Vision and OCR technologies
The AI Engineering team enables internal teams to add AI capabilities to their Apps and Workflows easily via APIs
without needing to build AI expertise in each team – Decision Support, NLP, Computer Vision, for Public Clouds and
Enterprise in NLU, Vision and Conversational AI.Candidate is adept at working with large data sets to find
opportunities for product and process optimization and using models to test the effectiveness of different courses of
action. They must have knowledge using a variety of data mining/data analysis methods, using a variety of data tools,
building, and implementing models, using/creating algorithms, and creating/running simulations. They must be
comfortable working with a wide range of stakeholders and functional teams. The right candidate will have a passion
for discovering solutions hidden in large data sets and working with stakeholders to improve business outcomes.
Roles & Responsibilities
● Develop scalable infrastructure, including microservices and backend, that automates training and
deployment of ML models.
● Building cloud services in Decision Support (Anomaly Detection, Time series forecasting, Fraud detection,
Risk prevention, Predictive analytics), computer vision, natural language processing (NLP) and speech that
work out of the box.
● Brainstorm and Design various POCs using ML/DL/NLP solutions for new or existing enterprise problems.
● Work with fellow data scientists/SW engineers to build out other parts of the infrastructure, effectively
communicating your needs and understanding theirs and address external and internal shareholder's
product challenges.
● Build core of Artificial Intelligence and AI Services such as Decision Support, Vision, Speech, Text, NLP, NLU,
and others.
● Leverage Cloud technology –AWS, GCP, Azure
● Experiment with ML models in Python using machine learning libraries (Pytorch, Tensorflow), Big Data,
Hadoop, HBase, Spark, etc
● Work with stakeholders throughout the organization to identify opportunities for leveraging company data to
drive business solutions.
● Mine and analyze data from company databases to drive optimization and improvement of product
development, marketing techniques and business strategies.
● Assess the effectiveness and accuracy of new data sources and data gathering techniques.
● Develop custom data models and algorithms to apply to data sets.
● Use predictive modeling to increase and optimize customer experiences, supply chain metric and other
business outcomes.
● Develop company A/B testing framework and test model quality.
● Coordinate with different functional teams to implement models and monitor outcomes.
● Develop processes and tools to monitor and analyze model performance and data accuracy.
● Develop scalable infrastructure, including microservices and backend, that automates training and
deployment of ML models.
● Brainstorm and Design various POCs using ML/DL/NLP solutions for new or existing enterprise problems.
● Work with fellow data scientists/SW engineers to build out other parts of the infrastructure, effectively
communicating your needs and understanding theirs and address external and internal shareholder's
product challenges.
● Deliver machine learning and data science projects with data science techniques and associated libraries
such as AI/ ML or equivalent NLP (Natural Language Processing) packages. Such techniques include a good
to phenomenal understanding of statistical models, probabilistic algorithms, classification, clustering, deep
learning or related approaches as it applies to financial applications.
● The role will encourage you to learn a wide array of capabilities, toolsets and architectural patterns for
successful delivery.
What is required of you?
You will get an opportunity to build and operate a suite of massive scale, integrated data/ML platforms in a broadly
distributed, multi-tenant cloud environment.
● B.S., M.S., or Ph.D. in Computer Science, Computer Engineering
● Coding knowledge and experience with several languages: C, C++, Java,JavaScript, etc.
● Experience with building high-performance, resilient, scalable, and well-engineered systems
● Experience in CI/CD and development best practices, instrumentation, logging systems
● Experience using statistical computer languages (R, Python, SLQ, etc.) to manipulate data and draw insights
from large data sets.
● Experience working with and creating data architectures.
● Good understanding of various machine learning and natural language processing technologies, such as
classification, information retrieval, clustering, knowledge graph, semi-supervised learning and ranking.
● Knowledge and experience in statistical and data mining techniques: GLM/Regression, Random Forest,
Boosting, Trees, text mining, social network analysis, etc.
● Knowledge on using web services: Redshift, S3, Spark, Digital Ocean, etc.
● Knowledge on creating and using advanced machine learning algorithms and statistics: regression,
simulation, scenario analysis, modeling, clustering, decision trees, neural networks, etc.
● Knowledge on analyzing data from 3rd party providers: Google Analytics, Site Catalyst, Core metrics,
AdWords, Crimson Hexagon, Facebook Insights, etc.
● Knowledge on distributed data/computing tools: Map/Reduce, Hadoop, Hive, Spark, MySQL, Kafka etc.
● Knowledge on visualizing/presenting data for stakeholders using: Quicksight, Periscope, Business Objects,
D3, ggplot, Tableau etc.
● Knowledge of a variety of machine learning techniques (clustering, decision tree learning, artificial neural
networks, etc.) and their real-world advantages/drawbacks.
● Knowledge of advanced statistical techniques and concepts (regression, properties of distributions,
statistical tests, and proper usage, etc.) and experience with applications.
● Experience building data pipelines that prep data for Machine learning and complete feedback loops.
● Knowledge of Machine Learning lifecycle and experience working with data scientists
● Experience with Relational databases and NoSQL databases
● Experience with workflow scheduling / orchestration such as Airflow or Oozie
● Working knowledge of current techniques and approaches in machine learning and statistical or
mathematical models
● Strong Data Engineering & ETL skills to build scalable data pipelines. Exposure to data streaming stack (e.g.
Kafka)
● Relevant experience in fine tuning and optimizing ML (especially Deep Learning) models to bring down
serving latency.
● Exposure to ML model productionzation stack (e.g. MLFlow, Docker)
● Excellent exploratory data analysis skills to slice & dice data at scale using SQL in Redshift/BigQuery.
Responsibilities:
- Manage the entire sales cycle
- Finding a client from existing resources
- Calling on Leads to arrange Demos
- Follow on leads till the conversion
- Create and share proposals
- Provide professional after-sales support to improve customer relationships
- Present periodic reports and accountability on the inside sales activities
- Establish and maintain strong relationships with clients and new prospects
Requirements:
- Minimum 2-3 years of experience in B2B Edtech or education
- Must be an exceptional communicator
- Good at email Writing
- MBA preferred, otherwise minimum of a Bachelor’s Degree.
- Must be self-motivated, flexible, collaborative, with an eagerness to learn
Job Title: Full Stack Developer
Experience: 6-8 years
Location: Pune
Mandatory Skills Sets
- Expert level design & coding skills .NET framework. Must have wide range of exposure to all .NET framework components like ASP.Net with C#, MVC Ado.Net, WPF, WCF etc. Unit Test Framework
- Good exposure to UI technologies like HTML, CSS, Java Script, JQuery, JQuery, Bootstrap, Angular
- Work on relational database systems, Object Oriented Programming and web application development
- Non-relational database – Mongo DB/Postgres
- Agile development methodologies and techniques & tools
- Experience in web services, Microservices
- Experience in developing complex/custom application
- Should be able to mentor Software Engineers to support design, code and meet software quality code standards as well as independently design and deliver projects
- Technical Hands on, Passionate, creative and team player
- Solving critical business problems and strong delivery focused
- Excellent Communication and problem-solving skills
Preferred Skills Sets
- Experience in cloud native application development using Azure or AWS will be an advantage
- Direct customer communication with European/US customers.
- Customer site work experience




