4+ Data processing Jobs in Pune | Data processing Job openings in Pune
Apply to 4+ Data processing Jobs in Pune on CutShort.io. Explore the latest Data processing Job opportunities across top companies like Google, Amazon & Adobe.

Solutions Engineer – Technical Customer Success Owner
Location: Pune | Experience: 2–4 years
About FlytBase
FlytBase powers 24/7 autonomous drone and robot operations across industrial sites—solar farms, refineries, rail yards, and more. With deployments at 300+ sites globally and customers like Oxy, CSX, and Anglo-American, we’re scaling the future of Physical AI.
What You’ll Own
This isn’t a demo-and-disappear job. You’ll lead from discovery to deployment—owning the full technical cycle for enterprise drone automation.
- Customer Success & Technical Discovery: Be the go-to advisor for clients—understand pain points, run PoCs, and deliver solutions they can scale.
- Sales Engineering: Assist in GTM execution, drive revenue with compelling value stories, and close enterprise deals.
- Product Feedback Loop: Turn customer insight into roadmap impact—collaborate with product & engineering to build what the market really needs.
- Thought Leadership: Represent FlytBase at expos, train customers, lead webinars, and create high-signal success stories.
Who You Are
- 2–4 years in enterprise SaaS, robotics, drones, or IoT (customer-facing)
- Proven experience in technical pre-sales or onboarding
- Excellent communicator—comfortable with CEOs and CTOs alike
- Obsessively curious, fast learner, and bias for execution
- Writes clearly, thinks logically, and simplifies complexity
- Bonus: Experience with UAV systems, DJI, APIs, or deployment architecture
AI-Native, Full-Stack Mindset
- Use AI to prototype, scale outreach, and optimize delivery
- Write docs, guides, and customer content that actually helps
- Solve problems—not just show features.
Who We’re Not Hiring
- Slide flippers who disappear after demos
- “That’s not my job” types
- Activity-over-outcome operators
- Anyone allergic to complexity or documentation
Who You’ll Work With
- Sales → Demo, close, and support GTM
- Product → Shape roadmap with insights from the field
- Engineering → Translate problems into features and fixes
- Marketing → Co-build stories that drive adoption.
Ready to Fly?
Apply now: https://forms.gle/4QAUqJxgT7TevpX86
If this felt too intense, it’s probably not for you. But if it sparked a fire—you know what to do.

Job Location: Hyderabad/Bangalore/ Chennai/Pune/Nagpur
Notice period: Immediate - 15 days
1. Python Developer with Snowflake
Job Description :
- 5.5+ years of Strong Python Development Experience with Snowflake.
- Strong hands of experience with SQL ability to write complex queries.
- Strong understanding of how to connect to Snowflake using Python, should be able to handle any type of files
- Development of Data Analysis, Data Processing engines using Python
- Good Experience in Data Transformation using Python.
- Experience in Snowflake data load using Python.
- Experience in creating user-defined functions in Snowflake.
- Snowsql implementation.
- Knowledge of query performance tuning will be added advantage.
- Good understanding of Datawarehouse (DWH) concepts.
- Interpret/analyze business requirements & functional specification
- Good to have DBT, FiveTran, and AWS Knowledge.



An experienced and hands-on Technical Architect to lead our Video analytics & Surveillance product
• An ideal candidate would have worked in large scale video platforms (Youtube, Netflix, Hotstar, etc) or Surveillance softwares
• As a Technical Architect, you are hands-on and also a top contributor to the product development
• Leading teams under time-sensitive projects
Skills Required:
• Expert level Python programming language skills is a MUST
• Hands-on experience with Deep Learning & Machine learning projects is a MUST
• Has to experience in design and development of products
• Review code & mentor team in improving the quality and efficiency of the delivery
• Ability to troubleshoot and address complex technical problems.
• Has to be a quick learner & ability to adapt to increasing customer demands
• Hands-on experience in design and deploying large scale docker and Kubernetes
• Can lead a technically strong team in sharpening the product further
• Strong design capability with microservices-based architecture and its pitfalls
• Should have worked in large scale data processing systems
• Good understanding of DevOps processes
• Familiar with Identity management, Authorization & Authentication frameworks
• Possesses very strong Software Design, enterprise networking systems, advanced problem-solving skills
• Experience writing technical architecture documents
CANDIDATE WILL BE DEPLOYED IN A FINANCIAL CAPTIVE ORGANIZATION @ PUNE (KHARADI)
Below are the job Details :-
Experience 10 to 18 years
Mandatory skills –
- data migration,
- data flow
The ideal candidate for this role will have the below experience and qualifications:
- Experience of building a range of Services in a Cloud Service provider (ideally GCP)
- Hands-on design and development of Google Cloud Platform (GCP), across a wide range of GCP services including hands on experience of GCP storage & database technologies.
- Hands-on experience in architecting, designing or implementing solutions on GCP, K8s, and other Google technologies. Security and Compliance, e.g. IAM and cloud compliance/auditing/monitoring tools
- Desired Skills within the GCP stack - Cloud Run, GKE, Serverless, Cloud Functions, Vision API, DLP, Data Flow, Data Fusion
- Prior experience of migrating on-prem applications to cloud environments. Knowledge and hands on experience on Stackdriver, pub-sub, VPC, Subnets, route tables, Load balancers, firewalls both for on premise and the GCP.
- Integrate, configure, deploy and manage centrally provided common cloud services (e.g. IAM, networking, logging, Operating systems, Containers.)
- Manage SDN in GCP Knowledge and experience of DevOps technologies around Continuous Integration & Delivery in GCP using Jenkins.
- Hands on experience of Terraform, Kubernetes, Docker, Stackdriver, Terraform
- Programming experience in one or more of the following languages: Python, Ruby, Java, JavaScript, Go, Groovy, Scala
- Knowledge or experience in DevOps tooling such as Jenkins, Git, Ansible, Splunk, Jira or Confluence, AppD, Docker, Kubernetes
- Act as a consultant and subject matter expert for internal teams to resolve technical deployment obstacles, improve product's vision. Ensure compliance with centrally defined Security
- Financial experience is preferred
- Ability to learn new technologies and rapidly prototype newer concepts
- Top-down thinker, excellent communicator, and great problem solver
Exp:- 10 to 18 years
Location:- Pune
Candidate must have experience in below.
- GCP Data Platform
- Data Processing:- Data Flow, Data Prep, Data Fusion
- Data Storage:- Big Query, Cloud Sql,
- Pub Sub, GCS Bucket