11+ Dataflow architecture Jobs in Pune | Dataflow architecture Job openings in Pune
Apply to 11+ Dataflow architecture Jobs in Pune on CutShort.io. Explore the latest Dataflow architecture Job opportunities across top companies like Google, Amazon & Adobe.
MNC Pune based IT company
CANDIDATE WILL BE DEPLOYED IN A FINANCIAL CAPTIVE ORGANIZATION @ PUNE (KHARADI)
Below are the job Details :-
Experience 10 to 18 years
Mandatory skills –
- data migration,
- data flow
The ideal candidate for this role will have the below experience and qualifications:
- Experience of building a range of Services in a Cloud Service provider (ideally GCP)
- Hands-on design and development of Google Cloud Platform (GCP), across a wide range of GCP services including hands on experience of GCP storage & database technologies.
- Hands-on experience in architecting, designing or implementing solutions on GCP, K8s, and other Google technologies. Security and Compliance, e.g. IAM and cloud compliance/auditing/monitoring tools
- Desired Skills within the GCP stack - Cloud Run, GKE, Serverless, Cloud Functions, Vision API, DLP, Data Flow, Data Fusion
- Prior experience of migrating on-prem applications to cloud environments. Knowledge and hands on experience on Stackdriver, pub-sub, VPC, Subnets, route tables, Load balancers, firewalls both for on premise and the GCP.
- Integrate, configure, deploy and manage centrally provided common cloud services (e.g. IAM, networking, logging, Operating systems, Containers.)
- Manage SDN in GCP Knowledge and experience of DevOps technologies around Continuous Integration & Delivery in GCP using Jenkins.
- Hands on experience of Terraform, Kubernetes, Docker, Stackdriver, Terraform
- Programming experience in one or more of the following languages: Python, Ruby, Java, JavaScript, Go, Groovy, Scala
- Knowledge or experience in DevOps tooling such as Jenkins, Git, Ansible, Splunk, Jira or Confluence, AppD, Docker, Kubernetes
- Act as a consultant and subject matter expert for internal teams to resolve technical deployment obstacles, improve product's vision. Ensure compliance with centrally defined Security
- Financial experience is preferred
- Ability to learn new technologies and rapidly prototype newer concepts
- Top-down thinker, excellent communicator, and great problem solver
Exp:- 10 to 18 years
Location:- Pune
Candidate must have experience in below.
- GCP Data Platform
- Data Processing:- Data Flow, Data Prep, Data Fusion
- Data Storage:- Big Query, Cloud Sql,
- Pub Sub, GCS Bucket
Job Description:
· Proficient In Python.
· Good knowledge of Stress/Load Testing and Performance Testing.
· Knowledge in Linux.
Technologies/Frameworks – · Core Java, J2EE, · Spring Core and Spring MVC, Sprint Boot, Spring Security, · JDBC, Hibernate, RESTful APIs, SOAP WebServices · Knowledge of JavaScript, JQuery, AJAX, HTML5, and CSS3, Angular is added advantage · Junit or Mockito frameworks · Maven, Git · Knowledge Data Structures, · SQL, MySQL · Designing relational database schemas · Basics of AWS, Cloud, Microservices
· BFSI, FinTech |
WHO WILL LOVE THIS JOB?
• Attracted to creativity, innovation, and eagerness to learn
• Alignment to a fast-paced organization and its short-term and long-term goals
• An engaging, open, genuine personality that naturally encourages interaction with individuals at all levels
• Strong value system and sense of ethics
• Absolute dedication to premium quality
• Want to build strong core product team capable of developing solutions for complex, industry-first problems.
• Build balance of experience, knowledge, and new learnings
ROLES AND RESPONSIBILITIES?
• Driving the success of the software engineering team at Datamotive.
• Collaborating with senior and peer engineers to prioritize and deliver features on the roadmap.
• Build strong development team with focus on building optimized & usable solutions.
• Research, Design & Develop distributed solution to handle workload mobility across multi & hybrid clouds
• Assist in Identifying, Researching & Designing newer features and cloud platform support in areas of disaster recovery, data protection, workload migration etc.
• Assist in building product roadmap.
• Conduct pilot tests to assess the functionality of newly developed programs.
• Front facing customers for product introduction, knowledge transfer, solutioning, bugs triaging etc.
• Assist customers by giving product demos, conducting POCs, trainings etc.
• Manage Datamotive infrastructure, bring innovative automation for optimizing infrastructure usage through monitoring and scripting.
• Design test environments to simulate customer behaviours and use cases in VMware vSphere, AWS, GCP, Azure clouds.
• Help write technical documentation, generate marketing content like blogs, webinars, seminars etc.
TECHNICAL SKILLS
• 3 – 8 years of experience in software development with relevant domain understanding of Data Protection, Disaster Recovery, Ransomware Recovery.
• A strong understanding and demonstrable experience with at least one of the major public cloud platforms (GCP, AWS, Azure or VMware)
• A strong understanding and experience of designing and developing architecture of complex, distributed systems.
• Insights into development of client-server SaaS applications with good breadth across networking, storage, micro-services, and other web technologies.
• Experience of building and leading strong development teams with systems product development background
• Programming knowledge in either of GO Lang, C, C++, Python or Shell script.
• Should be a computer science graduate with strong fundamentals & problem-solving abilities.
• Good understanding of virtualization, storage and cloud platforms like VMware, AWS, GCP, Azure and/or Kubernetes will be preferable
- Hands-on experience with Node JS and Angular (2+ years).
- Experience with popular frameworks like Express / Loopback.
- Creating and Integrating backend REST APIs
- Hands-on experience with any RDBMS systems.
- Passion about complex, interactive applications with a thoughtful UX/UI.
- Should be able to understand project requirements thoroughly.
- A deep understanding of the importance of building maintainable, efficient, clean code, while balancing the urgency of business needs.
- Collaborative attitude and experience working with cross-functional teams.
Responsibilities
- Responsible for driving & delivering web applications.
- Responsible for architecting, solutioning of the project in the requirement phase;
- Responsible for working with the team & enhancing the collaboration, adding value to the project with critical thinking on project design.
- Responsible for creating and integrating the REST APIs
Good to have
- Experience with AWS / Azure.
- Experience with CI/CD tools.
- Experience with any testing and automation frameworks.
The key aspects of this role include:
• Design, build, and maintain scalable applications using Python.
• Contribute to the entire implementation process including driving the definition of improvements
based on business need and architectural improvements.
• Act as a subject matter expert for Application Software developers and Engineers.
• Handle server-side code for a production platform and contribute to new features.
To be the right fit, you'll need:
• More than 4+ years of experience as a software developer in Python, with knowledge of at least one
Python web framework such as Django, Flask, etc.
• Good understanding of common design patterns and architecture principles to design reliable and
Scalable applications
• Strong communication skills
• Knowledge of databases line NoSQL or MongoDB
• Good to have AWS and Docker or Web services
• Basic understanding of front-end technologies, such as JavaScript, HTML5, and CSS3
Responsibilities for Data Scientist/ NLP Engineer
Work with customers to identify opportunities for leveraging their data to drive business
solutions.
• Develop custom data models and algorithms to apply to data sets.
• Basic data cleaning and annotation for any incoming raw data.
• Use predictive modeling to increase and optimize customer experiences, revenue
generation, ad targeting and other business outcomes.
• Develop company A/B testing framework and test model quality.
• Deployment of ML model in production.
Qualifications for Junior Data Scientist/ NLP Engineer
• BS, MS in Computer Science, Engineering, or related discipline.
• 3+ Years of experience in Data Science/Machine Learning.
• Experience with programming language Python.
• Familiar with at least one database query language, such as SQL
• Knowledge of Text Classification & Clustering, Question Answering & Query Understanding,
Search Indexing & Fuzzy Matching.
• Excellent written and verbal communication skills for coordinating acrossteams.
• Willing to learn and master new technologies and techniques.
• Knowledge and experience in statistical and data mining techniques:
GLM/Regression, Random Forest, Boosting, Trees, text mining, NLP, etc.
• Experience with chatbots would be bonus but not required
As an QA in MindTickle you will be required to:
• The role will include working on web and mobile application.
• Be keen on finding loopholes, bugs in systems and should be able to think out of the box.
• Have good communication skills and be a good team player as well as an individual contributor.
• Own responsibility and handle ownership of assignments.
• Role will gradually grow from Manual to a full stack QA (Manual + Automation).
• Good programming skills.