11+ Market Automation Tools Jobs in Pune | Market Automation Tools Job openings in Pune
Apply to 11+ Market Automation Tools Jobs in Pune on CutShort.io. Explore the latest Market Automation Tools Job opportunities across top companies like Google, Amazon & Adobe.
- NLU, Text, Speech, Image Processing
- Excited about Machine Learning and AI
- Want to work on a highly saleable, performance-optimized infrastructure that elastically handles customer needs
- Innovate, Design, and prototype solutions for Digital Assistant and Bots platform to handle heavy loads
- Build and maintain our platform and automation frameworks to ensure maximum uptime and predictability while preventing outages and service interruptions or degradations
- Analyze system failures and develop rapid response solutions to ensure such failures do not reoccur
- Work cross-functionally with product development, Product Management, Program Management and Cloud Infra operations teams
- Partner with Engineering to provide the infrastructure and services required to enable innovation
- Predict and provide notice of potential system vulnerabilities for current and future solutions and implementations. Provide specific recommendations and guidance to address such vulnerabilities
- Analyze, build and maintain all automation tools and processes to ensure the highest standards of reliability and robustness
- Fully understand our customer's service needs and ensure we meet these needs
Preferred Qualifications
- NLU, Text, Speech, Image Processing
- Excited about Machine Learning and AI
- Want to work on a highly saleable, performance-optimized infrastructure that elastically handles customer needs
- Innovate, Design, and prototype solutions for Digital Assistant and Bots platform to handle heavy loads
- Build and maintain our platform and automation frameworks to ensure maximum uptime and predictability while preventing outages and service interruptions or degradations
- Analyze system failures and develop rapid response solutions to ensure such failures do not reoccur
- Work cross-functionally with product development, Product Management, Program Management and Cloud Infra operations teams
- Partner with Engineering to provide the infrastructure and services required to enable innovation
- Predict and provide notice of potential system vulnerabilities for current and future solutions and implementations. Provide specific recommendations and guidance to address such vulnerabilities
- Analyze, build and maintain all automation tools and processes to ensure the highest standards of reliability and robustness
- Fully understand our customer's service needs and ensure we meet these needs
- No matter your role in our team, you will find yourself in an exciting and challenging environment where every person is empowered to show initiative, be outspoken, and be proactive and not reactive. Oracle is dedicated to the continual growth and development of its staff, striving constantly to strengthen our expertise as well as develop new skills. Our team is spread all around the world in four continents - we provide a full range of opportunities and challenges to apply your kills and grow your career in this new and exciting arena.
- Position: Appian Tech Lead
- Job Description:
- Extensive experience in Appian BPM application development
- Knowledge of Appian architecture and its objects best practices
- Participate in analysis, design, and new development of Appian based applications
- Mandatory Team leadership and provide technical leadership to Scrum teams Certification Mandatory- L1, L2 or L3
- Must be able to multi-task, work in a fast-paced environment and ability to resolve problems faced by team
- Build applications: interfaces, process flows, expressions, data types, sites, integrations,
- Proficient with SQL queries and with accessing data present in DB tables and views
- Experience in Analysis, Designing process models, Records, Reports, SAIL, forms, gateways, smart services, integration services and web services
- Experience working with different Appian Object types, query rules, constant rules and expression rules
Qualifucations
- At least 6 years of experience in Implementing BPM solutions using Appian 19.x or higher
- Over 8 years in Implementing IT solutions using BPM or integration technologies
- Experience in Scrum/Agile methodologies with Enterprise level application development projects
- Good understanding of database concepts and strong working knowledge any one of the major databases e g Oracle SQL Server MySQL
- Appian BPM application development on version 19.x or higher
- Experience of integrations using web services e g XML REST WSDL SOAP API JDBC JMS
- Good leadership skills and the ability to lead a team of software engineers technically
- Experience working in Agile Scrum teams
- Good Communication skills
POSITION / TITLE: Data Science Lead
Location: Offshore – Hyderabad/Bangalore/Pune
Who are we looking for?
Individuals with 8+ years of experience implementing and managing data science projects. Excellent working knowledge of traditional machine learning and LLM techniques.
The candidate must demonstrate the ability to navigate and advise on complex ML ecosystems from a model building and evaluation perspective. Experience in NLP and chatbots domains is preferred.
We acknowledge the job market is blurring the line between data roles: while software skills are necessary, the emphasis of this position is on data science skills, not on data-, ML- nor software-engineering.
Responsibilities:
· Lead data science and machine learning projects, contributing to model development, optimization and evaluation.
· Perform data cleaning, feature engineering, and exploratory data analysis.
· Translate business requirements into technical solutions, document and communicate project progress, manage non-technical stakeholders.
· Collaborate with other DS and engineers to deliver projects.
Technical Skills – Must have:
· Experience in and understanding of the natural language processing (NLP) and large language model (LLM) landscape.
· Proficiency with Python for data analysis, supervised & unsupervised learning ML tasks.
· Ability to translate complex machine learning problem statements into specific deliverables and requirements.
· Should have worked with major cloud platforms such as AWS, Azure or GCP.
· Working knowledge of SQL and no-SQL databases.
· Ability to create data and ML pipelines for more efficient and repeatable data science projects using MLOps principles.
· Keep abreast with new tools, algorithms and techniques in machine learning and works to implement them in the organization.
· Strong understanding of evaluation and monitoring metrics for machine learning projects.
Technical Skills – Good to have:
· Track record of getting ML models into production
· Experience building chatbots.
· Experience with closed and open source LLMs.
· Experience with frameworks and technologies like scikit-learn, BERT, langchain, autogen…
· Certifications or courses in data science.
Education:
· Master’s/Bachelors/PhD Degree in Computer Science, Engineering, Data Science, or a related field.
Process Skills:
· Understanding of Agile and Scrum methodologies.
· Ability to follow SDLC processes and contribute to technical documentation.
Behavioral Skills :
· Self-motivated and capable of working independently with minimal management supervision.
· Well-developed design, analytical & problem-solving skills
· Excellent communication and interpersonal skills.
· Excellent team player, able to work with virtual teams in several time zones.
Dear Candidate,
We are urgently looking for a Release- Big data Engineer For Pune Location.
Experience : 5-8 yrs
Location : Pune
Skills: Big data Engineer , Release Engineer ,DevOps, Aws/Azure/GCP Cloud exp. ,
JD:
- Oversee the end-to-end release lifecycle, from planning to post-production monitoring. Coordinate with cross-functional teams (DBA, BizOps, DevOps, DNS).
- Partner with development teams to resolve technical challenges in deployment and automation test runs
- Work with shared services DBA teams for schema-based multi-tenancy designs and smooth migrations.
- Drive automation for batch deployments and DR exercises. YAML based micro service deployment using shell/Python/Go
- Provide oversight for Big Data toolsets for deployment (e.g., Spark, Hive, HBase) in private cloud and public cloud CDP environments
- Ensure high-quality releases with a focus on stability and long-term performance.
- Able to run the automation batch scripts and debug the deployment and functional aspects/ work with dev leads to resolve the release cycle issues.
Regards,
Minakshi Soni
Executive- Talent Acquisition
Rigel Networks
Position Responsibilities
•Design, develop and test responsive and modular web applications providing optimal user experience on desktop and mobile devices
•Coordinate with other developers and teams in a fast-paced, collaborative development environment
•Research, build and coordinate the conversion and/or integration of new features
•Troubleshoot and analyse root cause for pre-prod or production problems and resolve issues
•Address problems with systems integration and compatibility
•Demonstrate impact of design on scalability, performance, and reliability
•Follow established coding and software tools standards in adherence to established security and quality control standards for software development
•Provide technical guidance to junior team members
Requirements and Qualifications
- Bachelor’s degree in Computer Science or related field
- 8+ years of experience as frontend engineer building large and cross platform applications
- SME level experience in Angular and/or React
- Excellent experience in Graphql, WebRTC, WebSockets and REST, PWA, Service Workers
- Excellent understanding of DOM, component rendering and client side performance issues
- Deep knowledge of Webpack like various bundling/build mechanisms and optimising builds
Good-to-have Qualifications
- Experience with building maps, reporting and analytics solutions
- Solid understanding of creating cross platform mobile application and publishing on various channels
- Experience with Native Android, Swift, or reactive Interfaces using RxJS
- Experience with Cloud Technologies
- Solve complex performance problems and architectural challenges
- Troubleshoot, test and maintain the core product software and databases to ensure strong optimization and functionality.
- Help team in technical challenges
- Code review and deployment
- Technical documentations
- Thorough understanding of React.js and its core principles.
- Familiarity with Restful APIs.
- Build efficient, testable, and reusable PHP modules.
- Should be able to understand the requirements & Develop web application.
- Ability to work on multiple projects at the same time and complete tasks in a timely manner.
Company Profile:
Easebuzz is a payment solutions (fintech organisation) company which enables online merchants to accept, process and disburse payments through developer friendly APIs. We are focusing on building plug n play products including the payment infrastructure to solve complete business problems. Definitely a wonderful place where all the actions related to payments, lending, subscription, eKYC is happening at the same time.
We have been consistently profitable and are constantly developing new innovative products, as a result, we are able to grow 4x over the past year alone. We are well capitalised and have recently closed a fundraise of $4M in March, 2021 from prominent VC firms and angel investors. The company is based out of Pune and has a total strength of 180 employees. Easebuzz’s corporate culture is tied into the vision of building a workplace which breeds open communication and minimal bureaucracy. An equal opportunity employer, we welcome and encourage diversity in the workplace. One thing you can be sure of is that you will be surrounded by colleagues who are committed to helping each other grow.
Easebuzz Pvt. Ltd. has its presence in Pune, Bangalore, Gurugram.
Salary: As per company standards.
Designation: Data Engineering
Location: Pune
Experience with ETL, Data Modeling, and Data Architecture
Design, build and operationalize large scale enterprise data solutions and applications using one or more of AWS data and analytics services in combination with 3rd parties
- Spark, EMR, DynamoDB, RedShift, Kinesis, Lambda, Glue.
Experience with AWS cloud data lake for development of real-time or near real-time use cases
Experience with messaging systems such as Kafka/Kinesis for real time data ingestion and processing
Build data pipeline frameworks to automate high-volume and real-time data delivery
Create prototypes and proof-of-concepts for iterative development.
Experience with NoSQL databases, such as DynamoDB, MongoDB etc
Create and maintain optimal data pipeline architecture,
Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies.
Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
Keep our data separated and secure across national boundaries through multiple data centers and AWS regions.
Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
Evangelize a very high standard of quality, reliability and performance for data models and algorithms that can be streamlined into the engineering and sciences workflow
Build and enhance data pipeline architecture by designing and implementing data ingestion solutions.
Employment Type
Full-time
You will be responsible for designing, developing, testing, and debugging responsive mobile and web applications.
Qualifications Required -
· Bachelor's degree or equivalent in Computers
· Experience - 3-4 Years
· Some experience with Mobile App development will be plus
Skillsets:
- Experience in with Node.js, Express.js, REST API
- Experience with NoSQL database - MongoDB
- Experience with AWS services
Looking for a Python Backend Developer to join our client’s award-winning, talented team,
building the next generation automated drone applications for our global customers.
Some Roles/Responsibilities
● Designing, implementing and deploying updates, features to the company’s cloud
backend system. The work typically requires writing RESTful APIs.
● Derive key metrics from the system to estimate system health and performance.
● Prepare automated test suites for continuous health monitoring of the backend
systems.
● Perform load testing to understand potential scale issues and take corrective
measures.
● Integrate tools for incidence report and response, to ensure high availability of
the backend system.
● Drive Security practices implementation. (Troubleshoot incidents, identify root
cause, configuration, fix and document problems, patching systems for
vulnerabilities, and implement preventive measures)
● Maintain codes, users and API documentation.
● Setting up the CI/CD automation environment for web applications using
Jenkins/ECS.
● Manage Linux Virtual machines on EC2 ensuring correct VPC, Security Group
settings.
● Integrate automated data pipelines to enable analysis of usage patterns on 3rd
party analytics platforms.
Experience required: 3+ years relevant experience
Must Haves Skils-
Flask/Django, Python Backend Exp, Exp in Websockets, Async IO
Notice Period- Immediate/1months/ 2 months negotiable




