11+ Quantitative research Jobs in Pune | Quantitative research Job openings in Pune
Apply to 11+ Quantitative research Jobs in Pune on CutShort.io. Explore the latest Quantitative research Job opportunities across top companies like Google, Amazon & Adobe.
About Company
Our client works in the area of skilling and livelihoods for underserved youth. It is a pioneering program with a strong PPP model, an agency-led
approach to livelihoods and a vision of socio-economic transformation. The NGO runs through a public-private partnership that empowers the Government, Corporates, NGOs and Citizens to work together toward changing lives.
Responsibilities
● Design and execution of research into the impact of an agency-led approach to livelihoods.
● Design and execution of the appropriate baseline surveys in communities.
● Gaining an in-depth understanding of the impact of the NGO program at the community level.
● Identification of research partners as required.
● Building tools and systems for monitoring and evaluation of the Lighthouse program and guiding the work of the data & reporting team.
● Data quality audits for program-specific data across the organization.
Requirements
● Master’s degree in statistics or economics with a specialization in statistical methods, or social sciences with training in qualitative and quantitative research methods.
● Minimum 5 years of work experience in the social sector/ corporate sector.
● Critical thinking
● Analytical ability
● Result orientation
● Excellent written and oral communication in Hindi and English (Knowledge of Marathi
will be an added advantage)
● Creativity
Node JS Developer
We are looking for a Node JS Developer to join our team.
In this role, you will be entrusted with developing JavaScript applications using Node.js(NestJs framework).
Apply now for the role of a Node JS Developer at Codebuddy which is opening doors for skilled, energetic developers looking for Node JS Developer Jobs.
If you can help us design and create multiple layers of applications while working cross-functionally across different infrastructures and if you love problem-solving, Designing systems, and creating quality offerings, you would fit perfectly in Codebuddy.
Responsibilities:
- Developing and maintaining all server-side network components.
- Ensuring optimal performance of the central database and responsiveness to front-end requests.
- Collaborating with front-end developers on the integration of elements.
- Designing back-end services for various business processes.
- Developing high-performance applications by writing testable, reusable, and efficient code.
- Implementing effective security protocols, data protection measures, and storage solutions.
- Running diagnostic tests, repairing defects, and providing technical support.
- Documenting Node.js processes, including database schemas, as well as preparing reports.
- Recommending and implementing improvements to processes and technologies.
- Keeping informed of advancements in the field of Node.js development.
- Analyze requests for enhancements/changes and write amendment/program specifications.
- Understand the inter-dependencies of the service (application, system, and database) and be able to pinpoint problem areas accurately to improve overall efficiency.
- Translate storyboards and various use cases to create high-performing apps
- Help in code automation
What You Need To Apply (Technical Competency) :
- Bachelor's degree in computer science, information science, or similar.
- Minimum two years of experience as a Node.js developer.
- Minimum 6 months of hands-on experience with TypeScript.
- Basic knowledge of NestJs and ExpressJs.
- Extensive knowledge of JavaScript, web stacks, libraries, and frameworks.
- Knowledge of MongoDB.
- Knowledge of front-end technologies such as HTML5 and CSS3.
- Basics of Linux commands and basic Git knowledge for everyday workflow.
- Superb interpersonal, communication, and collaboration skills.
- Exceptional analytical and problem-solving aptitude.
- Great organizational and time management skills.
- Availability to resolve urgent web application issues outside of business hours.
- Self-motivated with the ability to work independently or jointly in a self-directed way.
Avegen is a digital healthcare company empowering individuals to take control of their health and supporting healthcare professionals in delivering life-changing care.
Avegen's health management platform is being used by healthcare providers in the U.K. and India to take care delivery beyond the four walls of the hospital/clinic, ensuring that patients have access to personalized care. The platform enables better patient management leading to improved outcomes.
Responsibilities, Skills:
- Own
- Take the deliverable through lifecycle (Spec review, Test design, Test execution and defect reporting, Segregation tests in smoke, sanity, regression, Customer Issue reproduction, test case addition/update, automation to build regression suite for Customer reported issues
- Help with issue reproduction and debugging
- Keeping required documentation up to date for compliance for every deliverable
- Perform under guidance
- Review requirements, specifications and technical design documents to provide timely and meaningful feedback
- Create detailed, comprehensive and well-structured test plans and test cases
- Estimate, prioritize, plan and coordinate testing activities
- Liaise with internal teams (e.g. developers and product managers) to identify system requirements
- Track quality assurance metrics and QMS compliance
- Automate smoke, sanity and regression suite for backlog
- Expected to take initiative
- Stay up-to-date with new testing and test automation tools and test strategies
- Contribute to SDLC discussion and tweak processes to Avegen specific needs including waterfall, agile, hybrid, etc.
Educational Qualifications:
● Bachelor’s Degree in Engineering
● Additional business degree/Certification in Agile or Scrum preferred
Good to have ISTQB certification
We will build a comprehensive backtesting platform for trading in the NSE F&O segment.
Any knowledge of financial markets is a bonus
- Knowledge of Automation and robotics Industry
- Knowledge of Bottling industry
- Material handling - lifts, cranes, conveyors etc.
- Getting job works for VMC and CNC machines
- Hand on Experience in Food technology
- Experience in IoT
- Lead Generation X
- Seeking an Individual carrying around 5+ yrs of experience.
- Must have skills - Jenkins, Groovy, Ansible, Shell Scripting, Python, Linux Admin
- Terraform, AWS deep knowledge to automate and provision EC2, EBS, SQL Server, cost optimization, CI/CD pipeline using Jenkins, Server less automation is plus.
- Excellent writing and communication skills in English. Enjoy writing crisp and understandable documentation
- Comfortable programming in one or more scripting languages
- Enjoys tinkering with tooling. Find easier ways to handle systems by doing some research. Strong awareness around build vs buy.
Required:
- Strong background in, and at least 3+ years of working in tooling or QA automation
- Thorough understanding of SDLC, specifically automated QA processes in agile development environments
- Experience in writing, executing and monitoring automated test suites using a variety of technologies
- Proficient with bug tracking and test management toolsets to support development processes
- Strong working knowledge of testing fundamentals such as TDD & BDD
- Proficient working with relational databases such as MySQL & PostreSQL
- Some knowledge of Unix/Linux
Desired Skills:
- Building test infrastructures using containerization technologies such as Docker and working within continuous
delivery / continuous release pipeline processes
- Testing enterprise applications deployed to cloud environments such as Google Cloud Platform
- Experience mentoring QA staff and end users on quality objectives and testing processes
- Understanding of coding enterprise applications within Java, PHP, and other languages
- Understanding of NoSQL database technologies such as MongoDB or DynamoDB
- Degree level qualifications in a technical related subject
- Proactive 'self-starter' attitude
- Lifelong learner - thrives from developing and sharing knowledge
Summary
Our Kafka developer has a combination of technical skills, communication skills and business knowledge. The developer should be able to work on multiple medium to large projects. The successful candidate will have excellent technical skills of Apache/Confluent Kafka, Enterprise Data WareHouse preferable GCP BigQuery or any equivalent Cloud EDW and also will be able to take oral and written business requirements and develop efficient code to meet set deliverables.
Must Have Skills
- Participate in the development, enhancement and maintenance of data applications both as an individual contributor and as a lead.
- Leading in the identification, isolation, resolution and communication of problems within the production environment.
- Leading developer and applying technical skills Apache/Confluent Kafka (Preferred) AWS Kinesis (Optional), Cloud Enterprise Data Warehouse Google BigQuery (Preferred) or AWS RedShift or SnowFlakes (Optional)
- Design recommending best approach suited for data movement from different sources to Cloud EDW using Apache/Confluent Kafka
- Performs independent functional and technical analysis for major projects supporting several corporate initiatives.
- Communicate and Work with IT partners and user community with various levels from Sr Management to detailed developer to business SME for project definition .
- Works on multiple platforms and multiple projects concurrently.
- Performs code and unit testing for complex scope modules, and projects
- Provide expertise and hands on experience working on Kafka connect using schema registry in a very high volume environment (~900 Million messages)
- Provide expertise in Kafka brokers, zookeepers, KSQL, KStream and Kafka Control center.
- Provide expertise and hands on experience working on AvroConverters, JsonConverters, and StringConverters.
- Provide expertise and hands on experience working on Kafka connectors such as MQ connectors, Elastic Search connectors, JDBC connectors, File stream connector, JMS source connectors, Tasks, Workers, converters, Transforms.
- Provide expertise and hands on experience on custom connectors using the Kafka core concepts and API.
- Working knowledge on Kafka Rest proxy.
- Ensure optimum performance, high availability and stability of solutions.
- Create topics, setup redundancy cluster, deploy monitoring tools, alerts and has good knowledge of best practices.
- Create stubs for producers, consumers and consumer groups for helping onboard applications from different languages/platforms. Leverage Hadoop ecosystem knowledge to design, and develop capabilities to deliver our solutions using Spark, Scala, Python, Hive, Kafka and other things in the Hadoop ecosystem.
- Use automation tools like provisioning using Jenkins, Udeploy or relevant technologies
- Ability to perform data related benchmarking, performance analysis and tuning.
- Strong skills in In-memory applications, Database Design, Data Integration.




