11+ Qualitative research Jobs in Pune | Qualitative research Job openings in Pune
Apply to 11+ Qualitative research Jobs in Pune on CutShort.io. Explore the latest Qualitative research Job opportunities across top companies like Google, Amazon & Adobe.

About Company
Our client works in the area of skilling and livelihoods for underserved youth. It is a pioneering program with a strong PPP model, an agency-led
approach to livelihoods and a vision of socio-economic transformation. The NGO runs through a public-private partnership that empowers the Government, Corporates, NGOs and Citizens to work together toward changing lives.
Responsibilities
● Design and execution of research into the impact of an agency-led approach to livelihoods.
● Design and execution of the appropriate baseline surveys in communities.
● Gaining an in-depth understanding of the impact of the NGO program at the community level.
● Identification of research partners as required.
● Building tools and systems for monitoring and evaluation of the Lighthouse program and guiding the work of the data & reporting team.
● Data quality audits for program-specific data across the organization.
Requirements
● Master’s degree in statistics or economics with a specialization in statistical methods, or social sciences with training in qualitative and quantitative research methods.
● Minimum 5 years of work experience in the social sector/ corporate sector.
● Critical thinking
● Analytical ability
● Result orientation
● Excellent written and oral communication in Hindi and English (Knowledge of Marathi
will be an added advantage)
● Creativity
Good understanding and experience of HTML / CSS / JavaScript.
Hands-on experience with ES6 / ES7 / ES8 features.
Thorough understanding of the Request Lifecycle (including Event Queue, Event Loop,
Worker Threads, etc).
Familiarity with security principles including SSL protocols, data encryption, XSS, CSRF.
Expertise in Web Services / REST APIs will be beneficial.
Proficiency in Linux and deployment on Linux are valuable.
Knowledge about ORM like Sequelize and ODM like Mongoose and the ability to handle
DB transactions is a necessity.
Experience with Angular JS / React JS will be an added advantage.
Expertise with RDBMS like MySQL / PostgreSQL will be a plus.
Knowledge of AWS services like S3, EC2 will be helpful.
Understanding of Agile and CI/CD will be of value.
Experience: 4+ years.
Location: Vadodara & Pune
Skills Set- Snowflake, Power Bi, ETL, SQL, Data Pipelines
What you'll be doing:
- Develop, implement, and manage scalable Snowflake data warehouse solutions using advanced features such as materialized views, task automation, and clustering.
- Design and build real-time data pipelines from Kafka and other sources into Snowflake using Kafka Connect, Snowpipe, or custom solutions for streaming data ingestion.
- Create and optimize ETL/ELT workflows using tools like DBT, Airflow, or cloud-native solutions to ensure efficient data processing and transformation.
- Tune query performance, warehouse sizing, and pipeline efficiency by utilizing Snowflakes Query Profiling, Resource Monitors, and other diagnostic tools.
- Work closely with architects, data analysts, and data scientists to translate complex business requirements into scalable technical solutions.
- Enforce data governance and security standards, including data masking, encryption, and RBAC, to meet organizational compliance requirements.
- Continuously monitor data pipelines, address performance bottlenecks, and troubleshoot issues using monitoring frameworks such as Prometheus, Grafana, or Snowflake-native tools.
- Provide technical leadership, guidance, and code reviews for junior engineers, ensuring best practices in Snowflake and Kafka development are followed.
- Research emerging tools, frameworks, and methodologies in data engineering and integrate relevant technologies into the data stack.
What you need:
Basic Skills:
- 3+ years of hands-on experience with Snowflake data platform, including data modeling, performance tuning, and optimization.
- Strong experience with Apache Kafka for stream processing and real-time data integration.
- Proficiency in SQL and ETL/ELT processes.
- Solid understanding of cloud platforms such as AWS, Azure, or Google Cloud.
- Experience with scripting languages like Python, Shell, or similar for automation and data integration tasks.
- Familiarity with tools like dbt, Airflow, or similar orchestration platforms.
- Knowledge of data governance, security, and compliance best practices.
- Strong analytical and problem-solving skills with the ability to troubleshoot complex data issues.
- Ability to work in a collaborative team environment and communicate effectively with cross-functional teams
Responsibilities:
- Design, develop, and maintain Snowflake data warehouse solutions, leveraging advanced Snowflake features like clustering, partitioning, materialized views, and time travel to optimize performance, scalability, and data reliability.
- Architect and optimize ETL/ELT pipelines using tools such as Apache Airflow, DBT, or custom scripts, to ingest, transform, and load data into Snowflake from sources like Apache Kafka and other streaming/batch platforms.
- Work in collaboration with data architects, analysts, and data scientists to gather and translate complex business requirements into robust, scalable technical designs and implementations.
- Design and implement Apache Kafka-based real-time messaging systems to efficiently stream structured and semi-structured data into Snowflake, using Kafka Connect, KSQL, and Snow pipe for real-time ingestion.
- Monitor and resolve performance bottlenecks in queries, pipelines, and warehouse configurations using tools like Query Profile, Resource Monitors, and Task Performance Views.
- Implement automated data validation frameworks to ensure high-quality, reliable data throughout the ingestion and transformation lifecycle.
- Pipeline Monitoring and Optimization: Deploy and maintain pipeline monitoring solutions using Prometheus, Grafana, or cloud-native tools, ensuring efficient data flow, scalability, and cost-effective operations.
- Implement and enforce data governance policies, including role-based access control (RBAC), data masking, and auditing to meet compliance standards and safeguard sensitive information.
- Provide hands-on technical mentorship to junior data engineers, ensuring adherence to coding standards, design principles, and best practices in Snowflake, Kafka, and cloud data engineering.
- Stay current with advancements in Snowflake, Kafka, cloud services (AWS, Azure, GCP), and data engineering trends, and proactively apply new tools and methodologies to enhance the data platform.


Position Overview: We are seeking an experienced ASP.NET Developer to join our development team. As an ASP.NET Developer, you will be responsible for designing, developing, and maintaining web applications using the ASP.NET framework. You will work closely with cross-functional teams to deliver high-quality software solutions that meet our clients' needs.
Responsibilities:
- Collaborate with project managers, business analysts, and other stakeholders to gather requirements and understand project objectives.
- Design and develop web applications using ASP.NET, C#, asp.net core and related technologies.
- Write clean, scalable, and maintainable code following coding standards and best practices.
- Develop database structures, queries, and stored procedures using Microsoft SQL Server or other database systems.
- Implement and integrate third-party libraries, APIs, and frameworks as required.
- Conduct thorough testing of developed applications to ensure functionality, performance, and security.
- Troubleshoot and debug issues in existing applications and provide timely resolutions.
- Participate in code reviews and provide constructive feedback to team members.
- Stay up-to-date with industry trends, technologies, and best practices to continuously improve development processes and techniques.
Qualifications:
- Bachelor's degree in Computer Science, Software Engineering, or a related field (or equivalent work experience).
- Proven experience as an ASP.NET Developer or in a similar role.
- Strong knowledge of ASP.NET framework and proficiency in C# programming language.
- Experience with web technologies such as HTML, CSS, JavaScript, and jQuery.
- Solid understanding of relational databases and proficiency in SQL.
- Familiarity with front-end frameworks/libraries (e.g., Angular, React) is a plus.
- Experience with version control systems (e.g., Git) and agile development methodologies.
- Ability to work independently as well as in a collaborative team environment.
- Strong problem-solving skills and attention to detail.
Excellent communication and interpersonal skills.
Roles and Responsibilities
The Ideal Candidate with his Bachelor's Degree in Marine Engineering, Electrical / Instrumentation Engineering, Automation, or an adjacent field with experience in Maritime Systems is also welcome. Alternatively experienced electrical officers with maritime experience/ Naval experience with exposure to the above areas are welcome
Prior work experience in the Maritime Industry preferably with strong competence in some or all of the below is desirable :
- Marine Automation, Marine Electrical Systems
- Communication Systems, NAVCOM systems
- Experience in working with PLCs, RTU, Gateway, Edge Devices/ Data Loggers
- Systems operation and maintenance
- In-Depth knowledge of maintenance, commissioning of Marine Electrical and NAVCOM Systems Ability to carry out in-depth troubleshooting and guide vessels for effective case resolutions
- Work Experience as an Electrical/ Technical Superintendent will be a Plus
- Prior experience in the review of the vessel's drawings and machinery specification a must
- Carry out vessel inspection visits as required, Supervise retrofit installations during digital automation & Smart Ship Platform implementation.
A challenging role with one of the fastest-growing maritime digital automation startup.
- The candidate will be driving global delivery, conducting ship inspection to identify vessel machinery from which data signals can be fetched.
- The candidate must have past experience in data telemetry from various machinery such as Main Engine, DG Sets, Boiler, Flowmeters, Emission, Emissions, Ballast, etc.
- Candidate must have experience of working with data signals RS485 NMEA, 4..20 mA, 0..10 V, etc.
- The candidate will work in close coordination with technology specialists, data scientists, and chief engineers with experience in vessel and voyage performance management.
- Prior experience with various control systems, especially for: data telemetry, gateway configuration, working with different PLCs such as Siemens, Alan Bradley, etc.
- Candidates with experience in only navigation & communication systems are also welcome.
- Candidates will be working on cutting edge technology & new age maritime business practices to come up with solutions for the global maritime industry.
- Any experience across electrical systems of the vessels and awareness of various machinery parameters are welcome.
- Awareness & knowledge of power generation, automation systems, reefer container monitoring
, experience in PMS is an added advantage.
Candidates will get tremendous learning experience of innovative solutions & emerging business models in the maritime vertical.
- Consulting & Advisory positions are also available in addition to full-time positions.
- Work from home also available.

*3+ years of US Sales experience is mandatory (Product/Service)
*Min Graduation
*Extraordinarily fluent in English
*Confident & should have convincing skills.
*To go "Extra Mile" to meet the sales target.
*Target driven with a positive attitude.
*Standard hike on last drawn Salary + Deal Based Incentive
*Work from Office.
*Immedidate Joiner
AM Infoweb is looking to add bright, focused, resourceful, and highly goal-oriented Sales & Business Development Executive. You will be responsible for supporting sales success by ensuring customers have the products, services, and support to meet their technology and business needs. You will utilize your potentials to build strong relationships with customers over the phone, to sell AM Infoweb's offerings

Talk about the role we are offering for Mulesoft and how we are considering trainable resources who are keen to learn and work on new technologies
Our Requirements:
We are looking for 1-8 years of experience in technical skills like Java, python etc and who will be willing to learn and move to mulesoft and related technologies
Ability to work in a fast paced, demanding, and rapidly changing environment
Preferred:
Basic knowledge about mulesoft
Responsibilities
Assist in translating business objectives into technical solutions through the use of MuleSoft Anypoint Platform
Coding, testing, debugging, implementing and documenting MuleSoft based flows and integrations
Apply integration design patterns such as message routing, content enrichment, batch processing, error handling and reconciliation mechanisms to deliver required functionality
WHAT WE ARE OFFERING
Learning and Certification
Best in Industry Salary
Health & Wellness Benefits
Employee Rewards Program
Retirement & Savings
Flexible Schedules
Maternity & Paternity Leave


