11+ Vulnerability assessment Jobs in Pune | Vulnerability assessment Job openings in Pune
Apply to 11+ Vulnerability assessment Jobs in Pune on CutShort.io. Explore the latest Vulnerability assessment Job opportunities across top companies like Google, Amazon & Adobe.
As a Security Researcher in SaaS security posture management, your primary responsibility will be to conduct research on emerging security threats and vulnerabilities in SaaS environments and to develop and implement strategies to mitigate those risks. Specifically, your job duties will include: Conducting in-depth research on emerging security threats and vulnerabilities in SaaS environments.
- Analyzing data and security logs to identify potential threats and take proactive measures to prevent them.
- Developing and implementing security policies and procedures to protect against security threats in SaaS environments.
- Collaborating with other members of the IT team to implement security measures and ensure compliance with industry standards and regulations.
- Keeping up-to-date with the latest security technologies and trends in SaaS security posture management.
- Communicating findings and recommendations to management and other stakeholders.
- Participating in incident response and resolution activities in the event of a security breach in SaaS environments.
- To be successful in this role, you should have a Bachelor's or Master's degree in Computer Science, Information Security, or a related field, and have experience in researching emerging security threats and vulnerabilities in SaaS environments. You should also have strong analytical and problem-solving skills, and hold industry certifications such as CISSP, CEH, or OSCP. Excellent communication and collaboration skills are essential to work effectively with cross-functional teams.
Job Title : QA Lead (AI/ML Products)
Employment Type : Full Time
Experience : 4 to 8 Years
Location : On-site
Mandatory Skills : Strong hands-on experience in testing AI/ML (LLM, RAG) applications with deep expertise in API testing, SQL/NoSQL database validation, and advanced backend functional testing.
Role Overview :
We are looking for an experienced QA Lead who can own end-to-end quality for AI-influenced products and backend-heavy systems. This role requires strong expertise in advanced functional testing, API validation, database verification, and AI model behavior testing in non-deterministic environments.
Key Responsibilities :
- Define and implement comprehensive test strategies aligned with business and regulatory goals.
- Validate AI/ML and LLM-driven applications, including RAG pipelines, hallucination checks, prompt injection scenarios, and model response validation.
- Perform deep API testing using Postman/cURL and validate JSON/XML payloads.
- Execute complex SQL queries (MySQL/PostgreSQL) and work with MongoDB for backend and data integrity validation.
- Analyze server logs and transactional flows to debug issues and ensure system reliability.
- Conduct risk analysis and report key QA metrics such as defect leakage and release readiness.
- Establish and refine QA processes, templates, standards, and agile testing practices.
- Identify performance bottlenecks and basic security vulnerabilities (e.g., IDOR, data exposure).
- Collaborate closely with developers, product managers, and domain experts to translate business requirements into testable scenarios.
- Own feature quality independently from conception to release.
Required Skills & Experience :
- 4+ years of hands-on experience in software testing and QA.
- Strong understanding of testing AI/ML products, LLM validation, and non-deterministic behavior testing.
- Expertise in API Testing, server log analysis, and backend validation.
- Proficiency in SQL (MySQL/PostgreSQL) and MongoDB.
- Deep knowledge of SDLC and Bug Life Cycle.
- Strong problem-solving ability and structured approach to ambiguous scenarios.
- Awareness of performance testing and basic security testing practices.
- Excellent communication skills to articulate defects and QA strategies.
What We’re Looking For :
A proactive QA professional who can go beyond UI testing, understands backend systems deeply, and can confidently test modern AI-driven applications while driving quality standards across the team.
Wissen Technology is hiring for Data Engineer
About Wissen Technology: At Wissen Technology, we deliver niche, custom-built products that solve complex business challenges across industries worldwide. Founded in 2015, our core philosophy is built around a strong product engineering mindset—ensuring every solution is architected and delivered right the first time. Today, Wissen Technology has a global footprint with 2000+ employees across offices in the US, UK, UAE, India, and Australia. Our commitment to excellence translates into delivering 2X impact compared to traditional service providers. How do we achieve this? Through a combination of deep domain knowledge, cutting-edge technology expertise, and a relentless focus on quality. We don’t just meet expectations—we exceed them by ensuring faster time-to-market, reduced rework, and greater alignment with client objectives. We have a proven track record of building mission-critical systems across industries, including financial services, healthcare, retail, manufacturing, and more. Wissen stands apart through its unique delivery models. Our outcome-based projects ensure predictable costs and timelines, while our agile pods provide clients the flexibility to adapt to their evolving business needs. Wissen leverages its thought leadership and technology prowess to drive superior business outcomes. Our success is powered by top-tier talent. Our mission is clear: to be the partner of choice for building world-class custom products that deliver exceptional impact—the first time, every time.
Job Summary: Wissen Technology is hiring a Data Engineer with expertise in Python, Pandas, Airflow, and Azure Cloud Services. The ideal candidate will have strong communication skills and experience with Kubernetes.
Experience: 4-7 years
Notice Period: Immediate- 15 days
Location: Pune, Mumbai, Bangalore
Mode of Work: Hybrid
Key Responsibilities:
- Develop and maintain data pipelines using Python and Pandas.
- Implement and manage workflows using Airflow.
- Utilize Azure Cloud Services for data storage and processing.
- Collaborate with cross-functional teams to understand data requirements and deliver solutions.
- Ensure data quality and integrity throughout the data lifecycle.
- Optimize and scale data infrastructure to meet business needs.
Qualifications and Required Skills:
- Proficiency in Python (Must Have).
- Strong experience with Pandas (Must Have).
- Expertise in Airflow (Must Have).
- Experience with Azure Cloud Services.
- Good communication skills.
Good to Have Skills:
- Experience with Pyspark.
- Knowledge of Kubernetes.
Wissen Sites:
- Website: http://www.wissen.com
- LinkedIn: https://www.linkedin.com/company/wissen-technology
- Wissen Leadership: https://www.wissen.com/company/leadership-team/
- Wissen Live: https://www.linkedin.com/company/wissen-technology/posts/feedView=All
- Wissen Thought Leadership: https://www.wissen.com/articles/
Key Responsibilities
- Design and implement ETL/ELT pipelines using Databricks, PySpark, and AWS Glue
- Develop and maintain scalable data architectures on AWS (S3, EMR, Lambda, Redshift, RDS)
- Perform data wrangling, cleansing, and transformation using Python and SQL
- Collaborate with data scientists to integrate Generative AI models into analytics workflows
- Build dashboards and reports to visualize insights using tools like Power BI or Tableau
- Ensure data quality, governance, and security across all data assets
- Optimize performance of data pipelines and troubleshoot bottlenecks
- Work closely with stakeholders to understand data requirements and deliver actionable insights
🧪 Required Skills
Skill AreaTools & TechnologiesCloud PlatformsAWS (S3, Lambda, Glue, EMR, Redshift)Big DataDatabricks, Apache Spark, PySparkProgrammingPython, SQLData EngineeringETL/ELT, Data Lakes, Data WarehousingAnalyticsData Modeling, Visualization, BI ReportingGen AI IntegrationOpenAI, Hugging Face, LangChain (preferred)DevOps (Bonus)Git, Jenkins, Terraform, Docker
📚 Qualifications
- Bachelor's or Master’s degree in Computer Science, Data Science, or related field
- 3+ years of experience in data engineering or data analytics
- Hands-on experience with Databricks, PySpark, and AWS
- Familiarity with Generative AI tools and frameworks is a strong plus
- Strong problem-solving and communication skills
🌟 Preferred Traits
- Analytical mindset with attention to detail
- Passion for data and emerging technologies
- Ability to work independently and in cross-functional teams
- Eagerness to learn and adapt in a fast-paced environment
Job Title: Python Developer (FastAPI)
Experience Required: 4+ years
Location: Pune, Bangalore, Hyderabad, Mumbai, Panchkula, Mohali
Shift: Night Shift 6:30 pm to 3:30 AM IST
About the Role
We are seeking an experienced Python Developer with strong expertise in FastAPI to join our engineering team. The ideal candidate should have a solid background in backend development, RESTful API design, and scalable application development.
Required Skills & Qualifications
· 4+ years of professional experience in backend development with Python.
· Strong hands-on experience with FastAPI (or Flask/Django with migration experience).
· Familiarity with asynchronous programming in Python.
· Working knowledge of version control systems (Git).
· Good problem-solving and debugging skills.
· Strong communication and collaboration abilities.
- Development/Technical support experience in preferably DevOps.
- Looking for an engineer to be part of GitHub Actions support. Experience with CI/CD tools like Bamboo, Harness, Ansible, Salt Scripting.
- Hands-on expertise with GitHub Actions and CICD Tools like Bamboo, Harness, CI/CD Pipeline stages, Build Tools, SonarQube, Artifactory, Nuget, Proget Veracode, LaunchDarkly, GitHub/Bitbucket repos, Monitoring tools.
- Handelling Xmatters,Techlines,Incidents
- Strong Scripting skills (PowerShell, Python, Bash/Shell Scripting) for Implementing automation scripts and Tools to streamline administrative tasks and improve efficiency.
- An Atlassian Tools Administrator is responsible for managing and maintaining Atlassian products such as Jira, Confluence, Bitbucket, and Bamboo.
- Expertise in Bitbucket, GitHub for version control and collaboration global level.
- Good experience on Linux/Windows systems activities, Databases.
- Aware of SLA and Error concepts and their implementations; provide support and participate in Incident management & Jira Stories. Continuously Monitoring system performance and availability, and responding to incidents promptly to minimize downtime.
- Well-versed with Observability tool as Splunk for Monitoring, alerting and logging solutions to identify and address potential issues, especially in infrastructure.
- Expert with Troubleshooting production issues and bugs. Identifying and resolving issues in production environments.
- Experience in providing 24x5 support.
- GitHub Actions
- Atlassian Tools (Bamboo, Bitbucket, Jira, Confluence)
- Build Tools (Maven, Gradle, MS Build, NodeJS)
- SonarQube, Veracode.
- Nexus, JFrog, Nuget, Proget
- Harness
- Salt Services, Ansible
- PowerShell, Shell scripting
- Splunk
- Linux, Windows
• Hand-on experience in creating frontend applications using React or Angular.
• Highly proficient for both Java script and Type script.
• Good command over Databases both SQL/No SQL (Mongo DB, My Sql)
• Hands-on experience of GIT (version control tools).
Responsibilities
- Source candidates using a variety of search methods to build a robust candidate pipeline
- Screen candidates by reviewing resumes and job applications, and performing phone screenings
- Take ownership of candidate experience by designing and managing itDevelop job postings, job descriptions, and position requirements
- Perform reference checks as need
- Facilitate the offer process by extending the offer and negotiationg employment terms
- Manage onboarding and new hire process
- Stay abreast of recruiting trends and best practices
- Manage the overall interview, selection, and closing process
- Ensure all screening, hiring, and selection is done in accordance with employment laws and regulations
- Assist in Admin and HR related activities
Key Points:
- Quick Learner
- Smart worker
- Responsible & bumble
Job Type: Full-time
JD for NodeJS
Mandatory Skills - Nodejs, Javascript, Express.js, MongoDB, Data Structures, Algorithms.
Please find the JD below:-
- Expertise in Node.js Web frameworks like Meteor, Express, and Kraken.JS
- Expertise in building highly scalable web services using Node.js, Create REST API with the help of Node middleware
- Deep understanding of REST and API design
- Experience designing APIs for consistency, simplicity, and extensibility.
- Expertise with JavaScript testing frameworks like Jasmine, Quit, Mocha, Sinnon and Chai.
- Expertise with build tools like Web pack, gulp, and grunt.
- Integration of various application components
- Experience in various phases of Software Development Life Cycle (SDLC) such as requirements analysis, design, and implementation in an agile environment, etc.
About Us :-
Mobile programming LLC is a US based digital transformation company. We help enterprises transform ideas into innovative and intelligent solutions, governing the Internet of Things, Digital Commerce, Business Intelligence Analytics and Cloud Programming. Bring your challenges to us, we will give you the smartest solutions. From conceptualizing and engineering to advanced manufacturing, we help customers build and scale products fit for the global marketplace.
Mobile programming LLC has offices located in Los Angeles, San Jose, Glendale, San Diego, Phoenix, Plano, New York, Fort Lauderdale and Boston. Mobile programming is SAP Preferred Vendor, Apple Adjunct Partner, Google Empaneled Mobile Vendor and Microsoft Gold Certified Partner.
Moreover We have our presence in India with 7 locations i.e Gurgaon, Mohali, Panchkula, Pune, Bangalore, Dehradun & Chennai.
For more information please visit to our website link below:-
https://www.mobileprogramming.com/">https://www.mobileprogramming.com
The Database Developer will perform day-to-day database management, maintenance and troubleshooting by providing Tier 1 and Tier 2 support for diverse platforms including, but not limited to, MS SQL, Azure SQL, MySQL,
PostgreSQL and Amazon Redshift.
They are responsible for maintaining functional/technical support documentation
and operational documentation as well as reporting on performance metrics associated with job activity and platform
stability.
Must adhere to SLAs pertaining to data movement and provide evidence and supporting documentation for incidents that violate those SLAs.
Other responsibilities include API development and integrations via Azure
Functions, C# or Python.
Essential Duties and Responsibilities
• Advanced problem-solving skills
• Excellent communication skills
• Advanced T-SQL scripting skills
• Query optimization and performance tuning familiarity with traces, execution plans and server
logs
• SSIS package development and support
• PowerShell scripting
• Report visualization via SSRS, Power BI and/or Jupityr Nootbook
• Maintain functional/technical support documentation
• Maintain operational documentation specific to automated jobs and job steps
• Develop, implement and support user defined stored procedures, functions and (indexed) views
• Monitor database activities and provide Tier 1 and Tier 2 production support
• Provide functional and technical support to ensure performance, operation and stability of
database systems
• Manage data ingress and egress
• Track issue and/or project deliverables in Jira
• Assist in RDBMS patching, upgrades and enhancements
• Prepare database reports for managers as needed
• API integrations and development
Background/Experience
• Bachelor or advanced degree in computer science
• Microsoft SQL Server 2016 or higher
• Working knowledge of MySQL, PostgreSQL and/or Amazon Redshift
• C# and/or Python
Supervisory/Budget Responsibility
• No Supervisory Responsibility/No Budget Responsibility
Level of Authority to Make Decisions
The Database Developers expedite issue resolution pursuant to the functional/technical documentation available.
Issue escalation is at their discretion and should result in additional functional/technical documentation for future
reference.
However, individual problem solving, decision making and performance tuning will constitute 75% of their time.






