11+ OWASP Jobs in Hyderabad | OWASP Job openings in Hyderabad
Apply to 11+ OWASP Jobs in Hyderabad on CutShort.io. Explore the latest OWASP Job opportunities across top companies like Google, Amazon & Adobe.
3+ years of experience in cybersecurity, with a focus on application and cloud security.
· Proficiency in security tools such as Burp Suite, Metasploit, Nessus, OWASP ZAP, and SonarQube.
· Familiarity with data privacy regulations (GDPR, CCPA) and best practices.
· Basic knowledge of AI/ML security frameworks and tools.
API Developer (.NET Core 8/9)
Location: Hyderabad/Vijayawada- India
Navitas is seeking a Senior API Developer (.NET Core 8/9) to join our development team in building robust, high-performance microservices and APIs. You will play a key role in designing scalable, secure, and maintainable backend services that power our web and mobile applications. In this role, you will collaborate with product managers, front-end developers, and DevOps engineers to deliver seamless digital experiences and ensure smooth partner integration. This is a mission-critical position that contributes directly to our organization’s digital transformation initiatives.
Responsibilities will include but are not limited to:
- Microservices & API Development: Design, develop, and maintain RESTful APIs and microservices using .NET Core 8/9 and ASP.NET Core Web API.
- API Design & Documentation: Create secure, versioned, and well-documented endpoints for internal and external consumption.
- Asynchronous Processing: Build and manage background jobs and message-driven workflows using Azure Service Bus and Azure Storage Queues.
- Authentication & Security: Implement OAuth2.0, JWT, Azure AD for securing APIs; enforce best practices for secure coding.
- Caching Integration: Enhance performance through caching mechanisms (Redis, in-memory caching).
- Performance Optimization: Profile APIs and database queries to identify bottlenecks; tune services for speed, scalability, and resilience.
- Clean Code & Architecture: Follow SOLID principles, Clean Architecture, and domain-driven design to write modular, testable code.
- Technical Collaboration: Participate in Agile development processes; collaborate with cross-functional teams to plan and deliver solutions.
- Troubleshooting & Maintenance: Use debugging tools and logging strategies to maintain uptime and resolve production issues.
- Documentation: Maintain clear, accessible technical documentation for services, endpoints, and integration requirements.
What You’ll Need:
- Bachelor’s degree in Computer Science, Information Systems, or a related technical field.
- 8+ years of backend development experience using .NET Core (6+ preferred, experience with 8/9 strongly desired).
- Strong understanding of RESTful API design, versioning, and integration.
- Experience with Clean Architecture and Domain-Driven Design (DDD).
- Deep knowledge of SOLID principles, design patterns, and reusable code practices.
- Skilled in SQL Server, including schema design, query tuning, and optimization.
- Proficiency in Entity Framework Core and Dapper for data access.
- Familiarity with API security standards (OAuth2.0, JWT, API keys).
- Experience writing unit/integration tests using xUnit, Moq, or similar frameworks.
- Basic experience with Azure services, including message queues and storage.
- Proficiency with Git, Agile workflows, and collaboration tools.
- Strong communication and problem-solving skills.
Set Yourself Apart With:
- Hands-on experience with Azure components (e.g., Service Bus, Functions, App Services, AKS).
- Experience with Azure Application Insights, Datadog, or other observability tools.
- Familiarity with Docker, containerization, and CI/CD pipelines.
- Performance testing and load testing experience.
- Familiarity with Postman, Swagger/OpenAPI, and other dev/test tools.
- Exposure to Agile/Scrum methodologies and sprint planning processes.
Equal Employer/Veterans/Disabled
Navitas Business Consulting is an affirmative action and equal opportunity employer. If reasonable accommodation is needed to participate in the job application or interview process, to perform essential job functions, and/or to receive other benefits and privileges of employment, please contact Navitas Human Resources.
Navitas is an equal opportunity employer. We provide employment and opportunities for advancement, compensation, training, and growth according to individual merit, without regard to race, color, religion, sex (including pregnancy), national origin, sexual orientation, gender identity or expression, marital status, age, genetic information, disability, veteran-status veteran or military status, or any other characteristic protected under applicable Federal, state, or local law. Our goal is for each staff member to have the opportunity to grow to the limits of their abilities and to achieve personal and organizational objectives. We will support positive programs for equal treatment of all staff and full utilization of all qualified employees at all levels within Navitas.
Company: Snaptics
Job title: Senior Fullstack developers
Job type: Full time
Location: Hyderabad
Work schedule: 6 days (week end off)
Shift timings: 10:00am-6:30pm
Eligibility: Any degree
Experience: 2-4 years
Requirements:
Solid expertise in CMS, PHP, Wordpress, HTML5, CSS3, JavaScript (ES6+), and React.js.
2+ years of experience in web development
Proficiency in TypeScript and experience with popular React State Management solutions like Redux or Context API.
Experience with responsive design and cross-browser compatibility.
Excellent design skills with an eye for aesthetics and usability.
Strong understanding of UI/UX design principles and the ability to transform complex concepts into intuitive, accessible, and visually appealing interfaces.
Strong communication and collaboration skills.
Experience with version control systems like Git.
Familiarity with RESTful APIs and integration with backend services.
Excellent problem-solving abilities with an aptitude for learning and adapting to new technologies.
CORE RESPONSIBILITIES
- Create and manage cloud resources in AWS
- Data ingestion from different data sources which exposes data using different technologies, such as: RDBMS, REST HTTP API, flat files, Streams, and Time series data based on various proprietary systems. Implement data ingestion and processing with the help of Big Data technologies
- Data processing/transformation using various technologies such as Spark and Cloud Services. You will need to understand your part of business logic and implement it using the language supported by the base data platform
- Develop automated data quality check to make sure right data enters the platform and verifying the results of the calculations
- Develop an infrastructure to collect, transform, combine and publish/distribute customer data.
- Define process improvement opportunities to optimize data collection, insights and displays.
- Ensure data and results are accessible, scalable, efficient, accurate, complete and flexible
- Identify and interpret trends and patterns from complex data sets
- Construct a framework utilizing data visualization tools and techniques to present consolidated analytical and actionable results to relevant stakeholders.
- Key participant in regular Scrum ceremonies with the agile teams
- Proficient at developing queries, writing reports and presenting findings
- Mentor junior members and bring best industry practices
QUALIFICATIONS
- 5-7+ years’ experience as data engineer in consumer finance or equivalent industry (consumer loans, collections, servicing, optional product, and insurance sales)
- Strong background in math, statistics, computer science, data science or related discipline
- Advanced knowledge one of language: Java, Scala, Python, C#
- Production experience with: HDFS, YARN, Hive, Spark, Kafka, Oozie / Airflow, Amazon Web Services (AWS), Docker / Kubernetes, Snowflake
- Proficient with
- Data mining/programming tools (e.g. SAS, SQL, R, Python)
- Database technologies (e.g. PostgreSQL, Redshift, Snowflake. and Greenplum)
- Data visualization (e.g. Tableau, Looker, MicroStrategy)
- Comfortable learning about and deploying new technologies and tools.
- Organizational skills and the ability to handle multiple projects and priorities simultaneously and meet established deadlines.
- Good written and oral communication skills and ability to present results to non-technical audiences
- Knowledge of business intelligence and analytical tools, technologies and techniques.
Mandatory Requirements
- Experience in AWS Glue
- Experience in Apache Parquet
- Proficient in AWS S3 and data lake
- Knowledge of Snowflake
- Understanding of file-based ingestion best practices.
- Scripting language - Python & pyspark
Data Scientist
Job Title: Data Scientist – Data and Artificial Intelligence
Location: Hyderabad
Job Type: Full-time
Company Description:
Qylis is a leading provider of innovative IT solutions, specializing in Cloud, Data & AI,
and Cyber Security. We help businesses unlock the full potential of these technologies
to achieve their goals and gain a competitive edge. Our unique approach focuses on
delivering value through bespoke solutions tailored to customer specific needs. We are
driven by a customer-centric mindset and committed to delivering continuous value
through intellectual property accelerators and automation. Our team of experts is
passionate about technology and dedicated to making a positive impact. We foster an
environment of growth and innovation, constantly pushing the boundaries to deliver
exceptional results. Website: www.qylis.com, LinkedIn:
www.linkedin.com/company/qylis
Job Summary
We are an engineering organization collaborating directly with clients to address
challenges using the latest technologies. Our focus is on joint code development with
clients' engineers for cloud-based solutions, accelerating organizational progress.
Working with product teams, partners, and open-source communities, we contribute to
open source, striving for platform improvement. This role involves creating impactful
solution patterns and open-source assets. As a team member, you'll collaborate with
engineers from both teams, applying your skills and creativity to solve complex
challenges and contribute to open source, while fostering professional growth.
Responsibilities
• Researching and developing production-grade models (forecasting, anomaly
detection, optimization, clustering, etc.) for global cloud business by using
statistical and machine learning techniques.
• Manage large volumes of data, and create new and improved solutions for data
collection, management, analyses, and data science model development.
• Drive the onboarding of new data and the refinement of existing data sources
through feature engineering and feature selection.
• Apply statistical concepts and cutting-edge machine learning techniques to
analyze cloud demand and optimize data science model code for distributed
computing platforms and task automation.
• Work closely with other data scientists and data engineers to deploy models that
drive cloud infrastructure capacity planning.
• Present analytical findings and business insights to project managers,
stakeholders, and senior leadership and keep abreast of new statistical /
machine learning techniques and implement them as appropriate to improve
predictive performance.
• Oversees the analysis of data and leads the team in identifying trends, patterns,
correlations, and insights to develop new forecasting models and improve
existing models.
• Leads collaboration among team and leverages data to identify pockets of
opportunity to apply state-of-the-art algorithms to improve a solution to a
business problem.
• Consistently leverages knowledge of techniques to optimize analysis using
algorithms.
• Modifies statistical analysis tools for evaluating Machine Learning models.
Solves deep and challenging problems for circumstances such as when model
predictions are not correct, when models do not match the training data or the
design outcomes when the data is not clean when it is unclear which analyses to
run, and when the process is ambiguous.
• Provides coaching to team members on business context, interpretation, and the
implications of findings. Interprets findings and their implications for multiple
businesses, and champions methodological rigour by calling attention to the
limitations of knowledge wherever biases in data, methods, and analysis exist.
• Generates and leverages insights that inform future studies and reframe the
research agenda. Informs both current business decisions by implementing and
adapting supply-chain strategies through complex business intelligence.
Qualifications
• M.Sc. in Statistics, Applied Mathematics, Applied Economics, Computer
Science or Engineering, Data Science, Operations Research or similar applied
quantitative field
• 7+ years of industry experience in developing production-grade statistical and
machine learning code in a collaborative team environment.
• Prior experience in machine learning using R or Python (scikit / numpy / pandas /
statsmodel).
• Prior experience working on Computer Vision Project is an Add on
• Knowledge on AWS and Azure Cloud.
• Prior experience in time series forecasting.
• Prior experience with typical data management systems and tools such as SQL.
• Knowledge and ability to work within a large-scale computing or big data context,
and hands-on experience with Hadoop, Spark, DataBricks or similar.
• Excellent analytical skills; ability to understand business needs and translate
them into technical solutions, including analysis specifications and models.
• Experience in machine learning using R or Python (scikit / numpy / pandas /
statsmodel) with skill level at or near fluency.
• Experience with deep learning models (e.g., tensorflow, PyTorch, CNTK) and solid
knowledge of theory and practice.
• Practical and professional experience contributing to and maintaining a large
code base with code versioning systems such as Git.
• Creative thinking skills with emphasis on developing innovative methods to solve
hard problems under ambiguity.
• Good interpersonal and communication (verbal and written) skills, including the
ability to write concise and accurate technical documentation and communicate
technical ideas to non-technical audiences.
Job responsibilities
- Performs development, deployment, administration, management, configuration, testing, and integration tasks related to the cloud security platforms.
- Develops automated security and compliance capabilities in support of DevOps processes in a large-scale computing environment for the storage program within the firm.
- Champions a DevOps security model so that security is automated and elastic across all platforms and cultivate a cloud first mindset in transitioning workloads.
- Leverages DevOps tools to build, harden, maintain, and instrument a comprehensive security orchestration platform for infrastructure as code.
- Provides support to drive the maturity of the Cybersecurity software development lifecycle and develop & improve the quality of technical engineering documentation.
- Makes decisions of a global, strategic nature by analyzing complex data systems and incorporating knowledge of other lines of business & JPMC standards.
- Provides quality control of engineering deliverables, technical consultation to product management and technical interface between development and operations teams.
Required qualifications, capabilities, and skills
- Formal training or certification on Security engineering and 3+ years applied experience
- Proficiency in programming languages like Python or Java with strong coding skills
- Understanding of one or more Public Cloud platforms( AWS/ GCP/ Azure)
- Experience with highly scalable systems, release management, software configuration, design, development, and implementation is required
- Ability to analyzing complex data systems – failure analysis / root cause analysis, developing, improving, and maintaining technical engineering documentation
Role: Python-Django Developer
Location: Noida, India
Description:
- Develop web applications using Python and Django.
- Write clean and maintainable code following best practices and coding standards.
- Collaborate with other developers and stakeholders to design and implement new features.
- Participate in code reviews and maintain code quality.
- Troubleshoot and debug issues as they arise.
- Optimize applications for maximum speed and scalability.
- Stay up-to-date with emerging trends and technologies in web development.
Requirements:
- Bachelor's or Master's degree in Computer Science, Computer Engineering or a related field.
- 4+ years of experience in web development using Python and Django.
- Strong knowledge of object-oriented programming and design patterns.
- Experience with front-end technologies such as HTML, CSS, and JavaScript.
- Understanding of RESTful web services.
- Familiarity with database technologies such as PostgreSQL or MySQL.
- Experience with version control systems such as Git.
- Ability to work in a team environment and communicate effectively with team members.
- Strong problem-solving and analytical skills.
AWS Glue Developer
Work Experience: 6 to 8 Years
Work Location: Noida, Bangalore, Chennai & Hyderabad
Must Have Skills: AWS Glue, DMS, SQL, Python, PySpark, Data integrations and Data Ops,
Job Reference ID:BT/F21/IND
Job Description:
Design, build and configure applications to meet business process and application requirements.
Responsibilities:
7 years of work experience with ETL, Data Modelling, and Data Architecture Proficient in ETL optimization, designing, coding, and tuning big data processes using Pyspark Extensive experience to build data platforms on AWS using core AWS services Step function, EMR, Lambda, Glue and Athena, Redshift, Postgres, RDS etc and design/develop data engineering solutions. Orchestrate using Airflow.
Technical Experience:
Hands-on experience on developing Data platform and its components Data Lake, cloud Datawarehouse, APIs, Batch and streaming data pipeline Experience with building data pipelines and applications to stream and process large datasets at low latencies.
➢ Enhancements, new development, defect resolution and production support of Big data ETL development using AWS native services.
➢ Create data pipeline architecture by designing and implementing data ingestion solutions.
➢ Integrate data sets using AWS services such as Glue, Lambda functions/ Airflow.
➢ Design and optimize data models on AWS Cloud using AWS data stores such as Redshift, RDS, S3, Athena.
➢ Author ETL processes using Python, Pyspark.
➢ Build Redshift Spectrum direct transformations and data modelling using data in S3.
➢ ETL process monitoring using CloudWatch events.
➢ You will be working in collaboration with other teams. Good communication must.
➢ Must have experience in using AWS services API, AWS CLI and SDK
Professional Attributes:
➢ Experience operating very large data warehouses or data lakes Expert-level skills in writing and optimizing SQL Extensive, real-world experience designing technology components for enterprise solutions and defining solution architectures and reference architectures with a focus on cloud technology.
➢ Must have 6+ years of big data ETL experience using Python, S3, Lambda, Dynamo DB, Athena, Glue in AWS environment.
➢ Expertise in S3, RDS, Redshift, Kinesis, EC2 clusters highly desired.
Qualification:
➢ Degree in Computer Science, Computer Engineering or equivalent.
Salary: Commensurate with experience and demonstrated competence
About Apexon:
Apexon is a digital-first technology services firm specializing in accelerating business transformation and delivering human-centric digital experiences. For over 17 years, Apexon has been meeting customers wherever they are in the digital lifecycle and helping them outperform their competition through speed and innovation. Our reputation is built on a comprehensive suite of engineering services, a dedication to solving our clients’ toughest technology problems, and a commitment to continuous improvement. We focus on three broad areas of digital services: User Experience (UI/UX, Commerce); Engineering (QE/Automation, Cloud, Product/Platform); and Data (Foundation, Analytics, and AI/ML), and have deep expertise in BFSI, healthcare, and life sciences.
Apexon is backed by Goldman Sachs Asset Management and Everstone Capital.
To know more about us please visit: https://www.apexon.com/
About the role:
- Experience: 7+ years of experience building modern web applications working in the fullstack
- Proficiency in TypeScript and JavaScript with a thorough understanding of React.js or Vue.js and their core principles preferred
- Implementing RESTful APIs using .Net/C# preferred with experience with .Net Core, .Net 5 or 6 a bonus!
- Experience with SQL and relational database design with MS SQL Server experience is added advantage
- Experience with NoSQL/document database technologies
- Experience writing automated unit tests in the full stack environment
- Knowledge of modern authentication and authorization mechanisms
- Familiarity with modern front-end and backend build pipelines and tools
- Experience with common front-end development tools such as Babel, Webpack, NPM, etc.
- Experience with modern responsive web application design & development Familiarity with Node.js
- Experience with microservice architecture
- Experience using Git version control
- Experience with VS Code, Visual Studio, or other relevant development tools
- Familiarity with Scrum/Agile principles
- Strong communication skills
- Ability to understand business requirements and translate them into technical requirements.
Required skill set:
- Candidate must be good in JavaScript and have experience in at least one modern JavaScript framework such as Vue/Angular/React. But must be willing to work in Vue/React.
- Must have experience in .NET Framework. Good to have experience in .NET Core/.NET 5/6/7. But must be willing and capable enough to learn .NET Core.
- Should be able to work independently with minimum supervision.
- Must be good in programming concepts such as OOPS, Unit Tests, Web API, SQL, etc
Desired Candidate Profile
- A team focus with strong collaboration and communication skills
- Exceptional ability to quickly grasp high-level business goals, derive requirements, and translate them into effective technical solutions
- Exceptional object-oriented thinking, design and programming skills (Java 8 or 11)
- Expertise with the following technologies : Data Structures, Design Patterns ,Code Versioning Tools(Github/bitbucket/..), XML, JSON, Spring Batch Restful, Spring Cloud, Grafana(Knowledge/Experience), Kafka, Spring Boot, Microservices, DB/NoSQL, Docker, Kubernetes, AWS/GCP, Architecture design (Patterns) Agile, JIRA.
- Penchant toward self-motivation and continuous improvement; these words should describe you: dedicated, energetic, curious, conscientious, and flexible.








