11+ Threat modeling Jobs in Bangalore (Bengaluru) | Threat modeling Job openings in Bangalore (Bengaluru)
Apply to 11+ Threat modeling Jobs in Bangalore (Bengaluru) on CutShort.io. Explore the latest Threat modeling Job opportunities across top companies like Google, Amazon & Adobe.

About us:
HappyFox is a software-as-a-service (SaaS) support platform. We offer an enterprise-grade help desk ticketing system and intuitively designed live chat software.
We serve over 12,000 companies in 70+ countries. HappyFox is used by companies that span across education, media, e-commerce, retail, information technology, manufacturing, non-profit, government and many other verticals that have an internal or external support function.
To know more, Visit! - https://www.happyfox.com/
Responsibilities:
- Perform manual and automated application penetration tests and provide suggestions to harden our products
- Participate regularly in the development and release process to identify and report security vulnerabilities in the code being shipped
- Conduct regular audits on all Features/APIs of the product and reports vulnerabilities to the development team
- Keep up with industry trends in the security space
- Triage inbound vulnerability reports with an appropriate level of urgency and track them until they are resolved by Engineering teams
- Should be able to understand different elements of our NodeJS, Python and similar stacks and provide guidance on secure software development practices to the team
- Scale our application security engineering team
Requirements:
- Strong verbal and written communication skills
- Has worked on Web Application Security Testing for a reasonably complex application. The mobile experience is a plus
- Good knowledge of secure software development guidelines from authoritative bodies like NIST, OWASP, SANS
- Hands-on experience in performing manual/automated security assessments with open-source/commercial security tools

Data Architect/Engineer
Job Summary:
We are seeking an experienced Data Engineer/Architect to join our data and analytics team. The ideal candidate will have a strong background in data engineering, ETL pipeline development, and experience working with one or more data visualization tools (e.g., Power BI, Tableau, Looker). This role will involve designing, building, and maintaining scalable data solutions that empower business decision-making.
Experience: 8 to 12 yrs
Work location: JP Nagar 3rd phase, Bangalore.
Work type: work from office
Key Responsibilities:
- Define and maintain the overall data architecture strategy in line with business goals.
- Design and implement scalable, reliable, and secure data models, data lakes, and data warehouses.
- Design, develop, and maintain robust data pipelines and ETL workflows.
- Work with stakeholders to understand data requirements and translate them into technical solutions.
- Build and manage data models, data marts, and data lakes.
- Collaborate with BI and analytics teams to support dashboards and data visualizations.
- Ensure data quality, performance, and reliability across systems.
- Optimize data processing using modern cloud-based data platforms and tools.
- Support data governance and security best practices.
- Support the development of enterprise dashboards and reporting frameworks using tools like Power BI, Tableau, or Looker.
- Ensure compliance with data security and privacy regulations.
Required Skills & Qualifications:
- 8–12 years of experience in data engineering or related roles.
- Deep understanding of data modelling, database design, and data warehousing concepts.
- Technology evaluation & selection – Execute Proof of concept and Proof of value for various technology solutions and frameworks
- Strong knowledge of SQL, Python, and/or Scala.
- Experience with ETL tools (e.g., Apache Airflow, Talend, Informatica, dbt).
- Hands-on experience with cloud platforms (AWS, Azure, or GCP) and data services (e.g., Redshift, BigQuery, Snowflake).
- Exposure to one or more data visualization tools like Power BI, Tableau, Looker, or QlikView.
- Familiarity with data modeling, data warehousing, and real-time data streaming.
- Strong problem-solving and communication skills.
Preferred Qualifications:
- Experience working in Agile environments.
- Knowledge of CI/CD for data pipelines.
- Exposure to ML/AI data preparation is a plus.
Job Summary:
Seeking a seasoned SQL + ETL Developer with 4+ years of experience in managing large-scale datasets and cloud-based data pipelines. The ideal candidate is hands-on with MySQL, PySpark, AWS Glue, and ETL workflows, with proven expertise in AWS migration and performance optimization.
Key Responsibilities:
- Develop and optimize complex SQL queries and stored procedures to handle large datasets (100+ million records).
- Build and maintain scalable ETL pipelines using AWS Glue and PySpark.
- Work on data migration tasks in AWS environments.
- Monitor and improve database performance; automate key performance indicators and reports.
- Collaborate with cross-functional teams to support data integration and delivery requirements.
- Write shell scripts for automation and manage ETL jobs efficiently.
Required Skills:
- Strong experience with MySQL, complex SQL queries, and stored procedures.
- Hands-on experience with AWS Glue, PySpark, and ETL processes.
- Good understanding of AWS ecosystem and migration strategies.
- Proficiency in shell scripting.
- Strong communication and collaboration skills.
Nice to Have:
- Working knowledge of Python.
- Experience with AWS RDS.
- Work closely with Product Managers to drive product improvements through data-driven decisions.
- Conduct analysis to determine new project pilot settings, new features, user behavior, and in-app behavior.
- Present insights and recommendations to leadership using high-quality visualizations and concise messaging.
- Own the implementation of data collection and tracking, and coordinate with the engineering and product team.
- Create and maintain dashboards for product and business teams.
- Lead and own the analysis of highly complex data sources, identifying trends and patterns in data, and provide insights/recommendations based on analysis results
- Track feature success of new launches, and set systems to identify causation of change in metrics post every release.
- Deep dive on experiment results, and present insights for the supporting the subsequent product decisions.
Requirements
- 3+ years of experience in the Development of JAVA technology.
- Strong Java Basics
- Linux
- SpringBoot or Spring MVC
- Hands-on experience in Relational Databases (SQL query or Hibernate) + Mongo (JSON parsing)
- Proficient in REST API development
- Messaging Queue (RabitMQ or Kafka)
- Microservices
- Java 8
- Any Caching Mechanism
- Good at problem-solving
Good to Have Skills:
- 3+ years of experience in using Java/J2EE tech stacks
- Good understanding of data structures and algorithms.
- Excellent analytical and problem-solving skills.
- Ability to work in a fast-paced internet start-up environment.
- Experience in technical mentorship/coaching is highly desirable.
- Understanding AI/ML algorithms is a plus.
Looking for a strong candidate good in technical skills and communication.
*Must have Min 3 years experience in Domestic / Internation IT OR Non-IT Staffing.
*Should have Min 3 years experience working in Contract hire, Permanent positions.
*Must have Min 1 to 2 years experience working product hiring
We need a Recruiter with strong sourcing capability to handle volume hiring. Should have experience in Services & Technical OR Non-Tech Hiring.
*Ready to join immediately.

- Building and operationalizing large scale enterprise data solutions and applications using one or more of AZURE data and analytics services in combination with custom solutions - Azure Synapse/Azure SQL DWH, Azure Data Lake, Azure Blob Storage, Spark, HDInsights, Databricks, CosmosDB, EventHub/IOTHub.
- Experience in migrating on-premise data warehouses to data platforms on AZURE cloud.
- Designing and implementing data engineering, ingestion, and transformation functions
-
Azure Synapse or Azure SQL data warehouse
-
Spark on Azure is available in HD insights and data bricks
- Understand the overall requirements of the product/platform and specifically for the assigned feature/module.
- Design and Develop Software as per the product architecture and requirements.
- Code and unit testing of the feature/modules including appropriate design reviews and code inspections.
- Ensure adherence to the software development processes.
- Investigates software/system problems to isolate the root cause and provides innovative solutions.
- Collaborates with cross-functional teams (locally & globally) to ensure product releases meet quality, performance, scalability, reliability, and schedule goals.
Qualifications
- Bachelor’s or Master’s degree in Computer Science, Software Engineering or Information Technology.
- 5-8+ years of experience in software development using J2EE technologies
- Experience with Core Java, JEE5 (JSP/JMS/Web Services/Servlets), Spring, Hibernate, REST, JBOSS/Tomcat Servers
- 1+ years of experience using Azure IOT technologies
- Work experience in Azure IoT Suite: IoT HUB, Azure TSI, DPS, Service Bus, Azure Functions, Azure KeyVault, MCI/AKS, Azure AD
- Knowledge in Azure Data Lake, Azure SQL
- Strong understanding of object-oriented programming
- Knowledge in ReactJS, JQuery is a plus
- Good knowledge in SQL -preferable
- Must have worked in agile/scrum environment
- Good understanding of software development processes, preferably for a regulated medical devices environment (ISO13485/FDA)
- Ability to work with cross-functional and remote teams.
- Flexibility to work on different areas of the product/platform
- Good written and verbal English communication
- Ability to travel as needed (minimal)

