11+ Data security Jobs in Hyderabad | Data security Job openings in Hyderabad
Apply to 11+ Data security Jobs in Hyderabad on CutShort.io. Explore the latest Data security Job opportunities across top companies like Google, Amazon & Adobe.
What You Will Do :
As a Data Governance Lead at Kanerika, you will be responsible for defining, leading, and operationalizing the data governance framework, ensuring enterprise-wide alignment and regulatory compliance.
Required Qualifications :
- 7+ years of experience in data governance and data management.
- Proficient in Microsoft Purview and Informatica data governance tools.
- Strong in metadata management, lineage mapping, classification, and security.
- Experience with ADF, REST APIs, Talend, dbt, and automation via Azure tools.
- Knowledge of GDPR, CCPA, HIPAA, SOX and related compliance needs.
- Skilled in bridging technical governance with business and compliance goals.
Tools & Technologies :
- Microsoft Purview, Collibra, Atlan, Informatica Axon, IBM IG Catalog
- Microsoft Purview capabilities :
1. Label creation & policy setup
2. Auto-labeling & DLP
3. Compliance Manager, Insider Risk, Records & Lifecycle Management
4. Unified Catalog, eDiscovery, Data Map, Audit, Compliance alerts, DSPM.
Key Responsibilities :
1. Governance Strategy & Stakeholder Alignment :
- Develop and maintain enterprise data governance strategies, policies, and standards.
- Align governance with business goals : compliance, analytics, and decision-making.
- Collaborate across business, IT, legal, and compliance teams for role alignment.
- Drive governance training, awareness, and change management programs.
2. Microsoft Purview Administration & Implementation :
- Manage Microsoft Purview accounts, collections, and RBAC aligned to org structure.
- Optimize Purview setup for large-scale environments (50TB+).
- Integrate with Azure Data Lake, Synapse, SQL DB, Power BI, Snowflake.
- Schedule scans, set classification jobs, and maintain collection hierarchies.
3. Metadata & Lineage Management :
- Design metadata repositories and maintain business glossaries and data dictionaries.
- Implement ingestion workflows via ADF, REST APIs, PowerShell, Azure Functions.
- Ensure lineage mapping (ADF ? Synapse ? Power BI) and impact analysis.
4. Data Classification & Security Governance :
- Define classification rules and sensitivity labels (PII, PCI, PHI).
- Integrate with MIP, DLP, Insider Risk Management, and Compliance Manager.
- Enforce records management, lifecycle policies, and information barriers.
5. Data Quality & Policy Management :
- Define KPIs and dashboards to monitor data quality across domains.
- Collaborate on rule design, remediation workflows, and exception handling.
- Ensure policy compliance (GDPR, HIPAA, CCPA, etc.) and risk management.
6. Business Glossary & Stewardship :
- Maintain business glossary with domain owners and stewards in Purview.
- Enforce approval workflows, standard naming, and steward responsibilities.
- Conduct metadata audits for glossary and asset documentation quality.
7. Automation & Integration :
- Automate governance processes using PowerShell, Azure Functions, Logic Apps.
- Create pipelines for ingestion, lineage, glossary updates, tagging.
- Integrate with Power BI, Azure Monitor, Synapse Link, Collibra, BigID, etc.
8. Monitoring, Auditing & Compliance :
- Set up dashboards for audit logs, compliance reporting, metadata coverage.
- Oversee data lifecycle management across its phases.
- Support internal and external audit readiness with proper documentation.
We are looking for a skilled and motivated Data Engineer with strong experience in Python programming and Google Cloud Platform (GCP) to join our data engineering team. The ideal candidate will be responsible for designing, developing, and maintaining robust and scalable ETL (Extract, Transform, Load) data pipelines. The role involves working with various GCP services, implementing data ingestion and transformation logic, and ensuring data quality and consistency across systems.
Key Responsibilities:
- Design, develop, test, and maintain scalable ETL data pipelines using Python.
- Work extensively on Google Cloud Platform (GCP) services such as:
- Dataflow for real-time and batch data processing
- Cloud Functions for lightweight serverless compute
- BigQuery for data warehousing and analytics
- Cloud Composer for orchestration of data workflows (based on Apache Airflow)
- Google Cloud Storage (GCS) for managing data at scale
- IAM for access control and security
- Cloud Run for containerized applications
- Perform data ingestion from various sources and apply transformation and cleansing logic to ensure high-quality data delivery.
- Implement and enforce data quality checks, validation rules, and monitoring.
- Collaborate with data scientists, analysts, and other engineering teams to understand data needs and deliver efficient data solutions.
- Manage version control using GitHub and participate in CI/CD pipeline deployments for data projects.
- Write complex SQL queries for data extraction and validation from relational databases such as SQL Server, Oracle, or PostgreSQL.
- Document pipeline designs, data flow diagrams, and operational support procedures.
Required Skills:
- 4–8 years of hands-on experience in Python for backend or data engineering projects.
- Strong understanding and working experience with GCP cloud services (especially Dataflow, BigQuery, Cloud Functions, Cloud Composer, etc.).
- Solid understanding of data pipeline architecture, data integration, and transformation techniques.
- Experience in working with version control systems like GitHub and knowledge of CI/CD practices.
- Strong experience in SQL with at least one enterprise database (SQL Server, Oracle, PostgreSQL, etc.).
CORE RESPONSIBILITIES
- Create and manage cloud resources in AWS
- Data ingestion from different data sources which exposes data using different technologies, such as: RDBMS, REST HTTP API, flat files, Streams, and Time series data based on various proprietary systems. Implement data ingestion and processing with the help of Big Data technologies
- Data processing/transformation using various technologies such as Spark and Cloud Services. You will need to understand your part of business logic and implement it using the language supported by the base data platform
- Develop automated data quality check to make sure right data enters the platform and verifying the results of the calculations
- Develop an infrastructure to collect, transform, combine and publish/distribute customer data.
- Define process improvement opportunities to optimize data collection, insights and displays.
- Ensure data and results are accessible, scalable, efficient, accurate, complete and flexible
- Identify and interpret trends and patterns from complex data sets
- Construct a framework utilizing data visualization tools and techniques to present consolidated analytical and actionable results to relevant stakeholders.
- Key participant in regular Scrum ceremonies with the agile teams
- Proficient at developing queries, writing reports and presenting findings
- Mentor junior members and bring best industry practices
QUALIFICATIONS
- 5-7+ years’ experience as data engineer in consumer finance or equivalent industry (consumer loans, collections, servicing, optional product, and insurance sales)
- Strong background in math, statistics, computer science, data science or related discipline
- Advanced knowledge one of language: Java, Scala, Python, C#
- Production experience with: HDFS, YARN, Hive, Spark, Kafka, Oozie / Airflow, Amazon Web Services (AWS), Docker / Kubernetes, Snowflake
- Proficient with
- Data mining/programming tools (e.g. SAS, SQL, R, Python)
- Database technologies (e.g. PostgreSQL, Redshift, Snowflake. and Greenplum)
- Data visualization (e.g. Tableau, Looker, MicroStrategy)
- Comfortable learning about and deploying new technologies and tools.
- Organizational skills and the ability to handle multiple projects and priorities simultaneously and meet established deadlines.
- Good written and oral communication skills and ability to present results to non-technical audiences
- Knowledge of business intelligence and analytical tools, technologies and techniques.
Mandatory Requirements
- Experience in AWS Glue
- Experience in Apache Parquet
- Proficient in AWS S3 and data lake
- Knowledge of Snowflake
- Understanding of file-based ingestion best practices.
- Scripting language - Python & pyspark
Job Role –
React Developer
Location- Hyderabad
Job Description-
• Develop end-to-end features (full stack) for users and be proficient with aspects of design and coding.
• Proficiency in automated testing/testing methodologies is a must.
• Create, maintain, and modify (where necessary) the data structure.
• Work with product owners to handle backlogs and new requests.
• Work on React projects using specifies tools and databases.
• Should have very good understanding of the technical approach and its viability.
• Should be able to push back and accurately assess the viability of their solutions.
• Should be comfortable adopting testing frameworks and developing with common industry practices such as test-driven development.
Person Specifications/ Requirements
• Good understanding and proven experience of web development and basic languages such as HTML and JavaScript.
• Strong programming experience.
• Must have experience with React.js, Node.js.
• Experience with Vue.js and Go is required.
• Strong understanding of software development tools, data structures and data modelling.
• Strong Data Analytical Skills.
• Should have Knowledge and experience of databases, e.g., MS SQL, MongoDB.
• Comfortable with optimizing your code and refactoring where necessary, researching ways to have responsive solutions.
• Good understanding of Test-Driven Development (TDD), including benchmarking and refactoring during the TDD process.
• Familiarity with popular authorization techniques such as JSON Web Tokens (JWTs) is required.
• Strong understanding of the software development lifecycle and awareness of commercial deadlines.
• Strong self-discipline and detail-oriented, important to meet client requirements in pressured timelines to a high quality.
• Bachelor's level business related qualification or 2+ years relevant experience in a similar environment.
• Strong understanding of the software development lifecycle.
• Strong verbal and written communications skills.
• Inquisitive, enthusiastic, passionate, and diligent.
• Keen to develop a broad range of business, technical and soft skills
Strong on typescript/java script.
Strong on agile methodology.
Should have knowledge on writing unit test cases using jasmine/jest
- You solve problems at their root, stepping back to understand the broader context.
- You develop pragmatic solutions and build flexible systems that balance engineering complexity and timely delivery, creating business impact.
- You understand a broad range of data structures and algorithms and apply them to deliver high-performing applications.
- You recognize and use design patterns to solve business problems.
- You understand how operating systems work, perform and scale.
- You continually align your work with Amazon’s business objectives and seek to deliver business value.
- You collaborate to ensure that decisions are based on the merit of the proposal, not the proposer.
- You proactively support knowledge-sharing and build good working relationships within the team and with others in Amazon.
- You communicate clearly with your team and with other groups and listen effectively.
The company has raised $14.5 MM in seed funding and partners with over 30 manufacturers around the world and is founded by seasoned entrepreneurs and technology leaders who come from institutions like IIT Bombay and Stanford GSB.
As a Talent Acquisition Manager, you will be responsible for managing the full cycle recruitment process starting from sourcing to on- boarding and generating an excellent candidate experience.
What you will do:
- Working closely with the Founders of the company to design and implement the overall Talent Acquisition and Organization development strategies
- Developing an effective hiring process for each requirement and setting evaluation parameters for the purpose of assessment
- Utilizing effective networking methods to build a robust pipeline of candidates for potential hiring needs
- Collaborating with hiring managers to refine and implement innovative hiring strategies
- Proactively sourcing candidates through multiple channels like job portals, headhunting, referrals, social media and others
- Screening resumes, interviewing and managing candidates throughout the interview process till final offer negotiation
- Managing candidate status in the internal applicant tracking system
- Gaining market intelligence by performing competitor analysis and talent mapping
- Maximizing ROI on recruitment costs
Desired Candidate Profile
What you need to have:- MBA from tier 1/ 2 colleges
- At least 5-10 years experience in core recruitment
- Tech and startup experience preferred
- Knowledge of applicant tracking systems
- Good presentation skills
- Excellent verbal and written communication skills
- Proficiency in MS Office Applications
Implementing various development, testing, automation tools, and IT infrastructure
Planning the team structure, activities, and involvement in project management activities.
Managing stakeholders and external interfaces
Setting up tools and required infrastructure
Defining and setting development, test, release, update, and support processes for DevOps operation
Have the technical skill to review, verify, and validate the software code developed in the project.
Troubleshooting techniques and fixing the code bugs
Monitoring the processes during the entire lifecycle for its adherence and updating or creating new processes for improvement and minimizing the wastage
Encouraging and building automated processes wherever possible
Identifying and deploying cybersecurity measures by continuously performing vulnerability assessment and risk management
Incidence management and root cause analysis
Coordination and communication within the team and with customers
Selecting and deploying appropriate CI/CD tools
Strive for continuous improvement and build continuous integration, continuous development, and constant deployment pipeline (CI/CD Pipeline)
Mentoring and guiding the team members
Monitoring and measuring customer experience and KPIs
Managing periodic reporting on the progress to the management and the customer
- Create and maintain new hire and personnel files and enter them into Human Resources Information Systems.
- Assist with orientation of new employees.
- Ensure accurate maintenance of all employee records and files (e.g., interview documents, ).
- Support processing and maintenance of payroll records in accordance with policies and procedures, as necessary.
- Generate Human Resources data reports as necessary Answer phone calls and record messages.
- Create and type office correspondence using computer.
- Serve as Human Resources subject matter expert and participate on project teams.
- Day to day operations of the HR function
- End to End Recruitment process like Sourcing of candidates from job portals, taking care of interview and selection process.
- HR Administration (eg. leave management, employee database, salary inputs, training history cards)
- Strong communication and organizational skills
- Must be self-motivated with a strong sense of urgency
- Ability to handle multiple priorities while maintaining strong attention to detail and adhering to deadlines
- Strong interpersonal and communication skills, both oral and written
- Able to create a positive employee experience so that people want to join and stay with the organisation
- Sound judgment and problem-solving skills
- Familiarity with MS Office/Excel/Outlook
|
Primary Skill |
|





