
Role & Responsibilities
As a DevOps Engineer, you will be working on building and operating infrastructure at scale, designing and implementing a variety of tools to enable product teams to build and deploy their services independently, improving observability across the board, and designing for security, resiliency, availability, and stability. If the prospect of ensuring system reliability at scale and exploring cutting-edge technology to solve problems excites you, then this is your fit.
Job Responsibilities:
- Own end-to-end infrastructure right from non-prod to prod environment including self-managed DBs
- Codify infrastructure
- Ensure uptime above 99.99%
- Understand the bigger picture and navigate through ambiguities
- Scale technology considering cost and observability and manage end-to-end processes
- Understand DevOps philosophy and evangelize the principles across the organization
- Demonstrate strong communication and collaboration skills to break down silos
Ideal Candidate
Strong DevOps / Infrastructure Engineer Profiles
Mandatory Requirements:
Experience 1:
Must have 4+ years of hands-on experience working as a DevOps Engineer / Infrastructure Engineer / SRE / DevOps Consultant.
Experience 2:
Must have hands-on experience with Kubernetes and Docker, including deployment, scaling, or containerized application management.
Experience 3:
Must have experience with Infrastructure as Code (IaC) or configuration management tools such as Terraform, Ansible, Chef, or Puppet.
Experience 4:
Must have strong automation and scripting experience using Python, Go, Bash, Shell, or similar scripting languages.
Experience 5:
Must have working experience with distributed databases or data systems such as MongoDB, Redis, Cassandra, Elasticsearch, or Kafka.
Experience 6:
Must demonstrate strong expertise in at least one of the following areas:
- Databases / Distributed Data Systems
- Observability & Monitoring
- CI/CD Pipelines
- Networking Concepts
- Kubernetes / Container Platforms
Company Background:
Candidates must be from B2C product-based companies only.
Education:
BE / B.Tech or equivalent.
Preferred:
- Experience working with microservices or event-driven architectures.
- Exposure to cloud infrastructure, monitoring, reliability, and scalability practices.
- Understanding of programming languages such as Go, Python, or Java.
- Experience working in high-scale production or fast-growing product startups.
Perks, Benefits and Work Culture
We take our work seriously and are proud of the associations we have built along the way. But we also know how to have fun. With a seamless communication structure and a “no cubicle culture,” the people here are extremely approachable. You will have several opportunities to exercise your potential. We break the regular office monotony and believe in a free-flowing work culture. It’s a great place to be, and we are confident you will enjoy working here.
If you want, I can also convert this into a recruiter-friendly screening checklist version.

About TalentXO
About
Company social profiles
Similar jobs
Review Criteria:
- Strong Dremio / Lakehouse Data Architect profile
- 5+ years of experience in Data Architecture / Data Engineering, with minimum 3+ years hands-on in Dremio
- Strong expertise in SQL optimization, data modeling, query performance tuning, and designing analytical schemas for large-scale systems
- Deep experience with cloud object storage (S3 / ADLS / GCS) and file formats such as Parquet, Delta, Iceberg along with distributed query planning concepts
- Hands-on experience integrating data via APIs, JDBC, Delta/Parquet, object storage, and coordinating with data engineering pipelines (Airflow, DBT, Kafka, Spark, etc.)
- Proven experience designing and implementing lakehouse architecture including ingestion, curation, semantic modeling, reflections/caching optimization, and enabling governed analytics
- Strong understanding of data governance, lineage, RBAC-based access control, and enterprise security best practices
- Excellent communication skills with ability to work closely with BI, data science, and engineering teams; strong documentation discipline
- Candidates must come from enterprise data modernization, cloud-native, or analytics-driven companies
Preferred:
- Experience integrating Dremio with BI tools (Tableau, Power BI, Looker) or data catalogs (Collibra, Alation, Purview); familiarity with Snowflake, Databricks, or BigQuery environments
Role & Responsibilities:
You will be responsible for architecting, implementing, and optimizing Dremio-based data lakehouse environments integrated with cloud storage, BI, and data engineering ecosystems. The role requires a strong balance of architecture design, data modeling, query optimization, and governance enablement in large-scale analytical environments.
- Design and implement Dremio lakehouse architecture on cloud (AWS/Azure/Snowflake/Databricks ecosystem).
- Define data ingestion, curation, and semantic modeling strategies to support analytics and AI workloads.
- Optimize Dremio reflections, caching, and query performance for diverse data consumption patterns.
- Collaborate with data engineering teams to integrate data sources via APIs, JDBC, Delta/Parquet, and object storage layers (S3/ADLS).
- Establish best practices for data security, lineage, and access control aligned with enterprise governance policies.
- Support self-service analytics by enabling governed data products and semantic layers.
- Develop reusable design patterns, documentation, and standards for Dremio deployment, monitoring, and scaling.
- Work closely with BI and data science teams to ensure fast, reliable, and well-modeled access to enterprise data.
Ideal Candidate:
- Bachelor’s or Master’s in Computer Science, Information Systems, or related field.
- 5+ years in data architecture and engineering, with 3+ years in Dremio or modern lakehouse platforms.
- Strong expertise in SQL optimization, data modeling, and performance tuning within Dremio or similar query engines (Presto, Trino, Athena).
- Hands-on experience with cloud storage (S3, ADLS, GCS), Parquet/Delta/Iceberg formats, and distributed query planning.
- Knowledge of data integration tools and pipelines (Airflow, DBT, Kafka, Spark, etc.).
- Familiarity with enterprise data governance, metadata management, and role-based access control (RBAC).
- Excellent problem-solving, documentation, and stakeholder communication skills.
Preferred:
- Experience integrating Dremio with BI tools (Tableau, Power BI, Looker) and data catalogs (Collibra, Alation, Purview).
- Exposure to Snowflake, Databricks, or BigQuery environments.
- Experience in high-tech, manufacturing, or enterprise data modernization programs.
Job Description:
We’re looking for a Software Engineer who’s passionate about building scalable, high-performance applications using Java and Kotlin on AWS. You’ll collaborate closely with cross-functional teams — including DevOps, QA, and Product — to design, develop, and deliver robust software solutions.
Our culture values collaboration, ownership, and continuous learning. We embrace modern technologies to solve real-world problems and continuously evolve our platform to meet changing business needs.
Key Responsibilities
- Develop and enhance features to collect, process, and deliver user-generated content.
- Collaborate with engineers, product managers, and designers to build end-to-end solutions.
- Write clean, maintainable, and efficient code following best practices.
- Participate in design discussions, code reviews, and technical brainstorming sessions.
- Identify and fix performance bottlenecks and technical issues.
- Contribute to CI/CD pipelines, infrastructure automation, and developer tooling initiatives.
- Maintain and improve application reliability, scalability, and security.
Required Skills & Experience
- 4+ years of full-stack development experience using Java/Kotlin and AWS.
- Hands-on experience in API development for front-end applications.
- Strong understanding of relational databases (PostgreSQL, MySQL) and handling large datasets.
- Experience working with CI/CD tools (CircleCI, GitHub Actions, Drone).
- Experience with Infrastructure as Code using Terraform.
- Exposure to event-driven architectures (SQS, SNS, Kafka, Kinesis, Pub/Sub) and idempotent patterns.
Nice-to-Have Skills
- Familiarity with automated database migration tools (Liquibase, Flyway).
- Experience with cloud storage (AWS S3, GCP Cloud Storage, Azure Blob) or document data stores (DynamoDB, MongoDB).
- Experience containerizing applications using Docker and deploying to ECS or Kubernetes.
- Proficiency with Git and collaborative development workflows.
Company information:
About Kanerika
Who we are:
Kanerika Inc. is a premier global software products and services firm that specializes in providing innovative solutions and services for data-driven enterprises. Our focus is to empower businesses to achieve their digital transformation goals and maximize their business impact through the effective use of data and AI. We leverage cutting-edge technologies in data analytics, data governance, AI-ML, GenAI/ LLM and industry best practices to deliver custom solutions that help organizations optimize their operations, enhance customer experiences, and drive growth.
Awards and Recognitions:
Kanerika has won several awards over the years, including:
· CMMI Level 3 Appraised in 2024.
· Best Place to Work 2022 & 2023 by Great Place to Work®.
· Top 10 Most Recommended RPA Start-Ups in 2022 by RPA today.
· NASSCOM Emerge 50 Award in 2014.
· Frost & Sullivan India 2021 Technology Innovation Award for its Kompass composable solution architecture.
· Recognized for ISO 27701, 27001, SOC2, and GDPR compliances.
· Featured as Top Data Analytics Services Provider by GoodFirms.
What You Will Do:
As a Data Governance Developer at Kanerika, you will be responsible for building and managing metadata, lineage, and compliance frameworks across the organization’s data ecosystem.
Key Responsibilities:
- Set up and manage Microsoft Purview accounts, collections, and access controls (RBAC).
- Integrate Purview with data sources: Azure Data Lake, Synapse, SQL DB, Power BI, Snowflake.
- Schedule and monitor metadata scanning and classification jobs.
- Implement and maintain collection hierarchies aligned with data ownership.
- Design metadata ingestion workflows for technical, business, and operational metadata.
- Enrich data assets with business context: descriptions, glossary terms, tags.
- Synchronize metadata across tools using REST APIs, PowerShell, or ADF.
- Validate end-to-end lineage for datasets and reports (ADF → Synapse → Power BI).
- Resolve lineage gaps or failures using mapping corrections or scripts.
- Perform impact analysis to support downstream data consumers.
- Create custom classification rules for sensitive data (PII, PCI, PHI).
- Apply and manage Microsoft Purview sensitivity labels and policies.
- Integrate with Microsoft Information Protection (MIP) for DLP.
- Manage business glossary in collaboration with domain owners and stewards.
- Implement approval workflows and term governance.
- Conduct audits for glossary and metadata quality and consistency.
- Automate Purview operations using:
- PowerShell, Azure Functions, Logic Apps, REST APIs
- Build pipelines for dynamic source registration and scanning.
- Automate tagging, lineage, and glossary term mapping.
- Enable operational insights using Power BI, Synapse Link, Azure Monitor, and governance APIs.
Tools & Technologies:
- Microsoft Purview, Collibra, Atlan, Informatica Axon, IBM IG Catalog
- Experience in Microsoft Purview areas:
1. Label creation and policy management
2. Publish/Auto-labeling
3. Data Loss Prevention & Compliance handling
4. Compliance Manager, Communication Compliance, Insider Risk Management
5. Records Management, Unified Catalog, Information Barriers
6. eDiscovery, Data Map, Lifecycle Management, Compliance Alerts, Audit
7. DSPM, Data Policy
Required Qualifications:
-experience in data governance or data management.
- Strong experience in Microsoft Purview and Informatica governance tools.
- Proficient in tracking and visualizing data lineage across systems.
- Familiar with Azure Data Factory, Talend, dbt, and other integration tools.
- Understanding of data regulations: GDPR, CCPA, SOX, HIPAA.
- Ability to translate technical data governance concepts for business stakeholders.
Employee Benefits
1. Culture:
a. Open Door Policy: Encourages open communication and accessibility to management.
b. Open Office Floor Plan: Fosters a collaborative and interactive work environment.
c. Flexible Working Hours: Allows employees to have flexibility in their work schedules.
d. Employee Referral Bonus: Rewards employees for referring qualified candidates.
e. Appraisal Process Twice a Year: Provides regular performance evaluations and feedback
2. Inclusivity and Diversity:
a. Hiring practices that promote diversity: Ensures a diverse and inclusive workforce.
b. Mandatory POSH training: Promotes a safe and respectful work environment.
3. Health Insurance and Wellness Benefits:
a. GMC and Term Insurance: Offers medical coverage and financial protection.
b. Health Insurance: Provides coverage for medical expenses.
c. Disability Insurance: Offers financial support in case of disability.
4. Child Care & Parental Leave Benefits:
a. Company-sponsored family events: Creates opportunities for employees and their families to bond.
b. Generous Parental Leave: Allows parents to take time off after the birth or adoption of a child.
c. Family Medical Leave: Offers leave for employees to take care of family members' medical needs.
5. Perks and Time-Off Benefits:
a. Company-sponsored outings: Organizes recreational activities for employees.
b. Gratuity: Provides a monetary benefit as a token of appreciation.
c. Provident Fund: Helps employees save for retirement.
d. Generous PTO: Offers more than the industry standard for paid time off.
e. Paid sick days: Allows employees to take paid time off when they are unwell.
Thanks
Job Overview:
We are seeking an experienced and results-driven Sales Manager to lead and oversee our real estate sales team. The ideal candidate will have a strong background in real estate sales, with the ability to set strategies, manage a team, and drive sales targets. The Sales Manager will play a key role in developing and executing sales plans to achieve growth, ensure customer satisfaction, and maintain market competitiveness.
Key Responsibilities:
- Develop and execute sales strategies to achieve revenue targets.
- Identify and qualify potential clients.
- Build and maintain strong relationships with clients.
- Negotiate and close deals.
- Manage and develop a team of sales representatives.
- Prepare and deliver sales presentations.
- Analyze sales data and identify trends.
- Stay up-to-date on industry trends and best practices.
Qualifications:
- Bachelor's degree in the related field.
- 6+ years of experience in sales management.
- Strong communication, interpersonal, and negotiation skills.
- Ability to work independently and as part of a team.
- Experience with CRM software.
ZORANG is Hiring for Android Developer
Experience Range: 3+ Yrs.
Job Location: Gurgaon
How your skills and passion will come to life at Zorang:
● Design and build applications for the Android platform (Kotlin)
● Collaborate with cross-functional teams to design, and ship new features
● Understand product specifications and come up with optimal scalable solutions
● Unit-test code for robustness, including edge cases, usability, and general reliability
● Strive to follow best coding practices throughout designing, development, and testing.
Whatyou'veaccomplished:
● 3+ years of software engineering and product delivery experience, with min 2+ years of experience with Android (Kotlin)
● Experience working with Product Managers and UX Designers, with a strong product sense
● Excellent teamwork skills, flexibility, and ability to handle multiple tasks
● Capability to bring in software engineering life cycle best practices
● Excellent analytical and problem-solving skills
● Deep understanding of algorithms and data structures
● Executed on RCAs / tough eng problems at work
● Won awards / Recognised by upper management for going above and beyond
● Open source/pet projects contributions
Role: IT Infrastructure Services - Other
IndustryType: IT Services & Consulting
EmploymentType: Full Time, Permanent
Key Skills: Android Application Development, java, mobile development, Kotlin, third-party integration
- Writing, editing and publishing engaging, compelling and trending posts for social media networks (e.g. Facebook, Instagram, Linkedin and Twitter)
- Proofread the content for grammar, punctuation and spelling.
- Create monthly content calendars and take approval from clients.
- Maintain content hygiene of the social media pages.
- Tracking and reporting on social media insights (e.g. traffic, engagement, conversion rates, shares etc.)
- You will be responsible for establishing and nurturing strong relationships with clients. This involves regular communication, understanding their needs
Requirements
- Experience: Fresher
- Ability to create innovative social media content, supported by relevant images and videos
- Creative thinking and Creative Writing skills.
- Proficiency in English
- Strong verbal as well as written communication skills
- Exceptional time-management and organizational skills
- A keen eye for details
Job Type: Fresher
Salary: ₹20,000.00 - ₹25,000.00 per month
Schedule:
- Morning shift
Expected Start Date: 17/04/2023
Nest is a gamified investment platform for the next-generation and young millennials of the world, to invest, play and earn returns more than a regular savings bank account. The world of gaming is merging with finance to offer a more immersive user experience and help users learn how to manage their money. We’re creating a new breed of asset managers and fin-fluencers with Nest.
We are seeking a dedicated Golang developer to join our growing company.
You will collaborate with other technical staff to deliver and maintain a fully-functional mobile application. We hope you can put your passion for software engineering to work for creating highly immersive user applications.
Your duties will include maintaining code repositories via GitHub and internal server storage modules.
Responsibilities
Below are some of the responsibilities a Golang developer is expected to undertake in their position:
- Implement AWS containers to support Go implementation and successful repository maintenance
- Utilize Kubernetes to ensure successful application development, deployment, and scaling
- Implement Docker for smaller-scale applications that require simpler deployments
- Employ Linux Terminal command structures to allow easy back-end operations for less-expert technical staff
- Structure our user interface with React and ensure REST API access is available for enterprise-grade finance customers on-demand
- Collaborate with other technical staff to ensure consistent and smooth DevOps workflows
- Choose and implement other JavaScript libraries that will optimize performance without sacrificing security and base functionality
- Manage multiple projects within reasonable design specifications and budget restrictions
- QA design for errors
- Implement feedback and changes whenever possible
- Create visualizations that convey accurate messaging for the project.
Job Qualifications and Skill Sets
Bachelor’s degree in computer science, IT, or a related field, such as programming. Other critical skills required are:
- Provable proficiency in Go programming
- Excellent written and verbal communication skills
- Minimum of two years’ experience working in programming or information technology
- Attention to detail and Knowledge in Java is recommended
- Portfolio of work with examples that showcase technical skill
- Strong analytical skills
- Time management and organizational skills
- Knowledge of Go templating, common frameworks, and tools
- Experience working with a team
What you’ll receive
- A compensation competitive to premier firms
- Incredible learning and exposure to every aspect of a funded financial startup as an early member, working directly with the founders.
- Recognition and experience of working on a platform with an exponential user growth curve.
- A fun and receptive work culture which promises to never get mundane!
- If you love party, it is on us!
- Define and document detailed requirements detailing market opportunities and associated technical integrations in collaboration with architects, engineers, and key stakeholders with ISVs and strategic partners.
- Drive product execution across a large set of engineers, product designers, and product managers.
- Collaborate with Product Marketing on company Platform collaterals and sales tools such as articles, datasheets, white papers, blogs, how-to videos, demos, etc.
- Conduct usability testing interviews to gather feedback on Product integration prototypes.
- Identify market opportunities and conduct customer/user research interviews in conjunction with lead Product Managers.
- Deliver product management analysis, including market requirements, and customer experience artifacts.
- Develop a deep understanding of the RDBMS competitive market landscape and product analysis.
- Own the roadmap of active and future projects and maintain the backlog of features and priorities.
- Be outcome and KPI drove to achieve growth of your products and take the right actions accordingly.
- Work with the engineering and product team to ensure proper investment and execution in support of your strategy.
- Communicate regularly with senior leadership on status, risks, and change control.
Requirements:
- Have 5+ years of relevant experience in a technical role, with at least 3 years of those in a product management role.
- Have experience defining a roadmap and managing incremental execution through successful launches.
- Detail-oriented and able to understand the bigger picture by using your technical expertise and problem-solving abilities to prioritize and manage blocking issues.
- Thrive in the dynamic environment that comes with being part of a fast-moving startup. That means flexibility as well as dealing with ambiguity.
- Experience working across organizations with Sales, Marketing, Support, Product, Engineering, and Design.
- 2+ years experience in data infrastructure or related IaaS products and/or services; AWS and/or Google Cloud background preferred.
- Strong communication skills, analytical skills, and data-driven product decision-making mindset.
- Experience with product management planning and design tools - Confluence, JIRA, Balsamiq, AHA!
Nice to have:
- Strong proficiency in the big data ecosystem e.g. Kafka, Spark, or similar technologies.
- Technically savvy and experienced with the cloud, APIs, enterprise software.
- Worked on data platforms or streaming applications.
Have experience using and writing REST Apis.
Someone who can provide technical leadership across UI engineering, and help guide the overall engineering vision.










