
Hi Kirti,
Job Title: Data Analytics Engineer
Experience: 3 to 6 years
Location: Gurgaon (Hybrid)
Employment Type: Full-time
Job Description:
We are seeking a highly skilled Data Analytics Engineer with expertise in Qlik Replicate, Qlik Compose, and Data Warehousing to build and maintain robust data pipelines. The ideal candidate will have hands-on experience with Change Data Capture (CDC) pipelines from various sources, an understanding of Bronze, Silver, and Gold data layers, SQL querying for data warehouses like Amazon Redshift, and experience with Data Lakes using S3. A foundational understanding of Apache Parquet and Python is also desirable.
Key Responsibilities:
1. Data Pipeline Development & Maintenance
- Design, develop, and maintain ETL/ELT pipelines using Qlik Replicate and Qlik Compose.
- Ensure seamless data replication and transformation across multiple systems.
- Implement and optimize CDC-based data pipelines from various source systems.
2. Data Layering & Warehouse Management
- Implement Bronze, Silver, and Gold layer architectures to optimize data workflows.
- Design and manage data pipelines for structured and unstructured data.
- Ensure data integrity and quality within Redshift and other analytical data stores.
3. Database Management & SQL Development
- Write, optimize, and troubleshoot complex SQL queries for data warehouses like Redshift.
- Design and implement data models that support business intelligence and analytics use cases.
4. Data Lakes & Storage Optimization
- Work with AWS S3-based Data Lakes to store and manage large-scale datasets.
- Optimize data ingestion and retrieval using Apache Parquet.
5. Data Integration & Automation
- Integrate diverse data sources into a centralized analytics platform.
- Automate workflows to improve efficiency and reduce manual effort.
- Leverage Python for scripting, automation, and data manipulation where necessary.
6. Performance Optimization & Monitoring
- Monitor data pipelines for failures and implement recovery strategies.
- Optimize data flows for better performance, scalability, and cost-effectiveness.
- Troubleshoot and resolve ETL and data replication issues proactively.
Technical Expertise Required:
- 3 to 6 years of experience in Data Engineering, ETL Development, or related roles.
- Hands-on experience with Qlik Replicate & Qlik Compose for data integration.
- Strong SQL expertise, with experience in writing and optimizing queries for Redshift.
- Experience working with Bronze, Silver, and Gold layer architectures.
- Knowledge of Change Data Capture (CDC) pipelines from multiple sources.
- Experience working with AWS S3 Data Lakes.
- Experience working with Apache Parquet for data storage optimization.
- Basic understanding of Python for automation and data processing.
- Experience in cloud-based data architectures (AWS, Azure, GCP) is a plus.
- Strong analytical and problem-solving skills.
- Ability to work in a fast-paced, agile environment.
Preferred Qualifications:
- Experience in performance tuning and cost optimization in Redshift.
- Familiarity with big data technologies such as Spark or Hadoop.
- Understanding of data governance and security best practices.
- Exposure to data visualization tools such as Qlik Sense, Tableau, or Power BI.

Similar jobs


Senior Backend Developer (C# and .NET)
Hybrid / On-site (Bangalore)
What is the role?
Xoxoday is looking for a candidate who has a strong background in the design and implementation of scalable architecture and a good understanding of Algorithms, Data structures, and design patterns. Candidates must be ready to learn new tools, languages, and technologies. We are offering hybrid / remote options as well.
Basic Qualifications:
- At least 4 -7 years of experience as a software developer.
- At Least 3 years of experience in .net core C#, AWS stack, MS SQL Server, MVC, NodeJS experience is a plus
- Strong working knowledge in distributed event-driven messaging architecture/platform
- Strong knowledge in data access layer especially ability to work with stored procedure
- Established and stimulated software development standards and processes along with best practices for delivery of scalable and high-quality software.
- Production experience with AWS stack
- Fluent English speaker
Preferred Qualifications:l
- Experience working with OOP languages.
- Experience designing and developing Microservices and SOA.
- Experience working with AWS Kinesis, Lambda, SQS, S3, ElastiCache, ElasticSearch, Kubernetes, EventBridge, RDS, CloudWatch, APIGateway
- Experience designing and building high-performance scalable web services.
- Experience in REST API design and implementation.
- Experience in unit testing, test automation, and continuous delivery.
- Experience with stream-processing and message-broker software.
Nice to have:
- Experience working with distributed teams.
- Ability to work independently and as part of a team.
- Ability to work quickly toward tight deadlines, and make smart tradeoffs between speed, accuracy, and maintainability.
- Bachelor's or Master's degree in computer science (or equivalent professional experience).
What can you look for?
A wholesome opportunity in a fast-paced environment that will enable you to juggle between concepts, yet maintain the quality of content, interact, and share your ideas and have loads of learning while at work. Work with a team of highly talented young professionals and enjoy the benefits of being at Xoxoday.
We are
Xoxoday is a rapidly growing fintech SaaS firm that propels business growth while focusing on human motivation. Backed by Giift and Apis Partners Growth Fund II, Xoxoday offers a suite of three products - Plum, Empuls, and Compass. Xoxoday works with more than 2000 clients across 10+ countries and over 2.5 million users. Headquartered in Bengaluru, Xoxoday is a 300+ strong team with four global offices in San Francisco, Dublin, Singapore, New Delhi.
Way forward
We look forward to connecting with you. As you may take time to review this opportunity, we will wait for a reasonable time of around 3-5 days before we screen the collected applications and start lining up job discussions with the hiring manager. We however assure you that we will attempt to maintain a reasonable time window for successfully closing this requirement. The candidates will be kept informed and updated on the feedback and application status.



What you need to succeed in this job ?
- MS or BS/B.Tech in computer science or equivalent experience from top college.
- Minimum 2+ Experience in Java 8, Spring Boot, Spring Cloud, Spring Cloud Gateway etc
- Good understanding of Design Patterns usage and implementations.
- REST Services and understanding and implementation of Microservices Architecture.
- Unit testing tools – Junit & Mockito.
- Experience is PostgreSQL database is must,
- Excellent data structure & algorithm and problem solving skills.
- Should be an active contributor to developer communities like Stackoverflow is added advantage.
- Experience and knowledge of open source tools & frameworks, broader cutting edge technologies around server side development (Prometheus, Elasticsearch, Kafka).
- Must be a proven performer and team player that enjoy challenging assignments in a high- energy, fast growing and start-up workplace.
- Must be a self-starter who can work well with minimal guidance and in fluid environment.
Job Title: Lead Generation Executive
Location: Remote
Job Summary:
We are in search of a "Lead Generation Executive" for a remote role. Your primary responsibility is to help us find potential clients, create leads, and contribute to our sales goals.
Responsibilities:
- Find and create leads to support our sales goals, and also screen the leads and set up meetings.
- Identify new markets and potential customers by doing research.
- Reach out to potential clients through various methods such as calls, networking, and social media.
- Communicate with potential clients, understand their needs, and introduce our products.
- Keep track of sales and performance data and report on progress.
Qualifications and Skills:
- Minimum of 3 years of experience in lead generation or a related field.
- Proven success in lead generation or sales.
- Strong experience in generating leads, cold calling, email campaigns, LinkedIn outreach, and rate negotiations.
- Ability to plan and execute effective lead generation strategies.
- A good understanding of sales and marketing.
- Current knowledge of market trends and competitor activities.
- Strong negotiation and problem-solving skills.
- Excellent communication skills.
- Proficiency in phone calls and presentations.
- Self-motivated, results-driven, and passionate about lead generation.
- Ability to work independently and proactively.
- Familiarity with Microsoft Office, CRM systems, and sales software.
ABOUT THE COMPANY
Augment was started in 2015 in Madison, Wisconsin USA (three hours North of Chicago). We offer Digital Innovation, Application Development, and Staff Augmentation services to clients in the USA. We wanted to help not only our clients but also to provide a good home for developers, designers, and QA team members to work, grow and thrive.
We are a 70+ person team headquartered in Madison, Wisconsin, USA with a larger office in Coimbatore, India. We are also building our teams in Brazil and Eastern Europe. Our company takes pride in genuinely attracting and retaining the best talent. If you want to be part of a high-achieving, knowledgeable, and fun team, then here’s your chance to join a rapidly growing startup that offers an international and diverse work atmosphere. If you are interested in learning more about us, please contact us. You can also talk to our team members to see how they like working with Augment.
Technical/Core skills
- Minimum 3 yrs of exp in Informatica Big data Developer(BDM) in Hadoop environment.
- Have knowledge of informatica Power exchange (PWX).
- Minimum 3 yrs of exp in big data querying tool like Hive and Impala.
- Ability to designing/development of complex mappings using informatica Big data Developer.
- Create and manage Informatica power exchange and CDC real time implementation
- Strong Unix knowledge skills for writing shell scripts and troubleshoot of existing scripts.
- Good knowledge of big data platforms and its framework.
- Good to have an experience in cloudera data platform (CDP)
- Experience with building stream processing systems using Kafka and spark
- Excellent SQL knowledge
Soft skills :
- Ability to work independently
- Strong analytical and problem solving skills
- Attitude of learning new technology
- Regular interaction with vendors, partners and stakeholders
- The Recruitment Lead will be responsible for overall Annalect India recruitment strategy development, implementation and administration of recruitment programs. Execute Recruitment strategy to drive business results of each internal departments and appropriate stakeholders.
- The Recruitment Lead will possess and exercise exceptional skills in: problem solving, strategic thinking, and project/process management, coaching and fostering collaboration. To monitor and constantly reduces the costs of the recruitment process
Functions and Responsibilities:
40% Recruiting Strategy: Provide day to day support on the overall recruiting life cycle and building trusted relationships with hiring managers.
- Overall Recruiting Strategy for Annalect
- Responsible towards the Annalect senior leadership in planning, developing, executing, and directing hiring strategies, processes, and programs for Annalect
- Lead sourcing strategies through a variety of sourcing channels across all of Annalect’s recruiting needs.
- Manage and maintain external search agency relationships, contracts, negotiations and effectiveness.
- Develop and manage university and external program relationships to build additional sourcing pipelines.
- Oversight over all Annalect open roles and health of the pipelines.
- Identifying key industry related events and coordinating participation.
- Responsible for partnering with Marketing and Design on all recruiting related activities including managing our social media brand related to hiring
- Partnering effectively with the Annalect Global recruiting team, ensuring alignment on practices, and alignment on use of external agencies.
- Hiring Manager Relationships
- Responsible for building an understanding of our hiring needs, conducting a needs analysis and evaluation of job descriptions that best target our talent profiles.
- Responsible for providing counsel to hiring managers on approaches to recruiting and interviewing practices
- Act as a relationship manager between hiring managers and candidates, being responsible towards Annalect and balancing our needs and expectations with that of the candidate.
- Partnering with the Annalect HR team on supporting Annalect Hiring manager needs.
- Educating and enabling managers through the Annalect interview framework
30% Recruiting
- Manage open job postings and candidate pipeline
- Responsible for managing an active and passive pipeline search through a variety of recruiting channels.
- Responsible for ensuring a positive candidate experience throughout the life cycle of the recruitment process.
- Provide support on building out job descriptions
- Gathering candidate feedback in a timely manner
- Coordinating and scheduling candidates in a timely manner being respectful of the hiring managers’ schedules.
- Ensuring alignment of communications through job postings and candidate experience align with our overall Annalect culture and brand.
20% Recruiting Operations
- Reporting and Metrics:
- Ensuring the recruiting report is updated on a weekly basis and providing any further metrics or reporting needs analysis.
- Leadership level communication: Managing the Leadership and as appropriate Managers communications on the overall health of our recruitment.
- Financials:
- Oversight of the budgets aligned against search agencies, external events, memberships
- Oversight of referral program partnering with Finance, including processing of any referral payments
- Agency Relationships: Responsible for negotiating and managing any external agency relationships. Fiscally responsible for the costs and management of these relationships.
- Responsible for the overall recruiting flow and offer approval flow:
Experience Guidelines:
- To ensure that the HR Policies & Procedures on recruitment are fully complied throughout the selection process. Strong ability to communicate effectively (verbal and written), at all levels within and outside the organization
- Solid understanding of on-going Technology trends. Past experience in managing and mentoring a team of at least 4-6 recruiters is a must.
- About 10 years of related experience of which a minimum of 6 years should be in a similar position / responsibility, preferably in a similar Industry. HR , Recruitment , Recruitment Manager/Preferably from Analytics or IT domain company
- Hands-on experience on MS Office skills
Ways of Working
- Think and act strategically to resolve issues, problem solve, add business value
- Approach work with a sense of urgency
- Strong organizational/planning, project management, and multi-tasking skills required to deliver on time and within budget
- Work independently with minimal supervision to set priorities and demonstrate excellent program /process and management skills
- Demonstrate a strong attention to detail
- Demonstrated experience in fostering a collaborative working environment between individuals and teams through coaching & personal example
- Exceptional time and priority management skills
- Use excellent communication skills to collaborate and provide transparency on all things
- Be a collaborative, high-energy, proactive self-starter
• 5+ years’ experience developing and maintaining modern ingestion pipeline using
technologies like Spark, Apache Nifi etc).
• 2+ years’ experience with Healthcare Payors (focusing on Membership, Enrollment, Eligibility,
• Claims, Clinical)
• Hands on experience on AWS Cloud and its Native components like S3, Athena, Redshift &
• Jupyter Notebooks
• Strong in Spark Scala & Python pipelines (ETL & Streaming)
• Strong experience in metadata management tools like AWS Glue
• String experience in coding with languages like Java, Python
• Worked on designing ETL & streaming pipelines in Spark Scala / Python
• Good experience in Requirements gathering, Design & Development
• Working with cross-functional teams to meet strategic goals.
• Experience in high volume data environments
• Critical thinking and excellent verbal and written communication skills
• Strong problem-solving and analytical abilities, should be able to work and delivery
individually
• Good-to-have AWS Developer certified, Scala coding experience, Postman-API and Apache
Airflow or similar schedulers experience
• Nice-to-have experience in healthcare messaging standards like HL7, CCDA, EDI, 834, 835, 837
• Good communication skills

