
* Formulates and recommends standards for achieving maximum performance
and efficiency of the DW ecosystem.
* Participates in the Pre-sales activities for solutions of various customer
problem-statement/situations.
* Develop business cases and ROI for the customer/clients.
* Interview stakeholders and develop BI roadmap for success given project
prioritization
* Evangelize self-service BI and visual discovery while helping to automate any
manual process at the client site.
* Work closely with the Engineering Manager to ensure prioritization of
customer deliverables.
* Champion data quality, integrity, and reliability throughout the organization by
designing and promoting best practices.
*Implementation 20%
* Help DW/DE team members with issues needing technical expertise or
complex systems and/or programming knowledge.
* Provide on-the-job training for new or less experienced team members.
* Develop a technical excellence team
Requirements
- experience designing business intelligence solutions
- experience with ETL Process, Data warehouse architecture
- experience with Azure Data services i.e., ADF, ADLS Gen 2, Azure SQL dB,
Synapse, Azure Databricks, and Power BI
- Good analytical and problem-solving skills
- Fluent in relational database concepts and flat file processing concepts
- Must be knowledgeable in software development lifecycles/methodologies

Similar jobs
- Position: Appian Tech Lead
- Job Description:
- Extensive experience in Appian BPM application development
- Knowledge of Appian architecture and its objects best practices
- Participate in analysis, design, and new development of Appian based applications
- Mandatory Team leadership and provide technical leadership to Scrum teams Certification Mandatory- L1, L2 or L3
- Must be able to multi-task, work in a fast-paced environment and ability to resolve problems faced by team
- Build applications: interfaces, process flows, expressions, data types, sites, integrations,
- Proficient with SQL queries and with accessing data present in DB tables and views
- Experience in Analysis, Designing process models, Records, Reports, SAIL, forms, gateways, smart services, integration services and web services
- Experience working with different Appian Object types, query rules, constant rules and expression rules
Qualifucations
- At least 6 years of experience in Implementing BPM solutions using Appian 19.x or higher
- Over 8 years in Implementing IT solutions using BPM or integration technologies
- Experience in Scrum/Agile methodologies with Enterprise level application development projects
- Good understanding of database concepts and strong working knowledge any one of the major databases e g Oracle SQL Server MySQL
- Appian BPM application development on version 19.x or higher
- Experience of integrations using web services e g XML REST WSDL SOAP API JDBC JMS
- Good leadership skills and the ability to lead a team of software engineers technically
- Experience working in Agile Scrum teams
- Good Communication skills
- We are looking for a strong backend developer with good experience in AWS.
- Should be able to write solid and clean code.
- Should be good with algorithms and architecture.
Hi Kirti,
Job Title: Data Analytics Engineer
Experience: 3 to 6 years
Location: Gurgaon (Hybrid)
Employment Type: Full-time
Job Description:
We are seeking a highly skilled Data Analytics Engineer with expertise in Qlik Replicate, Qlik Compose, and Data Warehousing to build and maintain robust data pipelines. The ideal candidate will have hands-on experience with Change Data Capture (CDC) pipelines from various sources, an understanding of Bronze, Silver, and Gold data layers, SQL querying for data warehouses like Amazon Redshift, and experience with Data Lakes using S3. A foundational understanding of Apache Parquet and Python is also desirable.
Key Responsibilities:
1. Data Pipeline Development & Maintenance
- Design, develop, and maintain ETL/ELT pipelines using Qlik Replicate and Qlik Compose.
- Ensure seamless data replication and transformation across multiple systems.
- Implement and optimize CDC-based data pipelines from various source systems.
2. Data Layering & Warehouse Management
- Implement Bronze, Silver, and Gold layer architectures to optimize data workflows.
- Design and manage data pipelines for structured and unstructured data.
- Ensure data integrity and quality within Redshift and other analytical data stores.
3. Database Management & SQL Development
- Write, optimize, and troubleshoot complex SQL queries for data warehouses like Redshift.
- Design and implement data models that support business intelligence and analytics use cases.
4. Data Lakes & Storage Optimization
- Work with AWS S3-based Data Lakes to store and manage large-scale datasets.
- Optimize data ingestion and retrieval using Apache Parquet.
5. Data Integration & Automation
- Integrate diverse data sources into a centralized analytics platform.
- Automate workflows to improve efficiency and reduce manual effort.
- Leverage Python for scripting, automation, and data manipulation where necessary.
6. Performance Optimization & Monitoring
- Monitor data pipelines for failures and implement recovery strategies.
- Optimize data flows for better performance, scalability, and cost-effectiveness.
- Troubleshoot and resolve ETL and data replication issues proactively.
Technical Expertise Required:
- 3 to 6 years of experience in Data Engineering, ETL Development, or related roles.
- Hands-on experience with Qlik Replicate & Qlik Compose for data integration.
- Strong SQL expertise, with experience in writing and optimizing queries for Redshift.
- Experience working with Bronze, Silver, and Gold layer architectures.
- Knowledge of Change Data Capture (CDC) pipelines from multiple sources.
- Experience working with AWS S3 Data Lakes.
- Experience working with Apache Parquet for data storage optimization.
- Basic understanding of Python for automation and data processing.
- Experience in cloud-based data architectures (AWS, Azure, GCP) is a plus.
- Strong analytical and problem-solving skills.
- Ability to work in a fast-paced, agile environment.
Preferred Qualifications:
- Experience in performance tuning and cost optimization in Redshift.
- Familiarity with big data technologies such as Spark or Hadoop.
- Understanding of data governance and security best practices.
- Exposure to data visualization tools such as Qlik Sense, Tableau, or Power BI.
- Identifying New Sales Leads via Direct Customer Engagement.
- Qualifying the pre-generated leads over the call/Email/Softwares.
- Generate own leads/References from new customers at the time of acquisition.
- Achieving Sales Target & implementing sales promotional strategies of that particular region.
- Speaking with business owners, CEO, CXO, decision makers & explaining the product
- Understanding the requirement of B2B customers & demonstrating the services along with value proposition
- Establish and develop network of customers.
- Coordinating with sales team
- Passion, desire and motivation to sell and close new business
- Strong relationship building and management skills.
- Understanding of Web, Mobile Apps, IOT, Blockchain, Game technologies.
Requirements
- Excellent Verbal and Written in English.
- Excellent knowledge of MS Office.
- Aggressive attitude and driven by Sales Numbers.
- Thorough understanding of Sales and Marketing with negotiating techniques.
- Fast learning speed and a passion for Sales.
- Self-motivated with a results-driven approach.
- Aptitude in delivering attractive presentations.
- Qualification BE/ B.Tech (Comp Science) with MBA(Sales)
Node.js Developer Responsibilities:
- Developing and maintaining all server-side network components.
- Ensuring optimal performance of the central database and responsiveness to front-end requests.
- Collaborating with front-end developers on the integration of elements.
- Designing customer-facing UI and back-end services for various business processes.
- Developing high-performance applications by writing testable, reusable, and efficient code.
- Implementing effective security protocols, data protection measures, and storage solutions.
- Running diagnostic tests, repairing defects, and providing technical support.
- Documenting Node.js processes, including database schemas, as well as preparing reports.
- Recommending and implementing improvements to processes and technologies.
- Keeping informed of advancements in the field of Node.js development.
Node.js Developer Requirements:
- Bachelor's degree in computer science, information science, or similar.
- At least two years' experience as a Node.js developer.
- Extensive knowledge of JavaScript, web stacks, libraries, and frameworks.
- Knowledge of front-end technologies such as HTML5 and CSS3.
- Superb interpersonal, communication, and collaboration skills.
- Exceptional analytical and problem-solving aptitude.
- Great organizational and time management skills.
- Availability to resolve urgent web application issues outside of business hours.
Founded in 2019 by VIT alumni, this B2B marketplace works with the bottomline of hotels thereby helping them boost their profits by at least 20%. It has already empowered over 1000 hotels including ITC Hotels, Radisson and Taj.
This platform is disrupting the hospitality sector by leveraging the power of omnichannel procurement.
- Generating and Managing Leads in the Hospitality Industry
- Training and monitoring the Sales team’s performance
- Designing and formulating strategies to achieve maximum Sales
- Negotiating deals and supporting negotiation from the team
- Achieving the monthly Sales targets by enabling the team .
- Preparing and submitting Sales report to the senior management.
What you need to have:
- Any graduate degree
- Proven experience in Sales and management of a Sales team
- Experience working in a startup, building the team from scratch.
- Strong leadership and communication skills
- A knack of strategizing to drive actual sales and maximize revenue

Job Summary
SQL development for our Enterprise Resource Planning (ERP) Product offered to SMEs. Regular modifications , creation and validation with testing of stored procedures , views, functions on MS SQL Server.
Responsibilities and Duties
Understanding the ERP Software and use cases.
Regular Creation,modifications and testing of
- Stored Procedures
- Views
- Functions
- Nested Queries
- Table and Schema Designs
Qualifications and Skills
MS SQL
- Procedural Language
- Datatypes
- Objects
- Databases
- Schema

- 6+ years of relevant experience in DB2 LUW Administration
- Good experience with
- Performance tuning and troubleshooting
- High availability solutions HACMP, TSA, MSCS Cluster
- Monitoring, backup, recovery, IBM Data Server Manager, TSM , Commvault
- Data replication, Q-replication & CDC
- Implementing DB2 key features
- Desirable experience with Db2 Pure scale
- Experience with tools i.e. ITM, Nagios, Service Now
- Experience with automation and scripting such as CRON, PowerShell, Shell Scripting
- Experience with configuring and usage of Clustering, db2diag and notification logs, snapshot and event monitor
- Experience with use of problem and change management tools


Qualifications
- Bachelor’s/Master’s degree in Engineering with 3+ years experience.
- Strong knowledge in HTML5, CSS, SASS/Bootstrap, OOP JavaScript, JQuery & AJAX are preferred.
- Strong knowledge in developing responsive web app, designing models (JSON) and (Server communication (REST APIs) are preferred.
- Strong knowledge in modern Frontend frameworks & libraries like Angular, Redux + React are preferred.
- Strong Knowledge in HTML templates and modern templating technologies are preferred.
- Strong knowledge in modern libraries like React is preferred.
- Knowledge of NodeJS, NPM packages is a huge plus.
- Knowledge of Gulp, Bower is a plus.
- Knowledge of ExpressJS is desirable.
- Knowledge of Docker is desirable.
- Basic Knowledge of relational DBMS and SQL

