
Experience: 4+ years.
Location: Vadodara & Pune
Skills Set- Snowflake, Power Bi, ETL, SQL, Data Pipelines
What you'll be doing:
- Develop, implement, and manage scalable Snowflake data warehouse solutions using advanced features such as materialized views, task automation, and clustering.
- Design and build real-time data pipelines from Kafka and other sources into Snowflake using Kafka Connect, Snowpipe, or custom solutions for streaming data ingestion.
- Create and optimize ETL/ELT workflows using tools like DBT, Airflow, or cloud-native solutions to ensure efficient data processing and transformation.
- Tune query performance, warehouse sizing, and pipeline efficiency by utilizing Snowflakes Query Profiling, Resource Monitors, and other diagnostic tools.
- Work closely with architects, data analysts, and data scientists to translate complex business requirements into scalable technical solutions.
- Enforce data governance and security standards, including data masking, encryption, and RBAC, to meet organizational compliance requirements.
- Continuously monitor data pipelines, address performance bottlenecks, and troubleshoot issues using monitoring frameworks such as Prometheus, Grafana, or Snowflake-native tools.
- Provide technical leadership, guidance, and code reviews for junior engineers, ensuring best practices in Snowflake and Kafka development are followed.
- Research emerging tools, frameworks, and methodologies in data engineering and integrate relevant technologies into the data stack.
What you need:
Basic Skills:
- 3+ years of hands-on experience with Snowflake data platform, including data modeling, performance tuning, and optimization.
- Strong experience with Apache Kafka for stream processing and real-time data integration.
- Proficiency in SQL and ETL/ELT processes.
- Solid understanding of cloud platforms such as AWS, Azure, or Google Cloud.
- Experience with scripting languages like Python, Shell, or similar for automation and data integration tasks.
- Familiarity with tools like dbt, Airflow, or similar orchestration platforms.
- Knowledge of data governance, security, and compliance best practices.
- Strong analytical and problem-solving skills with the ability to troubleshoot complex data issues.
- Ability to work in a collaborative team environment and communicate effectively with cross-functional teams
Responsibilities:
- Design, develop, and maintain Snowflake data warehouse solutions, leveraging advanced Snowflake features like clustering, partitioning, materialized views, and time travel to optimize performance, scalability, and data reliability.
- Architect and optimize ETL/ELT pipelines using tools such as Apache Airflow, DBT, or custom scripts, to ingest, transform, and load data into Snowflake from sources like Apache Kafka and other streaming/batch platforms.
- Work in collaboration with data architects, analysts, and data scientists to gather and translate complex business requirements into robust, scalable technical designs and implementations.
- Design and implement Apache Kafka-based real-time messaging systems to efficiently stream structured and semi-structured data into Snowflake, using Kafka Connect, KSQL, and Snow pipe for real-time ingestion.
- Monitor and resolve performance bottlenecks in queries, pipelines, and warehouse configurations using tools like Query Profile, Resource Monitors, and Task Performance Views.
- Implement automated data validation frameworks to ensure high-quality, reliable data throughout the ingestion and transformation lifecycle.
- Pipeline Monitoring and Optimization: Deploy and maintain pipeline monitoring solutions using Prometheus, Grafana, or cloud-native tools, ensuring efficient data flow, scalability, and cost-effective operations.
- Implement and enforce data governance policies, including role-based access control (RBAC), data masking, and auditing to meet compliance standards and safeguard sensitive information.
- Provide hands-on technical mentorship to junior data engineers, ensuring adherence to coding standards, design principles, and best practices in Snowflake, Kafka, and cloud data engineering.
- Stay current with advancements in Snowflake, Kafka, cloud services (AWS, Azure, GCP), and data engineering trends, and proactively apply new tools and methodologies to enhance the data platform.

About Intellikart Ventures LLP
About
Similar jobs
Job Title: Lead – Application Engineering
Remote | Chennai | Hyderabad | Bangalore
Tiger Analytics is a global AI and analytics consulting firm. With data and technology at the core of our solutions, our 2800+ tribe is solving problems that eventually impact the lives of millions globally. Our culture is modelled around expertise and respect with a team-first mindset. Headquartered in Silicon Valley, you’ll find our delivery centres across the globe and offices in multiple cities across India, the US, the UK, Canada, and Singapore, including a substantial remote global workforce.
We’re Great Place to Work-Certified™. Working at Tiger Analytics, you’ll be at the heart of an AI revolution. You’ll work with teams that push the boundaries of what is possible and build solutions that energize and inspire.
Curious about the role? What your typical day would look like?
As an Application Engineer, you will work with our Application Engineering team on designing and developing web applications.
More specifically, you might:
- Collaborate with the business analysts and technical managers to understand functional, non-functional requirements, and scope.
- Design and develop multi-tier, cloud-native, high performance, and scalable solutions
- Involve in building world class and robust solutions by applying benchmarking software engineering principles and design patterns.
- Lead cross functional agile teams across the software development lifecycle
- Analyse design alternatives using proof-of-concepts, and engage with architects to choose the best optimal solutions
- Obsess about writing high-quality code and performing reviews of the design and code.
- Interact and collaborate with project/program managers to estimate, plan, and reduce technical concerns at the module or project level.
- Ideate with your peers. Being supportive of their work and providing constructive feedback on their solutions
What do we expect?
- 4+ years of experience
- Experience in building scalable, reliable, and high-performance web applications.
- A desire to write clean yet simple programs using Java / .Net core / Python with Django or Flask
- Keen interest in creating web servers using Node.js / Nginx
- A passion to store, organize, and process information by using database technologies - MySQL / Oracle / PostgreSQL / MongoDB
- Experience in creating and designing test cases using Junit/ Selenium
- Exposure to Cloud environments - AWS / Azure / GCP
- Good understanding of working with API design and development
You are important to us, let’s stay connected!
Every individual comes with a different set of skills and qualities so even if you don’t tick all the boxes for the role today, we urge you to apply as there might be a suitable/unique role for you tomorrow.
We are an equal opportunity employer. Our diverse and inclusive culture and values guide us to listen, trust, respect, and encourage people to grow the way they desire.
Position Profile
Qlik Sense Developer
Location: Mumbai/Gurgaon
Job Description
Role Summary:
Lead and drive the development in BI domain using Qlik Sense eco-system with deep technical and BI ecosystem knowledge. The resource will be responsible for the dashboard design, development, and delivery of BI services using Qlik Sence eco-system.
Key functions & responsibilities:
- QlikView/Qlik Sense Data Architect (possibly certified) with extensive knowledge of QlikView and Qlik Sense including best practices for data modelling, application design and development.
- Familiarity with the use of GeoAnalytics, NPrinting, extensions, widgets, mashups, ODAG and various other advanced features used in Qlik Sense development.
- Good knowledge working with Set Analysis.
- Experience working with Qlik Sense sites and the Qlik Management Console, creating rules, and managing the streams, as well as user and application security.
- Knowledge of Active Directory, proxies, load balancers, etc.
- Experience in troubleshooting connectivity, configuration, performance, etc.
- Strong communication and presentation skills.
Candidate’s Profile
Academics:
- Bachelor’s degree preferable in Computer science.
- Master’s degree would have an added advantage.
Experience:
2-6 years of experience in Qlik Sense designing and development.
Job Description
1) Ensure effective Design, Development, Management of team and their activities,
2) Validation and support activities in line with client needs and architectural requirements, continual knowledge management, Adherence to the organisational guidelines and processes.
3) Technical and Professional Requirements : - Minimum 3-4 years of experience required * Experience in developing PHP, Vue js, Angular applications with in-depth knowledge of Laravel - Lumen framework * In depth knowledge of MySQL and PostgreSQL databases * In depth knowledge of Java script * In depth knowledge of deployments on AWS servers using Bean stalk technology * Knowledge of Project mgmt tools such as Jira, Dev ops etc. * Knowledge of Payment gateway integration. * Experience in handling large data * Experience in Microservices Architecture - Pattern
-
Collaborate with headhunters to hire and retain the top best talent.
-
Screen resumes and job application forms.
-
Help to structure the job description.
-
Assist Hiring Managers to conduct interviews.
-
Interview job candidates via calls and conduct on-site interviews.
-
Ability to manages new employee relocation that determines new employee requirements and arranges temporary housing.
-
Coordinate with the management and corporate recruiters to find out the details of staffing requirements.
- Build close relationships with important stakeholders such as experts, key opinion leaders (KOLs), government and regulatory bodies, medical and other industry associations Ensure strong network with hospitals, and other external partners and organizations
- Built strong network for our services
- You already have & can build a great relation regional HCPs, hospitals
- Work with the operations team to develop and support the implementation of strategies to help build a sustainable organization
- Understand business and evolving organizational priorities from onboarding/collaboration perspective
- Drive business results through supporting the business plans, identify local growth opportunities and support ongoing commercial activities - Act as the custodian for knowledge related assets of products or services, and sales performance by setting up effective communication with HCPs in order to effectively disseminate information to the relevant entities
- Ensure the creation and development of a high-performance team (in coming months)
- Building good relationships with the HCPs, Handling HCPs queries
- Excellent communication skills (written and oral), demonstrated experience in data presentation, ppt making, strong management, interpersonal, and problem-solving skills, entrepreneurial, self-motivator, and a team player with an understanding of the big picture.
- Exposure to working in the same domain is a must.
Lead and drive the development in BI domain using Tableau eco-system with deep technical and BI ecosystem knowledge. The resource will be responsible for the dashboard design, development, and delivery of BI services using Tableau eco-system.
Key functions & responsibilities:
Communication & interaction with the Project Manager to understand the requirement
Dashboard designing, development and deployment using Tableau eco-system
Ensure delivery within a given time frame while maintaining quality
Stay up to date with current tech and bring relevant ideas to the table
Proactively work with the Management team to identify and resolve issues
Performs other related duties as assigned or advised
He/she should be a leader that sets the standard and expectations through an example in
his/her conduct, work ethic, integrity and character
Contribute in dashboard designing, R&D and project delivery using Tableau
Candidate’s Profile
Academics:
Batchelor’s degree preferable in Computer science.
Master’s degree would have an added advantage.
Experience:
Overall 2-5 Years of experience in DWBI development projects, having worked on BI and
Visualization technologies (Tableau, Qlikview) for at least 2 years.
At least 2 years of experience covering Tableau implementation lifecycle including hands-on development/programming, managing security, data modelling, data blending, etc.
Technology & Skills:
Hands on expertise of Tableau administration and maintenance
Strong working knowledge and development experience with Tableau Server and Desktop
Strong knowledge in SQL, PL/SQL and Data modelling
Knowledge of databases like Microsoft SQL Server, Oracle, etc.
Exposure to alternate Visualization technologies like Qlikview, Spotfire, Pentaho etc.
Good communication & Analytical skills with Excellent creative and conceptual thinking
abilities
Superior organizational skills, attention to detail/level of quality, Strong communication
skills, both verbal and written









