
Experience: 4+ years.
Location: Vadodara & Pune
Skills Set- Snowflake, Power Bi, ETL, SQL, Data Pipelines
What you'll be doing:
- Develop, implement, and manage scalable Snowflake data warehouse solutions using advanced features such as materialized views, task automation, and clustering.
- Design and build real-time data pipelines from Kafka and other sources into Snowflake using Kafka Connect, Snowpipe, or custom solutions for streaming data ingestion.
- Create and optimize ETL/ELT workflows using tools like DBT, Airflow, or cloud-native solutions to ensure efficient data processing and transformation.
- Tune query performance, warehouse sizing, and pipeline efficiency by utilizing Snowflakes Query Profiling, Resource Monitors, and other diagnostic tools.
- Work closely with architects, data analysts, and data scientists to translate complex business requirements into scalable technical solutions.
- Enforce data governance and security standards, including data masking, encryption, and RBAC, to meet organizational compliance requirements.
- Continuously monitor data pipelines, address performance bottlenecks, and troubleshoot issues using monitoring frameworks such as Prometheus, Grafana, or Snowflake-native tools.
- Provide technical leadership, guidance, and code reviews for junior engineers, ensuring best practices in Snowflake and Kafka development are followed.
- Research emerging tools, frameworks, and methodologies in data engineering and integrate relevant technologies into the data stack.
What you need:
Basic Skills:
- 3+ years of hands-on experience with Snowflake data platform, including data modeling, performance tuning, and optimization.
- Strong experience with Apache Kafka for stream processing and real-time data integration.
- Proficiency in SQL and ETL/ELT processes.
- Solid understanding of cloud platforms such as AWS, Azure, or Google Cloud.
- Experience with scripting languages like Python, Shell, or similar for automation and data integration tasks.
- Familiarity with tools like dbt, Airflow, or similar orchestration platforms.
- Knowledge of data governance, security, and compliance best practices.
- Strong analytical and problem-solving skills with the ability to troubleshoot complex data issues.
- Ability to work in a collaborative team environment and communicate effectively with cross-functional teams
Responsibilities:
- Design, develop, and maintain Snowflake data warehouse solutions, leveraging advanced Snowflake features like clustering, partitioning, materialized views, and time travel to optimize performance, scalability, and data reliability.
- Architect and optimize ETL/ELT pipelines using tools such as Apache Airflow, DBT, or custom scripts, to ingest, transform, and load data into Snowflake from sources like Apache Kafka and other streaming/batch platforms.
- Work in collaboration with data architects, analysts, and data scientists to gather and translate complex business requirements into robust, scalable technical designs and implementations.
- Design and implement Apache Kafka-based real-time messaging systems to efficiently stream structured and semi-structured data into Snowflake, using Kafka Connect, KSQL, and Snow pipe for real-time ingestion.
- Monitor and resolve performance bottlenecks in queries, pipelines, and warehouse configurations using tools like Query Profile, Resource Monitors, and Task Performance Views.
- Implement automated data validation frameworks to ensure high-quality, reliable data throughout the ingestion and transformation lifecycle.
- Pipeline Monitoring and Optimization: Deploy and maintain pipeline monitoring solutions using Prometheus, Grafana, or cloud-native tools, ensuring efficient data flow, scalability, and cost-effective operations.
- Implement and enforce data governance policies, including role-based access control (RBAC), data masking, and auditing to meet compliance standards and safeguard sensitive information.
- Provide hands-on technical mentorship to junior data engineers, ensuring adherence to coding standards, design principles, and best practices in Snowflake, Kafka, and cloud data engineering.
- Stay current with advancements in Snowflake, Kafka, cloud services (AWS, Azure, GCP), and data engineering trends, and proactively apply new tools and methodologies to enhance the data platform.

About Intellikart Ventures LLP
About
Similar jobs
Role Purpose: Contributing to develop and optimizing industrial processes and improvement in current
processes to enhance performance and productivity of fabrication by using technical skills, best practices in
manufacturing etc.
SKILLS:
1. Analyzing and interpreting skills for data and situations
2. Creative thinking and problem-solving skills
3. Communications skills
KNOWLEDGE:
● 1. Should have proven work experience in process engineering considering in same
domain.
● 2. Should have excellent technical skills in to understand drawings and application.
● 3. Should have experience in welding process validation and implementation
● 4. Should familiar with health and safety regulations
● 5. Should have good analytical and interpersonal skill
● 6. Should have knowledge of Lean, Welding and other problem solving tools & technique
DETAILED JOB DESCRIPTION: (HIGHLIGHT CRITICAL ACTIVITIES)
1. Establish critical processes parameters and prepare reports for performance.
2. Manage and validate all process and establish corrective actions for any
non-conformance. 3. Analyze data and monitor implementation of all process and quality of
that process.
4. Monitor projects, identify risks and issues and prepare documents for same.
5. Develop welder competency and improve fabrication culture toward excellence.
6. Perform risk assessments on new developments.
7. Evaluate current processes or operating procedure and suggest recommendations if any. 8.
Analyze and troubleshoot process problems to make continuous and permanent improvements
9. Initiate improvement projects and completed on time and within budget.
Role - Digital Marketing Manager
About Company -
Our client is a Tech Based Platform for freelance Photographer, Videographers, Food stylist and Cinematographer. Founded by alumni of IIM Bangalore. They are operational in about 100 Cities and 28 Countries. Leader in Food and Real Estate Photography. Done about 7000+ projects across the Globe.
Job description -
1. Design, maintain, and supply content for the organization's website
2. Formulate strategies to build lasting digital connection with customers
3. Monitor company presence on social media
4. Launch advertisements to increase brand awareness
5. Plan, monitor & execute social media activities on channels like Facebook, Instagram, Linkedin, Twitter, etc.
6. Run ads on Facebook, Instagram, Google, and other similar channels.
7.Be actively involved in SEO efforts (keyword, image optimisation etc.)
Minimum experience - 2 years
Position: Marketing Manager
CTC: 7 LPA - 9 LPA
Work Location: HRBR Layout, Bangalore, Karnataka
Key Performance Indicators:
- Marketing Qualified Leads
- Growth in Website Visitors
- RoI on Digital & Offline Marketing Campaigns
- Improve the Performance of Digital Marketing Channels (Email, BPN, WPN, Social Media etc)
- Generate Leads through Event Participation or other Channels
Key Responsibilities:
- Create and manage content for the Companies social media handles, emails, and other forms of digital communication
- Preparing periodic marketing plan- monthly quarterly
- Making Marketing budgets
- Meeting marketing revenue targets
- Generating expected Mqls
- Handling branding and sales promotional campaigns
- Stay up to date on the latest social media trends, and implement them in Companies marketing campaigns.
- Prepare reports and analytics on the overall performance of various marketing campaigns, including ROIs and KPIs
Marketing Channels:
- Browser Push Notifications
- Web Push Notification
- Email, SMS, WhatsApp
- Social Media - Facebook, LinkedIn, Twitter, Instagram
- Blog & Other guest posting platforms
- Trade Promotional Events
Skills & Ability:
- Social media savvy, with a passion for staying on top of trends
- Strong sense of creativity, and innovation to create and develop content.
- Strong project management and managerial skills
- Ability to multitask.
Technical Skills:
Knowledge of the following tools are added advantages
- SEO and SEM strategy and keyword research
- Word press blogging
- Facebook, LinkedIn, Instagram business page management
- Canva for image editing
- Browser push notifications software
- Web-push notifications
- Google Docs
- GetSiteControl
Education & Experience:
- MBA (Marketing) is preferred
- Experience: 3 to 5 Years of experience in Digital Marketing
- Location: Bangalore (Work from office only)
Additional Benefits:
- Health Insurance is provided
- Meal allowance is provided
- Performance based incentives
Job Types: Full-time, Permanent
Pay: ₹700,000.00 - ₹900,000.00 per year
Benefits:
- Health insurance
Schedule:
- Day shift
Supplemental Pay:
- Performance bonus
Ability to commute/relocate:
- Bangalore, Karnataka: Reliably commute or planning to relocate before starting work (Required)
Education:
- Master's (Preferred)
Experience:
- total work: 1 year (Preferred)
- Digital marketing: 3 years (Required)
Your Roles and Responsibilities:
Business Development is a critical aspect of our platform business.
1. Actively seeking out new sales opportunities through cold calling, networking, and social media.
2. Calling 65-70 leads every day
3. Setting up meetings with potential clients (parents)
4. Generating Trial Classes - Pitch Parents to take PlanetSpark Trial Classes
5. Negotiate/close deals and handle complaints or objections
6. Follow and achieve the department’s sales goals on a monthly, quarterly, and yearly basis (3L revenue per month)
7. “Go the extra mile” to drive sales
What are we looking for?
1. Proficiency in English
2. Thorough understanding of marketing and negotiating techniques
3. Fast learner and passion for sales
4. Self-motivated with a results-driven approach
5. Proven experience in sales or relevant roles is a plus
6. A friendly and energetic personality with a customer service focus
Criteria
1. Willing to work 5 days a week in a fast-paced startup
2. Ready to work from Office and join immediately.
Mission Statement
1. In your training of 1st month, we shall actively probe you through the process using Training Decks, Live Experiences, and Re-Training Programs that aim to give you an overall Learning experience along with your paid stipend along the journey.
2. We create career-oriented BDT in this 1 month whose skill becomes unmatched across the sector.
3. You are required to use the right sales strategy and accomplish 1 Lac of revenue in the training period, hence making a way to achieve a License to Sell (L-2-S)
4. You will be entitled to a training stipend of INR. 21428. In-office attendance of 90% is mandatory in 1month of training
5. Post 1L Revenue achievement, you will be entitled to the below CTC
India Shift – INR 6.6 LPA (4.23 LPA Fixed + 2.4 LPA Variable)
US/ Canada Shift - INR 7.10 LPA (4.8 Fixed + 2.3 LPA Variable)
Skill Required - Excellent Communication Experience Required - 00-1 year (fresher also welcome)
Minimum Education: Graduation
2018, 2019, 2020,2021,2022.
Read less
We are looking for an experienced Sr.Devops Consultant Engineer to join our team. The ideal candidate should have at least 5+ years of experience.
We are retained by a promising startup located in Silicon valley backed by Fortune 50 firm with veterans from firms as Zscaler, Salesforce & Oracle. Founding team has been part of three unicorns and two successful IPO’s in the past and well funded by Dell Technologies and Westwave Capital. The company has been widely recognized as an industry innovator in the Data Privacy, Security space and being built by proven Cybersecurity executives who have successfully built and scaled high growth Security companies and built Privacy programs as executives.
Responsibilities:
- Develop and maintain infrastructure as code using tools like Terraform, CloudFormation, and Ansible
- Manage and maintain Kubernetes clusters on EKS and EC2 instances
- Implement and maintain automated CI/CD pipelines for microservices
- Optimize AWS costs by identifying cost-saving opportunities and implementing cost-effective solutions
- Implement best security practices for microservices, including vulnerability assessments, SOC2 compliance, and network security
- Monitor the performance and availability of our cloud infrastructure using observability tools such as Prometheus, Grafana, and Elasticsearch
- Implement backup and disaster recovery solutions for our microservices and databases
- Stay up to date with the latest AWS services and technologies and provide recommendations for improving our cloud infrastructure
- Collaborate with cross-functional teams, including developers, and product managers, to ensure the smooth operation of our cloud infrastructure
- Experience with large scale system design and scaling services is highly desirable
Requirements:
- Bachelor's degree in Computer Science, Engineering, or a related field
- At least 5 years of experience in AWS DevOps and infrastructure engineering
- Expertise in Kubernetes management, Docker, EKS, EC2, Queues, Python Threads, Celery Optimization, Load balancers, AWS cost optimizations, Elasticsearch, Container management, and observability best practices
- Experience with SOC2 compliance and vulnerability assessment best practices for microservices
- Familiarity with AWS services such as S3, RDS, Lambda, and CloudFront
- Strong scripting skills in languages like Python, Bash, and Go
- Excellent communication skills and the ability to work in a collaborative team environment
- Experience with agile development methodologies and DevOps practices
- AWS certification (e.g. AWS Certified DevOps Engineer, AWS Certified Solutions Architect) is a plus.
Notice period : Can join within a month
Looking Data Enginner for our Own organization-
Notice Period- 15-30 days
CTC- upto 15 lpa
Preferred Technical Expertise
- Expertise in Python programming.
- Proficient in Pandas/Numpy Libraries.
- Experience with Django framework and API Development.
- Proficient in writing complex queries using SQL
- Hands on experience with Apache Airflow.
- Experience with source code versioning tools such as GIT, Bitbucket etc.
Good to have Skills:
- Create and maintain Optimal Data Pipeline Architecture
- Experienced in handling large structured data.
- Demonstrated ability in solutions covering data ingestion, data cleansing, ETL, Data mart creation and exposing data for consumers.
- Experience with any cloud platform (GCP is a plus)
- Experience with JQuery, HTML, Javascript, CSS is a plus.


