
Experience: 4+ years.
Location: Vadodara & Pune
Skills Set- Snowflake, Power Bi, ETL, SQL, Data Pipelines
What you'll be doing:
- Develop, implement, and manage scalable Snowflake data warehouse solutions using advanced features such as materialized views, task automation, and clustering.
- Design and build real-time data pipelines from Kafka and other sources into Snowflake using Kafka Connect, Snowpipe, or custom solutions for streaming data ingestion.
- Create and optimize ETL/ELT workflows using tools like DBT, Airflow, or cloud-native solutions to ensure efficient data processing and transformation.
- Tune query performance, warehouse sizing, and pipeline efficiency by utilizing Snowflakes Query Profiling, Resource Monitors, and other diagnostic tools.
- Work closely with architects, data analysts, and data scientists to translate complex business requirements into scalable technical solutions.
- Enforce data governance and security standards, including data masking, encryption, and RBAC, to meet organizational compliance requirements.
- Continuously monitor data pipelines, address performance bottlenecks, and troubleshoot issues using monitoring frameworks such as Prometheus, Grafana, or Snowflake-native tools.
- Provide technical leadership, guidance, and code reviews for junior engineers, ensuring best practices in Snowflake and Kafka development are followed.
- Research emerging tools, frameworks, and methodologies in data engineering and integrate relevant technologies into the data stack.
What you need:
Basic Skills:
- 3+ years of hands-on experience with Snowflake data platform, including data modeling, performance tuning, and optimization.
- Strong experience with Apache Kafka for stream processing and real-time data integration.
- Proficiency in SQL and ETL/ELT processes.
- Solid understanding of cloud platforms such as AWS, Azure, or Google Cloud.
- Experience with scripting languages like Python, Shell, or similar for automation and data integration tasks.
- Familiarity with tools like dbt, Airflow, or similar orchestration platforms.
- Knowledge of data governance, security, and compliance best practices.
- Strong analytical and problem-solving skills with the ability to troubleshoot complex data issues.
- Ability to work in a collaborative team environment and communicate effectively with cross-functional teams
Responsibilities:
- Design, develop, and maintain Snowflake data warehouse solutions, leveraging advanced Snowflake features like clustering, partitioning, materialized views, and time travel to optimize performance, scalability, and data reliability.
- Architect and optimize ETL/ELT pipelines using tools such as Apache Airflow, DBT, or custom scripts, to ingest, transform, and load data into Snowflake from sources like Apache Kafka and other streaming/batch platforms.
- Work in collaboration with data architects, analysts, and data scientists to gather and translate complex business requirements into robust, scalable technical designs and implementations.
- Design and implement Apache Kafka-based real-time messaging systems to efficiently stream structured and semi-structured data into Snowflake, using Kafka Connect, KSQL, and Snow pipe for real-time ingestion.
- Monitor and resolve performance bottlenecks in queries, pipelines, and warehouse configurations using tools like Query Profile, Resource Monitors, and Task Performance Views.
- Implement automated data validation frameworks to ensure high-quality, reliable data throughout the ingestion and transformation lifecycle.
- Pipeline Monitoring and Optimization: Deploy and maintain pipeline monitoring solutions using Prometheus, Grafana, or cloud-native tools, ensuring efficient data flow, scalability, and cost-effective operations.
- Implement and enforce data governance policies, including role-based access control (RBAC), data masking, and auditing to meet compliance standards and safeguard sensitive information.
- Provide hands-on technical mentorship to junior data engineers, ensuring adherence to coding standards, design principles, and best practices in Snowflake, Kafka, and cloud data engineering.
- Stay current with advancements in Snowflake, Kafka, cloud services (AWS, Azure, GCP), and data engineering trends, and proactively apply new tools and methodologies to enhance the data platform.

About Intellikart Ventures LLP
About
Similar jobs
We are looking for an experienced professional to lead our lead generation and PPC strategy. The role focuses on driving high-quality leads, managing ad campaigns, and ensuring strong ROI. The ideal candidate will have leadership skills, hands-on PPC expertise, and a proven record of improving lead conversions.
Responsibilities
- Create and manage lead generation strategy to meet business targets.
- Plan and run PPC campaigns on Google Ads, LinkedIn, Meta, and Display networks.
- Optimize ad budgets to lower cost per lead (CPL) and improve ROI.
- Guide and mentor a team of lead generation and PPC executives.
- Work with sales teams to improve lead-to-customer conversion.
- Track and report KPIs like leads, CPL, CTR, CAC, and ROI.
Skills Needed
- 8–12 years of experience in lead generation and PPC.
- Strong knowledge of Google Ads, LinkedIn Ads, Meta Ads, and YouTube.
- Experience with CRM and automation tools (HubSpot, Salesforce, Zoho).
- Ability to analyze data and make decisions to improve performance.
- Team management and leadership skills.
- Edtech experience is mandatory.
Benefits
- Health insurance
- Life insurance
- Provident Fund
• Prepare Balance Sheet Reconciliations
• Reconcile intercompany activity
• Perform Trial Balance Variance analysis
• Perform Accounts Payable Journal Review- (GL Account, Cost Center,
VAT & Withholding)
• Provide support of local Audits
• Prepare monthly FP&A reports
• Manage Local compliance including VAT, Withholding, Quarterly
Income Tax Return, Audited Financial Statements
• Ad-hoc tasks as necessary
• Aid in project development
Enterprise Minds, with core focus on engineering products, automation and intelligence, partners customers on the trajectory towards increasing outcomes, relevance, and growth.
Harnessing the power of Data and the forces that define AI, Machine Learning and Data Science, we believe in institutionalizing go-to-market models and not just explore possibilities.
We believe in a customer-centric ethic without and people-centric paradigm within. With a strong sense of community, ownership, and collaboration our people work in a spirit of co-creation, co-innovation, and co-development to engineer next-generation software products with the help of accelerators.
Through Communities we connect and attract talent that shares skills and expertise. Through Innovation Labs and global design studios we deliver creative solutions.
We create vertical isolated pods which has narrow but deep focus. We also create horizontal pods to collaborate and deliver sustainable outcomes.
We follow Agile methodologies to fail fast and deliver scalable and modular solutions. We are constantly self-asses and realign to work with each customer in the most impactful manner.
Pre-requisites for the Role
- Job ID-EMBD0120PS
- Primary skill: GCP DATA ENGINEER, BIGQUERY, ETL
- Secondary skill: HADOOP, PYTHON, SPARK
- Years of Experience: 5-8Years
- Location: Remote
Budget- Open
NP- Immediate
GCP DATA ENGINEER
Position description
- Designing and implementing software systems
- Creating systems for collecting data and for processing that data
- Using Extract Transform Load operations (the ETL process)
- Creating data architectures that meet the requirements of the business
- Researching new methods of obtaining valuable data and improving its quality
- Creating structured data solutions using various programming languages and tools
- Mining data from multiple areas to construct efficient business models
- Collaborating with data analysts, data scientists, and other teams.
Candidate profile
- Bachelor’s or master’s degree in information systems/engineering, computer science and management or related.
- 5-8 years professional experience as Big Data Engineer
- Proficiency in modelling and maintaining Data Lakes with PySpark – preferred basis.
- Experience with Big Data technologies (e.g., Databricks)
- Ability to model and optimize workflows GCP.
- Experience with Streaming Analytics services (e.g., Kafka, Grafana)
- Analytical, innovative and solution-oriented mindset
- Teamwork, strong communication and interpersonal skills
- Rigor and organizational skills
- Fluency in English (spoken and written).
Card91 is simplifying business payments by providing a plug and play issuance infrastructure
to businesses for domestic and cross-border payments. Our full stack platform enables
businesses to gain control and visibility into their payment flows, controlling the entire payment
lifecycle from onboarding, to issuance, to transaction and data reconciliation. The platform
aims to provide holistic solutions around payment issuance and distribution management. Our
deep focus in payments will help us in bringing new issuance formats as well as technologies
to businesses.
The company has been founded by seasoned technology entrepreneurs who have founded
successful companies like Myntra, Mastiff Technologies earlier. Headquartered in Bangalore,
with presence in Mumbai and NCR, Card91 aims to disrupt the payment infrastructure space
in the issuer-processor segment.
Position: Python Developer
We are building our core team which will be responsible for a highly scalable, always available
microservices based backend for payment processing. The developer will be involved with the
design, implementation, and execution from day one and build a robust, secure, and scalable
payment processing engine.
Roles & Responsibilities
● Individual Contributors talking full ownership of the micro services
● Design and Implement the micro services from scratch
● Be creative and always try to evaluate new strategies to execute faster
● Team player and always eager to teach new team members about the code,
● structure and design
● Not afraid of bringing new designs (even languages) which will make the system
execute faster
Experience and Qualifications
● Having 2 - 5 years of experience
● Degree from a premier institute like IIT/NIT/BITS is desirable
● A strong expertise in Python framework (like Django, Flask or Pyramid)
● Experience in creating APIs with design standards, patterns and best-practices
● Knowledge of object-relational mapping (ORM)
● GoLang experience is a big plus
● Experience building web services in REST
● knowledge and hands-on experience in API security standards and implementation
(Oauth, OpenId)
● Experience in developing highly scalable and reliable web applications, including
integration with internal and external systems.
● Experience in databases like MySQL and NoSQL db like Cassandra, etc.
● Experience with Cloud computing AWS & Microservices architecture
● Exposure to front end technologies like VueJS, ReactJS etc is an advantage
● Docker & Kubernetes experience is a big plus
● Previous experience of Fintech company is an added advantage
What’s on offer
● Vibrant, fun and rewarding culture that nurtures and promotes excellence
● Opportunities to learn and interact with payment industry leaders
● Competitive remuneration, group health insurance & PF
● Other office perks of being part of an early-stage startup
Location: HSR Layout, Bangalore
Responsibilities
- Convert shared custom designs into Shopify websites
- Expert-level knowledge of Shopify Liquid templating language to edit the website
- Add new sections, filters, products, etc.
- Integrating third-party and platform-supported apps into the websites
- Analyze the requirements and provide solutions
- Work on issue debugging and troubleshooting
- Use Shopify liquid, jQuery, CSS, and HTML to deliver interfaces as required
- Use Shopify JS APIs (storefront, AJAX Cart, Sections, etc.) to deliver the required functionality
Additional: Candidates with knowledge of SEO will be preferred.
Education and Experience:-
- A Bachelor's degree in Computer Science or equivalent.
- Strong knowledge of the Shopify platform
- Knowledge of SEO optimization
- 2+ years of development experience
- To appoint Block Sales Manager in their respective Districts.
- To manage sales operations in assigned district to achieve revenue goals.
- To supervise sales team members on daily basis and provide guidance whenever needed.
- To identify skill gaps and conduct trainings to sales team.
- To work with team to implement new sales techniques to obtain profits.
- To assist in employee recruitment, promotion, retention and termination activities.
- To conduct employee performance evaluation and provide feedback for improvements.
- To contact potential customers and identify new business opportunities.
- To stay abreast with customer needs, market trends and competitors.
- To maintain clear and complete sales reports for management review.
- To build strong relationships with customers for business growth.
- To analyze sales performances and recommend improvements.
- To ensure that sales team follows company policies and procedures at all times.
- To develop promotional programs to increase sales and revenue.
- To plan and coordinate sales activities for assigned projects.
- To provide outstanding services and ensure customer satisfaction.









