
Experience: 4+ years.
Location: Vadodara & Pune
Skills Set- Snowflake, Power Bi, ETL, SQL, Data Pipelines
What you'll be doing:
- Develop, implement, and manage scalable Snowflake data warehouse solutions using advanced features such as materialized views, task automation, and clustering.
- Design and build real-time data pipelines from Kafka and other sources into Snowflake using Kafka Connect, Snowpipe, or custom solutions for streaming data ingestion.
- Create and optimize ETL/ELT workflows using tools like DBT, Airflow, or cloud-native solutions to ensure efficient data processing and transformation.
- Tune query performance, warehouse sizing, and pipeline efficiency by utilizing Snowflakes Query Profiling, Resource Monitors, and other diagnostic tools.
- Work closely with architects, data analysts, and data scientists to translate complex business requirements into scalable technical solutions.
- Enforce data governance and security standards, including data masking, encryption, and RBAC, to meet organizational compliance requirements.
- Continuously monitor data pipelines, address performance bottlenecks, and troubleshoot issues using monitoring frameworks such as Prometheus, Grafana, or Snowflake-native tools.
- Provide technical leadership, guidance, and code reviews for junior engineers, ensuring best practices in Snowflake and Kafka development are followed.
- Research emerging tools, frameworks, and methodologies in data engineering and integrate relevant technologies into the data stack.
What you need:
Basic Skills:
- 3+ years of hands-on experience with Snowflake data platform, including data modeling, performance tuning, and optimization.
- Strong experience with Apache Kafka for stream processing and real-time data integration.
- Proficiency in SQL and ETL/ELT processes.
- Solid understanding of cloud platforms such as AWS, Azure, or Google Cloud.
- Experience with scripting languages like Python, Shell, or similar for automation and data integration tasks.
- Familiarity with tools like dbt, Airflow, or similar orchestration platforms.
- Knowledge of data governance, security, and compliance best practices.
- Strong analytical and problem-solving skills with the ability to troubleshoot complex data issues.
- Ability to work in a collaborative team environment and communicate effectively with cross-functional teams
Responsibilities:
- Design, develop, and maintain Snowflake data warehouse solutions, leveraging advanced Snowflake features like clustering, partitioning, materialized views, and time travel to optimize performance, scalability, and data reliability.
- Architect and optimize ETL/ELT pipelines using tools such as Apache Airflow, DBT, or custom scripts, to ingest, transform, and load data into Snowflake from sources like Apache Kafka and other streaming/batch platforms.
- Work in collaboration with data architects, analysts, and data scientists to gather and translate complex business requirements into robust, scalable technical designs and implementations.
- Design and implement Apache Kafka-based real-time messaging systems to efficiently stream structured and semi-structured data into Snowflake, using Kafka Connect, KSQL, and Snow pipe for real-time ingestion.
- Monitor and resolve performance bottlenecks in queries, pipelines, and warehouse configurations using tools like Query Profile, Resource Monitors, and Task Performance Views.
- Implement automated data validation frameworks to ensure high-quality, reliable data throughout the ingestion and transformation lifecycle.
- Pipeline Monitoring and Optimization: Deploy and maintain pipeline monitoring solutions using Prometheus, Grafana, or cloud-native tools, ensuring efficient data flow, scalability, and cost-effective operations.
- Implement and enforce data governance policies, including role-based access control (RBAC), data masking, and auditing to meet compliance standards and safeguard sensitive information.
- Provide hands-on technical mentorship to junior data engineers, ensuring adherence to coding standards, design principles, and best practices in Snowflake, Kafka, and cloud data engineering.
- Stay current with advancements in Snowflake, Kafka, cloud services (AWS, Azure, GCP), and data engineering trends, and proactively apply new tools and methodologies to enhance the data platform.

About Intellikart Ventures LLP
About
Similar jobs
Job Title: Lead Java Developer
Location: Bangalore
Experience: 8-12 years (only)
Job Overview:
We are looking for a highly experienced Java / Backend Developer with a strong background in investment banking/trading/brokerages to join our team. The ideal candidate will have extensive experience in developing front office and middle office systems which require high availability, high reliability and low latency/high throughput performance profiles.
Skills & Qualifications:
- 8 to 12 years of experience in Java development, with at least 4 years in the investment banking/financial services domain.
- Strong knowledge of Java, Spring Framework, RESTful web services.
- Hands-on experience with AWS Cloud and related services. Knowledge of Kubernetes is a plus.
- Solid experience with Kafka and/or messaging protocols.
- Familiarity with SQL databases (e.g., Postgres/Oracle).
- Strong problem-solving skills and ability to work in a team.
- Ability to understand and work with distributed systems / microservices architecture.
- Solid written and verbal communication skills to interface with other technology and business stakeholders.
We’re looking for passionate and skilled professionals who:
✅ Have deep understanding of the Fiserv Signature platform
✅ Come from BA, Dev, or QA background
✅ Can contribute to high-impact banking & fintech projects
✅ Are ready to take their career to the next level with a collaborative team
Skills Required:
For Oracle Apex Developer, we are specifically looking for someone who has good hands-on experience in setting up and configuring Oracle APEX on a Red Hat Linux environment.

What are the Key Responsibilities:
- Meeting with the development team to discuss user interface ideas and applications.
- Reviewing application requirements and interface designs.
- Identifying web-based user interactions.
- Developing and implementing highly responsive user interface components using react concepts.
- Writing application interface codes using JavaScript following react.js workflows.
- Troubleshooting interface software and debugging application codes.
- Developing and implementing front-end architecture to support user interface concepts.
- Monitoring and improving front-end performance.
- Documenting application changes and developing updates.
What are we looking for:
- Bachelor’s degree in computer science, information technology, or a similar field.
- Previous experience working as a react.js developer.
- In-depth knowledge of JavaScript, CSS, HTML, and front-end languages.
- Knowledge of REACT tools including React.js, Webpack, Enzyme, Redux, and Flux.
- Experience with user interface design.
- Experience with browser-based debugging and performance testing software.
- Excellent troubleshooting skills

Responsibilities
- Manage and drive a team of Data Analysts and Sr. Data Analysts to provide logistics and supply chain solutions.
- Conduct meetings with Clients to gather the requirements and understand the scope.
- Conduct meetings with internal stake holders to walk them through the solution and handover the analysis.
- Define business problems, identify solutions, provide analysis and insights from the client's data.
- 5 Conduct scheduled progress reviews on all projects and interact with onsite team daily.
- Ensure solutions are delivered error free and submitted on time.
- Implement ETL processes using Pentaho Data Integration (Pentaho ETL)Design and implement data models in Hadoop.
- Provide end-user training and technical assistance to maximize utilization of tools.
- Deliver technical guidance to team, including hands-on development as necessary; oversee standards, change controls and documentation library for training and reuse.
Requirements
- Bachelor's degree in Engineering.
- 16+ years of experience in Supply Chain and logistics or related industry and Analytics experience.
- 3 years of experience in team handling(8+People) and interacting with the executive leadership teams.
- Strong project and time management skills with ability to multitask and prioritize workload.
- Solid expertise with MS Excel, SQL, any visualization tools like Tableau/Power BI, any ETL tools.
- Proficiency in Hadoop / Hive.
- Experience Pentaho ETL, Pentaho Visualization API, Tableau.
- Hands on experience of working with Big data sets (Data sets with millions of records).
- Strong technical and Management experience.
Desired Skills and Experience
- NET,ASP.NET


Thirumoolar software is seeking talented AI researchers to join our cutting-edge team and help drive innovation in artificial intelligence. As an AI researcher, you will be at the forefront of developing intelligent systems that can solve complex problems and uncover valuable insights from data.
Responsibilities:
Research and Development: Conduct research in AI areas relevant to the company's goals, such as machine learning, natural language processing, computer vision, or recommendation systems. Explore new algorithms and methodologies to solve complex problems.
Algorithm Design and Implementation: Design and implement AI algorithms and models, considering factors such as performance, scalability, and computational efficiency. Use programming languages like Python, Java, or C++ to develop prototype solutions.
Data Analysis: Analyze large datasets to extract meaningful insights and patterns. Preprocess data and engineer features to prepare it for training AI models. Apply statistical methods and machine learning techniques to derive actionable insights.
Experimentation and Evaluation: Design experiments to evaluate the performance of AI algorithms and models. Conduct thorough evaluations and analyze results to identify strengths, weaknesses, and areas for improvement. Iterate on algorithms based on empirical findings.
Collaboration and Communication: Collaborate with cross-functional teams, including software engineers, data scientists, and product managers, to integrate AI solutions into our products and services. Communicate research findings, technical concepts, and project updates effectively to stakeholders.
Preferred Location: Chennai
About Us
Rezo.ai is an AI-Powered Contact Centre that enables enterprises to enhance customer experience and boost revenue by automating and analyzing customer agent interactions across multiple channels including voice, email, chat/WhatsApp, and social, at the required scale, whilst training agents with minimal costs.
How do we do it
Rezo’s AI-Powered contact center leverages ground-breaking technologies in AI, ML, ASR, NLP, RPA, and predictive intelligence to transform customer experience and reduce costs by automating, analyzing social media, whilst coaching them.
Overview
Database administrator (DBA) will be the performance, integrity and security of a database. You'll be involved in the planning and development of the database, as well as in troubleshooting any issues on behalf of the users.
Job Responsibilities:
- Monitoring system performance and identifying problems that arise.
- Install and maintain the performance of database servers.
- Develop processes for optimizing database security.
- Set and maintain database standards.
- Install, tune, implement and upgrade DBMS installations
- Write and deploy MYSQL patches
- Upgrade and improve application schema and data upgrades
- Manage database access.
- Performance tuning of database systems.
- Install, upgrade, and manage database applications.
- Diagnose and troubleshoot database errors.
- Recommend and implement emerging database technologies.
- Create and manage database reports, visualizations, and dashboards.
- Create automation for repeating database tasks.
- Be available for on-call support as needed.
Skills Required:
- A minimum of 3 years' experience as a database administrator.
- Strong and Proficiency in data manipulation languages, including MS SQL,
- Advanced knowledge of database security, backup and recovery, and performance monitoring standards
- Understanding of relational and dimensional data modelling
- Strong mathematical and statistical knowledge
- Excellent written and verbal communication skills
- Impeccable attention to detail
To recruit and train the Block Sales Manager for their current district.
To manage sales operation.
To identify skill gaps.
To contact potential customers to identify business opportunity.
To develop promotional programs to increase sales and revenue.
To supervise the team.
To maintain clear and complete sales reports for management review.
To plan and coordinate sales activities for assigned projects.
To provide outstanding services and ensure customer satisfaction.

