
Similar jobs
Experience: 4+ years.
Location: Vadodara & Pune
Skills Set- Snowflake, Power Bi, ETL, SQL, Data Pipelines
What you'll be doing:
- Develop, implement, and manage scalable Snowflake data warehouse solutions using advanced features such as materialized views, task automation, and clustering.
- Design and build real-time data pipelines from Kafka and other sources into Snowflake using Kafka Connect, Snowpipe, or custom solutions for streaming data ingestion.
- Create and optimize ETL/ELT workflows using tools like DBT, Airflow, or cloud-native solutions to ensure efficient data processing and transformation.
- Tune query performance, warehouse sizing, and pipeline efficiency by utilizing Snowflakes Query Profiling, Resource Monitors, and other diagnostic tools.
- Work closely with architects, data analysts, and data scientists to translate complex business requirements into scalable technical solutions.
- Enforce data governance and security standards, including data masking, encryption, and RBAC, to meet organizational compliance requirements.
- Continuously monitor data pipelines, address performance bottlenecks, and troubleshoot issues using monitoring frameworks such as Prometheus, Grafana, or Snowflake-native tools.
- Provide technical leadership, guidance, and code reviews for junior engineers, ensuring best practices in Snowflake and Kafka development are followed.
- Research emerging tools, frameworks, and methodologies in data engineering and integrate relevant technologies into the data stack.
What you need:
Basic Skills:
- 3+ years of hands-on experience with Snowflake data platform, including data modeling, performance tuning, and optimization.
- Strong experience with Apache Kafka for stream processing and real-time data integration.
- Proficiency in SQL and ETL/ELT processes.
- Solid understanding of cloud platforms such as AWS, Azure, or Google Cloud.
- Experience with scripting languages like Python, Shell, or similar for automation and data integration tasks.
- Familiarity with tools like dbt, Airflow, or similar orchestration platforms.
- Knowledge of data governance, security, and compliance best practices.
- Strong analytical and problem-solving skills with the ability to troubleshoot complex data issues.
- Ability to work in a collaborative team environment and communicate effectively with cross-functional teams
Responsibilities:
- Design, develop, and maintain Snowflake data warehouse solutions, leveraging advanced Snowflake features like clustering, partitioning, materialized views, and time travel to optimize performance, scalability, and data reliability.
- Architect and optimize ETL/ELT pipelines using tools such as Apache Airflow, DBT, or custom scripts, to ingest, transform, and load data into Snowflake from sources like Apache Kafka and other streaming/batch platforms.
- Work in collaboration with data architects, analysts, and data scientists to gather and translate complex business requirements into robust, scalable technical designs and implementations.
- Design and implement Apache Kafka-based real-time messaging systems to efficiently stream structured and semi-structured data into Snowflake, using Kafka Connect, KSQL, and Snow pipe for real-time ingestion.
- Monitor and resolve performance bottlenecks in queries, pipelines, and warehouse configurations using tools like Query Profile, Resource Monitors, and Task Performance Views.
- Implement automated data validation frameworks to ensure high-quality, reliable data throughout the ingestion and transformation lifecycle.
- Pipeline Monitoring and Optimization: Deploy and maintain pipeline monitoring solutions using Prometheus, Grafana, or cloud-native tools, ensuring efficient data flow, scalability, and cost-effective operations.
- Implement and enforce data governance policies, including role-based access control (RBAC), data masking, and auditing to meet compliance standards and safeguard sensitive information.
- Provide hands-on technical mentorship to junior data engineers, ensuring adherence to coding standards, design principles, and best practices in Snowflake, Kafka, and cloud data engineering.
- Stay current with advancements in Snowflake, Kafka, cloud services (AWS, Azure, GCP), and data engineering trends, and proactively apply new tools and methodologies to enhance the data platform.
Key Result Areas :
● Communication skills and clearness in your reporting and communication.
● Knowledge in the Java programming languages you use.
● Knowledge in the Spring Framework and libraries you use.
● Knowledge in the tool-sets you use.
● Analytical thinking and experience (practical when you design the architecture of the
“thing” prior to coding it).
● Technological understanding (ability to see your new “thing” in a wider perspective, for
example how a small library fits into a large project or product).
● Creativity (finding better ways to achieve your project goals).
● Coding (testable code, clean reusable code, maintainable code, readable code, bug-
free code, beautiful code).
● Correctness (few bugs, few iterations with refactoring).
● Learning (your ability to learn about and use new technologies, protocols, libraries, or
even languages as needed).
● Durability (to stay on track no matter what, even when you feel dead bored, or in way
over your head).
● Adherence to Effort and Schedule
● Team hand holding for day to day activities with team and monitor their progress
● Lead the team technically for the on time delivery and best efforts.
Essentials Skills:
● Strong Hands-on experience in Core Java, Spring framework, Maven, Rational Database.
● Comfortable with source code repository Github.
● Experience in developing REST APIs using Spring-MVC, Play Framework.
● Good to have No Sql, Neo4J, Cassandra, Elasticsearch.
● Experience in developing apache samza jobs (optional).
● Good understanding of CI-CD pipeline.

- Solid understanding of Data structures and Algorithms.
- Exceptional coding skills in an Object-Oriented programming language (Golang/Python)
- Must have basic understanding of AWS (EC2, Lambda, Boto, CI/CD), Celery, RabbitMq and similar task queue management tools/libraries.
- Experience with web technologies Python, Linux, Apache, Solr, Memcache, Redis, grpc
- Experience with high performance services catering to millions of daily traffic is a plus
- Strong understanding of Python and Django.
- Good knowledge of various Python Libraries, APIs, and tool kits.
- Basic understanding of front-end technologies, such as JavaScript, HTML5, and CSS3.
- Proficient understanding of code versioning tools such as Git.
- Understanding of the threading limitations of Python, and multi-process architecture
- Understanding of databases and MySQL
Responsibilities :
- Comply with coding standards and technical design.
- Adapts structured coding styles for easy review, testing, and maintainability of the code.
- Active participation in troubleshooting and debugging.
- Preparing technical documentation of code.
- 3+ YEARS OF EXPERIENCE WITH PAID FACEBOOK AD CAMPAIGNS IS A MUST.
- Manage Facebook ADS for the company as well as clients.
- Get the best ROI and ROAS.
- Run ADS Campaigns
- Execute Google Analytics & Google Tag Managers
- Have Good command of Social Media Management
- Have Good SEO Knowledge - Onpage and Off-page
Desired Candidate Profile
- 3+ Years of Experience
- Proven Results on Facebook ADS
- Professional in Facebook ADS
Perks and Benefits
- Flexible Work Schedule till COVID ends.
Job Description :
- Plan, manage and execute all digital marketing, including SEO/SMM, marketing database, email, social media and display advertising campaigns.
- Understanding client needs and offering solutions and support, answering potential client questions and follow-up call questions.
- Google AdWords would be a PLUS.
- Handling social media platforms like instagram, facebook, twitter etc
- Reserach, Plan and Make a Social Media Marketing and Content plan for Clients.
- Have good knowledge and handling of Social Platforms like Linkedin, Twitter, Reddit, Quora, Pinterest, Facebok and Instagram
- Build, plan and implement the overall digital marketing strategy
Job Types: Full-time
. Salary: As Per Company Norms.
Education: Bachelor’s (required) Master’s (Preferred)
Location: Indore, Madhya Pradesh (Required)
Best Regards,
HR Department
Office Add-519, 6th floor, Onam Plaza, Near Industry House, AB Road, Indore, Madhya Pradesh, 452001

Backend skills,java development,HTML,MVC,JAVA SCRIPT,SPRINGBOOT

The will be responsible for :
- Designing and development of Responsive Web Application
- Strong Knowledge of HTML5, CSS & Java Script
- Strong Understanding of concept of DOM (Document Object Model)
- Understanding of how browsers (Chrome, Mozilla) work behind the scene
- Knowledge of React JS, CSS Flexbox (Preferred or Optional )
We are looking to hire a Senior Backend Developer, with over 4+ years of experience, to directly work with the CTO and contribute in building and developing new products and feature sets for NostraGamus. The work will entail the opportunity to explore and utilise the latest developments in the world of technology, and formulate ways to incorporate them into day to day work to create stellar products. The candidate must have had significant exposure in building real products, including working with various stakeholders across product and marketing teams. Prior knowledge in building games at scale is highly desirable, but not necessarily.
The following skillsets are highly essential:
- Expertise in Node Js&Javascript, and hobbyist interest in few other languages like Python, Ruby, PHP. Must have done C/C++ programming in school/college
- Deep knowledge on Database systems, especially PostgreSQL, and any noSQL clones, including knowledge to optimise
- Awareness of Docker, virtualization, Redis or similar caching toolsets
- Absolute mastery in using Unix based systems and Bash. Ability to work in seamlessly in headless environments is a must.
- Disciplined in approaching problems, ability to maintain composure under catastrophic failures and propensity to work under deadlines
Bonus Points :
- Passionate about building new products - having done a few personal projects, regardless of success achieved or goals completed.
- Prior knowledge in building games at scale.
- Good understanding in AWS and its various services - EC2, ECS, RDS, SQS, Elastic Beanstalk, Elasticache, Route53.
- Strong interest in Mathematics, Probability Theory, Statistics, Machine Learning and ability to apply math in real-world applications.

