
About SpryOX
About
Connect with the team
Similar jobs
About the Role
We are looking for a Data Analyst with expertise in building data pipelines, designing data marts, and leveraging BI tools (ideally Metabase) to drive data-driven decision-making. In this role, you will work with Peliqan to create reliable, scalable data infrastructure and translate data into actionable insights.
Key Responsibilities
- Design and build data pipelines to collect, process, and transform raw data into usable formats.
- Develop and optimize data marts to support efficient querying and analysis.
- Utilize BI tools (preferably Metabase) to create reports, dashboards, and visualizations that enable data-driven decision-making.
- Collaborate with stakeholders to understand data needs and deliver insights that support business objectives.
- Ensure data quality, governance, and accuracy across pipelines and analytics solutions.
- Optimize queries and database performance to ensure smooth data retrieval and reporting.
- Work with integration and engineering teams to ensure seamless data flow between systems.
Requirements
- Strong experience in SQL and database design for analytics.
- Hands-on experience with ETL/ELT processes, data transformation, and building data pipelines.
- Experience with BI tools (Metabase preferred, but experience with Tableau, Looker, Power BI, etc., is also valuable).
- Understanding of data modeling and best practices for analytics data structures.
- Familiarity with Python or other scripting languages for data manipulation.
- Strong analytical skills with the ability to translate complex data into actionable insights.
- Experience with cloud data warehouses (e.g., BigQuery, Snowflake, Redshift) is a plus.
- Excellent communication and collaboration skills to work effectively with cross-functional teams.
Preferred Qualifications
- Experience working with Peliqan or similar data integration/automation platforms.
- Knowledge of data governance, security, and compliance best practices.
- Ability to automate reporting and data workflows for efficiency.
If you’re a data-driven professional who loves building scalable data solutions and transforming raw data into actionable insights, we’d love to hear from you!

Core responsibilities
• Collaborate with product teams to more accurately define new features
• Collaborate with UX in the building of a design system/component library
• Ensure high performance of all systems developed
• Aware of the relationship between development and business with a strong sense of ownership
• Employ the latest techniques such as agile software development
• Basic working knowledge of Unix/Linux
• Excellent problem solving and coding skills in Javascript/Angular
• Strong interpersonal, communication and analytical skills
• Should have the ability to express their design ideas and thoughts
Job Brief
• JavaScript and angular expertise
• Experience with libraries such as bootstrap, Ag-grid, formly, observables, and ngrx
• You enjoy working with new technologies, and are curious and energetic
• Experience working with REST APIs
• Good understanding about authentication and security
• Create and maintain various unit and integration tests
• You enjoy working in a creative and agile environment that moves fast
• Ability and interest in providing mentorship to junior members of the team
• Write high quality code and Conduct code reviews
• Being resourceful and detail-oriented, along with being an outside the box thinker!
• Experience being a great team player, who work collaborative, and brings a positive attitude
Experience: 4+ years.
Location: Vadodara & Pune
Skills Set- Snowflake, Power Bi, ETL, SQL, Data Pipelines
What you'll be doing:
- Develop, implement, and manage scalable Snowflake data warehouse solutions using advanced features such as materialized views, task automation, and clustering.
- Design and build real-time data pipelines from Kafka and other sources into Snowflake using Kafka Connect, Snowpipe, or custom solutions for streaming data ingestion.
- Create and optimize ETL/ELT workflows using tools like DBT, Airflow, or cloud-native solutions to ensure efficient data processing and transformation.
- Tune query performance, warehouse sizing, and pipeline efficiency by utilizing Snowflakes Query Profiling, Resource Monitors, and other diagnostic tools.
- Work closely with architects, data analysts, and data scientists to translate complex business requirements into scalable technical solutions.
- Enforce data governance and security standards, including data masking, encryption, and RBAC, to meet organizational compliance requirements.
- Continuously monitor data pipelines, address performance bottlenecks, and troubleshoot issues using monitoring frameworks such as Prometheus, Grafana, or Snowflake-native tools.
- Provide technical leadership, guidance, and code reviews for junior engineers, ensuring best practices in Snowflake and Kafka development are followed.
- Research emerging tools, frameworks, and methodologies in data engineering and integrate relevant technologies into the data stack.
What you need:
Basic Skills:
- 3+ years of hands-on experience with Snowflake data platform, including data modeling, performance tuning, and optimization.
- Strong experience with Apache Kafka for stream processing and real-time data integration.
- Proficiency in SQL and ETL/ELT processes.
- Solid understanding of cloud platforms such as AWS, Azure, or Google Cloud.
- Experience with scripting languages like Python, Shell, or similar for automation and data integration tasks.
- Familiarity with tools like dbt, Airflow, or similar orchestration platforms.
- Knowledge of data governance, security, and compliance best practices.
- Strong analytical and problem-solving skills with the ability to troubleshoot complex data issues.
- Ability to work in a collaborative team environment and communicate effectively with cross-functional teams
Responsibilities:
- Design, develop, and maintain Snowflake data warehouse solutions, leveraging advanced Snowflake features like clustering, partitioning, materialized views, and time travel to optimize performance, scalability, and data reliability.
- Architect and optimize ETL/ELT pipelines using tools such as Apache Airflow, DBT, or custom scripts, to ingest, transform, and load data into Snowflake from sources like Apache Kafka and other streaming/batch platforms.
- Work in collaboration with data architects, analysts, and data scientists to gather and translate complex business requirements into robust, scalable technical designs and implementations.
- Design and implement Apache Kafka-based real-time messaging systems to efficiently stream structured and semi-structured data into Snowflake, using Kafka Connect, KSQL, and Snow pipe for real-time ingestion.
- Monitor and resolve performance bottlenecks in queries, pipelines, and warehouse configurations using tools like Query Profile, Resource Monitors, and Task Performance Views.
- Implement automated data validation frameworks to ensure high-quality, reliable data throughout the ingestion and transformation lifecycle.
- Pipeline Monitoring and Optimization: Deploy and maintain pipeline monitoring solutions using Prometheus, Grafana, or cloud-native tools, ensuring efficient data flow, scalability, and cost-effective operations.
- Implement and enforce data governance policies, including role-based access control (RBAC), data masking, and auditing to meet compliance standards and safeguard sensitive information.
- Provide hands-on technical mentorship to junior data engineers, ensuring adherence to coding standards, design principles, and best practices in Snowflake, Kafka, and cloud data engineering.
- Stay current with advancements in Snowflake, Kafka, cloud services (AWS, Azure, GCP), and data engineering trends, and proactively apply new tools and methodologies to enhance the data platform.
- Work in close coordination with senior management to define strategy and targets for FI Partnerships
- Develop the pitch, collateral, presentations and other supporting material to adequately demonstrate risk/reward for the Financial Institution partners
- Generate leads pipeline (inbound and outbound) and implementation plan
- Lead and prepare presentations to Financial Institutions
- End to end management of the process for acquiring partners for the platform including conclusion of commercials, legal agreement, system integration etc
- Coordination with internal teams like Legal, Credit Risk, Product etc and with external teams of the Bank or Financial Institutions
- Progressively reduce financing cost profile across the platform and NBFC
- Work on arranging for debt funds for the NBFC from other Financial Institutions
- Work with Banking partners to actively participate in joint marketing and other efforts to drive customers both from open market and existing customers of the partner
- Work with the external stakeholders to develop products and new markets, identify business opportunities
- Track and improve on conversion from leads to closure
- Manage relationships and help resolution of issues in case of any with the Financial Institutions.
- Regular reporting of performance
- Develop deep relationships with Financial Institutions like Banks, NBFCs etc
Responsibilities:
- Must have hands-on working experience in NodeJS with Typescript
- Must have experience of large size product development process
- Must have detailed knowledge of designing and developing performant REST APIs
- Must have hands-on experience on Postgres DB and MongoDb / any NoSQL DB
- Must be comfortable working with git and CI CD pipelines and deployment strategies
- Should be able to build a new team and work with your engineers and mentor them
- Collaborate with cross-functional teams to define, design, and ship new features
- Ensure the performance, quality, and responsiveness of applications
- Identify and correct bottlenecks and fix bugs
- Continuously discover, evaluate, and implement new technologies to maximise development efficiency
- Should be open to learn new domain and work in fast paced environment
- Responsible to deliver end to end module/product/project scope
Requirements:
- 4+ years of experience as a Full Stack Developer with a strong focus on Nodejs with TypeScript and React JavaScript
- Solid understanding of web application architecture, including RESTful API design and development
- Experience with front-end development using tailwind , bootstrap ,HTML, CSS, and JavaScript, typescript
- Strong analytical and problem-solving skills
- Excellent communication and interpersonal skills


- Building customer products that operate at a scale
- Working on a back-end system using Python and Java
- Working on MySQL, NoSQL, Solr, Thrift, Flask, RabbitMQ, Redis, etc.
- Working on analytics and data science
- Working on server management (Google Cloud)
- Explore different functions of building a tech product/company early in your career.
- Learn to Prioritize Work and learn the ability to decide some technical decisions vs others ata very early stage of your career
- Be instilled with the value of hard work, ownership, and self-sustainability.
- Be more actively involved in the decision-making and functioning of the company.
- Understand the great importance of personal ownership and the liberty needed to pursue that ownership
- Hungry for Growth & Learning
- Willing to go all out for an accelerated career in software engineering
- Ready to experiment things never done before – Conquer uncharted waters
- Think Out-of-the-Box – Innovative and effective solutions
- Top Gun command over technologies (Mentioned above)
- You'll learn how to target efficiency rather than perfection.
- This empowerment will help you become a better leader at a very early stage of your career.
- Join the Core team – Get real insights into building a Product and Business ground up
- Enjoy the freedom, that comes with a lot of ownership
- Challenging and fun work environment
- Flexible Work Culture – We are a target driven organization. We like hard-workers, but we adore smart-workers more!
- Unlimited Vacation policy - Work hard, and take a break when you need
The ideal candidate is a competitive self-starter that thrives in a fast-paced environment. You must be
comfortable making dozens of calls and emails per day working with partners, generating interest,
qualifying prospects, and closing sales.
Responsibilities:-
• Source new sales opportunities through inbound lead follow-up and outbound cold calls and emails.
• Prospect call preparation including company background research and other pertinent lead
information.
• Identify customer’s buying trends and provide reports to management
• Enter, update, and maintain CRM information on leads, prospects, and Opportunities.
• Must have knowledge of Email- Marketing.
Requirement:-
• Strong Verbal & Written communication
• Source new sales opportunities through cold calling.
• Confident in speaking.
• Strong listening and presentation skills.
• Presence of Mind to influence and persuade.
• Identify B2B key players and research the accounts.
• Proficient to use CRM and hands-on Microsoft Office.
• Strong Interpersonal relations.
• Can work on targets.
• Email Marketing.
• Content Email Writing.

- An understanding of Core JavaScript. ...
- In-depth knowledge of the Angular framework. ...
- Good command of TypeScript. ...
- Thorough knowledge of web markup, primarily focusing on HTML language, and CSS. ...
- A degree of experience with RESTful API integration.

This includes working on:
a) The main Django application, a large, modern, Django app built using Python 3.8 and the latest Python and Django libraries;
b) The API, built using Django Rest Framework (DRF) that is used both by our web-app and client libraries to build and run data analyses;
c) Backend code that integrates our web server with the rest of our cloud architecture, including our PaaS, data science code, general integrations such as payments, devops code, and more.
Ideally, you should have experience working on Django codebases which serve both server-side rendered pages and APIs via DRF. Frontend/full-stack knowledge is a an advantage but not essential. Familiarity with modern development practices, such as CI/CD, testing, DevOps, Docker, Linux and git would be a big plus. You must have very strong familiarity with Python development, and be excited to pick up the new technologies and skills - for instance we use Python type-hints across our codebase extensively.
You should like the idea of releasing to real customers regularly, and prioritise getting a great product into users’ hands for feedback and iteration. You will have extensive scope to build and architect the backend, and to help grow the team in the future.


