
- Create and maintain optimal data pipeline architecture,
- Assemble large, complex data sets that meet functional / non-functional business requirements.
- Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies.
- Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
- Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
- Keep our data separated and secure across national boundaries through multiple data centers and AWS regions.
- Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
- Work with data and analytics experts to strive for greater functionality in our data systems.
- Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
- Experience building and optimizing ‘big data’ data pipelines, architectures and data sets.
- Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
- Strong analytic skills related to working with unstructured datasets.
- Build processes supporting data transformation, data structures, metadata, dependency and workload management.
- A successful history of manipulating, processing and extracting value from large disconnected datasets.
- Working knowledge of message queuing, stream processing, and highly scalable ‘big data’ data stores.
- Strong project management and organizational skills.
- Experience supporting and working with cross-functional teams in a dynamic environment.
- We are looking for a candidate with 5+ years of experience in a Data Engineer role, who has attained a Graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field. They should also have experience using the following software/tools: Experience with big
- data tools: Hadoop, Spark, Kafka, etc.
- Experience with relational SQL and NoSQL databases, including Postgres and Cassandra.
- Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc.
- Experience with AWS cloud services: EC2, EMR, RDS, Redshift
- Experience with stream-processing systems: Storm, Spark-Streaming, etc.
- Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.

Similar jobs
Job Title : React Native Developer
Experience : 3+ Years
Location : Gurgaon
Working Days : 6 Days (Monday to Saturday)
Job Summary :
We are looking for a skilled React Native Developer with experience in converting Figma designs into high-performance mobile applications.
The ideal candidate should have some exposure to Blockchain/Web3 technologies and be capable of converting mobile applications into SDKs.
Key Responsibilities :
✅ Develop and maintain React Native applications.
✅ Convert Figma designs into pixel-perfect UI.
✅ Optimize app performance and ensure smooth user experience.
✅ Work with Blockchain/Web3 integrations (preferred).
✅ Convert mobile applications into SDKs for seamless integration.
✅ Collaborate with designers, backend developers, and blockchain engineers.
Technical Skills :
🔹 Strong proficiency in React Native, JavaScript, TypeScript.
🔹 Experience with Redux, Context API, Hooks.
🔹 Familiarity with Blockchain/Web3 (Ethereum, Solidity, Wallet integrations).
🔹 Understanding of mobile SDK development.
🔹 Knowledge of REST APIs, GraphQL, and third-party integrations.
We’re looking for someone who doesn’t just collect leads, but actually understands how to find the right ones. You’ll research high-intent prospects, qualify them with a clear logic, and build targeted lists that make outreach easier and more predictable. You’ll run structured campaigns across LinkedIn, cold email, and other channels, while keeping the CRM clean, updated, and usable for the sales team.
What you’ll handle
• Deep research on companies, industries, and decision-makers
• Build segmented, high-accuracy lead lists
• Qualify leads based on relevance, authority, timelines, and potential
• Plan and execute multi-channel outreach sequences
• Refine messaging, subject lines, and connection strategies
• Maintain CRM hygiene and ensure every conversation has a next step
• Track performance data and share insights with the sales team
• Identify new markets, ICP variations, and emerging opportunities
• Collaborate closely with sales and marketing to improve conversions
What we’re looking for
• Strong experience in B2B lead generation, especially for tech, design, or service businesses
• Comfort with LinkedIn search, automation tools, prospecting databases, and email outreach platforms
• Ability to evaluate prospects quickly and filter out poor-fit leads
• Clear writing skills and a good understanding of how to spark interest
• Sharp attention to detail and the discipline to maintain clean data
• Someone who can plan, prioritise, and work independently
• A strategic mindset with a marketing touch is a bonus
What this really means is you’ll be the one shaping the quality of our pipeline. When you do your job well, the sales team gets better conversations, faster cycles, and stronger deals.
Location
Bestech Business Tower, Mohali (this is an onsite role.)
Five-day workweek
Our office operates from 9 am to 2 am across three shifts, so you should be comfortable working within these hours.
Salary
Up to 6 LPA for each role, negotiable based on experience, plus additional incentives for hitting targets.
Job Overview
We are looking for a talented Senior Backend Developer to join our team in developing backend servicing. As a key member of our development team, you will be responsible for cross-functional teams to design, develop, and maintain backend services and APIs using Node.js. The ideal candidate is passionate about distributed systems, has experience with backend services and is excited about optimising applications for performance, scalability, and reliability.
Responsibilities:
- Design, develop, and maintain backend services and APIs using Node.js.
- Collaborate with cross-functional teams to define requirements, architect solutions, and implement features.
- Write clean, modular, and maintainable code following best practices and coding standards.
- Optimise applications for performance, scalability, and reliability.
- Implement security measures to ensure data privacy and integrity in backend systems.
- Troubleshoot and debug issues to ensure smooth operation of production systems.
- Perform code reviews to maintain code quality, consistency, and adherence to coding standards.
- Stay updated with the latest trends and technologies in backend development, Node.js, and related ecosystems
Requirements:
- Bachelor's degree in Computer Science, Engineering, or related field.
- Proven experience in backend development, with at least 4 years of experience using Node.js.
- Strong understanding of data structures, algorithms, and software design principles.
- Experience with web frameworks such as Express.js.
- Proficiency in building RESTful APIs and microservices architecture.
- Familiarity with relational and NoSQL databases (e.g., MongoDB, PostgreSQL).
- Solid understanding of asynchronous programming and event-driven architectures.
- Experience with version control systems (e.g., Git) and collaborative development workflows.
- Excellent problem-solving, analytical, and troubleshooting skills.
- Ability to work independently and in a team environment, with excellent communication skills.
This will primarily be for Microsoft Windows Operating Systems.
He / She will be responsible in design, coding, and testing software components.
Requirements:
- Experience in Windows NDIS networking driver development.
- Experience in developing and maintaining of Network drivers and/or IWARP drivers and/or TCP offload drivers and/or iSCSI drivers and/or FCoE drivers is preferable.
- Experience in analyzing and debugging kernel crash dumps.
- RDMA/IWARP/RoCE, TCP offload, iSCSI, FCoE, FC, Networking device drivers.
- Good knowledge in TCP/IP.
- Performance tuning, crash dump analysis, solving critical customer issues
1. Experience and in-depth knowledge of HTML and CSS. (PRIMARY)
2. Knowledge of Principles of Responsive Web Design.
3. Experienced with JavaScript and ideally knowledge of one of these frameworks Angular, React, or Polymer.
4. Knowledge and experience integrating web apps with back-end using REST API's .
5. Knowledge of VC like Git.
Extras:
1. Principles of Web Performance Standards and PWA's.
2. Build and Automation tooling using a package like Gulp or Grunt.
3. Comfortable with the command line and usage of package managers like npm and gulp.
Job Description:
Roles & Responsibilities:
· You will be involved in every part of the project lifecycle, right from identifying the business problem and proposing a solution, to data collection, cleaning, and preprocessing, to training and optimizing ML/DL models and deploying them to production.
· You will often be required to design and execute proof-of-concept projects that can demonstrate business value and build confidence with CloudMoyo’s clients.
· You will be involved in designing and delivering data visualizations that utilize the ML models to generate insights and intuitively deliver business value to CXOs.
Desired Skill Set:
· Candidates should have strong Python coding skills and be comfortable working with various ML/DL frameworks and libraries.
· Hands-on skills and industry experience in one or more of the following areas is necessary:
1) Deep Learning (CNNs/RNNs, Reinforcement Learning, VAEs/GANs)
2) Machine Learning (Regression, Random Forests, SVMs, K-means, ensemble methods)
3) Natural Language Processing
4) Graph Databases (Neo4j, Apache Giraph)
5) Azure Bot Service
6) Azure ML Studio / Azure Cognitive Services
7) Log Analytics with NLP/ML/DL
· Previous experience with data visualization, C# or Azure Cloud platform and services will be a plus.
· Candidates should have excellent communication skills and be highly technical, with the ability to discuss ideas at any level from executive to developer.
· Creative problem-solving, unconventional approaches and a hacker mindset is highly desired.









