
- Must have 4 to 7 years of experience in ETL Design and Development using Informatica Components.
- Should have extensive knowledge in Unix shell scripting.
- Understanding of DW principles (Fact, Dimension tables, Dimensional Modelling and Data warehousing concepts).
- Research, development, document and modification of ETL processes as per data architecture and modeling requirements.
- Ensure appropriate documentation for all new development and modifications of the ETL processes and jobs.
- Should be good in writing complex SQL queries.
- • Selected candidates will be provided training opportunities on one or more of following: Google Cloud, AWS, DevOps Tools, Big Data technologies like Hadoop, Pig, Hive, Spark, Sqoop, Flume and
- Kafka would get chance to be part of the enterprise-grade implementation of Cloud and Big Data systems
- Will play an active role in setting up the Modern data platform based on Cloud and Big Data
- Would be part of teams with rich experience in various aspects of distributed systems and computing.

About DataMetica
About
Company video


Photos
Connect with the team
Similar jobs
Role Description
This is a full-time on-site role for a Python Developer located in Pune. The Python Developer will be responsible for back-end web development, software development, and programming using Python. Day-to-day tasks include developing, testing, and maintaining scalable web applications and server-side logic, as well as optimizing performance and integrating user-facing elements with server-side logic. The role also demands collaboration with cross-functional teams to define, design, and ship new features.
Key Responsibilities
- Lead the backend development team, ensuring best practices in coding, architecture, and performance optimization.
- Design, develop, and maintain scalable backend services using Python and FastAPI.
- Architect and optimize databases, ensuring efficient storage and retrieval of data using MongoDB.
- Integrate AI models and data science workflows into enterprise applications.
- Implement and manage AWS cloud services, including Lambda, S3, EC2, and other AWS components.
- Automate deployment pipelines using Jenkins and CI/CD best practices.
- Ensure security and reliability, implementing best practices for authentication, authorization, and data privacy.
- Monitor and troubleshoot system performance, optimizing infrastructure and codebase.
- Collaborate with data scientists, front-end engineers, and product team to build AI-driven solutions.
- Stay up to date with the latest technologies in AI, backend development, and cloud computing.
Required Skills & Qualifications
- 3-4 years of experience in backend development with Python.
- Strong experience in FastAPI framework.
- Proficiency in MongoDB or other NoSQL databases.
- Hands-on experience with AWS services (Lambda, S3, EC2, etc.).
- Experience with Jenkins and CI/CD pipelines.
- Data Science knowledge with experience integrating AI models and data pipelines.
- Strong understanding of RESTful API design, microservices, and event-driven architecture.
- Experience in performance tuning, caching, and security best practices.
- Proficiency in working with Docker and containerized applications.
Job Title: Senior Data Engineer
No. of Positions: 3
Employment Type: Full-Time, Permanent
Location: Remote (Pan India)
Shift Timings: 10:00 AM – 7:00 PM IST
Experience Required: Minimum 3+ Years
Mandatory Skills: Scala & PySpark
Role Overview
We are looking for an experienced Senior Data Engineer to design, build, and optimize scalable data pipelines and architectures. The ideal candidate should have hands-on experience working with Big Data technologies, distributed systems, and ETL pipelines. You will work closely with cross-functional teams including Data Analysts, Data Scientists, and Software Engineers to ensure efficient data flow and reliable data infrastructure.
Key Responsibilities
- Design and build scalable data pipelines for extraction, transformation, and loading (ETL) from various data sources.
- Enhance internal processes by automating tasks, optimizing data workflows, and improving infrastructure performance.
- Collaborate with Product, Engineering, Data, and Business teams to understand data needs and provide solutions.
- Work closely with machine learning and analytics teams to support advanced data modeling and innovation.
- Ensure systems are highly reliable, maintainable, and optimized for performance.
Required Qualifications & Skills
- Bachelor’s degree in Computer Science, Engineering, or related field.
- 3+ years of hands-on experience in Data Engineering.
- Strong experience with Apache Spark, with solid understanding of distributed data processing.
- Proficiency in Scala and PySpark is mandatory.
- Strong SQL skills and experience working with relational and non-relational data.
- Experience with cloud-based data platforms (preferably Databricks).
- Good understanding of Delta Lake architecture, Parquet, JSON, CSV, and related data file formats.
- Comfortable working in Linux/macOS environments with scripting capabilities.
- Ability to work in an Agile environment and deliver independently.
- Good communication and collaboration skills.
- Knowledge of Machine Learning concepts is an added advantage.
Reporting
- This role will report to the CEO or a designated Team Lead.
Benefits & Work Environment
- Remote work flexibility across India.
- Encouraging and diverse work culture.
- Paid leaves, holidays, performance incentives, and learning opportunities.
- Supportive environment that promotes personal and professional growth.
What We’re Looking For:
- Strong experience in Python (5+ years).
- Hands-on experience with any database (SQL or NoSQL).
- Experience with frameworks like Flask, FastAPI, or Django.
- Knowledge of ORMs, API development, and unit testing
- 5 -10 years of experience in ETL Testing, Snowflake, DWH Concepts.
- Strong SQL knowledge & debugging skills are a must.
- Experience on Azure and Snowflake Testing is plus
- Experience with Qlik Replicate and Compose tools (Change Data Capture) tools is considered a plus
- Strong Data warehousing Concepts, ETL tools like Talend Cloud Data Integration, Pentaho/Kettle tool
- Experience in JIRA, Xray defect management toolis good to have.
- Exposure to the financial domain knowledge is considered a plus
- Testing the data-readiness (data quality) address code or data issues
- Demonstrated ability to rationalize problems and use judgment and innovation to define clear and concise solutions
- Demonstrate strong collaborative experience across regions (APAC, EMEA and NA) to effectively and efficiently identify root cause of code/data issues and come up with a permanent solution
- Prior experience with State Street and Charles River Development (CRD) considered a plus
- Experience in tools such as PowerPoint, Excel, SQL
- Exposure to Third party data providers such as Bloomberg, Reuters, MSCI and other Rating agencies is a plus
Key Attributes include:
- Team player with professional and positive approach
- Creative, innovative and able to think outside of the box
- Strong attention to detail during root cause analysis and defect issue resolution
- Self-motivated & self-sufficient
- Effective communicator both written and verbal
- Brings a high level of energy with enthusiasm to generate excitement and motivate the team
- Able to work under pressure with tight deadlines and/or multiple projects
- Experience in negotiation and conflict resolution
Merito is a curated talent platform that connects talent to the right opportunities and helps our client hire quality talent faster.
About our Client :-
Our client is an award-winning digital marketing company, providing state of the art software and services for businesses around the globe.
What are we looking for?
We are looking for an experienced, results-driven executive to lead and manage our team of experts in SEO’s, Paid, Content, Strategist. Head of Digital should have exceptional knowledge of all discipline of search, proven experience in managing and leading team across various disciplines. This candidate should be well connected/ respected in the Search industry. Candidate should have Excellent Communication & Presentation Skills. Candidate should understand current and future digital trends, should be hands-on for strategies and tactics. Candidate should have customer centric mindset and the ability to operate on both a strategic, tactical level.
This position is based in Bangalore/Ahmedabad India.
Responsibilities: :-
- Leadership – Head of COE will be overseeing all the departments delivery, strategic vision operation execution. Lead teams to accomplish Company Level OKRs
- Omnichannel Strategies Deployment - Customer Journey Mapping, Gap Analysis,
Need Identification and Strategies and Solution
- Training, Coaching and Mentoring various teams
- RFP Support
- Flagship account growth – QBR support, Media Plan
- Quality, NPS standards by all COE team members
- Building High Performance culture including capacity and staffing model, use of platform to deliver/scale SEO and Paid Services
- Nurturing and building innovation driven culture
Key Skills Set :-
o 15+ years of Proven track record building of SEO, Social, Content and
Performance Marketing strategies
o Proven track record building a high performing team
o Detail-oriented, with excellent analytical and problem-solving skills
o Strong communication skills
Ability to thrive in a fast-paced environment
As a Go lang Developer, you will be working on Blockchain Layer 1.
● Advanced proficiency in Golang programming language, and good skills in languages such as C++,
Java, Solidity and Python (good to have).
● Extensive experience in back-end development, algorithms, and data structures.
● Extensive Knowledge of blockchain structure, protocol development or Smart Contract
● Writing clean, efficient, and reusable code that follows best practices and coding standards.
● knowledge of distributed and decentralized network protocols
● Knowledge of various decentralized ledger technologies and protocols
● Understanding of gossip protocol and consensus protocol
● Knowledge of best practices in data protection.
● Collaborating with managers to determine technology needs and envisaged functionalities.
● Creating application features and interfaces by using programming languages and writing multithreaded codes.
● Applying the latest cryptology techniques to protect digital transaction data against cyberattacks and information hacks.
● Maintaining client and server-side applications.
work from home presently.
CTC - up to 11 Lakh per annum + performance pay
5 days working
Post: BD /Sr BDA (business development associate) ( inside sales)
age criteria: Maximum 31
1 year Ed - tech EXP. is a must
WORK FROM HOME PRESENTLY DUE TO COVID AND LATER BANGALORE OFFICE
MALES / FEMALES BOTH CAN APPLY
About the job:
You'll be responsible for building backend of our platforms. You must know Mongodb, and Node.js.
Expectations:
Develop backend API’s.
Strong ability to design and configure schema and MongoDB data modelling
Assuring the complete stack is designed and built for high performance and scalability.
Your Requirement:
Must know Mongodb and Node.js.
Work experience of minimum 2 years.
Worked with startups and evolved products for scale.
Preferred:
Growth mindset & leadership abilities to work in a frugal environment.
- Team Name - SDET
- Skills and Stacks - Java, Spring boot, Mysql, AWS stack, HTTP/GRPC
- Project 1 line description -Will be required the folks to close the P0 E2E automation




















