
Job Description:
We are looking for a Big Data Engineer who have worked across the entire ETL stack. Someone who has ingested data in a batch and live stream format, transformed large volumes of daily and built Data-warehouse to store the transformed data and has integrated different visualization dashboards and applications with the data stores. The primary focus will be on choosing optimal solutions to use for these purposes, then maintaining, implementing, and monitoring them.
Responsibilities:
- Develop, test, and implement data solutions based on functional / non-functional business requirements.
- You would be required to code in Scala and PySpark daily on Cloud as well as on-prem infrastructure
- Build Data Models to store the data in a most optimized manner
- Identify, design, and implement process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Implementing the ETL process and optimal data pipeline architecture
- Monitoring performance and advising any necessary infrastructure changes.
- Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
- Work with data and analytics experts to strive for greater functionality in our data systems.
- Proactively identify potential production issues and recommend and implement solutions
- Must be able to write quality code and build secure, highly available systems.
- Create design documents that describe the functionality, capacity, architecture, and process.
- Review peer-codes and pipelines before deploying to Production for optimization issues and code standards
Skill Sets:
- Good understanding of optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and ‘big data’ technologies.
- Proficient understanding of distributed computing principles
- Experience in working with batch processing/ real-time systems using various open-source technologies like NoSQL, Spark, Pig, Hive, Apache Airflow.
- Implemented complex projects dealing with the considerable data size (PB).
- Optimization techniques (performance, scalability, monitoring, etc.)
- Experience with integration of data from multiple data sources
- Experience with NoSQL databases, such as HBase, Cassandra, MongoDB, etc.,
- Knowledge of various ETL techniques and frameworks, such as Flume
- Experience with various messaging systems, such as Kafka or RabbitMQ
- Creation of DAGs for data engineering
- Expert at Python /Scala programming, especially for data engineering/ ETL purposes

About Ganit Business Solutions
About
Ganit Inc. is in the business of enhancing the Decision Making Power (DMP) of businesses by offering solutions that lie at the crossroads of discovery-based artificial intelligence, hypothesis-based analytics, and the Internet of Things (IoT).
The company's offerings consist of a functioning product suite and a bespoke service offering as its solutions. The goal is to integrate these solutions into the core of their client's decision-making processes as seamlessly as possible. Customers in the FMCG/CPG, Retail, Logistics, Hospitality, Media, Insurance, and Banking sectors are served by Ganit's offices in both India and the United States. The company views data as a strategic resource that may assist other businesses in achieving growth in both their top and bottom lines of business. We build and implement AI and ML solutions that are purpose-built for certain sectors to increase decision velocity and decrease decision risk.
Connect with the team
Company social profiles
Similar jobs
Job Title: (Generative AI Engineer Specialist in Deep Learning)
Location: Gandhinagar, Ahmedabad, Gujarat
Company: Rayvat Outsourcing
Salary: Upto 2,50,000/- per annum
Job Type: Full-Time
Experience: 0 to 1 Year
Job Overview:
We are seeking a talented and enthusiastic Generative AI Engineer to join our team. As an Intermediate-level engineer, you will be responsible for developing and deploying state-of-the-art generative AI models to solve complex problems and create innovative solutions. You will collaborate with cross-functional teams, working on a variety of projects that range from natural language processing (NLP) to image generation and multimodal AI systems. The ideal candidate has hands-on experience with machine learning models, deep learning techniques, and a passion for artificial intelligence.
Key Responsibilities:
· Develop, fine-tune, and deploy generative AI models using frameworks such as GPT, BERT, DALL·E, Stable Diffusion, etc.
· Research and implement cutting-edge machine learning algorithms in NLP, computer vision, and multimodal systems.
· Collaborate with data scientists, ML engineers, and product teams to integrate AI solutions into products and platforms.
· Create APIs and pipelines to deploy models in production environments, ensuring scalability and performance.
· Analyze large datasets to identify key features, patterns, and use cases for model training.
· Debug and improve existing models by evaluating performance metrics and applying optimization techniques.
· Stay up-to-date with the latest advancements in AI, deep learning, and generative models to continually enhance the solutions.
· Document technical workflows, including model architecture, training processes, and performance reports.
· Ensure ethical use of AI, adhering to guidelines around AI fairness, transparency, and privacy.
Qualifications:
· Bachelor’s/Master’s degree in Computer Science, Machine Learning, Data Science, or a related field.
· 2-4 years of hands-on experience in machine learning and AI development, particularly in generative AI.
· Proficiency with deep learning frameworks such as TensorFlow, PyTorch, or similar.
· Experience with NLP models (e.g., GPT, BERT) or image-generation models (e.g., GANs, diffusion models).
· Strong knowledge of Python and libraries like NumPy, Pandas, scikit-learn, etc.
· Experience with cloud platforms (e.g., AWS, GCP, Azure) for AI model deployment and scaling.
· Familiarity with APIs, RESTful services, and microservice architectures.
· Strong problem-solving skills and the ability to troubleshoot and optimize AI models.
· Good understanding of data preprocessing, feature engineering, and handling large datasets.
· Excellent written and verbal communication skills, with the ability to explain complex concepts clearly.
Preferred Skills:
· Experience with multimodal AI systems (combining text, image, and/or audio data).
· Familiarity with ML Ops and CI/CD pipelines for deploying machine learning models.
· Experience in A/B testing and performance monitoring of AI models in production.
· Knowledge of ethical AI principles and AI governance.
What We Offer:
· Competitive salary and benefits package.
· Opportunities for professional development and growth in the rapidly evolving AI field.
· Collaborative and dynamic work environment, with access to cutting-edge AI technologies.
· Work on impactful projects with real-world applications.
Job Title: E-Commerce Executive
Salary: ₹16,000 - ₹25,000 per month
Experience: Minimum 1 year in e-commerce
Job Description:
We are seeking a motivated and detail-oriented E-Commerce Executive to join our dynamic team. The ideal candidate will have a solid understanding of e-commerce platforms, particularly Amazon, and will be responsible for managing and optimizing our online sales channels.
Key Responsibilities:
- Manage and optimize product listings on Amazon and other e-commerce platforms.
- Monitor inventory levels and coordinate with suppliers to ensure timely stock replenishment.
- Analyze sales data and market trends to identify opportunities for growth.
- Implement marketing strategies to enhance product visibility and drive sales.
- Handle customer inquiries and resolve issues related to online orders.
- Collaborate with the marketing team to create promotional campaigns.
- Stay updated on e-commerce trends and best practices.
Qualifications:
- Minimum 1 year of experience in e-commerce management.
- Knowledge of Amazon Seller Central and other e-commerce platforms.
- Strong analytical skills and attention to detail.
- Excellent communication and interpersonal skills.
- Ability to work independently and as part of a team.
- Proficiency in Microsoft Office and e-commerce management tools.
Company Description
Nadcab Labs is a dynamic and innovative team of blockchain enthusiasts based in Prayagraj. Our primary focus is on developing cutting-edge applications for the decentralized world, including smart contracts, decentralized applications (Dapp's), and various DeFi (Decentralized Finance) products. We are dedicated to building robust and secure blockchain solutions.
Role & Responsibilities
- Be expert in the English language (grammar, vocabulary, and training methodology) and Business Communication.
- Have an error-free verbal fluency in speaking and writing English.
- Conduct daily training sessions with employees
- Include soft-skills and behavioral training (according to in-house observation of employees).
- Identify individual areas of improvement and help employees with them via periodical 1-on-1 meetings.
- Coordinate with management to better customize training modules and sessions.
- ·Create engaging training material.
- ·Promote a global communication culture with language and soft skills.
Qualifications
- 2Years of experience with English language and soft-skills training.
- Knowledge of the global standards of English and Business Communication.
- Research skills required to carry out diagnostic check-ups for communication hurdles within the firm.
- Periodical tests ready for the various stages of training.
Primary Duties and Responsibilities
- Experience with Informatica Multidomain MDM 10.4 tool suite preferred
- Partnering with data architects and engineers to ensure an optimal data model design and implementation for each MDM domain in accordance with industry and MDM best practices
- Works with data governance and business steward(s) to design, develop, and configure business rules for data validation, standardize, match, and merge
- Implementation of Data Quality policies, procedures and standards along with Data Governance Team for maintenance of customer, location, product, and other data domains; Experience with Informatica IDQ tool suite preferred.
- Performs data analysis and source-to-target mapping for ingest and egress of data.
- Maintain compliance with change control, SDLC, and development standards.
- Champion the creation and contribution to technical documentation and diagrams.
- Establishes a technical vision and strategy with the team and works with the team to turn it into reality.
- Emphasis on coaching and training to cultivate skill development of team members within the department.
- Responsible for keeping up with industry best practices and trends.
- Monitor, troubleshoot, maintain, and continuously improve the MDM ecosystem.
Secondary Duties and Responsibilities
- May participate in off-hours on-call rotation.
- Attends and is prepared to participate in team, department and company meetings.
- Performs other job related duties and special projects as assigned.
Supervisory Responsibilities
This is a non-management role
Education and Experience
- Bachelor's degree in MIS, Computer Sciences, Business Administration, or related field; or High School Degree/General Education Diploma and 4 years of relevant experience in lieu of Bachelor's degree.
- 5+ years of experience in implementing MDM solutions using Informatica MDM.
- 2+ years of experience in data stewardship, data governance, and data management concepts.
- Professional working knowledge of Customer 360 solution
- Professional working knowledge in multi domain MDM data modeling.
- Strong understanding of company master data sets and their application in complex business processes and support data profiling, extraction, and cleansing activities using Informatica Data Quality (IDQ).
- Strong knowledge in the installation and configuration of the Informatica MDM Hub.
- Familiarity with real-time, near real-time and batch data integration.
- Strong experience and understanding of Informatica toolsets including Informatica MDM Hub, Informatica Data Quality (IDQ), Informatica Customer 360, Informatica EDC, Hierarchy Manager (HM), Business Entity Service Model, Address Doctor, Customizations & Composite Services
- Experience with event-driven architectures (e.g. Kafka, Google Pub/Sub, Azure Event Hub, etc.).
- Professional working knowledge of CI/CD technologies such as Concourse, TeamCity, Octopus, Jenkins, and CircleCI.
- Team player that exhibits high energy, strategic thinking, collaboration, direct communication and results orientation.
Physical Requirements
- Visual requirements include: ability to see detail at near range with or without correction. Must be physically able to perform sedentary work: occasionally lifting or carrying objects of no more than 10 pounds, and occasionally standing or walking, reaching, handling, grasping, feeling, talking, hearing and repetitive motions.
Working Conditions
- The duties of this position are performed through a combination of an open office setting and remote work options. Full remote work options available for employees that reside outside of the Des Moines Metro Area. There is frequent pressure to meet deadlines and handle multiple projects in a day.
Equipment Used to Perform Job
- Windows, or Mac computer and various software solutions.
Financial Responsibility
- Responsible for company assets including maintenance of software solutions.
Contacts
- Has frequent contact with office personnel in other departments related to the position, as well as occasional contact with users and customers. Engages stakeholders from other area in the business.
Confidentiality
- Has access to confidential information including trade secrets, intellectual property, various financials, and customer data.
To build the relationship with customers and explain about product
It's an Ed. Tech company authorized by the government of India and knowledge partner of NSDC



Responsibilities:
- Investigate technologies and solutions which help the team achieve deliverables and results.
- Working with business process analysts and technical leads to derive conceptual designs and solutions.
- Assisting project managers with sizing estimates and work-breakdown structures.
- Develop and unit test software within teams of varied sizes and geographical distributions.
- Benchmarking and load testing developed modules and integrated applications.
- Assisting quality assurance team in the conduct of solution testing.
- Documenting software specifications and requirements.
- Working with infrastructure teams to define and establish application environments and implementation of software applications.
- Provide post-implementation support for developed software applications and solutions.
Measures of Success:
- The delivery of software solutions that meets requirements and are within budget and project timeline.
- Achieves good quality development metrics and standards that are aligned with industry best practices.
Skill Requirements:- Knowledge of General Open System / Distributed systems platforms
- Knowledge of cloud service like AWS, Azure, GCP
- Experience in developing and implementing Restful APIs
- Good understanding of Network ports/Transfer protocols.
- Ability to process/analyze large amounts of data and logs.
- UI and Backend Development experience (Full stack).
- 6+ years of experience in software development within Linux environment; with a strong focus on development using JAVA, Javascript, C and BASH shell & Python scripting.
- Experienced in structured and agile development methodologies.
- Experienced in load testing and benchmarking tools such as JMeter, Loadrunner, etc.
- Experienced in Linux-based database technology and versed with stored procedures.
- Experienced in web services developments using XML, SOAP, JSON.
- Knowledge of application security setup using common security standards like SSL, TLS, secured ciphers, etc.
- Experienced with software versioning control using GIT.
Soft Skills:
- Strong interpersonal and communication skills.
- Strong problem solving and analytical skills.
- Self-motivated and detail-oriented with a desire to design and develop top quality software products.
Basic Qualifications for Consideration:
- Bachelor’s Degree in Computer Science or related
- Minimum 4 years of experience
Implementing various development, testing, automation tools, and IT infrastructure
Planning the team structure, activities, and involvement in project management activities.
Managing stakeholders and external interfaces
Setting up tools and required infrastructure
Defining and setting development, test, release, update, and support processes for DevOps operation
Have the technical skill to review, verify, and validate the software code developed in the project.
Troubleshooting techniques and fixing the code bugs
Monitoring the processes during the entire lifecycle for its adherence and updating or creating new processes for improvement and minimizing the wastage
Encouraging and building automated processes wherever possible
Identifying and deploying cybersecurity measures by continuously performing vulnerability assessment and risk management
Incidence management and root cause analysis
Coordination and communication within the team and with customers
Selecting and deploying appropriate CI/CD tools
Strive for continuous improvement and build continuous integration, continuous development, and constant deployment pipeline (CI/CD Pipeline)
Mentoring and guiding the team members
Monitoring and measuring customer experience and KPIs
Managing periodic reporting on the progress to the management and the customer
job Description
Problem Formulation: Identifies possible options to address the business problems and must possess good understanding of dimension modelling
Must have worked on at least one end to end project using any Cloud Datawarehouse (Azure Synapses, AWS Redshift, Google Big query)
Good to have an understand of POWER BI and integration with any Cloud services like Azure or GCP
Experience of working with SQL Server, SSIS(Preferred)
Applied Business Acumen: Supports the development of business cases and recommendations. Owns delivery of project activity and tasks assigned by others. Supports process updates and changes. Solves business issues.
Data Transformation/Integration/Optimization:
The ETL developer is responsible for designing and creating the Data warehouse and all related data extraction, transformation and load of data function in the company
The developer should provide the oversight and planning of data models, database structural design and deployment and work closely with the data architect and Business analyst
Duties include working in a cross functional software development teams (Business analyst, Testers, Developers) following agile ceremonies and development practices.
The developer plays a key role in contributing to the design, evaluation, selection, implementation and support of databases solution.
Development and Testing: Develops codes for the required solution by determining the appropriate approach and leveraging business, technical, and data requirements.
Creates test cases to review and validate the proposed solution design. Work on POCs and deploy the software to production servers.
Good to Have (Preferred Skills):
- Minimum 4-8 Years of experience in Data warehouse design and development for large scale application
- Minimum 3 years of experience with star schema, dimensional modelling and extract transform load (ETL) Design and development
- Expertise working with various databases (SQL Server, Oracle)
- Experience developing Packages, Procedures, Views and triggers
- Nice to have Big data technologies
- The individual must have good written and oral communication skills.
- Nice to have SSIS
Education and Experience
- Minimum 4-8 years of software development experience
- Bachelor's and/or Master’s degree in computer science
Please revert back with below details.
Total Experience:
Relevant Experience:
Current CTC:
Expected CTC:
Any offers: Y/N
Notice Period:
Qualification:
DOB:
Present Company Name:
Designation:
Domain
Reason for job change:
Current Location:


