

Similar jobs
DATA ENGINEER
Overview
They started with a singular belief - what is beautiful cannot and should not be defined in marketing meetings. It's defined by the regular people like us, our sisters, our next-door neighbours, and the friends we make on the playground and in lecture halls. That's why we stand for people-proving everything we do. From the inception of a product idea to testing the final formulations before launch, our consumers are a part of each and every process. They guide and inspire us by sharing their stories with us. They tell us not only about the product they need and the skincare issues they face but also the tales of their struggles, dreams and triumphs. Skincare goes deeper than skin. It's a form of self-care for many. Wherever someone is on this journey, we want to cheer them on through the products we make, the content we create and the conversations we have. What we wish to build is more than a brand. We want to build a community that grows and glows together - cheering each other on, sharing knowledge, and ensuring people always have access to skincare that really works.
Job Description:
We are seeking a skilled and motivated Data Engineer to join our team. As a Data Engineer, you will be responsible for designing, developing, and maintaining the data infrastructure and systems that enable efficient data collection, storage, processing, and analysis. You will collaborate with cross-functional teams, including data scientists, analysts, and software engineers, to implement data pipelines and ensure the availability, reliability, and scalability of our data platform.
Responsibilities:
Design and implement scalable and robust data pipelines to collect, process, and store data from various sources.
Develop and maintain data warehouse and ETL (Extract, Transform, Load) processes for data integration and transformation.
Optimize and tune the performance of data systems to ensure efficient data processing and analysis.
Collaborate with data scientists and analysts to understand data requirements and implement solutions for data modeling and analysis.
Identify and resolve data quality issues, ensuring data accuracy, consistency, and completeness.
Implement and maintain data governance and security measures to protect sensitive data.
Monitor and troubleshoot data infrastructure, perform root cause analysis, and implement necessary fixes.
Stay up-to-date with emerging technologies and industry trends in data engineering and recommend their adoption when appropriate.
Qualifications:
Bachelor’s or higher degree in Computer Science, Information Systems, or a related field.
Proven experience as a Data Engineer or similar role, working with large-scale data processing and storage systems.
Strong programming skills in languages such as Python, Java, or Scala.
Experience with big data technologies and frameworks like Hadoop, Spark, or Kafka.
Proficiency in SQL and database management systems (e.g., MySQL, PostgreSQL, or Oracle).
Familiarity with cloud platforms like AWS, Azure, or GCP, and their data services (e.g., S3, Redshift, BigQuery).
Solid understanding of data modeling, data warehousing, and ETL principles.
Knowledge of data integration techniques and tools (e.g., Apache Nifi, Talend, or Informatica).
Strong problem-solving and analytical skills, with the ability to handle complex data challenges.
Excellent communication and collaboration skills to work effectively in a team environment.
Preferred Qualifications:
Advanced knowledge of distributed computing and parallel processing.
Experience with real-time data processing and streaming technologies (e.g., Apache Kafka, Apache Flink).
Familiarity with machine learning concepts and frameworks (e.g., TensorFlow, PyTorch).
Knowledge of containerization and orchestration technologies (e.g., Docker, Kubernetes).
Experience with data visualization and reporting tools (e.g., Tableau, Power BI).
Certification in relevant technologies or data engineering disciplines.


About OJ Commerce:
OJ Commerce is a fast-growing, profitable online retailer based in Florida, USA with a full-fledged India office based in Chennai driven by a sophisticated, data-driven system to run the operations with virtually no human intervention. We strive to be the best-in-class ecommerce company delivering exceptional value to customers by leveraging technology, innovation and brand-partnerships to provide a seamless & enjoyable shopping of high-quality products at the best prices to our customers.
Responsibilities:
Work with business-stakeholders to understand requirements, prototype, build and deploy it.
CRUD the backend code you own keeping maintenance, performance and security in mind.
Keep up breast of latest technologies and its ecosystem and adopt ones that aid safe product delivery at speed.
Automate the boring and mundane stuff for you prefer to be productive than being busy.
We are flat. Be responsible for professional growth of self and the team.
- Tune application for performance.
- Take initiatives and manage change to work towards business goals at speed without compromising safety.
- Coach full-stack developers on backend skills.
- Provides problem resolution support, specific to application issues, identifies and resolves problems in application software, determines symptoms and ensures accurate problem definition
Develop functional, architectural and other documentation as required for productive functioning of teams.
- Be the brand ambassador for OJ Commerce by speaking at meetups, conferences, etc.
- We are fluid. Be ready for changing dynamics in responsibilities from time to time. Exciting isn't?
- Take the lead in digital transformation of legacy applications.
What you need to shine?
- You have the prior experience in modernising legacy applications.
- You are a passionate hands-on developer with deep experience in building enterprise grade software in Microsoft ASP.NET Core, ASP.NET MVC, Web API, SOA, Micro-Services and RESTful Services with knowledge of SQL Server database.
- You have the ability to see and work on the big picture (Application Architecture) and devilish details (Complex Code).
- Strong experience in developing web applications using C#, VB.Net, .NET, LINQ, Net Framework 4.0, MVC 3/4/5, ASP.NET Web API, .Net Core etc.
- You are Cloud savvy, preferably Google Cloud.
- You have rich experience in Object-Oriented Programming (OOP) with good knowledge of practical design-patterns and its applications.
Hands-on experience in building SOA or Micro-services preferably on .NET Core.
Proven Architectural skills with high standards in Code quality
Knowledge of ReactJS/Typescript would be added advantage.
Practical experience in Agile development methodologies of using CI/CD.
Extreme Programming (TDD) experience is sought after by us.
What we Offer
- Greenfield opportunity to transform legacy backend applications to latest technology stack.
- Fast-paced start-up environment: This is not for the faint hearted; you need grit and passion as much as you need the core skills.
- Work in an interdisciplinary team where learning from one another and developing solutions cross-functionally is a key part of our culture.
- Golden opportunity to make history by making big business impact.
- Competitive salary to take good care of self and family.
- Insurance Benefits: Medical and Accident cover.
- Flexible Working Hours


Minimum Skills :
- Creating RESTful services with Node.js (Express)
- React & React Native
- Mongoose & MongoDB.
Candidate must have good Knowledge of Node Debugging, understanding & proficiency in REST APIs, integration of data storage solutions (NoSQL DB especially MongoDB) and should be comfortable with using Git Repository and Jira.
Preferred Skills :
- Hands-on experience with Ubuntu-based servers
- Hands on experience in deploying on Digital Ocean Droplets
- Hands on experience with google maps integration
We are looking for immediate joiners. Candidates that can join the company within 15 days will be given more preference.
Job Location: Pune/Bangalore/ Hyderabad/ Indore
- Very good knowledge of MuleSoft components.
- Prior work experience in setting up a COE using MuleSoft Integration Software.
- Good understanding of various integration patterns.
- Ability to deliver projects independently with little or no supervision.
- Previous experience working in a multi-geographic team.
- Previous experience with best programming practices.
- Good written and oral communication skills – English.
About the Company: Among top five global media agencies. we provide access and scale everywhere our clients do business. Intelligent and imaginative, we create, integrate and scale technology-enabled services with premium partners
Job Location :Mumbai
Roles and Responsibilities
- Devising solutions for the client by identifying and configuring the technology, data and processes that improve media targeting, and deepen analytical learning. This can encompass comprehensive audits of a client’s current setup, as well as leading data and tech partner evaluations, and defining measurement frameworks to capture the most valuable signals
- Aid in data partner discovery, vetting, onboarding and cataloguing, providing the relevant teams with a view on the latest solutions in the market, as well as current trends and issues relating to data management
- Support the implementation of client audience data strategy programs, specifically how client data is captured, managed, enhanced (through partnerships), deployed and measured for effectiveness, working closely with adjacent disciplines in Planning, AdOps, Activation and Analytics to ensure audience management best practices flow down and through all client media campaigns
- Maintain strong, effective relationships with key partners and suppliers within the ad/mar-tech data space
- Build strong relationships with key client stakeholders, including communicating with them on subjects outside your remit, and build a reputation for excellence in communication
Desired background experience:
Required
- Demonstrable knowledge of the ad tech and data landscape, including hands-on experience managing audience data, for your own company or your clients. This includes, a working knowledge of the leading data management platforms (DMPs), customer data platforms (CDPs), knowledge of data onboarding and cross-device matching approaches and an understanding of the tools and technologies that can activate this data (e,g, DSPs, social platforms)
- Familiarity with and experienced in working with cloud-based data solutions (e.g. Google Cloud Platform, Amazon Web services) and tools within (e.g. BigQuery, Ads Data Hub) for data warehousing, insight generation and/or deployment.
- Strong analytical skills and a natural affinity for numbers is key; You must be able to analyze raw data, draw insights and develop actionable recommendations as needed
- Exceptional verbal & written communication skills, able to build and develop strong relationships with, and communicate effectively with people at all levels of seniority; in-agency, client-side, and in the supplier space
- Strong organizational skills, including experience with resource management, to effectively manage smooth flow of work through agency
Regards
Team Merito


Job description
● You'll design and build scalable systems using AI, ML to improve productivity in manufacturing operations
● You'll work on building web apps that are intuitive, intelligent and highly performant
● You will work with Designers, Solution Architect and Functional Consultant to define architecture and build
solutions
● You should be able to work in unstructured situations and help structure problems through discussions,
solutioning. Taking initiative, listening to others and working collaboratively on technology, product and
business would be really important
● You will work directly with founders from IIM, XLRI, DCE
Required Candidate profile
● Have 2-6 years of experience
● Come up with your own goals and don't need heavy direction or daily check-ins
● Have command and confidence on Python Django. Should be able to build complex solutions
● Have an understanding of databases - Relational and Non-Relational - their data models and
performance tradeoffs
● Have experience of database design and querying with a focus on performance
● Have knowledge of REST paradigm, service-oriented architecture and distributed systems
● Have a clear understanding of data structures and algorithms
1.SV, UVM, USB, DDR, PCIE, Ethernet, Axi, MIPI. Any one of the protocols will
be added advantage.
2.Experience in verification of complex IPs or SoCs.
3. Expertise in SoC Verification using C and SV/UVM.Expertise in AMBA
protocols
4. AXI/AHB/APB and experience in working with ARM Processors.
5. Expertise in Test Plan creation and Verification technologies like Code
Coverage, Functional Coverage and Assertions.

We're looking for exprienced Python developers with over 2+ years of production experience and strong expertise in building web applications and APIs using Python, Django and DRF. We are looking for cadidates who are go-getters and are leaning towards leadership positions. Candidates must have a proven history of building, scaling, optimising and securing Python based backends and APIs using a microservice architecture.
Bonus Skills : Experience with ReactJs, Node.js, FastAPI
We would prefer candidates who can join immediately or are currently serving their notice period. Jumpers please excuse, your application will be ignored
Key Skills Required
- Proficiency in Python 3.x based web and backend development
- Solid understanding of Python concepts
- Strong experience in building web applications using Django
- Experience building REST APIs using DRF or Flask
- Experience with some form of Machine Learning (ML)
- Experience in using libraries such as Numpy and Pandas
- Hands on experience with RDBMS such as Postgres or MySQL including querying
- Comfort with Git repositories, branching and deployment using Git
- Working experience with Docker
- Basic working knowledge of ReactJs
- Experience in deploying Django applications to AWS,Digital Ocean or Heroku
Responsibilities
- Understanding requirement and congributing to engineering solutions at a conceptual stage to provide the best possible solution to the task/challenge
- Building high quality code using coding standards based on the SRS/Documentation
- Building component based, maintainable, scalable and reusable backend libraries/modules.
- Building & documenting scalable APIs on the Open Spec standard
- Unit testing development modules and APIs
- Conducting code reviews to ensure that the highest quality standard are maintained
- Securing backend applications and APIs using industry best practices
- Troubleshooting issues and fixing bugs raised by the QA team efficiently.
- Optimizing code
- Building and deploying the applications
-
Full-time
Employment type


Job Description:
In this role, you will:
- Be responsible for building APIs, coding, documenting and maintaining scalable web/mobile applications in a fast-paced environment
- Be involved in conceptualization of IOT product features & AI Model Integrations, designing, development and debugging in the real-time environment
- Collaborate with multiple stakeholders to deliver products in an agile environment
- Continuous Integration and Continuous Deployment
Required Skills :
- Good understanding of Data Structures &Algorithms with strong analytical skills.
- Deep understanding of javascript and web fundamentals like HTML, CSS3, Boostrap 3, SDKs, cookies, Sockets,
- Experience in programming languages: nodejs, python, java, Angular/React JS,
- Knowledge of Relational and Non-relational Databases and Cache like MySQL, Mongo, Elasticsearch, Redis etc
- Knowledge of RESTful paradigms and to experience building/consuming APIs, microservices & system design architecture
- Experience in client side technologies like JavaScript, jQuery, Typescript and React JS.
- Good in Building UI/UX of web application and integration of backend and frontend
- Capable of hosting AI model on AWS cloud by developing APIs
Desired Skills:
- Ability to break complex projects into modules and propose effective solutions in view of capabilities of existing platforms and infrastructure.
- Experience in managing build/deployment pipelines for continuous integration and continuous delivery to improve the quality and availability of products & services.
- Understanding of cloud architecture and cloud deployments - AWS SQS, Lambda, EC2, S 3, Azure, other cloud technologies

