


Proactively fetches information from various sources and analyzes it for a better understanding of how the business performs, and to build AI tools that automate certain processes within the company.
Roles & Responsibilities
- Develop novel computer vision/NLP algorithms
- Build large datasets that will be used to train the models
- Empirically evaluate related research works
- Train and evaluate deep learning architectures on multiple large scale datasets
- Collaborate with the rest of the research team to produce high quality research
- Manage a team of 2+ interns
Must-have skills
- 2+years of experience in building deep learning models
- Strong basics around probability and statistics, linear algebra, data structure & algorithms
- Good knowledge of classic ML algorithms (regression, SVM, PCA etc.), deep learning
- Strong programming skills
Nice to have skills
- Familiarity with pytorch
- Knowledge of SOTA techniques in NLP and Vision
Benefits
- High level of responsibility and ownership for a product impacting billions of lives.
- Extremely high-quality talent to work with. Work with a global team between US / India.
- Work from anywhere anytime!
- Best of breed industry benefits packages.

Similar jobs
VEHICLE MAINTENANCE
VEHICLE MAINTENANCE
VEHICLE MAINTENANCE
VEHICLE MAINTENANCE
VEHICLE MAINTENANCE
VEHICLE MAINTENANCE
VEHICLE MAINTENANCE
Job Title: Data Engineer
Cargill’s size and scale allows us to make a positive impact in the world. Our purpose is to nourish the world in a safe, responsible and sustainable way. We are a family company providing food, ingredients, agricultural solutions and industrial products that are vital for living. We connect farmers with markets so they can prosper. We connect customers with ingredients so they can make meals people love. And we connect families with daily essentials — from eggs to edible oils, salt to skincare, feed to alternative fuel. Our 160,000 colleagues, operating in 70 countries, make essential products that touch billions of lives each day. Join us and reach your higher purpose at Cargill.
Job Purpose and Impact
As a Data Engineer at Cargill you work across the full stack to design, develop and operate high performance and data centric solutions using our comprehensive and modern data capabilities and platforms. You will play a critical role in enabling analytical insights and process efficiencies for Cargill’s diverse and complex business environments. You will work in a small team that shares your passion for building innovative, resilient, and high-quality solutions while sharing, learning and growing together.
Key Accountabilities
Collaborate with business stakeholders, product owners and across your team on product or solution designs.
· Develop robust, scalable and sustainable data products or solutions utilizing cloud-based technologies.
· Provide moderately complex technical support through all phases of product or solution life cycle.
· Perform data analysis, handle data modeling, and configure and develop data pipelines to move and optimize data assets.
· Build moderately complex prototypes to test new concepts and provide ideas on reusable frameworks, components and data products or solutions and help promote adoption of new technologies.
· Independently solve moderately complex issues with minimal supervision, while escalating more complex issues to appropriate staff.
· Other duties as assigned
Qualifications
MINIMUM QUALIFICATIONS
· Bachelor’s degree in a related field or equivalent experience
· Minimum of two years of related work experience
· Other minimum qualifications may apply
PREFERRED QUALIFCATIONS
· Experience developing modern data architectures, including data warehouses, data lakes, data meshes, hubs and associated capabilities including ingestion, governance, modeling, observability and more.
· Experience with data collection and ingestion capabilities, including AWS Glue, Kafka Connect and others.
· Experience with data storage and management of large, heterogenous datasets, including formats, structures, and cataloging with such tools as Iceberg, Parquet, Avro, ORC, S3, HFDS, HIVE, Kudu or others.
· Experience with transformation and modeling tools, including SQL based transformation frameworks, orchestration and quality frameworks including dbt, Apache Nifi, Talend, AWS Glue, Airflow, Dagster, Great Expectations, Oozie and others
· Experience working in Big Data environments including tools such as Hadoop and Spark
· Experience working in Cloud Platforms including AWS, GCP or Azure
· Experience of streaming and stream integration or middleware platforms, tools, and architectures such as Kafka, Flink, JMS, or Kinesis.
· Strong programming knowledge of SQL, Python, R, Java, Scala or equivalent
· Proficiency in engineering tooling including docker, git, and container orchestration services
· Strong experience of working in devops models with demonstratable understanding of associated best practices for code management, continuous integration, and deployment strategies.
· Experience and knowledge of data governance considerations including quality, privacy, security associated implications for data product development and consumption.
Equal Opportunity Employer, including Disability/Vet.
Numerator is looking for an experienced, talented and quick-thinking DevOps Manager to join our team and work with the Global DevOps groups to keep infrastructure up to date and continuously advancing. This is a unique opportunity where you will get the chance to work on the infrastructure of both established and greenfield products. Our technology harnesses consumer-related data in many ways including gamified mobile apps, sophisticated web crawling and enhanced Deep Learning algorithms to deliver an unmatched view of the consumer shopping experience. As a member of the Numerator DevOps Engineering team, you will make an immediate impact as you help build out and expand our technology platforms from on-premise to the cloud across a wide range of software ecosystems. Many of your daily tasks and engagement with applications teams will help shape how new projects are delivered at scale to meet our clients demands. This role requires a balance between hands-on infrastructure-as-code deployments with application teams as well as working with Global DevOps Team to roll out new initiatives. What you will get to do
|
Requirements |
Nice to have
|
location-mumbai
serving notice period
client- one of the big 4
Job Location: New Delhi
What you’ll do:
- Contribute to all phases of the Software development lifecycle
- Write well-designed testable, efficient code and complex code with Data Structure & Algorithms.
- Ensure designs are in compliance with specifications on Java and RestAPI.
- Prepare and produce releases of software component
- Support continuous improvement by investigating alternatives and technologies and presenting these for architectural review
- A Developer whose job includes writing complex code, testing, and fixing bugs.
- Maintain, design the database and organize collected information.
- A specialized programmer who focuses on multiple types of development.
What you’ll bring:
- 2 year to 5 years of proven hands-on Software Development experience.
- Strong experience in Java programming
- Expertise Knowledge of Data structures and Algorithms
- Proven working experience in a product-based organization (preferred)
- Understanding with Java frameworks Spring boot, Microservices and ORM frameworks.
- Must have the experience to design and build scalable REST APIs.
- Object-oriented analysis and design using common design patterns.

Software Engineer - II (Backend)
As a Software Engineer – II, you'll bring a good understanding of design and coding practices and apply these independently in development. In this role - you could expect to:
- Own the lifecycle of a feature from requirement analysis to deployment and post production processes
- Perform code reviews and support your team implement best practices
- Explore trade off's on module designs
- Continuously improve performance metrics of modules owned. Performs RCAs and deploys long term fixes.
- Implement initiatives that improve engineering efficiency and excellence
- Participate in the hiring and interview process for junior developers on your team
A leading Indian Pharmaceutical MNC is looking for a ZSM to take care of its FMCG portfolio of Personal Care, Baby care and Food & Beverages businesses in Southern states of India.
He would be taking care of all sales/ revenue generating activities in the zone by leading a team of ASMs to drive trade & marketing initiatives and drive retail expansion profitably.
He would be responsible for the distribution in General Trade for the Zone. Would be responsible for best daily, weekly, monthly review mechanisms within the team and ensure all trade initiative reporting is conducted in time and for100% utilisation of SFA and DMS.
Working experience in the state of Karnataka or Kerala is essential.



You can code comfortably in Python
Working knowledge of streaming media protocols, technologies, and standards (streaming, compression, and transcoding): HTTP Live Streaming (HLS), RTMP, RTSP, etc.
Good grasp of Linux and cloud server AWS or Azure.
Working knowledge of Data Structures and Algorithms.
Working knowledge of SQL, NoSQL, Graph Databases (MySQL, MongoDB, Cassandra, Redis, SQL/JSON)
Working knowledge of API architectures, micro-services.
Needs to have a good working knowledge of GitHub, docker.
Working knowledge of Distributed computing and multi-processing.
GOOD TO KNOW
Should have knowledge of pandas, Luigi, Celery, Django, flask, package.
Should have interacted with big-data.
Should have interacted with message queuing tools like Kafka, zmq, etc.
You have exceptional knowledge of encryption, security & networking.
The present role is a Backend Developer – SDE role for Goscale - Apna Tech
Collaboration.
About Goscale
Goscale is a premium technology company focusing on helping companies building
world class scalable products.
Goscale works with premium product companies (Indian and International) like - Swiggy,
ShareChat, Grab, Capillary, Uber, Workspan, Ovo and many more. We are responsible
for managing infrastructure for Swiggy as well.
We focus on building only world class tech product and our USP is building technology
can can handle scale from 1 million to 1 billion hits.
We invite candidates who have a zeal to develop world class products to come and work
with us.
About Apna
apna is India’s most trusted and fastest growing professional platform. apna aims to
connect the right jobs to the right people in the shortest possible time and deliver a
delightful holistic experience to job seekers and recruiters.
It is a one-of-a-kind comprehensive professional app where alongside job search, you
can do a lot more like learn new skills, network with like minded people, get support to
start your business, engage with particular professional communities to hone your skills
and share your expertise.
In the words of our users, apna is one of the most rewarding interactive career apps in
India.
Required Skills:
Take ownership of end-to-end product development
Be champion of Test Driven Development methodology
Building reusable code and libraries for future use
Optimization of the application for maximum speed and scalability
Implementation of security and data protection
Technical Documentation around the owned product
Skills And Qualifications:
Expert level in OOPS concepts and REST API development. Typically 4+ years of
professional experience
Proficiency in one or more of the following languages: Python, Java, Ruby, Go
Knowledge around ElasticSearch, RabbitMQ, Redis
Integration of multiple data sources and databases into one system
Implementing automated testing platforms and unit tests
Proficient understanding of git
Knowledge of GCP would be a plus
Identify and assess customers’ needs to achieve satisfaction.
Build sustainable relationships and trust with customer accounts through open and
interactive communication.
Provide accurate, valid and complete information by using the right methods/tools.
Handle customer complaints, provide appropriate solutions and alternatives within the
time limits; follow up to ensure resolution.
Keep records of customer interactions, process customer accounts.
Follow communication procedures, guidelines and policies.
Take the extra mile to engage customers.

